Jul 15 05:16:53.849988 kernel: Linux version 6.12.36-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue Jul 15 03:28:48 -00 2025 Jul 15 05:16:53.850011 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=926b029026d98240a9e8b6527b65fc026ae523bea87c3b77ffd7237bcc7be4fb Jul 15 05:16:53.850019 kernel: BIOS-provided physical RAM map: Jul 15 05:16:53.850025 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Jul 15 05:16:53.850031 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Jul 15 05:16:53.850036 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Jul 15 05:16:53.850044 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007cfdbfff] usable Jul 15 05:16:53.850049 kernel: BIOS-e820: [mem 0x000000007cfdc000-0x000000007cffffff] reserved Jul 15 05:16:53.850055 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Jul 15 05:16:53.850060 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Jul 15 05:16:53.850066 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jul 15 05:16:53.850071 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Jul 15 05:16:53.850076 kernel: NX (Execute Disable) protection: active Jul 15 05:16:53.850082 kernel: APIC: Static calls initialized Jul 15 05:16:53.850090 kernel: SMBIOS 2.8 present. Jul 15 05:16:53.850096 kernel: DMI: Hetzner vServer/Standard PC (Q35 + ICH9, 2009), BIOS 20171111 11/11/2017 Jul 15 05:16:53.850102 kernel: DMI: Memory slots populated: 1/1 Jul 15 05:16:53.850108 kernel: Hypervisor detected: KVM Jul 15 05:16:53.850114 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Jul 15 05:16:53.850120 kernel: kvm-clock: using sched offset of 4334179119 cycles Jul 15 05:16:53.850126 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Jul 15 05:16:53.850132 kernel: tsc: Detected 2445.406 MHz processor Jul 15 05:16:53.850140 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jul 15 05:16:53.850146 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jul 15 05:16:53.850152 kernel: last_pfn = 0x7cfdc max_arch_pfn = 0x400000000 Jul 15 05:16:53.850158 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Jul 15 05:16:53.850164 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jul 15 05:16:53.850170 kernel: Using GB pages for direct mapping Jul 15 05:16:53.850189 kernel: ACPI: Early table checksum verification disabled Jul 15 05:16:53.850195 kernel: ACPI: RSDP 0x00000000000F5270 000014 (v00 BOCHS ) Jul 15 05:16:53.850201 kernel: ACPI: RSDT 0x000000007CFE2693 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 05:16:53.850209 kernel: ACPI: FACP 0x000000007CFE2483 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 05:16:53.850215 kernel: ACPI: DSDT 0x000000007CFE0040 002443 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 05:16:53.850221 kernel: ACPI: FACS 0x000000007CFE0000 000040 Jul 15 05:16:53.850227 kernel: ACPI: APIC 0x000000007CFE2577 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 05:16:53.850233 kernel: ACPI: HPET 0x000000007CFE25F7 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 05:16:53.850262 kernel: ACPI: MCFG 0x000000007CFE262F 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 05:16:53.850268 kernel: ACPI: WAET 0x000000007CFE266B 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 05:16:53.850274 kernel: ACPI: Reserving FACP table memory at [mem 0x7cfe2483-0x7cfe2576] Jul 15 05:16:53.850283 kernel: ACPI: Reserving DSDT table memory at [mem 0x7cfe0040-0x7cfe2482] Jul 15 05:16:53.850291 kernel: ACPI: Reserving FACS table memory at [mem 0x7cfe0000-0x7cfe003f] Jul 15 05:16:53.850298 kernel: ACPI: Reserving APIC table memory at [mem 0x7cfe2577-0x7cfe25f6] Jul 15 05:16:53.850304 kernel: ACPI: Reserving HPET table memory at [mem 0x7cfe25f7-0x7cfe262e] Jul 15 05:16:53.850310 kernel: ACPI: Reserving MCFG table memory at [mem 0x7cfe262f-0x7cfe266a] Jul 15 05:16:53.850316 kernel: ACPI: Reserving WAET table memory at [mem 0x7cfe266b-0x7cfe2692] Jul 15 05:16:53.850324 kernel: No NUMA configuration found Jul 15 05:16:53.850331 kernel: Faking a node at [mem 0x0000000000000000-0x000000007cfdbfff] Jul 15 05:16:53.850337 kernel: NODE_DATA(0) allocated [mem 0x7cfd4dc0-0x7cfdbfff] Jul 15 05:16:53.850343 kernel: Zone ranges: Jul 15 05:16:53.850349 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jul 15 05:16:53.850356 kernel: DMA32 [mem 0x0000000001000000-0x000000007cfdbfff] Jul 15 05:16:53.850362 kernel: Normal empty Jul 15 05:16:53.850368 kernel: Device empty Jul 15 05:16:53.850374 kernel: Movable zone start for each node Jul 15 05:16:53.850382 kernel: Early memory node ranges Jul 15 05:16:53.850388 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Jul 15 05:16:53.850394 kernel: node 0: [mem 0x0000000000100000-0x000000007cfdbfff] Jul 15 05:16:53.850401 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007cfdbfff] Jul 15 05:16:53.850407 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jul 15 05:16:53.850413 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Jul 15 05:16:53.850419 kernel: On node 0, zone DMA32: 12324 pages in unavailable ranges Jul 15 05:16:53.850425 kernel: ACPI: PM-Timer IO Port: 0x608 Jul 15 05:16:53.850431 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Jul 15 05:16:53.850437 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Jul 15 05:16:53.850445 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Jul 15 05:16:53.850452 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Jul 15 05:16:53.850458 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Jul 15 05:16:53.850464 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Jul 15 05:16:53.850470 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Jul 15 05:16:53.850476 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jul 15 05:16:53.850482 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Jul 15 05:16:53.850489 kernel: CPU topo: Max. logical packages: 1 Jul 15 05:16:53.850495 kernel: CPU topo: Max. logical dies: 1 Jul 15 05:16:53.850502 kernel: CPU topo: Max. dies per package: 1 Jul 15 05:16:53.850509 kernel: CPU topo: Max. threads per core: 1 Jul 15 05:16:53.850515 kernel: CPU topo: Num. cores per package: 2 Jul 15 05:16:53.850521 kernel: CPU topo: Num. threads per package: 2 Jul 15 05:16:53.850528 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Jul 15 05:16:53.850534 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Jul 15 05:16:53.850540 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Jul 15 05:16:53.850546 kernel: Booting paravirtualized kernel on KVM Jul 15 05:16:53.850552 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jul 15 05:16:53.850560 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Jul 15 05:16:53.850566 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Jul 15 05:16:53.850573 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Jul 15 05:16:53.850579 kernel: pcpu-alloc: [0] 0 1 Jul 15 05:16:53.850585 kernel: kvm-guest: PV spinlocks disabled, no host support Jul 15 05:16:53.850592 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=926b029026d98240a9e8b6527b65fc026ae523bea87c3b77ffd7237bcc7be4fb Jul 15 05:16:53.850599 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Jul 15 05:16:53.850605 kernel: random: crng init done Jul 15 05:16:53.850613 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jul 15 05:16:53.850619 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jul 15 05:16:53.850625 kernel: Fallback order for Node 0: 0 Jul 15 05:16:53.850631 kernel: Built 1 zonelists, mobility grouping on. Total pages: 511866 Jul 15 05:16:53.850637 kernel: Policy zone: DMA32 Jul 15 05:16:53.850643 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jul 15 05:16:53.850650 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Jul 15 05:16:53.850656 kernel: ftrace: allocating 40097 entries in 157 pages Jul 15 05:16:53.850662 kernel: ftrace: allocated 157 pages with 5 groups Jul 15 05:16:53.850670 kernel: Dynamic Preempt: voluntary Jul 15 05:16:53.850676 kernel: rcu: Preemptible hierarchical RCU implementation. Jul 15 05:16:53.850683 kernel: rcu: RCU event tracing is enabled. Jul 15 05:16:53.850689 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Jul 15 05:16:53.850696 kernel: Trampoline variant of Tasks RCU enabled. Jul 15 05:16:53.850702 kernel: Rude variant of Tasks RCU enabled. Jul 15 05:16:53.850708 kernel: Tracing variant of Tasks RCU enabled. Jul 15 05:16:53.850715 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jul 15 05:16:53.850721 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Jul 15 05:16:53.850729 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jul 15 05:16:53.850735 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jul 15 05:16:53.850741 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jul 15 05:16:53.850748 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Jul 15 05:16:53.850754 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jul 15 05:16:53.850760 kernel: Console: colour VGA+ 80x25 Jul 15 05:16:53.850766 kernel: printk: legacy console [tty0] enabled Jul 15 05:16:53.850772 kernel: printk: legacy console [ttyS0] enabled Jul 15 05:16:53.850778 kernel: ACPI: Core revision 20240827 Jul 15 05:16:53.850787 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Jul 15 05:16:53.850798 kernel: APIC: Switch to symmetric I/O mode setup Jul 15 05:16:53.850805 kernel: x2apic enabled Jul 15 05:16:53.850813 kernel: APIC: Switched APIC routing to: physical x2apic Jul 15 05:16:53.850819 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Jul 15 05:16:53.850826 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x233fc4eb620, max_idle_ns: 440795316590 ns Jul 15 05:16:53.850833 kernel: Calibrating delay loop (skipped) preset value.. 4890.81 BogoMIPS (lpj=2445406) Jul 15 05:16:53.850839 kernel: AMD Zen1 DIV0 bug detected. Disable SMT for full protection. Jul 15 05:16:53.850845 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Jul 15 05:16:53.850852 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Jul 15 05:16:53.850858 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jul 15 05:16:53.850867 kernel: Spectre V2 : Mitigation: Retpolines Jul 15 05:16:53.850873 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Jul 15 05:16:53.850880 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Jul 15 05:16:53.850886 kernel: RETBleed: Mitigation: untrained return thunk Jul 15 05:16:53.850893 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jul 15 05:16:53.850899 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jul 15 05:16:53.850907 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied! Jul 15 05:16:53.850915 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options. Jul 15 05:16:53.850921 kernel: x86/bugs: return thunk changed Jul 15 05:16:53.850927 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode Jul 15 05:16:53.850934 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jul 15 05:16:53.850940 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jul 15 05:16:53.850947 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jul 15 05:16:53.850953 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jul 15 05:16:53.850962 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Jul 15 05:16:53.850968 kernel: Freeing SMP alternatives memory: 32K Jul 15 05:16:53.850974 kernel: pid_max: default: 32768 minimum: 301 Jul 15 05:16:53.850981 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jul 15 05:16:53.850987 kernel: landlock: Up and running. Jul 15 05:16:53.850994 kernel: SELinux: Initializing. Jul 15 05:16:53.851003 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jul 15 05:16:53.851016 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jul 15 05:16:53.851033 kernel: smpboot: CPU0: AMD EPYC Processor (with IBPB) (family: 0x17, model: 0x1, stepping: 0x2) Jul 15 05:16:53.851048 kernel: Performance Events: AMD PMU driver. Jul 15 05:16:53.851059 kernel: ... version: 0 Jul 15 05:16:53.851069 kernel: ... bit width: 48 Jul 15 05:16:53.851079 kernel: ... generic registers: 4 Jul 15 05:16:53.851090 kernel: ... value mask: 0000ffffffffffff Jul 15 05:16:53.851101 kernel: ... max period: 00007fffffffffff Jul 15 05:16:53.851110 kernel: ... fixed-purpose events: 0 Jul 15 05:16:53.851117 kernel: ... event mask: 000000000000000f Jul 15 05:16:53.851123 kernel: signal: max sigframe size: 1776 Jul 15 05:16:53.851132 kernel: rcu: Hierarchical SRCU implementation. Jul 15 05:16:53.851139 kernel: rcu: Max phase no-delay instances is 400. Jul 15 05:16:53.851151 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jul 15 05:16:53.851162 kernel: smp: Bringing up secondary CPUs ... Jul 15 05:16:53.851170 kernel: smpboot: x86: Booting SMP configuration: Jul 15 05:16:53.851190 kernel: .... node #0, CPUs: #1 Jul 15 05:16:53.851197 kernel: smp: Brought up 1 node, 2 CPUs Jul 15 05:16:53.851203 kernel: smpboot: Total of 2 processors activated (9781.62 BogoMIPS) Jul 15 05:16:53.851210 kernel: Memory: 1917780K/2047464K available (14336K kernel code, 2430K rwdata, 9956K rodata, 54608K init, 2360K bss, 125140K reserved, 0K cma-reserved) Jul 15 05:16:53.851219 kernel: devtmpfs: initialized Jul 15 05:16:53.851226 kernel: x86/mm: Memory block size: 128MB Jul 15 05:16:53.851232 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jul 15 05:16:53.851275 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Jul 15 05:16:53.851282 kernel: pinctrl core: initialized pinctrl subsystem Jul 15 05:16:53.851289 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jul 15 05:16:53.851295 kernel: audit: initializing netlink subsys (disabled) Jul 15 05:16:53.851302 kernel: audit: type=2000 audit(1752556610.692:1): state=initialized audit_enabled=0 res=1 Jul 15 05:16:53.851309 kernel: thermal_sys: Registered thermal governor 'step_wise' Jul 15 05:16:53.851318 kernel: thermal_sys: Registered thermal governor 'user_space' Jul 15 05:16:53.851324 kernel: cpuidle: using governor menu Jul 15 05:16:53.851332 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jul 15 05:16:53.851343 kernel: dca service started, version 1.12.1 Jul 15 05:16:53.851354 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff] Jul 15 05:16:53.851365 kernel: PCI: Using configuration type 1 for base access Jul 15 05:16:53.851374 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jul 15 05:16:53.851381 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jul 15 05:16:53.851388 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jul 15 05:16:53.851396 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jul 15 05:16:53.851403 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jul 15 05:16:53.851409 kernel: ACPI: Added _OSI(Module Device) Jul 15 05:16:53.851416 kernel: ACPI: Added _OSI(Processor Device) Jul 15 05:16:53.851422 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jul 15 05:16:53.851429 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jul 15 05:16:53.851435 kernel: ACPI: Interpreter enabled Jul 15 05:16:53.851441 kernel: ACPI: PM: (supports S0 S5) Jul 15 05:16:53.851448 kernel: ACPI: Using IOAPIC for interrupt routing Jul 15 05:16:53.851456 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jul 15 05:16:53.851463 kernel: PCI: Using E820 reservations for host bridge windows Jul 15 05:16:53.851469 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Jul 15 05:16:53.851476 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jul 15 05:16:53.851678 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jul 15 05:16:53.851801 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Jul 15 05:16:53.851905 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Jul 15 05:16:53.851917 kernel: PCI host bridge to bus 0000:00 Jul 15 05:16:53.852026 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jul 15 05:16:53.852119 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Jul 15 05:16:53.852260 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jul 15 05:16:53.852363 kernel: pci_bus 0000:00: root bus resource [mem 0x7d000000-0xafffffff window] Jul 15 05:16:53.852455 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Jul 15 05:16:53.852546 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x8ffffffff window] Jul 15 05:16:53.852641 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jul 15 05:16:53.852789 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Jul 15 05:16:53.852966 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Jul 15 05:16:53.853122 kernel: pci 0000:00:01.0: BAR 0 [mem 0xfb800000-0xfbffffff pref] Jul 15 05:16:53.853330 kernel: pci 0000:00:01.0: BAR 2 [mem 0xfd200000-0xfd203fff 64bit pref] Jul 15 05:16:53.853484 kernel: pci 0000:00:01.0: BAR 4 [mem 0xfea10000-0xfea10fff] Jul 15 05:16:53.853629 kernel: pci 0000:00:01.0: ROM [mem 0xfea00000-0xfea0ffff pref] Jul 15 05:16:53.853770 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jul 15 05:16:53.853931 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 05:16:53.854092 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfea11000-0xfea11fff] Jul 15 05:16:53.854280 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jul 15 05:16:53.854440 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Jul 15 05:16:53.854594 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Jul 15 05:16:53.854838 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 05:16:53.855002 kernel: pci 0000:00:02.1: BAR 0 [mem 0xfea12000-0xfea12fff] Jul 15 05:16:53.855152 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jul 15 05:16:53.856420 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Jul 15 05:16:53.856587 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Jul 15 05:16:53.856751 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 05:16:53.856908 kernel: pci 0000:00:02.2: BAR 0 [mem 0xfea13000-0xfea13fff] Jul 15 05:16:53.857109 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jul 15 05:16:53.859335 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Jul 15 05:16:53.859510 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Jul 15 05:16:53.859696 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 05:16:53.859866 kernel: pci 0000:00:02.3: BAR 0 [mem 0xfea14000-0xfea14fff] Jul 15 05:16:53.860078 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jul 15 05:16:53.860545 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Jul 15 05:16:53.860713 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Jul 15 05:16:53.860883 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 05:16:53.861053 kernel: pci 0000:00:02.4: BAR 0 [mem 0xfea15000-0xfea15fff] Jul 15 05:16:53.862267 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jul 15 05:16:53.862453 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Jul 15 05:16:53.862627 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Jul 15 05:16:53.862795 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 05:16:53.862973 kernel: pci 0000:00:02.5: BAR 0 [mem 0xfea16000-0xfea16fff] Jul 15 05:16:53.863142 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jul 15 05:16:53.864104 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Jul 15 05:16:53.864321 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Jul 15 05:16:53.864508 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 05:16:53.864682 kernel: pci 0000:00:02.6: BAR 0 [mem 0xfea17000-0xfea17fff] Jul 15 05:16:53.864847 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jul 15 05:16:53.865017 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Jul 15 05:16:53.865215 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Jul 15 05:16:53.865416 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 05:16:53.865584 kernel: pci 0000:00:02.7: BAR 0 [mem 0xfea18000-0xfea18fff] Jul 15 05:16:53.865745 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jul 15 05:16:53.865892 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Jul 15 05:16:53.866409 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Jul 15 05:16:53.866595 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 05:16:53.866765 kernel: pci 0000:00:03.0: BAR 0 [mem 0xfea19000-0xfea19fff] Jul 15 05:16:53.866932 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jul 15 05:16:53.867074 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Jul 15 05:16:53.868276 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Jul 15 05:16:53.868468 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Jul 15 05:16:53.868629 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Jul 15 05:16:53.868806 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Jul 15 05:16:53.868976 kernel: pci 0000:00:1f.2: BAR 4 [io 0xc040-0xc05f] Jul 15 05:16:53.869145 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfea1a000-0xfea1afff] Jul 15 05:16:53.870373 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Jul 15 05:16:53.870552 kernel: pci 0000:00:1f.3: BAR 4 [io 0x0700-0x073f] Jul 15 05:16:53.870750 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jul 15 05:16:53.870921 kernel: pci 0000:01:00.0: BAR 1 [mem 0xfe880000-0xfe880fff] Jul 15 05:16:53.871082 kernel: pci 0000:01:00.0: BAR 4 [mem 0xfd000000-0xfd003fff 64bit pref] Jul 15 05:16:53.873301 kernel: pci 0000:01:00.0: ROM [mem 0xfe800000-0xfe87ffff pref] Jul 15 05:16:53.873478 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jul 15 05:16:53.873672 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Jul 15 05:16:53.873844 kernel: pci 0000:02:00.0: BAR 0 [mem 0xfe600000-0xfe603fff 64bit] Jul 15 05:16:53.874014 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jul 15 05:16:53.874226 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Jul 15 05:16:53.874430 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfe400000-0xfe400fff] Jul 15 05:16:53.874604 kernel: pci 0000:03:00.0: BAR 4 [mem 0xfcc00000-0xfcc03fff 64bit pref] Jul 15 05:16:53.874769 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jul 15 05:16:53.874955 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Jul 15 05:16:53.875135 kernel: pci 0000:04:00.0: BAR 4 [mem 0xfca00000-0xfca03fff 64bit pref] Jul 15 05:16:53.876373 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jul 15 05:16:53.876573 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Jul 15 05:16:53.876751 kernel: pci 0000:05:00.0: BAR 4 [mem 0xfc800000-0xfc803fff 64bit pref] Jul 15 05:16:53.876917 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jul 15 05:16:53.877113 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Jul 15 05:16:53.877333 kernel: pci 0000:06:00.0: BAR 1 [mem 0xfde00000-0xfde00fff] Jul 15 05:16:53.877499 kernel: pci 0000:06:00.0: BAR 4 [mem 0xfc600000-0xfc603fff 64bit pref] Jul 15 05:16:53.877672 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jul 15 05:16:53.877691 kernel: acpiphp: Slot [0] registered Jul 15 05:16:53.877858 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jul 15 05:16:53.878016 kernel: pci 0000:07:00.0: BAR 1 [mem 0xfdc80000-0xfdc80fff] Jul 15 05:16:53.878193 kernel: pci 0000:07:00.0: BAR 4 [mem 0xfc400000-0xfc403fff 64bit pref] Jul 15 05:16:53.880411 kernel: pci 0000:07:00.0: ROM [mem 0xfdc00000-0xfdc7ffff pref] Jul 15 05:16:53.880578 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jul 15 05:16:53.880603 kernel: acpiphp: Slot [0-2] registered Jul 15 05:16:53.880754 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jul 15 05:16:53.880771 kernel: acpiphp: Slot [0-3] registered Jul 15 05:16:53.880923 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jul 15 05:16:53.880943 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Jul 15 05:16:53.880956 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Jul 15 05:16:53.880969 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jul 15 05:16:53.880981 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Jul 15 05:16:53.880993 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Jul 15 05:16:53.881013 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Jul 15 05:16:53.881025 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Jul 15 05:16:53.881037 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Jul 15 05:16:53.881049 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Jul 15 05:16:53.881061 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Jul 15 05:16:53.881074 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Jul 15 05:16:53.881087 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Jul 15 05:16:53.881099 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Jul 15 05:16:53.881111 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Jul 15 05:16:53.881127 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Jul 15 05:16:53.881139 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Jul 15 05:16:53.881151 kernel: iommu: Default domain type: Translated Jul 15 05:16:53.881163 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jul 15 05:16:53.881193 kernel: PCI: Using ACPI for IRQ routing Jul 15 05:16:53.881205 kernel: PCI: pci_cache_line_size set to 64 bytes Jul 15 05:16:53.881217 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Jul 15 05:16:53.881229 kernel: e820: reserve RAM buffer [mem 0x7cfdc000-0x7fffffff] Jul 15 05:16:53.881419 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Jul 15 05:16:53.881579 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Jul 15 05:16:53.881740 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jul 15 05:16:53.881758 kernel: vgaarb: loaded Jul 15 05:16:53.881769 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Jul 15 05:16:53.881781 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Jul 15 05:16:53.881793 kernel: clocksource: Switched to clocksource kvm-clock Jul 15 05:16:53.881804 kernel: VFS: Disk quotas dquot_6.6.0 Jul 15 05:16:53.881817 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jul 15 05:16:53.881833 kernel: pnp: PnP ACPI init Jul 15 05:16:53.882008 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Jul 15 05:16:53.882027 kernel: pnp: PnP ACPI: found 5 devices Jul 15 05:16:53.882039 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jul 15 05:16:53.882051 kernel: NET: Registered PF_INET protocol family Jul 15 05:16:53.882063 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Jul 15 05:16:53.882075 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Jul 15 05:16:53.882087 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jul 15 05:16:53.882099 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Jul 15 05:16:53.882115 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Jul 15 05:16:53.882127 kernel: TCP: Hash tables configured (established 16384 bind 16384) Jul 15 05:16:53.882139 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Jul 15 05:16:53.882150 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Jul 15 05:16:53.882162 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jul 15 05:16:53.882174 kernel: NET: Registered PF_XDP protocol family Jul 15 05:16:53.884401 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jul 15 05:16:53.884572 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jul 15 05:16:53.884740 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jul 15 05:16:53.884911 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff]: assigned Jul 15 05:16:53.885097 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff]: assigned Jul 15 05:16:53.885310 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff]: assigned Jul 15 05:16:53.885476 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jul 15 05:16:53.885638 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Jul 15 05:16:53.885842 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Jul 15 05:16:53.887469 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jul 15 05:16:53.887639 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Jul 15 05:16:53.887796 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Jul 15 05:16:53.887941 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jul 15 05:16:53.888086 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Jul 15 05:16:53.888263 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Jul 15 05:16:53.888413 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jul 15 05:16:53.888554 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Jul 15 05:16:53.888714 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Jul 15 05:16:53.888878 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jul 15 05:16:53.889040 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Jul 15 05:16:53.889216 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Jul 15 05:16:53.889425 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jul 15 05:16:53.889593 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Jul 15 05:16:53.889753 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Jul 15 05:16:53.889912 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jul 15 05:16:53.890070 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff] Jul 15 05:16:53.890267 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Jul 15 05:16:53.890437 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Jul 15 05:16:53.890597 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jul 15 05:16:53.890755 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff] Jul 15 05:16:53.890912 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Jul 15 05:16:53.891071 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Jul 15 05:16:53.891275 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jul 15 05:16:53.891439 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff] Jul 15 05:16:53.891610 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Jul 15 05:16:53.891781 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Jul 15 05:16:53.891936 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Jul 15 05:16:53.892077 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Jul 15 05:16:53.892262 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Jul 15 05:16:53.892400 kernel: pci_bus 0000:00: resource 7 [mem 0x7d000000-0xafffffff window] Jul 15 05:16:53.892551 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Jul 15 05:16:53.892702 kernel: pci_bus 0000:00: resource 9 [mem 0x100000000-0x8ffffffff window] Jul 15 05:16:53.892862 kernel: pci_bus 0000:01: resource 1 [mem 0xfe800000-0xfe9fffff] Jul 15 05:16:53.893018 kernel: pci_bus 0000:01: resource 2 [mem 0xfd000000-0xfd1fffff 64bit pref] Jul 15 05:16:53.893212 kernel: pci_bus 0000:02: resource 1 [mem 0xfe600000-0xfe7fffff] Jul 15 05:16:53.893396 kernel: pci_bus 0000:02: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Jul 15 05:16:53.893550 kernel: pci_bus 0000:03: resource 1 [mem 0xfe400000-0xfe5fffff] Jul 15 05:16:53.893707 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Jul 15 05:16:53.893867 kernel: pci_bus 0000:04: resource 1 [mem 0xfe200000-0xfe3fffff] Jul 15 05:16:53.894017 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Jul 15 05:16:53.894191 kernel: pci_bus 0000:05: resource 1 [mem 0xfe000000-0xfe1fffff] Jul 15 05:16:53.894398 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Jul 15 05:16:53.894553 kernel: pci_bus 0000:06: resource 1 [mem 0xfde00000-0xfdffffff] Jul 15 05:16:53.894706 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Jul 15 05:16:53.894862 kernel: pci_bus 0000:07: resource 0 [io 0x1000-0x1fff] Jul 15 05:16:53.895016 kernel: pci_bus 0000:07: resource 1 [mem 0xfdc00000-0xfddfffff] Jul 15 05:16:53.895170 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Jul 15 05:16:53.895396 kernel: pci_bus 0000:08: resource 0 [io 0x2000-0x2fff] Jul 15 05:16:53.895553 kernel: pci_bus 0000:08: resource 1 [mem 0xfda00000-0xfdbfffff] Jul 15 05:16:53.895711 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Jul 15 05:16:53.895890 kernel: pci_bus 0000:09: resource 0 [io 0x3000-0x3fff] Jul 15 05:16:53.896053 kernel: pci_bus 0000:09: resource 1 [mem 0xfd800000-0xfd9fffff] Jul 15 05:16:53.896271 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Jul 15 05:16:53.896295 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Jul 15 05:16:53.896308 kernel: PCI: CLS 0 bytes, default 64 Jul 15 05:16:53.896321 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x233fc4eb620, max_idle_ns: 440795316590 ns Jul 15 05:16:53.896333 kernel: Initialise system trusted keyrings Jul 15 05:16:53.896349 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Jul 15 05:16:53.896361 kernel: Key type asymmetric registered Jul 15 05:16:53.896373 kernel: Asymmetric key parser 'x509' registered Jul 15 05:16:53.896384 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Jul 15 05:16:53.896396 kernel: io scheduler mq-deadline registered Jul 15 05:16:53.896408 kernel: io scheduler kyber registered Jul 15 05:16:53.896420 kernel: io scheduler bfq registered Jul 15 05:16:53.896593 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Jul 15 05:16:53.896767 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Jul 15 05:16:53.896951 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Jul 15 05:16:53.897125 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Jul 15 05:16:53.897348 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Jul 15 05:16:53.897514 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Jul 15 05:16:53.897677 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Jul 15 05:16:53.897844 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Jul 15 05:16:53.898017 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Jul 15 05:16:53.898193 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Jul 15 05:16:53.898385 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Jul 15 05:16:53.898558 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Jul 15 05:16:53.898728 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Jul 15 05:16:53.898898 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Jul 15 05:16:53.899076 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Jul 15 05:16:53.899288 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Jul 15 05:16:53.899310 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Jul 15 05:16:53.899478 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Jul 15 05:16:53.899642 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Jul 15 05:16:53.899664 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jul 15 05:16:53.899678 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Jul 15 05:16:53.899691 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jul 15 05:16:53.899704 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jul 15 05:16:53.899716 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Jul 15 05:16:53.899727 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jul 15 05:16:53.899743 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jul 15 05:16:53.899916 kernel: rtc_cmos 00:03: RTC can wake from S4 Jul 15 05:16:53.899938 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jul 15 05:16:53.900092 kernel: rtc_cmos 00:03: registered as rtc0 Jul 15 05:16:53.900299 kernel: rtc_cmos 00:03: setting system clock to 2025-07-15T05:16:53 UTC (1752556613) Jul 15 05:16:53.900459 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Jul 15 05:16:53.900479 kernel: NET: Registered PF_INET6 protocol family Jul 15 05:16:53.900491 kernel: Segment Routing with IPv6 Jul 15 05:16:53.900510 kernel: In-situ OAM (IOAM) with IPv6 Jul 15 05:16:53.900522 kernel: NET: Registered PF_PACKET protocol family Jul 15 05:16:53.900535 kernel: Key type dns_resolver registered Jul 15 05:16:53.900547 kernel: IPI shorthand broadcast: enabled Jul 15 05:16:53.900559 kernel: sched_clock: Marking stable (3098006737, 166826428)->(3273292076, -8458911) Jul 15 05:16:53.900571 kernel: registered taskstats version 1 Jul 15 05:16:53.900583 kernel: Loading compiled-in X.509 certificates Jul 15 05:16:53.900596 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.36-flatcar: a24478b628e55368911ce1800a2bd6bc158938c7' Jul 15 05:16:53.900608 kernel: Demotion targets for Node 0: null Jul 15 05:16:53.900624 kernel: Key type .fscrypt registered Jul 15 05:16:53.900636 kernel: Key type fscrypt-provisioning registered Jul 15 05:16:53.900647 kernel: ima: No TPM chip found, activating TPM-bypass! Jul 15 05:16:53.900659 kernel: ima: Allocated hash algorithm: sha1 Jul 15 05:16:53.900671 kernel: ima: No architecture policies found Jul 15 05:16:53.900683 kernel: clk: Disabling unused clocks Jul 15 05:16:53.900694 kernel: Warning: unable to open an initial console. Jul 15 05:16:53.900706 kernel: Freeing unused kernel image (initmem) memory: 54608K Jul 15 05:16:53.900723 kernel: Write protecting the kernel read-only data: 24576k Jul 15 05:16:53.900734 kernel: Freeing unused kernel image (rodata/data gap) memory: 284K Jul 15 05:16:53.900747 kernel: Run /init as init process Jul 15 05:16:53.900759 kernel: with arguments: Jul 15 05:16:53.900772 kernel: /init Jul 15 05:16:53.900784 kernel: with environment: Jul 15 05:16:53.900796 kernel: HOME=/ Jul 15 05:16:53.900808 kernel: TERM=linux Jul 15 05:16:53.900820 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Jul 15 05:16:53.900834 systemd[1]: Successfully made /usr/ read-only. Jul 15 05:16:53.900856 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jul 15 05:16:53.900868 systemd[1]: Detected virtualization kvm. Jul 15 05:16:53.900879 systemd[1]: Detected architecture x86-64. Jul 15 05:16:53.900890 systemd[1]: Running in initrd. Jul 15 05:16:53.900901 systemd[1]: No hostname configured, using default hostname. Jul 15 05:16:53.900914 systemd[1]: Hostname set to . Jul 15 05:16:53.900929 systemd[1]: Initializing machine ID from VM UUID. Jul 15 05:16:53.900942 systemd[1]: Queued start job for default target initrd.target. Jul 15 05:16:53.900954 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 15 05:16:53.900967 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 15 05:16:53.900981 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jul 15 05:16:53.900995 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jul 15 05:16:53.901008 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jul 15 05:16:53.901022 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jul 15 05:16:53.901042 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jul 15 05:16:53.901056 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jul 15 05:16:53.901068 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 15 05:16:53.901081 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jul 15 05:16:53.901094 systemd[1]: Reached target paths.target - Path Units. Jul 15 05:16:53.901107 systemd[1]: Reached target slices.target - Slice Units. Jul 15 05:16:53.901119 systemd[1]: Reached target swap.target - Swaps. Jul 15 05:16:53.901131 systemd[1]: Reached target timers.target - Timer Units. Jul 15 05:16:53.901148 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jul 15 05:16:53.901161 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jul 15 05:16:53.901173 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jul 15 05:16:53.901207 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jul 15 05:16:53.901220 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jul 15 05:16:53.901233 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jul 15 05:16:53.901274 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jul 15 05:16:53.901287 systemd[1]: Reached target sockets.target - Socket Units. Jul 15 05:16:53.901304 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jul 15 05:16:53.901316 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jul 15 05:16:53.901328 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jul 15 05:16:53.901344 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jul 15 05:16:53.901357 systemd[1]: Starting systemd-fsck-usr.service... Jul 15 05:16:53.901369 systemd[1]: Starting systemd-journald.service - Journal Service... Jul 15 05:16:53.901381 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jul 15 05:16:53.901393 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 15 05:16:53.901404 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jul 15 05:16:53.901420 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jul 15 05:16:53.901432 systemd[1]: Finished systemd-fsck-usr.service. Jul 15 05:16:53.901445 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jul 15 05:16:53.901497 systemd-journald[216]: Collecting audit messages is disabled. Jul 15 05:16:53.901532 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jul 15 05:16:53.901544 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jul 15 05:16:53.901555 kernel: Bridge firewalling registered Jul 15 05:16:53.901568 systemd-journald[216]: Journal started Jul 15 05:16:53.901596 systemd-journald[216]: Runtime Journal (/run/log/journal/e6c29a0c4d3341c486cd19696f964d79) is 4.8M, max 38.6M, 33.7M free. Jul 15 05:16:53.856081 systemd-modules-load[217]: Inserted module 'overlay' Jul 15 05:16:53.922414 systemd[1]: Started systemd-journald.service - Journal Service. Jul 15 05:16:53.887385 systemd-modules-load[217]: Inserted module 'br_netfilter' Jul 15 05:16:53.923014 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jul 15 05:16:53.924049 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 05:16:53.927398 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jul 15 05:16:53.933372 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jul 15 05:16:53.940351 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jul 15 05:16:53.943367 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jul 15 05:16:53.953409 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jul 15 05:16:53.960107 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jul 15 05:16:53.963414 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jul 15 05:16:53.967427 systemd-tmpfiles[233]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jul 15 05:16:53.970621 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 15 05:16:53.974075 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 15 05:16:53.977350 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jul 15 05:16:53.983219 dracut-cmdline[251]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=926b029026d98240a9e8b6527b65fc026ae523bea87c3b77ffd7237bcc7be4fb Jul 15 05:16:54.024748 systemd-resolved[257]: Positive Trust Anchors: Jul 15 05:16:54.025625 systemd-resolved[257]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jul 15 05:16:54.026451 systemd-resolved[257]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jul 15 05:16:54.032940 systemd-resolved[257]: Defaulting to hostname 'linux'. Jul 15 05:16:54.034844 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jul 15 05:16:54.035561 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jul 15 05:16:54.069290 kernel: SCSI subsystem initialized Jul 15 05:16:54.080277 kernel: Loading iSCSI transport class v2.0-870. Jul 15 05:16:54.090273 kernel: iscsi: registered transport (tcp) Jul 15 05:16:54.108515 kernel: iscsi: registered transport (qla4xxx) Jul 15 05:16:54.108562 kernel: QLogic iSCSI HBA Driver Jul 15 05:16:54.126556 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jul 15 05:16:54.146165 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jul 15 05:16:54.148445 systemd[1]: Reached target network-pre.target - Preparation for Network. Jul 15 05:16:54.188478 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jul 15 05:16:54.190331 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jul 15 05:16:54.237270 kernel: raid6: avx2x4 gen() 33093 MB/s Jul 15 05:16:54.254263 kernel: raid6: avx2x2 gen() 32110 MB/s Jul 15 05:16:54.271394 kernel: raid6: avx2x1 gen() 22697 MB/s Jul 15 05:16:54.271442 kernel: raid6: using algorithm avx2x4 gen() 33093 MB/s Jul 15 05:16:54.289459 kernel: raid6: .... xor() 4854 MB/s, rmw enabled Jul 15 05:16:54.289507 kernel: raid6: using avx2x2 recovery algorithm Jul 15 05:16:54.308269 kernel: xor: automatically using best checksumming function avx Jul 15 05:16:54.434286 kernel: Btrfs loaded, zoned=no, fsverity=no Jul 15 05:16:54.441594 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jul 15 05:16:54.443500 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 15 05:16:54.475642 systemd-udevd[463]: Using default interface naming scheme 'v255'. Jul 15 05:16:54.480740 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 15 05:16:54.483417 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jul 15 05:16:54.505571 dracut-pre-trigger[469]: rd.md=0: removing MD RAID activation Jul 15 05:16:54.532223 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jul 15 05:16:54.534517 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jul 15 05:16:54.604571 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jul 15 05:16:54.611131 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jul 15 05:16:54.727105 kernel: cryptd: max_cpu_qlen set to 1000 Jul 15 05:16:54.732284 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Jul 15 05:16:54.732544 kernel: ACPI: bus type USB registered Jul 15 05:16:54.737316 kernel: usbcore: registered new interface driver usbfs Jul 15 05:16:54.739286 kernel: usbcore: registered new interface driver hub Jul 15 05:16:54.747278 kernel: usbcore: registered new device driver usb Jul 15 05:16:54.748017 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jul 15 05:16:54.749293 kernel: scsi host0: Virtio SCSI HBA Jul 15 05:16:54.749468 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 05:16:54.750786 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jul 15 05:16:54.753439 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 15 05:16:54.759338 kernel: scsi 0:0:0:0: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Jul 15 05:16:54.768312 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 Jul 15 05:16:54.771277 kernel: AES CTR mode by8 optimization enabled Jul 15 05:16:54.778303 kernel: libata version 3.00 loaded. Jul 15 05:16:54.826855 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jul 15 05:16:54.827270 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Jul 15 05:16:54.827486 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Jul 15 05:16:54.827696 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jul 15 05:16:54.827907 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Jul 15 05:16:54.828114 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Jul 15 05:16:54.829533 kernel: hub 1-0:1.0: USB hub found Jul 15 05:16:54.829843 kernel: hub 1-0:1.0: 4 ports detected Jul 15 05:16:54.830043 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Jul 15 05:16:54.831901 kernel: sd 0:0:0:0: Power-on or device reset occurred Jul 15 05:16:54.832453 kernel: sd 0:0:0:0: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Jul 15 05:16:54.832676 kernel: sd 0:0:0:0: [sda] Write Protect is off Jul 15 05:16:54.832882 kernel: sd 0:0:0:0: [sda] Mode Sense: 63 00 00 08 Jul 15 05:16:54.833083 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Jul 15 05:16:54.834349 kernel: hub 2-0:1.0: USB hub found Jul 15 05:16:54.834564 kernel: hub 2-0:1.0: 4 ports detected Jul 15 05:16:54.834737 kernel: ahci 0000:00:1f.2: version 3.0 Jul 15 05:16:54.835310 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Jul 15 05:16:54.837412 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Jul 15 05:16:54.837709 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Jul 15 05:16:54.837947 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Jul 15 05:16:54.838819 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jul 15 05:16:54.838859 kernel: GPT:17805311 != 80003071 Jul 15 05:16:54.838874 kernel: GPT:Alternate GPT header not at the end of the disk. Jul 15 05:16:54.838888 kernel: GPT:17805311 != 80003071 Jul 15 05:16:54.838901 kernel: GPT: Use GNU Parted to correct GPT errors. Jul 15 05:16:54.838914 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jul 15 05:16:54.838928 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Jul 15 05:16:54.844307 kernel: scsi host1: ahci Jul 15 05:16:54.850276 kernel: scsi host2: ahci Jul 15 05:16:54.852271 kernel: scsi host3: ahci Jul 15 05:16:54.853276 kernel: scsi host4: ahci Jul 15 05:16:54.854263 kernel: scsi host5: ahci Jul 15 05:16:54.854708 kernel: scsi host6: ahci Jul 15 05:16:54.854840 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a100 irq 49 lpm-pol 0 Jul 15 05:16:54.854851 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a180 irq 49 lpm-pol 0 Jul 15 05:16:54.854860 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a200 irq 49 lpm-pol 0 Jul 15 05:16:54.854874 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a280 irq 49 lpm-pol 0 Jul 15 05:16:54.854882 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a300 irq 49 lpm-pol 0 Jul 15 05:16:54.854891 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a380 irq 49 lpm-pol 0 Jul 15 05:16:54.919729 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Jul 15 05:16:54.928937 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 05:16:54.948796 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Jul 15 05:16:54.955731 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Jul 15 05:16:54.956287 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Jul 15 05:16:54.965057 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Jul 15 05:16:54.967089 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jul 15 05:16:55.001332 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jul 15 05:16:55.001382 disk-uuid[625]: Primary Header is updated. Jul 15 05:16:55.001382 disk-uuid[625]: Secondary Entries is updated. Jul 15 05:16:55.001382 disk-uuid[625]: Secondary Header is updated. Jul 15 05:16:55.067274 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Jul 15 05:16:55.162774 kernel: ata2: SATA link down (SStatus 0 SControl 300) Jul 15 05:16:55.162835 kernel: ata4: SATA link down (SStatus 0 SControl 300) Jul 15 05:16:55.162846 kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Jul 15 05:16:55.162855 kernel: ata3: SATA link down (SStatus 0 SControl 300) Jul 15 05:16:55.165085 kernel: ata5: SATA link down (SStatus 0 SControl 300) Jul 15 05:16:55.165115 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Jul 15 05:16:55.166962 kernel: ata1.00: applying bridge limits Jul 15 05:16:55.167251 kernel: ata6: SATA link down (SStatus 0 SControl 300) Jul 15 05:16:55.168258 kernel: ata1.00: configured for UDMA/100 Jul 15 05:16:55.169727 kernel: scsi 1:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Jul 15 05:16:55.206692 kernel: hid: raw HID events driver (C) Jiri Kosina Jul 15 05:16:55.206764 kernel: sr 1:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Jul 15 05:16:55.207283 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Jul 15 05:16:55.209310 kernel: usbcore: registered new interface driver usbhid Jul 15 05:16:55.209355 kernel: usbhid: USB HID core driver Jul 15 05:16:55.214675 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input4 Jul 15 05:16:55.214704 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Jul 15 05:16:55.219264 kernel: sr 1:0:0:0: Attached scsi CD-ROM sr0 Jul 15 05:16:55.517959 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jul 15 05:16:55.519199 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jul 15 05:16:55.520332 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 15 05:16:55.521742 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jul 15 05:16:55.524017 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jul 15 05:16:55.545323 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jul 15 05:16:56.024815 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jul 15 05:16:56.027658 disk-uuid[626]: The operation has completed successfully. Jul 15 05:16:56.113612 systemd[1]: disk-uuid.service: Deactivated successfully. Jul 15 05:16:56.113734 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jul 15 05:16:56.137132 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jul 15 05:16:56.154876 sh[661]: Success Jul 15 05:16:56.174850 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jul 15 05:16:56.174890 kernel: device-mapper: uevent: version 1.0.3 Jul 15 05:16:56.175494 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jul 15 05:16:56.186269 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Jul 15 05:16:56.237043 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jul 15 05:16:56.241311 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jul 15 05:16:56.251926 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jul 15 05:16:56.265746 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' Jul 15 05:16:56.265777 kernel: BTRFS: device fsid eb96c768-dac4-4ca9-ae1d-82815d4ce00b devid 1 transid 36 /dev/mapper/usr (254:0) scanned by mount (673) Jul 15 05:16:56.268447 kernel: BTRFS info (device dm-0): first mount of filesystem eb96c768-dac4-4ca9-ae1d-82815d4ce00b Jul 15 05:16:56.268469 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jul 15 05:16:56.270516 kernel: BTRFS info (device dm-0): using free-space-tree Jul 15 05:16:56.279083 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jul 15 05:16:56.280093 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jul 15 05:16:56.281006 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jul 15 05:16:56.283357 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jul 15 05:16:56.284663 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jul 15 05:16:56.315265 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (707) Jul 15 05:16:56.315315 kernel: BTRFS info (device sda6): first mount of filesystem 86e7a055-b4ff-48a6-9a0a-c301ff74862f Jul 15 05:16:56.317540 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jul 15 05:16:56.319612 kernel: BTRFS info (device sda6): using free-space-tree Jul 15 05:16:56.328339 kernel: BTRFS info (device sda6): last unmount of filesystem 86e7a055-b4ff-48a6-9a0a-c301ff74862f Jul 15 05:16:56.329449 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jul 15 05:16:56.331000 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jul 15 05:16:56.400985 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jul 15 05:16:56.403519 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jul 15 05:16:56.441749 ignition[761]: Ignition 2.21.0 Jul 15 05:16:56.441764 ignition[761]: Stage: fetch-offline Jul 15 05:16:56.443552 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jul 15 05:16:56.441795 ignition[761]: no configs at "/usr/lib/ignition/base.d" Jul 15 05:16:56.441804 ignition[761]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 05:16:56.441874 ignition[761]: parsed url from cmdline: "" Jul 15 05:16:56.441877 ignition[761]: no config URL provided Jul 15 05:16:56.441881 ignition[761]: reading system config file "/usr/lib/ignition/user.ign" Jul 15 05:16:56.441888 ignition[761]: no config at "/usr/lib/ignition/user.ign" Jul 15 05:16:56.441892 ignition[761]: failed to fetch config: resource requires networking Jul 15 05:16:56.442027 ignition[761]: Ignition finished successfully Jul 15 05:16:56.450053 systemd-networkd[842]: lo: Link UP Jul 15 05:16:56.450062 systemd-networkd[842]: lo: Gained carrier Jul 15 05:16:56.452458 systemd-networkd[842]: Enumeration completed Jul 15 05:16:56.452532 systemd[1]: Started systemd-networkd.service - Network Configuration. Jul 15 05:16:56.453171 systemd-networkd[842]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 05:16:56.453185 systemd-networkd[842]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 15 05:16:56.453529 systemd[1]: Reached target network.target - Network. Jul 15 05:16:56.454542 systemd-networkd[842]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 05:16:56.454547 systemd-networkd[842]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 15 05:16:56.455562 systemd-networkd[842]: eth0: Link UP Jul 15 05:16:56.455565 systemd-networkd[842]: eth0: Gained carrier Jul 15 05:16:56.455573 systemd-networkd[842]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 05:16:56.457344 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jul 15 05:16:56.459464 systemd-networkd[842]: eth1: Link UP Jul 15 05:16:56.459467 systemd-networkd[842]: eth1: Gained carrier Jul 15 05:16:56.459475 systemd-networkd[842]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 05:16:56.480293 ignition[851]: Ignition 2.21.0 Jul 15 05:16:56.480934 ignition[851]: Stage: fetch Jul 15 05:16:56.481087 ignition[851]: no configs at "/usr/lib/ignition/base.d" Jul 15 05:16:56.481097 ignition[851]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 05:16:56.481164 ignition[851]: parsed url from cmdline: "" Jul 15 05:16:56.481168 ignition[851]: no config URL provided Jul 15 05:16:56.481172 ignition[851]: reading system config file "/usr/lib/ignition/user.ign" Jul 15 05:16:56.481192 ignition[851]: no config at "/usr/lib/ignition/user.ign" Jul 15 05:16:56.487296 systemd-networkd[842]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 Jul 15 05:16:56.481223 ignition[851]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Jul 15 05:16:56.482313 ignition[851]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Jul 15 05:16:56.525323 systemd-networkd[842]: eth0: DHCPv4 address 157.180.32.153/32, gateway 172.31.1.1 acquired from 172.31.1.1 Jul 15 05:16:56.682404 ignition[851]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Jul 15 05:16:56.685042 ignition[851]: GET result: OK Jul 15 05:16:56.685718 ignition[851]: parsing config with SHA512: 1d1199c85b763b6602d590bf47dbced269c3bfd448ff9684533da4f0d26d76455ef7297bb333a5766b67ca0c95c9a9480b125a387be2a6da7874fad7e0ef2d79 Jul 15 05:16:56.694653 unknown[851]: fetched base config from "system" Jul 15 05:16:56.694677 unknown[851]: fetched base config from "system" Jul 15 05:16:56.695147 ignition[851]: fetch: fetch complete Jul 15 05:16:56.694687 unknown[851]: fetched user config from "hetzner" Jul 15 05:16:56.695155 ignition[851]: fetch: fetch passed Jul 15 05:16:56.695228 ignition[851]: Ignition finished successfully Jul 15 05:16:56.700317 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jul 15 05:16:56.702665 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jul 15 05:16:56.734298 ignition[859]: Ignition 2.21.0 Jul 15 05:16:56.734312 ignition[859]: Stage: kargs Jul 15 05:16:56.734468 ignition[859]: no configs at "/usr/lib/ignition/base.d" Jul 15 05:16:56.734480 ignition[859]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 05:16:56.737721 ignition[859]: kargs: kargs passed Jul 15 05:16:56.737795 ignition[859]: Ignition finished successfully Jul 15 05:16:56.740665 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jul 15 05:16:56.743117 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jul 15 05:16:56.769145 ignition[866]: Ignition 2.21.0 Jul 15 05:16:56.769157 ignition[866]: Stage: disks Jul 15 05:16:56.769311 ignition[866]: no configs at "/usr/lib/ignition/base.d" Jul 15 05:16:56.769321 ignition[866]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 05:16:56.769957 ignition[866]: disks: disks passed Jul 15 05:16:56.772693 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jul 15 05:16:56.770005 ignition[866]: Ignition finished successfully Jul 15 05:16:56.774041 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jul 15 05:16:56.774782 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jul 15 05:16:56.776016 systemd[1]: Reached target local-fs.target - Local File Systems. Jul 15 05:16:56.776962 systemd[1]: Reached target sysinit.target - System Initialization. Jul 15 05:16:56.778137 systemd[1]: Reached target basic.target - Basic System. Jul 15 05:16:56.780155 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jul 15 05:16:56.800315 systemd-fsck[875]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Jul 15 05:16:56.802694 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jul 15 05:16:56.805319 systemd[1]: Mounting sysroot.mount - /sysroot... Jul 15 05:16:56.917284 kernel: EXT4-fs (sda9): mounted filesystem 277c3938-5262-4ab1-8fa3-62fde82f8257 r/w with ordered data mode. Quota mode: none. Jul 15 05:16:56.916885 systemd[1]: Mounted sysroot.mount - /sysroot. Jul 15 05:16:56.917783 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jul 15 05:16:56.919852 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jul 15 05:16:56.922311 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jul 15 05:16:56.925384 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Jul 15 05:16:56.926582 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jul 15 05:16:56.927534 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jul 15 05:16:56.937768 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jul 15 05:16:56.941347 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jul 15 05:16:56.955215 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (883) Jul 15 05:16:56.955251 kernel: BTRFS info (device sda6): first mount of filesystem 86e7a055-b4ff-48a6-9a0a-c301ff74862f Jul 15 05:16:56.955263 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jul 15 05:16:56.955273 kernel: BTRFS info (device sda6): using free-space-tree Jul 15 05:16:56.956162 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jul 15 05:16:56.982574 coreos-metadata[885]: Jul 15 05:16:56.982 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Jul 15 05:16:56.984945 coreos-metadata[885]: Jul 15 05:16:56.983 INFO Fetch successful Jul 15 05:16:56.984945 coreos-metadata[885]: Jul 15 05:16:56.983 INFO wrote hostname ci-4396-0-0-n-153ccb2e88 to /sysroot/etc/hostname Jul 15 05:16:56.986136 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jul 15 05:16:57.003415 initrd-setup-root[911]: cut: /sysroot/etc/passwd: No such file or directory Jul 15 05:16:57.007690 initrd-setup-root[918]: cut: /sysroot/etc/group: No such file or directory Jul 15 05:16:57.011680 initrd-setup-root[925]: cut: /sysroot/etc/shadow: No such file or directory Jul 15 05:16:57.016046 initrd-setup-root[932]: cut: /sysroot/etc/gshadow: No such file or directory Jul 15 05:16:57.107839 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jul 15 05:16:57.109642 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jul 15 05:16:57.111259 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jul 15 05:16:57.136278 kernel: BTRFS info (device sda6): last unmount of filesystem 86e7a055-b4ff-48a6-9a0a-c301ff74862f Jul 15 05:16:57.149145 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jul 15 05:16:57.161474 ignition[1001]: INFO : Ignition 2.21.0 Jul 15 05:16:57.161474 ignition[1001]: INFO : Stage: mount Jul 15 05:16:57.164281 ignition[1001]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 15 05:16:57.164281 ignition[1001]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 05:16:57.164281 ignition[1001]: INFO : mount: mount passed Jul 15 05:16:57.164281 ignition[1001]: INFO : Ignition finished successfully Jul 15 05:16:57.165120 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jul 15 05:16:57.167104 systemd[1]: Starting ignition-files.service - Ignition (files)... Jul 15 05:16:57.264926 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jul 15 05:16:57.267774 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jul 15 05:16:57.293268 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1012) Jul 15 05:16:57.298532 kernel: BTRFS info (device sda6): first mount of filesystem 86e7a055-b4ff-48a6-9a0a-c301ff74862f Jul 15 05:16:57.298589 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jul 15 05:16:57.298601 kernel: BTRFS info (device sda6): using free-space-tree Jul 15 05:16:57.306036 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jul 15 05:16:57.333314 ignition[1029]: INFO : Ignition 2.21.0 Jul 15 05:16:57.333314 ignition[1029]: INFO : Stage: files Jul 15 05:16:57.334915 ignition[1029]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 15 05:16:57.334915 ignition[1029]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 05:16:57.334915 ignition[1029]: DEBUG : files: compiled without relabeling support, skipping Jul 15 05:16:57.337426 ignition[1029]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jul 15 05:16:57.337426 ignition[1029]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jul 15 05:16:57.339228 ignition[1029]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jul 15 05:16:57.339228 ignition[1029]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jul 15 05:16:57.339228 ignition[1029]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jul 15 05:16:57.338138 unknown[1029]: wrote ssh authorized keys file for user: core Jul 15 05:16:57.342518 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jul 15 05:16:57.342518 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Jul 15 05:16:57.610609 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jul 15 05:16:58.089479 systemd-networkd[842]: eth1: Gained IPv6LL Jul 15 05:16:58.473405 systemd-networkd[842]: eth0: Gained IPv6LL Jul 15 05:16:59.537435 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jul 15 05:16:59.539096 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jul 15 05:16:59.539096 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jul 15 05:16:59.539096 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jul 15 05:16:59.539096 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jul 15 05:16:59.539096 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jul 15 05:16:59.539096 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jul 15 05:16:59.539096 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jul 15 05:16:59.539096 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jul 15 05:16:59.546712 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jul 15 05:16:59.546712 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jul 15 05:16:59.546712 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Jul 15 05:16:59.546712 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Jul 15 05:16:59.546712 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Jul 15 05:16:59.546712 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-x86-64.raw: attempt #1 Jul 15 05:16:59.938608 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jul 15 05:17:00.114628 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" Jul 15 05:17:00.114628 ignition[1029]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jul 15 05:17:00.116877 ignition[1029]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jul 15 05:17:00.118050 ignition[1029]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jul 15 05:17:00.118050 ignition[1029]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jul 15 05:17:00.118050 ignition[1029]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Jul 15 05:17:00.118050 ignition[1029]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Jul 15 05:17:00.118050 ignition[1029]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Jul 15 05:17:00.118050 ignition[1029]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Jul 15 05:17:00.118050 ignition[1029]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Jul 15 05:17:00.118050 ignition[1029]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Jul 15 05:17:00.127825 ignition[1029]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Jul 15 05:17:00.127825 ignition[1029]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Jul 15 05:17:00.127825 ignition[1029]: INFO : files: files passed Jul 15 05:17:00.127825 ignition[1029]: INFO : Ignition finished successfully Jul 15 05:17:00.122097 systemd[1]: Finished ignition-files.service - Ignition (files). Jul 15 05:17:00.126365 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jul 15 05:17:00.133807 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jul 15 05:17:00.136520 systemd[1]: ignition-quench.service: Deactivated successfully. Jul 15 05:17:00.142347 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jul 15 05:17:00.150132 initrd-setup-root-after-ignition[1059]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jul 15 05:17:00.150132 initrd-setup-root-after-ignition[1059]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jul 15 05:17:00.152552 initrd-setup-root-after-ignition[1063]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jul 15 05:17:00.153137 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jul 15 05:17:00.154110 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jul 15 05:17:00.155721 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jul 15 05:17:00.198267 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jul 15 05:17:00.198398 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jul 15 05:17:00.199483 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jul 15 05:17:00.200371 systemd[1]: Reached target initrd.target - Initrd Default Target. Jul 15 05:17:00.201378 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jul 15 05:17:00.202043 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jul 15 05:17:00.240478 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jul 15 05:17:00.243108 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jul 15 05:17:00.261058 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jul 15 05:17:00.263124 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 15 05:17:00.264112 systemd[1]: Stopped target timers.target - Timer Units. Jul 15 05:17:00.265424 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jul 15 05:17:00.265592 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jul 15 05:17:00.266986 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jul 15 05:17:00.267874 systemd[1]: Stopped target basic.target - Basic System. Jul 15 05:17:00.269267 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jul 15 05:17:00.270391 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jul 15 05:17:00.271614 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jul 15 05:17:00.272854 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jul 15 05:17:00.274300 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jul 15 05:17:00.275459 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jul 15 05:17:00.276873 systemd[1]: Stopped target sysinit.target - System Initialization. Jul 15 05:17:00.278205 systemd[1]: Stopped target local-fs.target - Local File Systems. Jul 15 05:17:00.279580 systemd[1]: Stopped target swap.target - Swaps. Jul 15 05:17:00.280776 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jul 15 05:17:00.280942 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jul 15 05:17:00.282278 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jul 15 05:17:00.283074 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 15 05:17:00.284216 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jul 15 05:17:00.284415 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 15 05:17:00.285371 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jul 15 05:17:00.285459 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jul 15 05:17:00.287125 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jul 15 05:17:00.287297 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jul 15 05:17:00.288550 systemd[1]: ignition-files.service: Deactivated successfully. Jul 15 05:17:00.288714 systemd[1]: Stopped ignition-files.service - Ignition (files). Jul 15 05:17:00.289763 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Jul 15 05:17:00.289923 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jul 15 05:17:00.296329 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jul 15 05:17:00.299392 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jul 15 05:17:00.300305 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jul 15 05:17:00.300448 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jul 15 05:17:00.301348 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jul 15 05:17:00.301436 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jul 15 05:17:00.306204 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jul 15 05:17:00.308328 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jul 15 05:17:00.323270 ignition[1083]: INFO : Ignition 2.21.0 Jul 15 05:17:00.324262 ignition[1083]: INFO : Stage: umount Jul 15 05:17:00.324262 ignition[1083]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 15 05:17:00.324262 ignition[1083]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 05:17:00.326523 ignition[1083]: INFO : umount: umount passed Jul 15 05:17:00.326523 ignition[1083]: INFO : Ignition finished successfully Jul 15 05:17:00.329153 systemd[1]: ignition-mount.service: Deactivated successfully. Jul 15 05:17:00.329312 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jul 15 05:17:00.329911 systemd[1]: ignition-disks.service: Deactivated successfully. Jul 15 05:17:00.329957 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jul 15 05:17:00.330700 systemd[1]: ignition-kargs.service: Deactivated successfully. Jul 15 05:17:00.330742 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jul 15 05:17:00.331165 systemd[1]: ignition-fetch.service: Deactivated successfully. Jul 15 05:17:00.331220 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jul 15 05:17:00.331719 systemd[1]: Stopped target network.target - Network. Jul 15 05:17:00.334029 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jul 15 05:17:00.334075 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jul 15 05:17:00.334557 systemd[1]: Stopped target paths.target - Path Units. Jul 15 05:17:00.339586 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jul 15 05:17:00.343596 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 15 05:17:00.352209 systemd[1]: Stopped target slices.target - Slice Units. Jul 15 05:17:00.353202 systemd[1]: Stopped target sockets.target - Socket Units. Jul 15 05:17:00.354338 systemd[1]: iscsid.socket: Deactivated successfully. Jul 15 05:17:00.354376 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jul 15 05:17:00.355194 systemd[1]: iscsiuio.socket: Deactivated successfully. Jul 15 05:17:00.355253 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jul 15 05:17:00.356151 systemd[1]: ignition-setup.service: Deactivated successfully. Jul 15 05:17:00.356220 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jul 15 05:17:00.357636 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jul 15 05:17:00.357678 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jul 15 05:17:00.361925 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jul 15 05:17:00.362439 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jul 15 05:17:00.364572 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jul 15 05:17:00.365091 systemd[1]: sysroot-boot.service: Deactivated successfully. Jul 15 05:17:00.365190 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jul 15 05:17:00.367719 systemd[1]: systemd-resolved.service: Deactivated successfully. Jul 15 05:17:00.367818 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jul 15 05:17:00.370957 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Jul 15 05:17:00.371689 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jul 15 05:17:00.371764 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jul 15 05:17:00.372878 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jul 15 05:17:00.372924 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 15 05:17:00.375119 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Jul 15 05:17:00.375365 systemd[1]: systemd-networkd.service: Deactivated successfully. Jul 15 05:17:00.375471 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jul 15 05:17:00.377515 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Jul 15 05:17:00.377903 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jul 15 05:17:00.378924 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jul 15 05:17:00.378963 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jul 15 05:17:00.380663 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jul 15 05:17:00.382547 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jul 15 05:17:00.382605 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jul 15 05:17:00.385710 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jul 15 05:17:00.385768 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jul 15 05:17:00.387335 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jul 15 05:17:00.387399 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jul 15 05:17:00.388442 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 15 05:17:00.393423 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Jul 15 05:17:00.403361 systemd[1]: network-cleanup.service: Deactivated successfully. Jul 15 05:17:00.403473 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jul 15 05:17:00.404736 systemd[1]: systemd-udevd.service: Deactivated successfully. Jul 15 05:17:00.404889 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 15 05:17:00.405900 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jul 15 05:17:00.405959 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jul 15 05:17:00.406669 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jul 15 05:17:00.406704 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jul 15 05:17:00.407752 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jul 15 05:17:00.407797 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jul 15 05:17:00.409464 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jul 15 05:17:00.409507 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jul 15 05:17:00.410732 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jul 15 05:17:00.410778 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jul 15 05:17:00.412669 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jul 15 05:17:00.415756 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jul 15 05:17:00.415807 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jul 15 05:17:00.416987 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jul 15 05:17:00.417034 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 15 05:17:00.418549 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Jul 15 05:17:00.418592 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jul 15 05:17:00.419734 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jul 15 05:17:00.419776 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jul 15 05:17:00.424372 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jul 15 05:17:00.424415 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 05:17:00.428512 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jul 15 05:17:00.428602 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jul 15 05:17:00.430095 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jul 15 05:17:00.432350 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jul 15 05:17:00.446895 systemd[1]: Switching root. Jul 15 05:17:00.481109 systemd-journald[216]: Journal stopped Jul 15 05:17:01.538848 systemd-journald[216]: Received SIGTERM from PID 1 (systemd). Jul 15 05:17:01.538937 kernel: SELinux: policy capability network_peer_controls=1 Jul 15 05:17:01.538951 kernel: SELinux: policy capability open_perms=1 Jul 15 05:17:01.538964 kernel: SELinux: policy capability extended_socket_class=1 Jul 15 05:17:01.538974 kernel: SELinux: policy capability always_check_network=0 Jul 15 05:17:01.538984 kernel: SELinux: policy capability cgroup_seclabel=1 Jul 15 05:17:01.539001 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jul 15 05:17:01.539016 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jul 15 05:17:01.539041 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jul 15 05:17:01.539059 kernel: SELinux: policy capability userspace_initial_context=0 Jul 15 05:17:01.539069 kernel: audit: type=1403 audit(1752556620.615:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Jul 15 05:17:01.539080 systemd[1]: Successfully loaded SELinux policy in 56.036ms. Jul 15 05:17:01.539095 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 7.216ms. Jul 15 05:17:01.539107 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jul 15 05:17:01.539117 systemd[1]: Detected virtualization kvm. Jul 15 05:17:01.539128 systemd[1]: Detected architecture x86-64. Jul 15 05:17:01.539138 systemd[1]: Detected first boot. Jul 15 05:17:01.539148 systemd[1]: Hostname set to . Jul 15 05:17:01.539159 systemd[1]: Initializing machine ID from VM UUID. Jul 15 05:17:01.539169 kernel: Guest personality initialized and is inactive Jul 15 05:17:01.539194 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Jul 15 05:17:01.539204 kernel: Initialized host personality Jul 15 05:17:01.539214 kernel: NET: Registered PF_VSOCK protocol family Jul 15 05:17:01.539224 zram_generator::config[1129]: No configuration found. Jul 15 05:17:01.541986 systemd[1]: Populated /etc with preset unit settings. Jul 15 05:17:01.542019 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Jul 15 05:17:01.542033 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jul 15 05:17:01.542044 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jul 15 05:17:01.542059 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jul 15 05:17:01.542070 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jul 15 05:17:01.542081 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jul 15 05:17:01.542091 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jul 15 05:17:01.542101 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jul 15 05:17:01.542111 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jul 15 05:17:01.542121 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jul 15 05:17:01.542132 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jul 15 05:17:01.542142 systemd[1]: Created slice user.slice - User and Session Slice. Jul 15 05:17:01.542155 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 15 05:17:01.542166 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 15 05:17:01.542196 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jul 15 05:17:01.542212 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jul 15 05:17:01.542224 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jul 15 05:17:01.542248 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jul 15 05:17:01.544882 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jul 15 05:17:01.544904 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 15 05:17:01.544920 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jul 15 05:17:01.544936 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jul 15 05:17:01.544952 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jul 15 05:17:01.544967 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jul 15 05:17:01.544989 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jul 15 05:17:01.545009 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 15 05:17:01.545028 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jul 15 05:17:01.545046 systemd[1]: Reached target slices.target - Slice Units. Jul 15 05:17:01.545064 systemd[1]: Reached target swap.target - Swaps. Jul 15 05:17:01.545084 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jul 15 05:17:01.545102 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jul 15 05:17:01.545121 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jul 15 05:17:01.545140 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jul 15 05:17:01.545161 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jul 15 05:17:01.545189 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jul 15 05:17:01.545205 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jul 15 05:17:01.545221 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jul 15 05:17:01.545263 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jul 15 05:17:01.545278 systemd[1]: Mounting media.mount - External Media Directory... Jul 15 05:17:01.545294 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:17:01.545310 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jul 15 05:17:01.545325 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jul 15 05:17:01.545344 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jul 15 05:17:01.545361 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jul 15 05:17:01.545377 systemd[1]: Reached target machines.target - Containers. Jul 15 05:17:01.545392 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jul 15 05:17:01.545407 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 15 05:17:01.545423 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jul 15 05:17:01.545438 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jul 15 05:17:01.545453 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 15 05:17:01.545471 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jul 15 05:17:01.545487 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 15 05:17:01.545502 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jul 15 05:17:01.545518 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jul 15 05:17:01.545534 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jul 15 05:17:01.545550 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jul 15 05:17:01.545565 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jul 15 05:17:01.545580 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jul 15 05:17:01.545596 systemd[1]: Stopped systemd-fsck-usr.service. Jul 15 05:17:01.545616 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 15 05:17:01.545631 kernel: fuse: init (API version 7.41) Jul 15 05:17:01.545647 systemd[1]: Starting systemd-journald.service - Journal Service... Jul 15 05:17:01.545663 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jul 15 05:17:01.545678 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jul 15 05:17:01.545694 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jul 15 05:17:01.545709 kernel: loop: module loaded Jul 15 05:17:01.545726 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jul 15 05:17:01.545742 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jul 15 05:17:01.545760 systemd[1]: verity-setup.service: Deactivated successfully. Jul 15 05:17:01.545779 systemd[1]: Stopped verity-setup.service. Jul 15 05:17:01.545795 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:17:01.545810 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jul 15 05:17:01.545826 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jul 15 05:17:01.545841 systemd[1]: Mounted media.mount - External Media Directory. Jul 15 05:17:01.545858 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jul 15 05:17:01.545875 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jul 15 05:17:01.545890 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jul 15 05:17:01.545906 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jul 15 05:17:01.545924 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jul 15 05:17:01.545943 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jul 15 05:17:01.545960 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jul 15 05:17:01.545976 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 15 05:17:01.545993 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 15 05:17:01.546009 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 15 05:17:01.546026 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 15 05:17:01.546045 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jul 15 05:17:01.546060 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jul 15 05:17:01.546074 systemd[1]: modprobe@loop.service: Deactivated successfully. Jul 15 05:17:01.546089 kernel: ACPI: bus type drm_connector registered Jul 15 05:17:01.546104 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jul 15 05:17:01.546120 systemd[1]: modprobe@drm.service: Deactivated successfully. Jul 15 05:17:01.546136 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jul 15 05:17:01.546153 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jul 15 05:17:01.546189 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jul 15 05:17:01.546207 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jul 15 05:17:01.546227 systemd[1]: Reached target network-pre.target - Preparation for Network. Jul 15 05:17:01.547253 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jul 15 05:17:01.547332 systemd-journald[1205]: Collecting audit messages is disabled. Jul 15 05:17:01.547367 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jul 15 05:17:01.547385 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jul 15 05:17:01.547403 systemd[1]: Reached target local-fs.target - Local File Systems. Jul 15 05:17:01.547420 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jul 15 05:17:01.547443 systemd-journald[1205]: Journal started Jul 15 05:17:01.547475 systemd-journald[1205]: Runtime Journal (/run/log/journal/e6c29a0c4d3341c486cd19696f964d79) is 4.8M, max 38.6M, 33.7M free. Jul 15 05:17:01.190887 systemd[1]: Queued start job for default target multi-user.target. Jul 15 05:17:01.211663 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Jul 15 05:17:01.212161 systemd[1]: systemd-journald.service: Deactivated successfully. Jul 15 05:17:01.555286 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jul 15 05:17:01.555345 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 15 05:17:01.564289 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jul 15 05:17:01.564455 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jul 15 05:17:01.573363 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jul 15 05:17:01.578268 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jul 15 05:17:01.587704 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jul 15 05:17:01.596274 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jul 15 05:17:01.600272 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jul 15 05:17:01.605263 systemd[1]: Started systemd-journald.service - Journal Service. Jul 15 05:17:01.609390 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jul 15 05:17:01.610024 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jul 15 05:17:01.610605 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jul 15 05:17:01.611615 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jul 15 05:17:01.644700 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jul 15 05:17:01.647904 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jul 15 05:17:01.651344 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jul 15 05:17:01.654318 kernel: loop0: detected capacity change from 0 to 221472 Jul 15 05:17:01.676717 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jul 15 05:17:01.683741 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jul 15 05:17:01.692875 systemd-tmpfiles[1235]: ACLs are not supported, ignoring. Jul 15 05:17:01.692901 systemd-tmpfiles[1235]: ACLs are not supported, ignoring. Jul 15 05:17:01.698191 systemd-journald[1205]: Time spent on flushing to /var/log/journal/e6c29a0c4d3341c486cd19696f964d79 is 23.769ms for 1168 entries. Jul 15 05:17:01.698191 systemd-journald[1205]: System Journal (/var/log/journal/e6c29a0c4d3341c486cd19696f964d79) is 8M, max 584.8M, 576.8M free. Jul 15 05:17:01.741288 systemd-journald[1205]: Received client request to flush runtime journal. Jul 15 05:17:01.741324 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jul 15 05:17:01.741337 kernel: loop1: detected capacity change from 0 to 146488 Jul 15 05:17:01.702791 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jul 15 05:17:01.711270 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jul 15 05:17:01.716056 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jul 15 05:17:01.742654 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jul 15 05:17:01.774279 kernel: loop2: detected capacity change from 0 to 114000 Jul 15 05:17:01.781593 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jul 15 05:17:01.785396 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jul 15 05:17:01.816542 systemd-tmpfiles[1275]: ACLs are not supported, ignoring. Jul 15 05:17:01.816889 systemd-tmpfiles[1275]: ACLs are not supported, ignoring. Jul 15 05:17:01.819310 kernel: loop3: detected capacity change from 0 to 8 Jul 15 05:17:01.827814 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 15 05:17:01.843722 kernel: loop4: detected capacity change from 0 to 221472 Jul 15 05:17:01.865275 kernel: loop5: detected capacity change from 0 to 146488 Jul 15 05:17:01.896509 kernel: loop6: detected capacity change from 0 to 114000 Jul 15 05:17:01.918358 kernel: loop7: detected capacity change from 0 to 8 Jul 15 05:17:01.920347 (sd-merge)[1280]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Jul 15 05:17:01.921026 (sd-merge)[1280]: Merged extensions into '/usr'. Jul 15 05:17:01.926057 systemd[1]: Reload requested from client PID 1234 ('systemd-sysext') (unit systemd-sysext.service)... Jul 15 05:17:01.926133 systemd[1]: Reloading... Jul 15 05:17:01.999272 zram_generator::config[1302]: No configuration found. Jul 15 05:17:02.133305 ldconfig[1230]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jul 15 05:17:02.138901 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 15 05:17:02.212272 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jul 15 05:17:02.212866 systemd[1]: Reloading finished in 286 ms. Jul 15 05:17:02.227324 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jul 15 05:17:02.228273 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jul 15 05:17:02.229093 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jul 15 05:17:02.240375 systemd[1]: Starting ensure-sysext.service... Jul 15 05:17:02.243347 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jul 15 05:17:02.248635 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 15 05:17:02.259131 systemd[1]: Reload requested from client PID 1350 ('systemctl') (unit ensure-sysext.service)... Jul 15 05:17:02.259147 systemd[1]: Reloading... Jul 15 05:17:02.269309 systemd-tmpfiles[1351]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jul 15 05:17:02.269347 systemd-tmpfiles[1351]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jul 15 05:17:02.269607 systemd-tmpfiles[1351]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jul 15 05:17:02.269835 systemd-tmpfiles[1351]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Jul 15 05:17:02.270646 systemd-tmpfiles[1351]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Jul 15 05:17:02.270940 systemd-tmpfiles[1351]: ACLs are not supported, ignoring. Jul 15 05:17:02.271068 systemd-tmpfiles[1351]: ACLs are not supported, ignoring. Jul 15 05:17:02.275711 systemd-tmpfiles[1351]: Detected autofs mount point /boot during canonicalization of boot. Jul 15 05:17:02.276095 systemd-tmpfiles[1351]: Skipping /boot Jul 15 05:17:02.292268 systemd-tmpfiles[1351]: Detected autofs mount point /boot during canonicalization of boot. Jul 15 05:17:02.292381 systemd-tmpfiles[1351]: Skipping /boot Jul 15 05:17:02.312215 systemd-udevd[1352]: Using default interface naming scheme 'v255'. Jul 15 05:17:02.335270 zram_generator::config[1387]: No configuration found. Jul 15 05:17:02.475991 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 15 05:17:02.530276 kernel: mousedev: PS/2 mouse device common for all mice Jul 15 05:17:02.564893 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Jul 15 05:17:02.565556 systemd[1]: Reloading finished in 306 ms. Jul 15 05:17:02.576268 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 15 05:17:02.578284 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 15 05:17:02.603087 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jul 15 05:17:02.607343 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input5 Jul 15 05:17:02.605992 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jul 15 05:17:02.607549 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jul 15 05:17:02.611198 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jul 15 05:17:02.617489 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jul 15 05:17:02.621454 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jul 15 05:17:02.628535 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:17:02.629207 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 15 05:17:02.637409 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 15 05:17:02.640221 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 15 05:17:02.646605 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jul 15 05:17:02.647432 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 15 05:17:02.647516 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 15 05:17:02.652638 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jul 15 05:17:02.654343 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:17:02.659894 kernel: ACPI: button: Power Button [PWRF] Jul 15 05:17:02.668031 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:17:02.668226 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 15 05:17:02.668390 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 15 05:17:02.668456 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 15 05:17:02.668521 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:17:02.672109 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:17:02.672334 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 15 05:17:02.680428 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jul 15 05:17:02.681596 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 15 05:17:02.681685 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 15 05:17:02.681783 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:17:02.684733 systemd[1]: Finished ensure-sysext.service. Jul 15 05:17:02.697658 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Jul 15 05:17:02.702032 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jul 15 05:17:02.712845 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jul 15 05:17:02.716384 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jul 15 05:17:02.740758 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 15 05:17:02.741312 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 15 05:17:02.742683 systemd[1]: modprobe@loop.service: Deactivated successfully. Jul 15 05:17:02.743924 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jul 15 05:17:02.744745 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 15 05:17:02.745724 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 15 05:17:02.747487 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jul 15 05:17:02.752197 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jul 15 05:17:02.752831 augenrules[1499]: No rules Jul 15 05:17:02.753430 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jul 15 05:17:02.755961 systemd[1]: audit-rules.service: Deactivated successfully. Jul 15 05:17:02.756597 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jul 15 05:17:02.758317 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jul 15 05:17:02.760409 systemd[1]: modprobe@drm.service: Deactivated successfully. Jul 15 05:17:02.760592 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jul 15 05:17:02.778162 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jul 15 05:17:02.780576 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Jul 15 05:17:02.780623 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:17:02.780717 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 15 05:17:02.786388 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 15 05:17:02.787767 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 15 05:17:02.789831 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jul 15 05:17:02.791356 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 15 05:17:02.791388 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 15 05:17:02.791409 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jul 15 05:17:02.791418 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:17:02.804910 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jul 15 05:17:02.822002 systemd[1]: modprobe@loop.service: Deactivated successfully. Jul 15 05:17:02.822685 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jul 15 05:17:02.823621 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 15 05:17:02.823835 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 15 05:17:02.824678 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 15 05:17:02.825292 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 15 05:17:02.827001 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jul 15 05:17:02.827079 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jul 15 05:17:02.857464 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Jul 15 05:17:02.860552 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jul 15 05:17:02.882521 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Jul 15 05:17:02.886276 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Jul 15 05:17:02.889634 kernel: Console: switching to colour dummy device 80x25 Jul 15 05:17:02.891032 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jul 15 05:17:02.891065 kernel: [drm] features: -context_init Jul 15 05:17:02.893521 kernel: [drm] number of scanouts: 1 Jul 15 05:17:02.893547 kernel: [drm] number of cap sets: 0 Jul 15 05:17:02.896257 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Jul 15 05:17:02.916600 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Jul 15 05:17:02.916813 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Jul 15 05:17:02.922201 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jul 15 05:17:02.933276 kernel: EDAC MC: Ver: 3.0.0 Jul 15 05:17:02.959479 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 15 05:17:03.032392 systemd-networkd[1464]: lo: Link UP Jul 15 05:17:03.032405 systemd-networkd[1464]: lo: Gained carrier Jul 15 05:17:03.044810 systemd-networkd[1464]: Enumeration completed Jul 15 05:17:03.044905 systemd[1]: Started systemd-networkd.service - Network Configuration. Jul 15 05:17:03.047517 systemd-networkd[1464]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 05:17:03.047530 systemd-networkd[1464]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 15 05:17:03.048583 systemd-resolved[1465]: Positive Trust Anchors: Jul 15 05:17:03.048719 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jul 15 05:17:03.049271 systemd-resolved[1465]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jul 15 05:17:03.049354 systemd-resolved[1465]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jul 15 05:17:03.050468 systemd-networkd[1464]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 05:17:03.050481 systemd-networkd[1464]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 15 05:17:03.051445 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jul 15 05:17:03.052228 systemd-networkd[1464]: eth0: Link UP Jul 15 05:17:03.053098 systemd-networkd[1464]: eth0: Gained carrier Jul 15 05:17:03.053122 systemd-networkd[1464]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 05:17:03.062633 systemd-networkd[1464]: eth1: Link UP Jul 15 05:17:03.066743 systemd-networkd[1464]: eth1: Gained carrier Jul 15 05:17:03.066791 systemd-networkd[1464]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 05:17:03.068213 systemd-resolved[1465]: Using system hostname 'ci-4396-0-0-n-153ccb2e88'. Jul 15 05:17:03.079187 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jul 15 05:17:03.079469 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 05:17:03.080860 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jul 15 05:17:03.081926 systemd[1]: Reached target network.target - Network. Jul 15 05:17:03.082708 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jul 15 05:17:03.092319 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 15 05:17:03.095398 systemd-networkd[1464]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 Jul 15 05:17:03.111403 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jul 15 05:17:03.111626 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Jul 15 05:17:03.112344 systemd[1]: Reached target time-set.target - System Time Set. Jul 15 05:17:03.112837 systemd-networkd[1464]: eth0: DHCPv4 address 157.180.32.153/32, gateway 172.31.1.1 acquired from 172.31.1.1 Jul 15 05:17:03.158393 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 05:17:03.158686 systemd[1]: Reached target sysinit.target - System Initialization. Jul 15 05:17:03.158855 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jul 15 05:17:03.158947 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jul 15 05:17:03.159017 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Jul 15 05:17:03.159281 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jul 15 05:17:03.159466 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jul 15 05:17:03.159541 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jul 15 05:17:03.159604 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jul 15 05:17:03.159631 systemd[1]: Reached target paths.target - Path Units. Jul 15 05:17:03.159714 systemd[1]: Reached target timers.target - Timer Units. Jul 15 05:17:03.161143 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jul 15 05:17:03.162901 systemd[1]: Starting docker.socket - Docker Socket for the API... Jul 15 05:17:03.165422 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jul 15 05:17:03.165686 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jul 15 05:17:03.165771 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jul 15 05:17:03.169338 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jul 15 05:17:03.169842 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jul 15 05:17:03.170614 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jul 15 05:17:03.171327 systemd[1]: Reached target sockets.target - Socket Units. Jul 15 05:17:03.171398 systemd[1]: Reached target basic.target - Basic System. Jul 15 05:17:03.171501 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jul 15 05:17:03.171531 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jul 15 05:17:03.172551 systemd[1]: Starting containerd.service - containerd container runtime... Jul 15 05:17:03.174373 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jul 15 05:17:03.176498 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jul 15 05:17:03.177877 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jul 15 05:17:03.179444 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jul 15 05:17:03.181515 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jul 15 05:17:03.181591 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jul 15 05:17:03.186327 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Jul 15 05:17:03.193714 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jul 15 05:17:03.196990 jq[1567]: false Jul 15 05:17:03.197801 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jul 15 05:17:03.201589 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Jul 15 05:17:03.208149 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jul 15 05:17:03.208827 google_oslogin_nss_cache[1569]: oslogin_cache_refresh[1569]: Refreshing passwd entry cache Jul 15 05:17:03.209036 oslogin_cache_refresh[1569]: Refreshing passwd entry cache Jul 15 05:17:03.210306 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jul 15 05:17:03.211292 google_oslogin_nss_cache[1569]: oslogin_cache_refresh[1569]: Failure getting users, quitting Jul 15 05:17:03.211284 oslogin_cache_refresh[1569]: Failure getting users, quitting Jul 15 05:17:03.211354 google_oslogin_nss_cache[1569]: oslogin_cache_refresh[1569]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jul 15 05:17:03.211305 oslogin_cache_refresh[1569]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jul 15 05:17:03.211401 google_oslogin_nss_cache[1569]: oslogin_cache_refresh[1569]: Refreshing group entry cache Jul 15 05:17:03.211359 oslogin_cache_refresh[1569]: Refreshing group entry cache Jul 15 05:17:03.216540 google_oslogin_nss_cache[1569]: oslogin_cache_refresh[1569]: Failure getting groups, quitting Jul 15 05:17:03.216581 oslogin_cache_refresh[1569]: Failure getting groups, quitting Jul 15 05:17:03.216631 google_oslogin_nss_cache[1569]: oslogin_cache_refresh[1569]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jul 15 05:17:03.216657 oslogin_cache_refresh[1569]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jul 15 05:17:03.219533 systemd[1]: Starting systemd-logind.service - User Login Management... Jul 15 05:17:03.220668 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jul 15 05:17:03.221925 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jul 15 05:17:03.223646 systemd[1]: Starting update-engine.service - Update Engine... Jul 15 05:17:03.224974 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jul 15 05:17:03.243668 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jul 15 05:17:03.244618 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jul 15 05:17:03.245095 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jul 15 05:17:03.245430 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Jul 15 05:17:03.246425 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Jul 15 05:17:03.247613 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jul 15 05:17:03.248823 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jul 15 05:17:03.263712 extend-filesystems[1568]: Found /dev/sda6 Jul 15 05:17:03.274890 update_engine[1579]: I20250715 05:17:03.272934 1579 main.cc:92] Flatcar Update Engine starting Jul 15 05:17:03.279264 extend-filesystems[1568]: Found /dev/sda9 Jul 15 05:17:03.281313 jq[1580]: true Jul 15 05:17:03.295272 extend-filesystems[1568]: Checking size of /dev/sda9 Jul 15 05:17:03.292371 systemd-timesyncd[1482]: Contacted time server 128.127.67.142:123 (0.flatcar.pool.ntp.org). Jul 15 05:17:03.292520 systemd-timesyncd[1482]: Initial clock synchronization to Tue 2025-07-15 05:17:03.515588 UTC. Jul 15 05:17:03.297332 coreos-metadata[1564]: Jul 15 05:17:03.296 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Jul 15 05:17:03.297770 coreos-metadata[1564]: Jul 15 05:17:03.297 INFO Fetch successful Jul 15 05:17:03.298258 coreos-metadata[1564]: Jul 15 05:17:03.297 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Jul 15 05:17:03.299516 coreos-metadata[1564]: Jul 15 05:17:03.298 INFO Fetch successful Jul 15 05:17:03.309420 systemd[1]: motdgen.service: Deactivated successfully. Jul 15 05:17:03.310486 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jul 15 05:17:03.317112 jq[1604]: true Jul 15 05:17:03.322798 tar[1591]: linux-amd64/helm Jul 15 05:17:03.325812 (ntainerd)[1608]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Jul 15 05:17:03.333031 extend-filesystems[1568]: Resized partition /dev/sda9 Jul 15 05:17:03.345936 extend-filesystems[1617]: resize2fs 1.47.2 (1-Jan-2025) Jul 15 05:17:03.348557 dbus-daemon[1565]: [system] SELinux support is enabled Jul 15 05:17:03.350413 systemd-logind[1577]: New seat seat0. Jul 15 05:17:03.350429 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jul 15 05:17:03.357277 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Jul 15 05:17:03.365658 update_engine[1579]: I20250715 05:17:03.365597 1579 update_check_scheduler.cc:74] Next update check in 7m49s Jul 15 05:17:03.373026 systemd-logind[1577]: Watching system buttons on /dev/input/event3 (Power Button) Jul 15 05:17:03.373060 systemd-logind[1577]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Jul 15 05:17:03.374448 systemd[1]: Started systemd-logind.service - User Login Management. Jul 15 05:17:03.374922 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jul 15 05:17:03.374957 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jul 15 05:17:03.375046 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jul 15 05:17:03.375063 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jul 15 05:17:03.378907 systemd[1]: Started update-engine.service - Update Engine. Jul 15 05:17:03.393841 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jul 15 05:17:03.439760 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jul 15 05:17:03.440161 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jul 15 05:17:03.456458 bash[1640]: Updated "/home/core/.ssh/authorized_keys" Jul 15 05:17:03.464574 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jul 15 05:17:03.467849 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jul 15 05:17:03.471933 systemd[1]: Starting sshkeys.service... Jul 15 05:17:03.480286 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Jul 15 05:17:03.494200 extend-filesystems[1617]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Jul 15 05:17:03.494200 extend-filesystems[1617]: old_desc_blocks = 1, new_desc_blocks = 5 Jul 15 05:17:03.494200 extend-filesystems[1617]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Jul 15 05:17:03.494602 extend-filesystems[1568]: Resized filesystem in /dev/sda9 Jul 15 05:17:03.494736 systemd[1]: extend-filesystems.service: Deactivated successfully. Jul 15 05:17:03.494962 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jul 15 05:17:03.516513 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jul 15 05:17:03.519906 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jul 15 05:17:03.592729 coreos-metadata[1651]: Jul 15 05:17:03.592 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Jul 15 05:17:03.598014 containerd[1608]: time="2025-07-15T05:17:03Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jul 15 05:17:03.615811 coreos-metadata[1651]: Jul 15 05:17:03.615 INFO Fetch successful Jul 15 05:17:03.618062 containerd[1608]: time="2025-07-15T05:17:03.618021611Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Jul 15 05:17:03.622821 unknown[1651]: wrote ssh authorized keys file for user: core Jul 15 05:17:03.669255 containerd[1608]: time="2025-07-15T05:17:03.668532684Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="8.615µs" Jul 15 05:17:03.669255 containerd[1608]: time="2025-07-15T05:17:03.668568471Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jul 15 05:17:03.669255 containerd[1608]: time="2025-07-15T05:17:03.668587046Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jul 15 05:17:03.669255 containerd[1608]: time="2025-07-15T05:17:03.668744832Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jul 15 05:17:03.669255 containerd[1608]: time="2025-07-15T05:17:03.668758348Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jul 15 05:17:03.669255 containerd[1608]: time="2025-07-15T05:17:03.668782383Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jul 15 05:17:03.669255 containerd[1608]: time="2025-07-15T05:17:03.668851933Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jul 15 05:17:03.669255 containerd[1608]: time="2025-07-15T05:17:03.668863434Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jul 15 05:17:03.669255 containerd[1608]: time="2025-07-15T05:17:03.669103184Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jul 15 05:17:03.669255 containerd[1608]: time="2025-07-15T05:17:03.669115066Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jul 15 05:17:03.669255 containerd[1608]: time="2025-07-15T05:17:03.669124574Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jul 15 05:17:03.669255 containerd[1608]: time="2025-07-15T05:17:03.669130936Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jul 15 05:17:03.674849 containerd[1608]: time="2025-07-15T05:17:03.674803803Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jul 15 05:17:03.675300 containerd[1608]: time="2025-07-15T05:17:03.675273624Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jul 15 05:17:03.675342 containerd[1608]: time="2025-07-15T05:17:03.675314821Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jul 15 05:17:03.675342 containerd[1608]: time="2025-07-15T05:17:03.675326203Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jul 15 05:17:03.675758 containerd[1608]: time="2025-07-15T05:17:03.675729879Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jul 15 05:17:03.677916 containerd[1608]: time="2025-07-15T05:17:03.677885321Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jul 15 05:17:03.677978 containerd[1608]: time="2025-07-15T05:17:03.677964991Z" level=info msg="metadata content store policy set" policy=shared Jul 15 05:17:03.678939 update-ssh-keys[1657]: Updated "/home/core/.ssh/authorized_keys" Jul 15 05:17:03.680317 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jul 15 05:17:03.683568 systemd[1]: Finished sshkeys.service. Jul 15 05:17:03.696733 containerd[1608]: time="2025-07-15T05:17:03.696685163Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jul 15 05:17:03.696799 containerd[1608]: time="2025-07-15T05:17:03.696763440Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jul 15 05:17:03.696799 containerd[1608]: time="2025-07-15T05:17:03.696780292Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jul 15 05:17:03.696799 containerd[1608]: time="2025-07-15T05:17:03.696791152Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jul 15 05:17:03.696852 containerd[1608]: time="2025-07-15T05:17:03.696802433Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jul 15 05:17:03.696869 containerd[1608]: time="2025-07-15T05:17:03.696855633Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jul 15 05:17:03.696869 containerd[1608]: time="2025-07-15T05:17:03.696866383Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jul 15 05:17:03.696899 containerd[1608]: time="2025-07-15T05:17:03.696876963Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jul 15 05:17:03.696928 containerd[1608]: time="2025-07-15T05:17:03.696898593Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jul 15 05:17:03.696928 containerd[1608]: time="2025-07-15T05:17:03.696909163Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jul 15 05:17:03.696928 containerd[1608]: time="2025-07-15T05:17:03.696917749Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jul 15 05:17:03.696928 containerd[1608]: time="2025-07-15T05:17:03.696928990Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jul 15 05:17:03.697079 containerd[1608]: time="2025-07-15T05:17:03.697052312Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jul 15 05:17:03.697118 containerd[1608]: time="2025-07-15T05:17:03.697079232Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jul 15 05:17:03.697118 containerd[1608]: time="2025-07-15T05:17:03.697092717Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jul 15 05:17:03.697118 containerd[1608]: time="2025-07-15T05:17:03.697101914Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jul 15 05:17:03.697118 containerd[1608]: time="2025-07-15T05:17:03.697110521Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jul 15 05:17:03.697118 containerd[1608]: time="2025-07-15T05:17:03.697120098Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jul 15 05:17:03.697344 containerd[1608]: time="2025-07-15T05:17:03.697130438Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jul 15 05:17:03.697344 containerd[1608]: time="2025-07-15T05:17:03.697140938Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jul 15 05:17:03.697344 containerd[1608]: time="2025-07-15T05:17:03.697151668Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jul 15 05:17:03.697344 containerd[1608]: time="2025-07-15T05:17:03.697160595Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jul 15 05:17:03.697344 containerd[1608]: time="2025-07-15T05:17:03.697180963Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jul 15 05:17:03.697344 containerd[1608]: time="2025-07-15T05:17:03.697265812Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jul 15 05:17:03.697344 containerd[1608]: time="2025-07-15T05:17:03.697280088Z" level=info msg="Start snapshots syncer" Jul 15 05:17:03.697344 containerd[1608]: time="2025-07-15T05:17:03.697309864Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jul 15 05:17:03.701265 containerd[1608]: time="2025-07-15T05:17:03.699353286Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jul 15 05:17:03.701265 containerd[1608]: time="2025-07-15T05:17:03.699403851Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jul 15 05:17:03.701397 containerd[1608]: time="2025-07-15T05:17:03.699462792Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jul 15 05:17:03.701397 containerd[1608]: time="2025-07-15T05:17:03.699560304Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jul 15 05:17:03.701397 containerd[1608]: time="2025-07-15T05:17:03.699578598Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jul 15 05:17:03.701397 containerd[1608]: time="2025-07-15T05:17:03.699588126Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jul 15 05:17:03.701397 containerd[1608]: time="2025-07-15T05:17:03.699596292Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jul 15 05:17:03.701397 containerd[1608]: time="2025-07-15T05:17:03.699606040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jul 15 05:17:03.701397 containerd[1608]: time="2025-07-15T05:17:03.699614816Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jul 15 05:17:03.701397 containerd[1608]: time="2025-07-15T05:17:03.699628793Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jul 15 05:17:03.701397 containerd[1608]: time="2025-07-15T05:17:03.699647928Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jul 15 05:17:03.701397 containerd[1608]: time="2025-07-15T05:17:03.699657276Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jul 15 05:17:03.701397 containerd[1608]: time="2025-07-15T05:17:03.699665521Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jul 15 05:17:03.701397 containerd[1608]: time="2025-07-15T05:17:03.699693314Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jul 15 05:17:03.701397 containerd[1608]: time="2025-07-15T05:17:03.699704344Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jul 15 05:17:03.701397 containerd[1608]: time="2025-07-15T05:17:03.699711538Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jul 15 05:17:03.701702 containerd[1608]: time="2025-07-15T05:17:03.699720154Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jul 15 05:17:03.701702 containerd[1608]: time="2025-07-15T05:17:03.699727107Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jul 15 05:17:03.701702 containerd[1608]: time="2025-07-15T05:17:03.699736294Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jul 15 05:17:03.701702 containerd[1608]: time="2025-07-15T05:17:03.699747696Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jul 15 05:17:03.701702 containerd[1608]: time="2025-07-15T05:17:03.699761722Z" level=info msg="runtime interface created" Jul 15 05:17:03.701702 containerd[1608]: time="2025-07-15T05:17:03.699766611Z" level=info msg="created NRI interface" Jul 15 05:17:03.701702 containerd[1608]: time="2025-07-15T05:17:03.699773464Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jul 15 05:17:03.701702 containerd[1608]: time="2025-07-15T05:17:03.699782941Z" level=info msg="Connect containerd service" Jul 15 05:17:03.701702 containerd[1608]: time="2025-07-15T05:17:03.699802559Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jul 15 05:17:03.704620 containerd[1608]: time="2025-07-15T05:17:03.704585787Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jul 15 05:17:03.775566 locksmithd[1624]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jul 15 05:17:03.862859 containerd[1608]: time="2025-07-15T05:17:03.862757966Z" level=info msg="Start subscribing containerd event" Jul 15 05:17:03.863141 containerd[1608]: time="2025-07-15T05:17:03.863104867Z" level=info msg="Start recovering state" Jul 15 05:17:03.864057 containerd[1608]: time="2025-07-15T05:17:03.864037846Z" level=info msg="Start event monitor" Jul 15 05:17:03.864507 containerd[1608]: time="2025-07-15T05:17:03.864490315Z" level=info msg="Start cni network conf syncer for default" Jul 15 05:17:03.864643 containerd[1608]: time="2025-07-15T05:17:03.864627041Z" level=info msg="Start streaming server" Jul 15 05:17:03.864873 containerd[1608]: time="2025-07-15T05:17:03.864857062Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jul 15 05:17:03.865498 containerd[1608]: time="2025-07-15T05:17:03.865475803Z" level=info msg="runtime interface starting up..." Jul 15 05:17:03.865750 containerd[1608]: time="2025-07-15T05:17:03.865735109Z" level=info msg="starting plugins..." Jul 15 05:17:03.865890 containerd[1608]: time="2025-07-15T05:17:03.865462177Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jul 15 05:17:03.866023 containerd[1608]: time="2025-07-15T05:17:03.865997280Z" level=info msg=serving... address=/run/containerd/containerd.sock Jul 15 05:17:03.866324 containerd[1608]: time="2025-07-15T05:17:03.865879500Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jul 15 05:17:03.866810 systemd[1]: Started containerd.service - containerd container runtime. Jul 15 05:17:03.868329 containerd[1608]: time="2025-07-15T05:17:03.867996068Z" level=info msg="containerd successfully booted in 0.270603s" Jul 15 05:17:03.953577 tar[1591]: linux-amd64/LICENSE Jul 15 05:17:03.953810 tar[1591]: linux-amd64/README.md Jul 15 05:17:03.983494 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jul 15 05:17:04.153547 sshd_keygen[1595]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jul 15 05:17:04.186913 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jul 15 05:17:04.190671 systemd[1]: Starting issuegen.service - Generate /run/issue... Jul 15 05:17:04.195662 systemd[1]: Started sshd@0-157.180.32.153:22-139.178.89.65:53370.service - OpenSSH per-connection server daemon (139.178.89.65:53370). Jul 15 05:17:04.210495 systemd[1]: issuegen.service: Deactivated successfully. Jul 15 05:17:04.210887 systemd[1]: Finished issuegen.service - Generate /run/issue. Jul 15 05:17:04.213497 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jul 15 05:17:04.231107 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jul 15 05:17:04.234492 systemd[1]: Started getty@tty1.service - Getty on tty1. Jul 15 05:17:04.236958 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jul 15 05:17:04.237548 systemd[1]: Reached target getty.target - Login Prompts. Jul 15 05:17:04.297555 systemd-networkd[1464]: eth1: Gained IPv6LL Jul 15 05:17:04.299145 systemd-networkd[1464]: eth0: Gained IPv6LL Jul 15 05:17:04.301638 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jul 15 05:17:04.302254 systemd[1]: Reached target network-online.target - Network is Online. Jul 15 05:17:04.305212 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:17:04.308485 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jul 15 05:17:04.337237 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jul 15 05:17:05.213453 sshd[1692]: Accepted publickey for core from 139.178.89.65 port 53370 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:17:05.215555 sshd-session[1692]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:17:05.232917 systemd-logind[1577]: New session 1 of user core. Jul 15 05:17:05.234896 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jul 15 05:17:05.237894 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jul 15 05:17:05.260816 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jul 15 05:17:05.270485 systemd[1]: Starting user@500.service - User Manager for UID 500... Jul 15 05:17:05.273371 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:17:05.274870 systemd[1]: Reached target multi-user.target - Multi-User System. Jul 15 05:17:05.279043 (kubelet)[1723]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 15 05:17:05.280792 (systemd)[1722]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Jul 15 05:17:05.284034 systemd-logind[1577]: New session c1 of user core. Jul 15 05:17:05.422291 systemd[1722]: Queued start job for default target default.target. Jul 15 05:17:05.429997 systemd[1722]: Created slice app.slice - User Application Slice. Jul 15 05:17:05.430027 systemd[1722]: Reached target paths.target - Paths. Jul 15 05:17:05.430074 systemd[1722]: Reached target timers.target - Timers. Jul 15 05:17:05.432114 systemd[1722]: Starting dbus.socket - D-Bus User Message Bus Socket... Jul 15 05:17:05.451381 systemd[1722]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jul 15 05:17:05.451461 systemd[1722]: Reached target sockets.target - Sockets. Jul 15 05:17:05.451513 systemd[1722]: Reached target basic.target - Basic System. Jul 15 05:17:05.451567 systemd[1722]: Reached target default.target - Main User Target. Jul 15 05:17:05.451603 systemd[1722]: Startup finished in 160ms. Jul 15 05:17:05.451888 systemd[1]: Started user@500.service - User Manager for UID 500. Jul 15 05:17:05.457419 systemd[1]: Started session-1.scope - Session 1 of User core. Jul 15 05:17:05.457966 systemd[1]: Startup finished in 3.178s (kernel) + 6.970s (initrd) + 4.896s (userspace) = 15.045s. Jul 15 05:17:05.827113 kubelet[1723]: E0715 05:17:05.827017 1723 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 15 05:17:05.831098 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 15 05:17:05.831418 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 15 05:17:05.831943 systemd[1]: kubelet.service: Consumed 876ms CPU time, 263.7M memory peak. Jul 15 05:17:06.176745 systemd[1]: Started sshd@1-157.180.32.153:22-139.178.89.65:53372.service - OpenSSH per-connection server daemon (139.178.89.65:53372). Jul 15 05:17:07.195396 sshd[1745]: Accepted publickey for core from 139.178.89.65 port 53372 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:17:07.197344 sshd-session[1745]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:17:07.203418 systemd-logind[1577]: New session 2 of user core. Jul 15 05:17:07.212458 systemd[1]: Started session-2.scope - Session 2 of User core. Jul 15 05:17:07.888712 sshd[1748]: Connection closed by 139.178.89.65 port 53372 Jul 15 05:17:07.889437 sshd-session[1745]: pam_unix(sshd:session): session closed for user core Jul 15 05:17:07.893841 systemd[1]: sshd@1-157.180.32.153:22-139.178.89.65:53372.service: Deactivated successfully. Jul 15 05:17:07.895879 systemd[1]: session-2.scope: Deactivated successfully. Jul 15 05:17:07.896752 systemd-logind[1577]: Session 2 logged out. Waiting for processes to exit. Jul 15 05:17:07.898490 systemd-logind[1577]: Removed session 2. Jul 15 05:17:08.059846 systemd[1]: Started sshd@2-157.180.32.153:22-139.178.89.65:53376.service - OpenSSH per-connection server daemon (139.178.89.65:53376). Jul 15 05:17:09.055899 sshd[1754]: Accepted publickey for core from 139.178.89.65 port 53376 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:17:09.057805 sshd-session[1754]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:17:09.062893 systemd-logind[1577]: New session 3 of user core. Jul 15 05:17:09.069433 systemd[1]: Started session-3.scope - Session 3 of User core. Jul 15 05:17:09.737721 sshd[1757]: Connection closed by 139.178.89.65 port 53376 Jul 15 05:17:09.738794 sshd-session[1754]: pam_unix(sshd:session): session closed for user core Jul 15 05:17:09.745846 systemd-logind[1577]: Session 3 logged out. Waiting for processes to exit. Jul 15 05:17:09.747464 systemd[1]: sshd@2-157.180.32.153:22-139.178.89.65:53376.service: Deactivated successfully. Jul 15 05:17:09.751084 systemd[1]: session-3.scope: Deactivated successfully. Jul 15 05:17:09.753760 systemd-logind[1577]: Removed session 3. Jul 15 05:17:09.916607 systemd[1]: Started sshd@3-157.180.32.153:22-139.178.89.65:40928.service - OpenSSH per-connection server daemon (139.178.89.65:40928). Jul 15 05:17:10.942209 sshd[1763]: Accepted publickey for core from 139.178.89.65 port 40928 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:17:10.944011 sshd-session[1763]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:17:10.950311 systemd-logind[1577]: New session 4 of user core. Jul 15 05:17:10.957395 systemd[1]: Started session-4.scope - Session 4 of User core. Jul 15 05:17:11.631532 sshd[1766]: Connection closed by 139.178.89.65 port 40928 Jul 15 05:17:11.632601 sshd-session[1763]: pam_unix(sshd:session): session closed for user core Jul 15 05:17:11.639838 systemd-logind[1577]: Session 4 logged out. Waiting for processes to exit. Jul 15 05:17:11.641613 systemd[1]: sshd@3-157.180.32.153:22-139.178.89.65:40928.service: Deactivated successfully. Jul 15 05:17:11.646174 systemd[1]: session-4.scope: Deactivated successfully. Jul 15 05:17:11.649569 systemd-logind[1577]: Removed session 4. Jul 15 05:17:11.801960 systemd[1]: Started sshd@4-157.180.32.153:22-139.178.89.65:40934.service - OpenSSH per-connection server daemon (139.178.89.65:40934). Jul 15 05:17:12.806815 sshd[1772]: Accepted publickey for core from 139.178.89.65 port 40934 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:17:12.808390 sshd-session[1772]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:17:12.814050 systemd-logind[1577]: New session 5 of user core. Jul 15 05:17:12.820375 systemd[1]: Started session-5.scope - Session 5 of User core. Jul 15 05:17:13.346145 sudo[1776]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jul 15 05:17:13.346707 sudo[1776]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 15 05:17:13.365166 sudo[1776]: pam_unix(sudo:session): session closed for user root Jul 15 05:17:13.524918 sshd[1775]: Connection closed by 139.178.89.65 port 40934 Jul 15 05:17:13.525818 sshd-session[1772]: pam_unix(sshd:session): session closed for user core Jul 15 05:17:13.530532 systemd-logind[1577]: Session 5 logged out. Waiting for processes to exit. Jul 15 05:17:13.531337 systemd[1]: sshd@4-157.180.32.153:22-139.178.89.65:40934.service: Deactivated successfully. Jul 15 05:17:13.533325 systemd[1]: session-5.scope: Deactivated successfully. Jul 15 05:17:13.534914 systemd-logind[1577]: Removed session 5. Jul 15 05:17:13.695010 systemd[1]: Started sshd@5-157.180.32.153:22-139.178.89.65:40950.service - OpenSSH per-connection server daemon (139.178.89.65:40950). Jul 15 05:17:14.696058 sshd[1782]: Accepted publickey for core from 139.178.89.65 port 40950 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:17:14.698719 sshd-session[1782]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:17:14.706549 systemd-logind[1577]: New session 6 of user core. Jul 15 05:17:14.715455 systemd[1]: Started session-6.scope - Session 6 of User core. Jul 15 05:17:15.224923 sudo[1787]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jul 15 05:17:15.225530 sudo[1787]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 15 05:17:15.234209 sudo[1787]: pam_unix(sudo:session): session closed for user root Jul 15 05:17:15.245188 sudo[1786]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jul 15 05:17:15.245845 sudo[1786]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 15 05:17:15.263423 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jul 15 05:17:15.325132 augenrules[1809]: No rules Jul 15 05:17:15.327104 systemd[1]: audit-rules.service: Deactivated successfully. Jul 15 05:17:15.327495 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jul 15 05:17:15.329318 sudo[1786]: pam_unix(sudo:session): session closed for user root Jul 15 05:17:15.488965 sshd[1785]: Connection closed by 139.178.89.65 port 40950 Jul 15 05:17:15.490112 sshd-session[1782]: pam_unix(sshd:session): session closed for user core Jul 15 05:17:15.494219 systemd[1]: sshd@5-157.180.32.153:22-139.178.89.65:40950.service: Deactivated successfully. Jul 15 05:17:15.496148 systemd[1]: session-6.scope: Deactivated successfully. Jul 15 05:17:15.497346 systemd-logind[1577]: Session 6 logged out. Waiting for processes to exit. Jul 15 05:17:15.499068 systemd-logind[1577]: Removed session 6. Jul 15 05:17:15.666785 systemd[1]: Started sshd@6-157.180.32.153:22-139.178.89.65:40964.service - OpenSSH per-connection server daemon (139.178.89.65:40964). Jul 15 05:17:15.950651 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jul 15 05:17:15.952661 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:17:16.120561 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:17:16.126510 (kubelet)[1829]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 15 05:17:16.161383 kubelet[1829]: E0715 05:17:16.161323 1829 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 15 05:17:16.166692 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 15 05:17:16.166862 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 15 05:17:16.167385 systemd[1]: kubelet.service: Consumed 179ms CPU time, 110.3M memory peak. Jul 15 05:17:16.683764 sshd[1818]: Accepted publickey for core from 139.178.89.65 port 40964 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:17:16.685372 sshd-session[1818]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:17:16.690957 systemd-logind[1577]: New session 7 of user core. Jul 15 05:17:16.701412 systemd[1]: Started session-7.scope - Session 7 of User core. Jul 15 05:17:17.206717 sudo[1837]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jul 15 05:17:17.207056 sudo[1837]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 15 05:17:17.495237 systemd[1]: Starting docker.service - Docker Application Container Engine... Jul 15 05:17:17.504547 (dockerd)[1856]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jul 15 05:17:17.692349 dockerd[1856]: time="2025-07-15T05:17:17.692289108Z" level=info msg="Starting up" Jul 15 05:17:17.693672 dockerd[1856]: time="2025-07-15T05:17:17.693539042Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jul 15 05:17:17.705589 dockerd[1856]: time="2025-07-15T05:17:17.705538213Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jul 15 05:17:17.744753 dockerd[1856]: time="2025-07-15T05:17:17.744553247Z" level=info msg="Loading containers: start." Jul 15 05:17:17.754274 kernel: Initializing XFRM netlink socket Jul 15 05:17:17.995625 systemd-networkd[1464]: docker0: Link UP Jul 15 05:17:18.000132 dockerd[1856]: time="2025-07-15T05:17:18.000078860Z" level=info msg="Loading containers: done." Jul 15 05:17:18.015890 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2798564291-merged.mount: Deactivated successfully. Jul 15 05:17:18.017327 dockerd[1856]: time="2025-07-15T05:17:18.017226545Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jul 15 05:17:18.017412 dockerd[1856]: time="2025-07-15T05:17:18.017343665Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jul 15 05:17:18.017450 dockerd[1856]: time="2025-07-15T05:17:18.017440883Z" level=info msg="Initializing buildkit" Jul 15 05:17:18.047568 dockerd[1856]: time="2025-07-15T05:17:18.047495653Z" level=info msg="Completed buildkit initialization" Jul 15 05:17:18.054848 dockerd[1856]: time="2025-07-15T05:17:18.054809155Z" level=info msg="Daemon has completed initialization" Jul 15 05:17:18.055017 systemd[1]: Started docker.service - Docker Application Container Engine. Jul 15 05:17:18.055466 dockerd[1856]: time="2025-07-15T05:17:18.055414572Z" level=info msg="API listen on /run/docker.sock" Jul 15 05:17:19.180500 containerd[1608]: time="2025-07-15T05:17:19.180438741Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.10\"" Jul 15 05:17:19.859895 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2036396653.mount: Deactivated successfully. Jul 15 05:17:20.928729 containerd[1608]: time="2025-07-15T05:17:20.928666388Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:17:20.930192 containerd[1608]: time="2025-07-15T05:17:20.929872727Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.10: active requests=0, bytes read=28077838" Jul 15 05:17:20.930948 containerd[1608]: time="2025-07-15T05:17:20.930919965Z" level=info msg="ImageCreate event name:\"sha256:74c5154ea84d9a53c406e6c00e53cf66145cce821fd80e3c74e2e1bf312f3977\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:17:20.933272 containerd[1608]: time="2025-07-15T05:17:20.933207557Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:083d7d64af31cd090f870eb49fb815e6bb42c175fc602ee9dae2f28f082bd4dc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:17:20.933901 containerd[1608]: time="2025-07-15T05:17:20.933869661Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.10\" with image id \"sha256:74c5154ea84d9a53c406e6c00e53cf66145cce821fd80e3c74e2e1bf312f3977\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.10\", repo digest \"registry.k8s.io/kube-apiserver@sha256:083d7d64af31cd090f870eb49fb815e6bb42c175fc602ee9dae2f28f082bd4dc\", size \"28074544\" in 1.753389942s" Jul 15 05:17:20.933946 containerd[1608]: time="2025-07-15T05:17:20.933903918Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.10\" returns image reference \"sha256:74c5154ea84d9a53c406e6c00e53cf66145cce821fd80e3c74e2e1bf312f3977\"" Jul 15 05:17:20.934461 containerd[1608]: time="2025-07-15T05:17:20.934419767Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.10\"" Jul 15 05:17:22.269351 containerd[1608]: time="2025-07-15T05:17:22.269291036Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:17:22.270334 containerd[1608]: time="2025-07-15T05:17:22.270065584Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.10: active requests=0, bytes read=24713316" Jul 15 05:17:22.271020 containerd[1608]: time="2025-07-15T05:17:22.270979857Z" level=info msg="ImageCreate event name:\"sha256:c285c4e62c91c434e9928bee7063b361509f43f43faa31641b626d6eff97616d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:17:22.273106 containerd[1608]: time="2025-07-15T05:17:22.273070873Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:3c67387d023c6114879f1e817669fd641797d30f117230682faf3930ecaaf0fe\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:17:22.273943 containerd[1608]: time="2025-07-15T05:17:22.273912396Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.10\" with image id \"sha256:c285c4e62c91c434e9928bee7063b361509f43f43faa31641b626d6eff97616d\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.10\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:3c67387d023c6114879f1e817669fd641797d30f117230682faf3930ecaaf0fe\", size \"26315128\" in 1.33946649s" Jul 15 05:17:22.274036 containerd[1608]: time="2025-07-15T05:17:22.274019446Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.10\" returns image reference \"sha256:c285c4e62c91c434e9928bee7063b361509f43f43faa31641b626d6eff97616d\"" Jul 15 05:17:22.274827 containerd[1608]: time="2025-07-15T05:17:22.274768613Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.10\"" Jul 15 05:17:23.317720 containerd[1608]: time="2025-07-15T05:17:23.317645391Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:17:23.318919 containerd[1608]: time="2025-07-15T05:17:23.318867341Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.10: active requests=0, bytes read=18783693" Jul 15 05:17:23.319749 containerd[1608]: time="2025-07-15T05:17:23.319695285Z" level=info msg="ImageCreate event name:\"sha256:61daeb7d112d9547792027cb16242b1d131f357f511545477381457fff5a69e2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:17:23.322663 containerd[1608]: time="2025-07-15T05:17:23.322637785Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:284dc2a5cf6afc9b76e39ad4b79c680c23d289488517643b28784a06d0141272\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:17:23.323501 containerd[1608]: time="2025-07-15T05:17:23.323347737Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.10\" with image id \"sha256:61daeb7d112d9547792027cb16242b1d131f357f511545477381457fff5a69e2\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.10\", repo digest \"registry.k8s.io/kube-scheduler@sha256:284dc2a5cf6afc9b76e39ad4b79c680c23d289488517643b28784a06d0141272\", size \"20385523\" in 1.048544956s" Jul 15 05:17:23.323501 containerd[1608]: time="2025-07-15T05:17:23.323386325Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.10\" returns image reference \"sha256:61daeb7d112d9547792027cb16242b1d131f357f511545477381457fff5a69e2\"" Jul 15 05:17:23.323893 containerd[1608]: time="2025-07-15T05:17:23.323841048Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.10\"" Jul 15 05:17:24.397199 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2925930168.mount: Deactivated successfully. Jul 15 05:17:24.735580 containerd[1608]: time="2025-07-15T05:17:24.735509119Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:17:24.736543 containerd[1608]: time="2025-07-15T05:17:24.736480521Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.10: active requests=0, bytes read=30383971" Jul 15 05:17:24.737323 containerd[1608]: time="2025-07-15T05:17:24.737273514Z" level=info msg="ImageCreate event name:\"sha256:3ed600862d3e69931e0f9f4dbf5c2b46343af40aa079772434f13de771bdc30c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:17:24.738747 containerd[1608]: time="2025-07-15T05:17:24.738703931Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:bcbb293812bdf587b28ea98369a8c347ca84884160046296761acdf12b27029d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:17:24.739189 containerd[1608]: time="2025-07-15T05:17:24.739037428Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.10\" with image id \"sha256:3ed600862d3e69931e0f9f4dbf5c2b46343af40aa079772434f13de771bdc30c\", repo tag \"registry.k8s.io/kube-proxy:v1.31.10\", repo digest \"registry.k8s.io/kube-proxy@sha256:bcbb293812bdf587b28ea98369a8c347ca84884160046296761acdf12b27029d\", size \"30382962\" in 1.415170537s" Jul 15 05:17:24.739189 containerd[1608]: time="2025-07-15T05:17:24.739062955Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.10\" returns image reference \"sha256:3ed600862d3e69931e0f9f4dbf5c2b46343af40aa079772434f13de771bdc30c\"" Jul 15 05:17:24.739851 containerd[1608]: time="2025-07-15T05:17:24.739806170Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Jul 15 05:17:25.240109 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3147502570.mount: Deactivated successfully. Jul 15 05:17:25.948589 containerd[1608]: time="2025-07-15T05:17:25.948501659Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:17:25.949610 containerd[1608]: time="2025-07-15T05:17:25.949555772Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565335" Jul 15 05:17:25.950798 containerd[1608]: time="2025-07-15T05:17:25.950758343Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:17:25.953639 containerd[1608]: time="2025-07-15T05:17:25.953600661Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:17:25.955059 containerd[1608]: time="2025-07-15T05:17:25.954700508Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.214757058s" Jul 15 05:17:25.955059 containerd[1608]: time="2025-07-15T05:17:25.954729651Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Jul 15 05:17:25.955287 containerd[1608]: time="2025-07-15T05:17:25.955266552Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jul 15 05:17:26.200470 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jul 15 05:17:26.202183 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:17:26.403446 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:17:26.421774 (kubelet)[2195]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 15 05:17:26.448357 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1774256727.mount: Deactivated successfully. Jul 15 05:17:26.454293 containerd[1608]: time="2025-07-15T05:17:26.454028738Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 15 05:17:26.455776 containerd[1608]: time="2025-07-15T05:17:26.455738374Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321160" Jul 15 05:17:26.456438 containerd[1608]: time="2025-07-15T05:17:26.456400834Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 15 05:17:26.459952 containerd[1608]: time="2025-07-15T05:17:26.459284533Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 15 05:17:26.459952 containerd[1608]: time="2025-07-15T05:17:26.459590317Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 504.223449ms" Jul 15 05:17:26.459952 containerd[1608]: time="2025-07-15T05:17:26.459611980Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Jul 15 05:17:26.460511 containerd[1608]: time="2025-07-15T05:17:26.460442362Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Jul 15 05:17:26.469841 kubelet[2195]: E0715 05:17:26.469781 2195 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 15 05:17:26.474530 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 15 05:17:26.474765 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 15 05:17:26.475421 systemd[1]: kubelet.service: Consumed 188ms CPU time, 108.9M memory peak. Jul 15 05:17:26.967591 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1208749174.mount: Deactivated successfully. Jul 15 05:17:28.362884 containerd[1608]: time="2025-07-15T05:17:28.362811620Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:17:28.363856 containerd[1608]: time="2025-07-15T05:17:28.363811795Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56780083" Jul 15 05:17:28.364625 containerd[1608]: time="2025-07-15T05:17:28.364562000Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:17:28.367667 containerd[1608]: time="2025-07-15T05:17:28.366876453Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:17:28.367782 containerd[1608]: time="2025-07-15T05:17:28.367633387Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 1.906967008s" Jul 15 05:17:28.367870 containerd[1608]: time="2025-07-15T05:17:28.367850860Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" Jul 15 05:17:30.956991 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:17:30.957146 systemd[1]: kubelet.service: Consumed 188ms CPU time, 108.9M memory peak. Jul 15 05:17:30.959346 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:17:30.986305 systemd[1]: Reload requested from client PID 2287 ('systemctl') (unit session-7.scope)... Jul 15 05:17:30.986318 systemd[1]: Reloading... Jul 15 05:17:31.108303 zram_generator::config[2331]: No configuration found. Jul 15 05:17:31.207092 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 15 05:17:31.326116 systemd[1]: Reloading finished in 339 ms. Jul 15 05:17:31.391982 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jul 15 05:17:31.392333 systemd[1]: kubelet.service: Failed with result 'signal'. Jul 15 05:17:31.392922 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:17:31.392981 systemd[1]: kubelet.service: Consumed 121ms CPU time, 98.2M memory peak. Jul 15 05:17:31.395503 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:17:31.564344 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:17:31.571647 (kubelet)[2385]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jul 15 05:17:31.620400 kubelet[2385]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 15 05:17:31.620400 kubelet[2385]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jul 15 05:17:31.620400 kubelet[2385]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 15 05:17:31.620400 kubelet[2385]: I0715 05:17:31.620176 2385 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jul 15 05:17:31.952455 kubelet[2385]: I0715 05:17:31.952399 2385 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Jul 15 05:17:31.952455 kubelet[2385]: I0715 05:17:31.952426 2385 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jul 15 05:17:31.952717 kubelet[2385]: I0715 05:17:31.952630 2385 server.go:934] "Client rotation is on, will bootstrap in background" Jul 15 05:17:31.974874 kubelet[2385]: I0715 05:17:31.974314 2385 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jul 15 05:17:31.983212 kubelet[2385]: E0715 05:17:31.981222 2385 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://157.180.32.153:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 157.180.32.153:6443: connect: connection refused" logger="UnhandledError" Jul 15 05:17:31.994232 kubelet[2385]: I0715 05:17:31.994133 2385 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jul 15 05:17:32.001190 kubelet[2385]: I0715 05:17:32.001116 2385 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jul 15 05:17:32.002774 kubelet[2385]: I0715 05:17:32.002706 2385 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jul 15 05:17:32.002931 kubelet[2385]: I0715 05:17:32.002878 2385 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jul 15 05:17:32.003114 kubelet[2385]: I0715 05:17:32.002902 2385 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4396-0-0-n-153ccb2e88","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jul 15 05:17:32.003114 kubelet[2385]: I0715 05:17:32.003089 2385 topology_manager.go:138] "Creating topology manager with none policy" Jul 15 05:17:32.003114 kubelet[2385]: I0715 05:17:32.003098 2385 container_manager_linux.go:300] "Creating device plugin manager" Jul 15 05:17:32.003480 kubelet[2385]: I0715 05:17:32.003204 2385 state_mem.go:36] "Initialized new in-memory state store" Jul 15 05:17:32.005571 kubelet[2385]: I0715 05:17:32.005525 2385 kubelet.go:408] "Attempting to sync node with API server" Jul 15 05:17:32.005571 kubelet[2385]: I0715 05:17:32.005545 2385 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Jul 15 05:17:32.005571 kubelet[2385]: I0715 05:17:32.005577 2385 kubelet.go:314] "Adding apiserver pod source" Jul 15 05:17:32.006775 kubelet[2385]: I0715 05:17:32.005600 2385 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jul 15 05:17:32.012764 kubelet[2385]: W0715 05:17:32.012695 2385 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://157.180.32.153:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4396-0-0-n-153ccb2e88&limit=500&resourceVersion=0": dial tcp 157.180.32.153:6443: connect: connection refused Jul 15 05:17:32.012936 kubelet[2385]: E0715 05:17:32.012903 2385 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://157.180.32.153:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4396-0-0-n-153ccb2e88&limit=500&resourceVersion=0\": dial tcp 157.180.32.153:6443: connect: connection refused" logger="UnhandledError" Jul 15 05:17:32.013685 kubelet[2385]: W0715 05:17:32.013622 2385 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://157.180.32.153:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 157.180.32.153:6443: connect: connection refused Jul 15 05:17:32.013839 kubelet[2385]: E0715 05:17:32.013808 2385 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://157.180.32.153:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 157.180.32.153:6443: connect: connection refused" logger="UnhandledError" Jul 15 05:17:32.014062 kubelet[2385]: I0715 05:17:32.014036 2385 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Jul 15 05:17:32.018617 kubelet[2385]: I0715 05:17:32.018569 2385 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jul 15 05:17:32.019552 kubelet[2385]: W0715 05:17:32.019500 2385 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jul 15 05:17:32.020530 kubelet[2385]: I0715 05:17:32.020474 2385 server.go:1274] "Started kubelet" Jul 15 05:17:32.022260 kubelet[2385]: I0715 05:17:32.021377 2385 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jul 15 05:17:32.022669 kubelet[2385]: I0715 05:17:32.022651 2385 server.go:449] "Adding debug handlers to kubelet server" Jul 15 05:17:32.025872 kubelet[2385]: I0715 05:17:32.025439 2385 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jul 15 05:17:32.025931 kubelet[2385]: I0715 05:17:32.025916 2385 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jul 15 05:17:32.027811 kubelet[2385]: I0715 05:17:32.027791 2385 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jul 15 05:17:32.028221 kubelet[2385]: E0715 05:17:32.026286 2385 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://157.180.32.153:6443/api/v1/namespaces/default/events\": dial tcp 157.180.32.153:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4396-0-0-n-153ccb2e88.185254fd546d2850 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4396-0-0-n-153ccb2e88,UID:ci-4396-0-0-n-153ccb2e88,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4396-0-0-n-153ccb2e88,},FirstTimestamp:2025-07-15 05:17:32.020443216 +0000 UTC m=+0.444210861,LastTimestamp:2025-07-15 05:17:32.020443216 +0000 UTC m=+0.444210861,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4396-0-0-n-153ccb2e88,}" Jul 15 05:17:32.029044 kubelet[2385]: I0715 05:17:32.028985 2385 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jul 15 05:17:32.036538 kubelet[2385]: E0715 05:17:32.035935 2385 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4396-0-0-n-153ccb2e88\" not found" Jul 15 05:17:32.036538 kubelet[2385]: I0715 05:17:32.035995 2385 volume_manager.go:289] "Starting Kubelet Volume Manager" Jul 15 05:17:32.039670 kubelet[2385]: I0715 05:17:32.038924 2385 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Jul 15 05:17:32.039670 kubelet[2385]: I0715 05:17:32.039001 2385 reconciler.go:26] "Reconciler: start to sync state" Jul 15 05:17:32.039819 kubelet[2385]: I0715 05:17:32.039786 2385 factory.go:221] Registration of the systemd container factory successfully Jul 15 05:17:32.040564 kubelet[2385]: I0715 05:17:32.039869 2385 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jul 15 05:17:32.043786 kubelet[2385]: E0715 05:17:32.043152 2385 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://157.180.32.153:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4396-0-0-n-153ccb2e88?timeout=10s\": dial tcp 157.180.32.153:6443: connect: connection refused" interval="200ms" Jul 15 05:17:32.043786 kubelet[2385]: W0715 05:17:32.043349 2385 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://157.180.32.153:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 157.180.32.153:6443: connect: connection refused Jul 15 05:17:32.043786 kubelet[2385]: E0715 05:17:32.043393 2385 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://157.180.32.153:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 157.180.32.153:6443: connect: connection refused" logger="UnhandledError" Jul 15 05:17:32.044756 kubelet[2385]: I0715 05:17:32.044718 2385 factory.go:221] Registration of the containerd container factory successfully Jul 15 05:17:32.065758 kubelet[2385]: I0715 05:17:32.065704 2385 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jul 15 05:17:32.066512 kubelet[2385]: I0715 05:17:32.066495 2385 cpu_manager.go:214] "Starting CPU manager" policy="none" Jul 15 05:17:32.066588 kubelet[2385]: I0715 05:17:32.066578 2385 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jul 15 05:17:32.066637 kubelet[2385]: I0715 05:17:32.066629 2385 state_mem.go:36] "Initialized new in-memory state store" Jul 15 05:17:32.066894 kubelet[2385]: I0715 05:17:32.066858 2385 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jul 15 05:17:32.066894 kubelet[2385]: I0715 05:17:32.066884 2385 status_manager.go:217] "Starting to sync pod status with apiserver" Jul 15 05:17:32.066956 kubelet[2385]: I0715 05:17:32.066901 2385 kubelet.go:2321] "Starting kubelet main sync loop" Jul 15 05:17:32.066956 kubelet[2385]: E0715 05:17:32.066933 2385 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jul 15 05:17:32.069481 kubelet[2385]: I0715 05:17:32.069457 2385 policy_none.go:49] "None policy: Start" Jul 15 05:17:32.071289 kubelet[2385]: W0715 05:17:32.071265 2385 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://157.180.32.153:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 157.180.32.153:6443: connect: connection refused Jul 15 05:17:32.071397 kubelet[2385]: E0715 05:17:32.071377 2385 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://157.180.32.153:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 157.180.32.153:6443: connect: connection refused" logger="UnhandledError" Jul 15 05:17:32.072129 kubelet[2385]: I0715 05:17:32.071914 2385 memory_manager.go:170] "Starting memorymanager" policy="None" Jul 15 05:17:32.072129 kubelet[2385]: I0715 05:17:32.071938 2385 state_mem.go:35] "Initializing new in-memory state store" Jul 15 05:17:32.077441 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jul 15 05:17:32.089357 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jul 15 05:17:32.094962 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jul 15 05:17:32.104728 kubelet[2385]: I0715 05:17:32.103613 2385 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jul 15 05:17:32.104728 kubelet[2385]: I0715 05:17:32.103794 2385 eviction_manager.go:189] "Eviction manager: starting control loop" Jul 15 05:17:32.104728 kubelet[2385]: I0715 05:17:32.103805 2385 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jul 15 05:17:32.104728 kubelet[2385]: I0715 05:17:32.104315 2385 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jul 15 05:17:32.106950 kubelet[2385]: E0715 05:17:32.106916 2385 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4396-0-0-n-153ccb2e88\" not found" Jul 15 05:17:32.180491 systemd[1]: Created slice kubepods-burstable-pod4438e3662bee56f52349e1daec9c5cd7.slice - libcontainer container kubepods-burstable-pod4438e3662bee56f52349e1daec9c5cd7.slice. Jul 15 05:17:32.196645 systemd[1]: Created slice kubepods-burstable-podddfb02216aa4036044d1e1f4c586e97e.slice - libcontainer container kubepods-burstable-podddfb02216aa4036044d1e1f4c586e97e.slice. Jul 15 05:17:32.207832 kubelet[2385]: I0715 05:17:32.206639 2385 kubelet_node_status.go:72] "Attempting to register node" node="ci-4396-0-0-n-153ccb2e88" Jul 15 05:17:32.207832 kubelet[2385]: E0715 05:17:32.206948 2385 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://157.180.32.153:6443/api/v1/nodes\": dial tcp 157.180.32.153:6443: connect: connection refused" node="ci-4396-0-0-n-153ccb2e88" Jul 15 05:17:32.214673 systemd[1]: Created slice kubepods-burstable-pod66d69814c7e4b38ec2fc1ce48979fde3.slice - libcontainer container kubepods-burstable-pod66d69814c7e4b38ec2fc1ce48979fde3.slice. Jul 15 05:17:32.244686 kubelet[2385]: E0715 05:17:32.244608 2385 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://157.180.32.153:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4396-0-0-n-153ccb2e88?timeout=10s\": dial tcp 157.180.32.153:6443: connect: connection refused" interval="400ms" Jul 15 05:17:32.341465 kubelet[2385]: I0715 05:17:32.341356 2385 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ddfb02216aa4036044d1e1f4c586e97e-ca-certs\") pod \"kube-controller-manager-ci-4396-0-0-n-153ccb2e88\" (UID: \"ddfb02216aa4036044d1e1f4c586e97e\") " pod="kube-system/kube-controller-manager-ci-4396-0-0-n-153ccb2e88" Jul 15 05:17:32.341465 kubelet[2385]: I0715 05:17:32.341441 2385 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ddfb02216aa4036044d1e1f4c586e97e-kubeconfig\") pod \"kube-controller-manager-ci-4396-0-0-n-153ccb2e88\" (UID: \"ddfb02216aa4036044d1e1f4c586e97e\") " pod="kube-system/kube-controller-manager-ci-4396-0-0-n-153ccb2e88" Jul 15 05:17:32.341465 kubelet[2385]: I0715 05:17:32.341468 2385 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ddfb02216aa4036044d1e1f4c586e97e-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4396-0-0-n-153ccb2e88\" (UID: \"ddfb02216aa4036044d1e1f4c586e97e\") " pod="kube-system/kube-controller-manager-ci-4396-0-0-n-153ccb2e88" Jul 15 05:17:32.341758 kubelet[2385]: I0715 05:17:32.341494 2385 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/66d69814c7e4b38ec2fc1ce48979fde3-kubeconfig\") pod \"kube-scheduler-ci-4396-0-0-n-153ccb2e88\" (UID: \"66d69814c7e4b38ec2fc1ce48979fde3\") " pod="kube-system/kube-scheduler-ci-4396-0-0-n-153ccb2e88" Jul 15 05:17:32.341758 kubelet[2385]: I0715 05:17:32.341515 2385 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4438e3662bee56f52349e1daec9c5cd7-ca-certs\") pod \"kube-apiserver-ci-4396-0-0-n-153ccb2e88\" (UID: \"4438e3662bee56f52349e1daec9c5cd7\") " pod="kube-system/kube-apiserver-ci-4396-0-0-n-153ccb2e88" Jul 15 05:17:32.341758 kubelet[2385]: I0715 05:17:32.341534 2385 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4438e3662bee56f52349e1daec9c5cd7-k8s-certs\") pod \"kube-apiserver-ci-4396-0-0-n-153ccb2e88\" (UID: \"4438e3662bee56f52349e1daec9c5cd7\") " pod="kube-system/kube-apiserver-ci-4396-0-0-n-153ccb2e88" Jul 15 05:17:32.341758 kubelet[2385]: I0715 05:17:32.341557 2385 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4438e3662bee56f52349e1daec9c5cd7-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4396-0-0-n-153ccb2e88\" (UID: \"4438e3662bee56f52349e1daec9c5cd7\") " pod="kube-system/kube-apiserver-ci-4396-0-0-n-153ccb2e88" Jul 15 05:17:32.341758 kubelet[2385]: I0715 05:17:32.341581 2385 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/ddfb02216aa4036044d1e1f4c586e97e-flexvolume-dir\") pod \"kube-controller-manager-ci-4396-0-0-n-153ccb2e88\" (UID: \"ddfb02216aa4036044d1e1f4c586e97e\") " pod="kube-system/kube-controller-manager-ci-4396-0-0-n-153ccb2e88" Jul 15 05:17:32.341963 kubelet[2385]: I0715 05:17:32.341603 2385 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ddfb02216aa4036044d1e1f4c586e97e-k8s-certs\") pod \"kube-controller-manager-ci-4396-0-0-n-153ccb2e88\" (UID: \"ddfb02216aa4036044d1e1f4c586e97e\") " pod="kube-system/kube-controller-manager-ci-4396-0-0-n-153ccb2e88" Jul 15 05:17:32.410101 kubelet[2385]: I0715 05:17:32.410037 2385 kubelet_node_status.go:72] "Attempting to register node" node="ci-4396-0-0-n-153ccb2e88" Jul 15 05:17:32.410641 kubelet[2385]: E0715 05:17:32.410604 2385 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://157.180.32.153:6443/api/v1/nodes\": dial tcp 157.180.32.153:6443: connect: connection refused" node="ci-4396-0-0-n-153ccb2e88" Jul 15 05:17:32.496065 containerd[1608]: time="2025-07-15T05:17:32.495292431Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4396-0-0-n-153ccb2e88,Uid:4438e3662bee56f52349e1daec9c5cd7,Namespace:kube-system,Attempt:0,}" Jul 15 05:17:32.513073 containerd[1608]: time="2025-07-15T05:17:32.512740215Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4396-0-0-n-153ccb2e88,Uid:ddfb02216aa4036044d1e1f4c586e97e,Namespace:kube-system,Attempt:0,}" Jul 15 05:17:32.518716 containerd[1608]: time="2025-07-15T05:17:32.518634621Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4396-0-0-n-153ccb2e88,Uid:66d69814c7e4b38ec2fc1ce48979fde3,Namespace:kube-system,Attempt:0,}" Jul 15 05:17:32.625541 containerd[1608]: time="2025-07-15T05:17:32.625488800Z" level=info msg="connecting to shim 70d41083680083c5d0e665e4609b79a3db984331599e60a099b2b22db72e8030" address="unix:///run/containerd/s/b6fcea72191e540e2fd32b6df2b0d1d98ceb474a8d960fae0c4601afa26a24ca" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:17:32.626567 containerd[1608]: time="2025-07-15T05:17:32.626047036Z" level=info msg="connecting to shim be5c7bc1074ae21919189114ce42fc3cddfa7c05710e062cc047d7934ceedc2c" address="unix:///run/containerd/s/45e472d832597b6af86e17a64f98c0d0238478d0bd8b2fdb75c4a2e0e4dde5eb" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:17:32.628381 containerd[1608]: time="2025-07-15T05:17:32.628341429Z" level=info msg="connecting to shim d205ab30e83028c0cf057a59a1970a1bf0d4651b9a49349e0e4c6ca815fbea18" address="unix:///run/containerd/s/6a7a2c535117b684db63279c3de52cd2f7c3fb20cc7d49822837980c435f27ff" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:17:32.645744 kubelet[2385]: E0715 05:17:32.645525 2385 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://157.180.32.153:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4396-0-0-n-153ccb2e88?timeout=10s\": dial tcp 157.180.32.153:6443: connect: connection refused" interval="800ms" Jul 15 05:17:32.704402 systemd[1]: Started cri-containerd-70d41083680083c5d0e665e4609b79a3db984331599e60a099b2b22db72e8030.scope - libcontainer container 70d41083680083c5d0e665e4609b79a3db984331599e60a099b2b22db72e8030. Jul 15 05:17:32.709617 systemd[1]: Started cri-containerd-be5c7bc1074ae21919189114ce42fc3cddfa7c05710e062cc047d7934ceedc2c.scope - libcontainer container be5c7bc1074ae21919189114ce42fc3cddfa7c05710e062cc047d7934ceedc2c. Jul 15 05:17:32.711707 systemd[1]: Started cri-containerd-d205ab30e83028c0cf057a59a1970a1bf0d4651b9a49349e0e4c6ca815fbea18.scope - libcontainer container d205ab30e83028c0cf057a59a1970a1bf0d4651b9a49349e0e4c6ca815fbea18. Jul 15 05:17:32.777607 containerd[1608]: time="2025-07-15T05:17:32.777499903Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4396-0-0-n-153ccb2e88,Uid:ddfb02216aa4036044d1e1f4c586e97e,Namespace:kube-system,Attempt:0,} returns sandbox id \"70d41083680083c5d0e665e4609b79a3db984331599e60a099b2b22db72e8030\"" Jul 15 05:17:32.782727 containerd[1608]: time="2025-07-15T05:17:32.782699327Z" level=info msg="CreateContainer within sandbox \"70d41083680083c5d0e665e4609b79a3db984331599e60a099b2b22db72e8030\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jul 15 05:17:32.799079 containerd[1608]: time="2025-07-15T05:17:32.799045895Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4396-0-0-n-153ccb2e88,Uid:4438e3662bee56f52349e1daec9c5cd7,Namespace:kube-system,Attempt:0,} returns sandbox id \"be5c7bc1074ae21919189114ce42fc3cddfa7c05710e062cc047d7934ceedc2c\"" Jul 15 05:17:32.802468 containerd[1608]: time="2025-07-15T05:17:32.802444821Z" level=info msg="CreateContainer within sandbox \"be5c7bc1074ae21919189114ce42fc3cddfa7c05710e062cc047d7934ceedc2c\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jul 15 05:17:32.807018 containerd[1608]: time="2025-07-15T05:17:32.806981412Z" level=info msg="Container 59f00e5231c6853387d462524bb9a6a5c2f8dc219c14950e690bd6e9a64484dc: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:17:32.810574 containerd[1608]: time="2025-07-15T05:17:32.810497645Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4396-0-0-n-153ccb2e88,Uid:66d69814c7e4b38ec2fc1ce48979fde3,Namespace:kube-system,Attempt:0,} returns sandbox id \"d205ab30e83028c0cf057a59a1970a1bf0d4651b9a49349e0e4c6ca815fbea18\"" Jul 15 05:17:32.814768 kubelet[2385]: I0715 05:17:32.814154 2385 kubelet_node_status.go:72] "Attempting to register node" node="ci-4396-0-0-n-153ccb2e88" Jul 15 05:17:32.815257 kubelet[2385]: E0715 05:17:32.815213 2385 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://157.180.32.153:6443/api/v1/nodes\": dial tcp 157.180.32.153:6443: connect: connection refused" node="ci-4396-0-0-n-153ccb2e88" Jul 15 05:17:32.817357 containerd[1608]: time="2025-07-15T05:17:32.816767564Z" level=info msg="CreateContainer within sandbox \"d205ab30e83028c0cf057a59a1970a1bf0d4651b9a49349e0e4c6ca815fbea18\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jul 15 05:17:32.817357 containerd[1608]: time="2025-07-15T05:17:32.816837561Z" level=info msg="Container ae07f8c5848926174eea72e6fc969ea9c45d799873af4611025f6174f0b1a201: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:17:32.820820 containerd[1608]: time="2025-07-15T05:17:32.820780527Z" level=info msg="CreateContainer within sandbox \"70d41083680083c5d0e665e4609b79a3db984331599e60a099b2b22db72e8030\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"59f00e5231c6853387d462524bb9a6a5c2f8dc219c14950e690bd6e9a64484dc\"" Jul 15 05:17:32.821306 containerd[1608]: time="2025-07-15T05:17:32.821229576Z" level=info msg="StartContainer for \"59f00e5231c6853387d462524bb9a6a5c2f8dc219c14950e690bd6e9a64484dc\"" Jul 15 05:17:32.822366 containerd[1608]: time="2025-07-15T05:17:32.822323632Z" level=info msg="connecting to shim 59f00e5231c6853387d462524bb9a6a5c2f8dc219c14950e690bd6e9a64484dc" address="unix:///run/containerd/s/b6fcea72191e540e2fd32b6df2b0d1d98ceb474a8d960fae0c4601afa26a24ca" protocol=ttrpc version=3 Jul 15 05:17:32.830369 containerd[1608]: time="2025-07-15T05:17:32.830314218Z" level=info msg="CreateContainer within sandbox \"be5c7bc1074ae21919189114ce42fc3cddfa7c05710e062cc047d7934ceedc2c\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"ae07f8c5848926174eea72e6fc969ea9c45d799873af4611025f6174f0b1a201\"" Jul 15 05:17:32.831386 containerd[1608]: time="2025-07-15T05:17:32.831347961Z" level=info msg="StartContainer for \"ae07f8c5848926174eea72e6fc969ea9c45d799873af4611025f6174f0b1a201\"" Jul 15 05:17:32.832739 containerd[1608]: time="2025-07-15T05:17:32.832701195Z" level=info msg="connecting to shim ae07f8c5848926174eea72e6fc969ea9c45d799873af4611025f6174f0b1a201" address="unix:///run/containerd/s/45e472d832597b6af86e17a64f98c0d0238478d0bd8b2fdb75c4a2e0e4dde5eb" protocol=ttrpc version=3 Jul 15 05:17:32.835941 containerd[1608]: time="2025-07-15T05:17:32.835448295Z" level=info msg="Container cada33e1bba514d30c3f9f702dbb636c7e399eca8d67bc5c76d2e2842005f366: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:17:32.850487 containerd[1608]: time="2025-07-15T05:17:32.850450063Z" level=info msg="CreateContainer within sandbox \"d205ab30e83028c0cf057a59a1970a1bf0d4651b9a49349e0e4c6ca815fbea18\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"cada33e1bba514d30c3f9f702dbb636c7e399eca8d67bc5c76d2e2842005f366\"" Jul 15 05:17:32.852085 containerd[1608]: time="2025-07-15T05:17:32.852030954Z" level=info msg="StartContainer for \"cada33e1bba514d30c3f9f702dbb636c7e399eca8d67bc5c76d2e2842005f366\"" Jul 15 05:17:32.853000 containerd[1608]: time="2025-07-15T05:17:32.852965245Z" level=info msg="connecting to shim cada33e1bba514d30c3f9f702dbb636c7e399eca8d67bc5c76d2e2842005f366" address="unix:///run/containerd/s/6a7a2c535117b684db63279c3de52cd2f7c3fb20cc7d49822837980c435f27ff" protocol=ttrpc version=3 Jul 15 05:17:32.854545 systemd[1]: Started cri-containerd-59f00e5231c6853387d462524bb9a6a5c2f8dc219c14950e690bd6e9a64484dc.scope - libcontainer container 59f00e5231c6853387d462524bb9a6a5c2f8dc219c14950e690bd6e9a64484dc. Jul 15 05:17:32.872376 systemd[1]: Started cri-containerd-ae07f8c5848926174eea72e6fc969ea9c45d799873af4611025f6174f0b1a201.scope - libcontainer container ae07f8c5848926174eea72e6fc969ea9c45d799873af4611025f6174f0b1a201. Jul 15 05:17:32.880551 systemd[1]: Started cri-containerd-cada33e1bba514d30c3f9f702dbb636c7e399eca8d67bc5c76d2e2842005f366.scope - libcontainer container cada33e1bba514d30c3f9f702dbb636c7e399eca8d67bc5c76d2e2842005f366. Jul 15 05:17:32.938905 containerd[1608]: time="2025-07-15T05:17:32.938828921Z" level=info msg="StartContainer for \"59f00e5231c6853387d462524bb9a6a5c2f8dc219c14950e690bd6e9a64484dc\" returns successfully" Jul 15 05:17:32.947963 containerd[1608]: time="2025-07-15T05:17:32.947900970Z" level=info msg="StartContainer for \"ae07f8c5848926174eea72e6fc969ea9c45d799873af4611025f6174f0b1a201\" returns successfully" Jul 15 05:17:32.988267 containerd[1608]: time="2025-07-15T05:17:32.986939750Z" level=info msg="StartContainer for \"cada33e1bba514d30c3f9f702dbb636c7e399eca8d67bc5c76d2e2842005f366\" returns successfully" Jul 15 05:17:32.993925 kubelet[2385]: W0715 05:17:32.993856 2385 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://157.180.32.153:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 157.180.32.153:6443: connect: connection refused Jul 15 05:17:32.993925 kubelet[2385]: E0715 05:17:32.993924 2385 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://157.180.32.153:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 157.180.32.153:6443: connect: connection refused" logger="UnhandledError" Jul 15 05:17:33.076827 kubelet[2385]: W0715 05:17:33.076709 2385 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://157.180.32.153:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 157.180.32.153:6443: connect: connection refused Jul 15 05:17:33.076827 kubelet[2385]: E0715 05:17:33.076752 2385 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://157.180.32.153:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 157.180.32.153:6443: connect: connection refused" logger="UnhandledError" Jul 15 05:17:33.617647 kubelet[2385]: I0715 05:17:33.617607 2385 kubelet_node_status.go:72] "Attempting to register node" node="ci-4396-0-0-n-153ccb2e88" Jul 15 05:17:34.740926 kubelet[2385]: E0715 05:17:34.740830 2385 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4396-0-0-n-153ccb2e88\" not found" node="ci-4396-0-0-n-153ccb2e88" Jul 15 05:17:34.801421 kubelet[2385]: I0715 05:17:34.801370 2385 kubelet_node_status.go:75] "Successfully registered node" node="ci-4396-0-0-n-153ccb2e88" Jul 15 05:17:35.015916 kubelet[2385]: I0715 05:17:35.015600 2385 apiserver.go:52] "Watching apiserver" Jul 15 05:17:35.040056 kubelet[2385]: I0715 05:17:35.040000 2385 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Jul 15 05:17:37.028181 systemd[1]: Reload requested from client PID 2657 ('systemctl') (unit session-7.scope)... Jul 15 05:17:37.028316 systemd[1]: Reloading... Jul 15 05:17:37.134267 zram_generator::config[2704]: No configuration found. Jul 15 05:17:37.215425 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 15 05:17:37.330728 systemd[1]: Reloading finished in 302 ms. Jul 15 05:17:37.363028 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:17:37.383732 systemd[1]: kubelet.service: Deactivated successfully. Jul 15 05:17:37.383963 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:17:37.384031 systemd[1]: kubelet.service: Consumed 840ms CPU time, 126.2M memory peak. Jul 15 05:17:37.386635 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:17:37.562475 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:17:37.571464 (kubelet)[2752]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jul 15 05:17:37.618208 kubelet[2752]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 15 05:17:37.618208 kubelet[2752]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jul 15 05:17:37.618208 kubelet[2752]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 15 05:17:37.619969 kubelet[2752]: I0715 05:17:37.618484 2752 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jul 15 05:17:37.625980 kubelet[2752]: I0715 05:17:37.625653 2752 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Jul 15 05:17:37.625980 kubelet[2752]: I0715 05:17:37.625671 2752 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jul 15 05:17:37.625980 kubelet[2752]: I0715 05:17:37.625900 2752 server.go:934] "Client rotation is on, will bootstrap in background" Jul 15 05:17:37.627318 kubelet[2752]: I0715 05:17:37.627297 2752 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jul 15 05:17:37.629340 kubelet[2752]: I0715 05:17:37.628839 2752 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jul 15 05:17:37.633540 kubelet[2752]: I0715 05:17:37.633513 2752 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jul 15 05:17:37.637962 kubelet[2752]: I0715 05:17:37.637938 2752 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jul 15 05:17:37.638087 kubelet[2752]: I0715 05:17:37.638072 2752 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jul 15 05:17:37.638261 kubelet[2752]: I0715 05:17:37.638211 2752 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jul 15 05:17:37.638536 kubelet[2752]: I0715 05:17:37.638262 2752 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4396-0-0-n-153ccb2e88","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jul 15 05:17:37.638621 kubelet[2752]: I0715 05:17:37.638547 2752 topology_manager.go:138] "Creating topology manager with none policy" Jul 15 05:17:37.638621 kubelet[2752]: I0715 05:17:37.638556 2752 container_manager_linux.go:300] "Creating device plugin manager" Jul 15 05:17:37.638621 kubelet[2752]: I0715 05:17:37.638603 2752 state_mem.go:36] "Initialized new in-memory state store" Jul 15 05:17:37.638712 kubelet[2752]: I0715 05:17:37.638697 2752 kubelet.go:408] "Attempting to sync node with API server" Jul 15 05:17:37.638738 kubelet[2752]: I0715 05:17:37.638732 2752 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Jul 15 05:17:37.639229 kubelet[2752]: I0715 05:17:37.638760 2752 kubelet.go:314] "Adding apiserver pod source" Jul 15 05:17:37.639229 kubelet[2752]: I0715 05:17:37.638772 2752 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jul 15 05:17:37.642268 kubelet[2752]: I0715 05:17:37.641719 2752 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Jul 15 05:17:37.642268 kubelet[2752]: I0715 05:17:37.642002 2752 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jul 15 05:17:37.643413 kubelet[2752]: I0715 05:17:37.643394 2752 server.go:1274] "Started kubelet" Jul 15 05:17:37.647145 kubelet[2752]: I0715 05:17:37.647001 2752 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jul 15 05:17:37.654528 kubelet[2752]: I0715 05:17:37.654392 2752 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jul 15 05:17:37.655521 kubelet[2752]: I0715 05:17:37.655504 2752 server.go:449] "Adding debug handlers to kubelet server" Jul 15 05:17:37.656987 kubelet[2752]: I0715 05:17:37.656894 2752 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jul 15 05:17:37.657189 kubelet[2752]: I0715 05:17:37.657175 2752 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jul 15 05:17:37.657462 kubelet[2752]: I0715 05:17:37.657445 2752 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jul 15 05:17:37.659668 kubelet[2752]: I0715 05:17:37.659612 2752 volume_manager.go:289] "Starting Kubelet Volume Manager" Jul 15 05:17:37.659929 kubelet[2752]: E0715 05:17:37.659895 2752 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4396-0-0-n-153ccb2e88\" not found" Jul 15 05:17:37.662424 kubelet[2752]: I0715 05:17:37.662370 2752 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Jul 15 05:17:37.662719 kubelet[2752]: I0715 05:17:37.662586 2752 reconciler.go:26] "Reconciler: start to sync state" Jul 15 05:17:37.664521 kubelet[2752]: I0715 05:17:37.664498 2752 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jul 15 05:17:37.665348 kubelet[2752]: I0715 05:17:37.665326 2752 factory.go:221] Registration of the systemd container factory successfully Jul 15 05:17:37.665432 kubelet[2752]: I0715 05:17:37.665411 2752 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jul 15 05:17:37.666030 kubelet[2752]: I0715 05:17:37.665787 2752 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jul 15 05:17:37.666030 kubelet[2752]: I0715 05:17:37.665810 2752 status_manager.go:217] "Starting to sync pod status with apiserver" Jul 15 05:17:37.666030 kubelet[2752]: I0715 05:17:37.665823 2752 kubelet.go:2321] "Starting kubelet main sync loop" Jul 15 05:17:37.666030 kubelet[2752]: E0715 05:17:37.665853 2752 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jul 15 05:17:37.669380 kubelet[2752]: I0715 05:17:37.669360 2752 factory.go:221] Registration of the containerd container factory successfully Jul 15 05:17:37.719831 kubelet[2752]: I0715 05:17:37.719792 2752 cpu_manager.go:214] "Starting CPU manager" policy="none" Jul 15 05:17:37.719831 kubelet[2752]: I0715 05:17:37.719810 2752 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jul 15 05:17:37.719831 kubelet[2752]: I0715 05:17:37.719827 2752 state_mem.go:36] "Initialized new in-memory state store" Jul 15 05:17:37.719990 kubelet[2752]: I0715 05:17:37.719956 2752 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jul 15 05:17:37.719990 kubelet[2752]: I0715 05:17:37.719966 2752 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jul 15 05:17:37.719990 kubelet[2752]: I0715 05:17:37.719981 2752 policy_none.go:49] "None policy: Start" Jul 15 05:17:37.720750 kubelet[2752]: I0715 05:17:37.720727 2752 memory_manager.go:170] "Starting memorymanager" policy="None" Jul 15 05:17:37.720750 kubelet[2752]: I0715 05:17:37.720748 2752 state_mem.go:35] "Initializing new in-memory state store" Jul 15 05:17:37.720879 kubelet[2752]: I0715 05:17:37.720859 2752 state_mem.go:75] "Updated machine memory state" Jul 15 05:17:37.727038 kubelet[2752]: I0715 05:17:37.727021 2752 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jul 15 05:17:37.727663 kubelet[2752]: I0715 05:17:37.727255 2752 eviction_manager.go:189] "Eviction manager: starting control loop" Jul 15 05:17:37.727663 kubelet[2752]: I0715 05:17:37.727279 2752 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jul 15 05:17:37.727663 kubelet[2752]: I0715 05:17:37.727436 2752 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jul 15 05:17:37.831288 kubelet[2752]: I0715 05:17:37.831223 2752 kubelet_node_status.go:72] "Attempting to register node" node="ci-4396-0-0-n-153ccb2e88" Jul 15 05:17:37.839567 kubelet[2752]: I0715 05:17:37.839518 2752 kubelet_node_status.go:111] "Node was previously registered" node="ci-4396-0-0-n-153ccb2e88" Jul 15 05:17:37.839824 kubelet[2752]: I0715 05:17:37.839808 2752 kubelet_node_status.go:75] "Successfully registered node" node="ci-4396-0-0-n-153ccb2e88" Jul 15 05:17:37.964592 kubelet[2752]: I0715 05:17:37.964552 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4438e3662bee56f52349e1daec9c5cd7-ca-certs\") pod \"kube-apiserver-ci-4396-0-0-n-153ccb2e88\" (UID: \"4438e3662bee56f52349e1daec9c5cd7\") " pod="kube-system/kube-apiserver-ci-4396-0-0-n-153ccb2e88" Jul 15 05:17:37.964592 kubelet[2752]: I0715 05:17:37.964587 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4438e3662bee56f52349e1daec9c5cd7-k8s-certs\") pod \"kube-apiserver-ci-4396-0-0-n-153ccb2e88\" (UID: \"4438e3662bee56f52349e1daec9c5cd7\") " pod="kube-system/kube-apiserver-ci-4396-0-0-n-153ccb2e88" Jul 15 05:17:37.964751 kubelet[2752]: I0715 05:17:37.964608 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ddfb02216aa4036044d1e1f4c586e97e-ca-certs\") pod \"kube-controller-manager-ci-4396-0-0-n-153ccb2e88\" (UID: \"ddfb02216aa4036044d1e1f4c586e97e\") " pod="kube-system/kube-controller-manager-ci-4396-0-0-n-153ccb2e88" Jul 15 05:17:37.964751 kubelet[2752]: I0715 05:17:37.964628 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ddfb02216aa4036044d1e1f4c586e97e-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4396-0-0-n-153ccb2e88\" (UID: \"ddfb02216aa4036044d1e1f4c586e97e\") " pod="kube-system/kube-controller-manager-ci-4396-0-0-n-153ccb2e88" Jul 15 05:17:37.964751 kubelet[2752]: I0715 05:17:37.964646 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/66d69814c7e4b38ec2fc1ce48979fde3-kubeconfig\") pod \"kube-scheduler-ci-4396-0-0-n-153ccb2e88\" (UID: \"66d69814c7e4b38ec2fc1ce48979fde3\") " pod="kube-system/kube-scheduler-ci-4396-0-0-n-153ccb2e88" Jul 15 05:17:37.964751 kubelet[2752]: I0715 05:17:37.964662 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4438e3662bee56f52349e1daec9c5cd7-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4396-0-0-n-153ccb2e88\" (UID: \"4438e3662bee56f52349e1daec9c5cd7\") " pod="kube-system/kube-apiserver-ci-4396-0-0-n-153ccb2e88" Jul 15 05:17:37.964751 kubelet[2752]: I0715 05:17:37.964676 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/ddfb02216aa4036044d1e1f4c586e97e-flexvolume-dir\") pod \"kube-controller-manager-ci-4396-0-0-n-153ccb2e88\" (UID: \"ddfb02216aa4036044d1e1f4c586e97e\") " pod="kube-system/kube-controller-manager-ci-4396-0-0-n-153ccb2e88" Jul 15 05:17:37.964876 kubelet[2752]: I0715 05:17:37.964689 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ddfb02216aa4036044d1e1f4c586e97e-k8s-certs\") pod \"kube-controller-manager-ci-4396-0-0-n-153ccb2e88\" (UID: \"ddfb02216aa4036044d1e1f4c586e97e\") " pod="kube-system/kube-controller-manager-ci-4396-0-0-n-153ccb2e88" Jul 15 05:17:37.964876 kubelet[2752]: I0715 05:17:37.964703 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ddfb02216aa4036044d1e1f4c586e97e-kubeconfig\") pod \"kube-controller-manager-ci-4396-0-0-n-153ccb2e88\" (UID: \"ddfb02216aa4036044d1e1f4c586e97e\") " pod="kube-system/kube-controller-manager-ci-4396-0-0-n-153ccb2e88" Jul 15 05:17:38.640169 kubelet[2752]: I0715 05:17:38.639831 2752 apiserver.go:52] "Watching apiserver" Jul 15 05:17:38.663679 kubelet[2752]: I0715 05:17:38.663546 2752 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Jul 15 05:17:38.725668 kubelet[2752]: I0715 05:17:38.725603 2752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4396-0-0-n-153ccb2e88" podStartSLOduration=1.725483186 podStartE2EDuration="1.725483186s" podCreationTimestamp="2025-07-15 05:17:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 05:17:38.724566633 +0000 UTC m=+1.148566011" watchObservedRunningTime="2025-07-15 05:17:38.725483186 +0000 UTC m=+1.149482554" Jul 15 05:17:38.751430 kubelet[2752]: I0715 05:17:38.751301 2752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4396-0-0-n-153ccb2e88" podStartSLOduration=1.751282402 podStartE2EDuration="1.751282402s" podCreationTimestamp="2025-07-15 05:17:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 05:17:38.741074541 +0000 UTC m=+1.165073919" watchObservedRunningTime="2025-07-15 05:17:38.751282402 +0000 UTC m=+1.175281769" Jul 15 05:17:38.760535 kubelet[2752]: I0715 05:17:38.760460 2752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4396-0-0-n-153ccb2e88" podStartSLOduration=1.7604443239999998 podStartE2EDuration="1.760444324s" podCreationTimestamp="2025-07-15 05:17:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 05:17:38.752227801 +0000 UTC m=+1.176227179" watchObservedRunningTime="2025-07-15 05:17:38.760444324 +0000 UTC m=+1.184443692" Jul 15 05:17:43.435368 kubelet[2752]: I0715 05:17:43.435314 2752 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jul 15 05:17:43.435778 containerd[1608]: time="2025-07-15T05:17:43.435677651Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jul 15 05:17:43.436040 kubelet[2752]: I0715 05:17:43.435831 2752 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jul 15 05:17:44.191969 systemd[1]: Created slice kubepods-besteffort-podda3b5c99_cbf2_408d_a963_03600e4aa9bd.slice - libcontainer container kubepods-besteffort-podda3b5c99_cbf2_408d_a963_03600e4aa9bd.slice. Jul 15 05:17:44.205022 kubelet[2752]: I0715 05:17:44.204920 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/da3b5c99-cbf2-408d-a963-03600e4aa9bd-xtables-lock\") pod \"kube-proxy-v4w86\" (UID: \"da3b5c99-cbf2-408d-a963-03600e4aa9bd\") " pod="kube-system/kube-proxy-v4w86" Jul 15 05:17:44.205022 kubelet[2752]: I0715 05:17:44.204964 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/da3b5c99-cbf2-408d-a963-03600e4aa9bd-lib-modules\") pod \"kube-proxy-v4w86\" (UID: \"da3b5c99-cbf2-408d-a963-03600e4aa9bd\") " pod="kube-system/kube-proxy-v4w86" Jul 15 05:17:44.205022 kubelet[2752]: I0715 05:17:44.204990 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/da3b5c99-cbf2-408d-a963-03600e4aa9bd-kube-proxy\") pod \"kube-proxy-v4w86\" (UID: \"da3b5c99-cbf2-408d-a963-03600e4aa9bd\") " pod="kube-system/kube-proxy-v4w86" Jul 15 05:17:44.205022 kubelet[2752]: I0715 05:17:44.205011 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zsb2\" (UniqueName: \"kubernetes.io/projected/da3b5c99-cbf2-408d-a963-03600e4aa9bd-kube-api-access-9zsb2\") pod \"kube-proxy-v4w86\" (UID: \"da3b5c99-cbf2-408d-a963-03600e4aa9bd\") " pod="kube-system/kube-proxy-v4w86" Jul 15 05:17:44.351457 systemd[1]: Created slice kubepods-besteffort-pode121f7c4_f2a1_41ed_a5af_8433221395e2.slice - libcontainer container kubepods-besteffort-pode121f7c4_f2a1_41ed_a5af_8433221395e2.slice. Jul 15 05:17:44.406689 kubelet[2752]: I0715 05:17:44.406645 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/e121f7c4-f2a1-41ed-a5af-8433221395e2-var-lib-calico\") pod \"tigera-operator-5bf8dfcb4-lf5f4\" (UID: \"e121f7c4-f2a1-41ed-a5af-8433221395e2\") " pod="tigera-operator/tigera-operator-5bf8dfcb4-lf5f4" Jul 15 05:17:44.406863 kubelet[2752]: I0715 05:17:44.406692 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvbzn\" (UniqueName: \"kubernetes.io/projected/e121f7c4-f2a1-41ed-a5af-8433221395e2-kube-api-access-pvbzn\") pod \"tigera-operator-5bf8dfcb4-lf5f4\" (UID: \"e121f7c4-f2a1-41ed-a5af-8433221395e2\") " pod="tigera-operator/tigera-operator-5bf8dfcb4-lf5f4" Jul 15 05:17:44.503981 containerd[1608]: time="2025-07-15T05:17:44.503799864Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-v4w86,Uid:da3b5c99-cbf2-408d-a963-03600e4aa9bd,Namespace:kube-system,Attempt:0,}" Jul 15 05:17:44.544367 containerd[1608]: time="2025-07-15T05:17:44.544229758Z" level=info msg="connecting to shim a0d5d308e1bde6275905c3aad1360718b3b68ff1e390815f3afe6d25087eea51" address="unix:///run/containerd/s/f01919aa0af7bf15d00297f3af3a89d470d8ef1f19f52e6fb7ee648e009ba7a8" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:17:44.575391 systemd[1]: Started cri-containerd-a0d5d308e1bde6275905c3aad1360718b3b68ff1e390815f3afe6d25087eea51.scope - libcontainer container a0d5d308e1bde6275905c3aad1360718b3b68ff1e390815f3afe6d25087eea51. Jul 15 05:17:44.601220 containerd[1608]: time="2025-07-15T05:17:44.601147394Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-v4w86,Uid:da3b5c99-cbf2-408d-a963-03600e4aa9bd,Namespace:kube-system,Attempt:0,} returns sandbox id \"a0d5d308e1bde6275905c3aad1360718b3b68ff1e390815f3afe6d25087eea51\"" Jul 15 05:17:44.605358 containerd[1608]: time="2025-07-15T05:17:44.605299318Z" level=info msg="CreateContainer within sandbox \"a0d5d308e1bde6275905c3aad1360718b3b68ff1e390815f3afe6d25087eea51\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jul 15 05:17:44.625367 containerd[1608]: time="2025-07-15T05:17:44.625177049Z" level=info msg="Container e15a58316da5b0ff1bf16437aff52c6fcfd311743e5b49d70754ed5bd8a68862: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:17:44.631762 containerd[1608]: time="2025-07-15T05:17:44.631707453Z" level=info msg="CreateContainer within sandbox \"a0d5d308e1bde6275905c3aad1360718b3b68ff1e390815f3afe6d25087eea51\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"e15a58316da5b0ff1bf16437aff52c6fcfd311743e5b49d70754ed5bd8a68862\"" Jul 15 05:17:44.632534 containerd[1608]: time="2025-07-15T05:17:44.632509793Z" level=info msg="StartContainer for \"e15a58316da5b0ff1bf16437aff52c6fcfd311743e5b49d70754ed5bd8a68862\"" Jul 15 05:17:44.635077 containerd[1608]: time="2025-07-15T05:17:44.635050853Z" level=info msg="connecting to shim e15a58316da5b0ff1bf16437aff52c6fcfd311743e5b49d70754ed5bd8a68862" address="unix:///run/containerd/s/f01919aa0af7bf15d00297f3af3a89d470d8ef1f19f52e6fb7ee648e009ba7a8" protocol=ttrpc version=3 Jul 15 05:17:44.657198 containerd[1608]: time="2025-07-15T05:17:44.657158151Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5bf8dfcb4-lf5f4,Uid:e121f7c4-f2a1-41ed-a5af-8433221395e2,Namespace:tigera-operator,Attempt:0,}" Jul 15 05:17:44.659512 systemd[1]: Started cri-containerd-e15a58316da5b0ff1bf16437aff52c6fcfd311743e5b49d70754ed5bd8a68862.scope - libcontainer container e15a58316da5b0ff1bf16437aff52c6fcfd311743e5b49d70754ed5bd8a68862. Jul 15 05:17:44.680618 containerd[1608]: time="2025-07-15T05:17:44.680580961Z" level=info msg="connecting to shim b144c3781e4cff4fec2d0c4e7b144c997453ef8bfc4aa44825523c4b0a03c8ae" address="unix:///run/containerd/s/90980f45d16bec553981b7c1ae8255dd251034dc91fa42846f7639597d3a7925" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:17:44.712469 systemd[1]: Started cri-containerd-b144c3781e4cff4fec2d0c4e7b144c997453ef8bfc4aa44825523c4b0a03c8ae.scope - libcontainer container b144c3781e4cff4fec2d0c4e7b144c997453ef8bfc4aa44825523c4b0a03c8ae. Jul 15 05:17:44.724468 containerd[1608]: time="2025-07-15T05:17:44.724318721Z" level=info msg="StartContainer for \"e15a58316da5b0ff1bf16437aff52c6fcfd311743e5b49d70754ed5bd8a68862\" returns successfully" Jul 15 05:17:44.773553 containerd[1608]: time="2025-07-15T05:17:44.773424121Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5bf8dfcb4-lf5f4,Uid:e121f7c4-f2a1-41ed-a5af-8433221395e2,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"b144c3781e4cff4fec2d0c4e7b144c997453ef8bfc4aa44825523c4b0a03c8ae\"" Jul 15 05:17:44.777381 containerd[1608]: time="2025-07-15T05:17:44.776981947Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\"" Jul 15 05:17:45.327351 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3171663128.mount: Deactivated successfully. Jul 15 05:17:45.755434 kubelet[2752]: I0715 05:17:45.755328 2752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-v4w86" podStartSLOduration=1.754981589 podStartE2EDuration="1.754981589s" podCreationTimestamp="2025-07-15 05:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 05:17:45.753199812 +0000 UTC m=+8.177199190" watchObservedRunningTime="2025-07-15 05:17:45.754981589 +0000 UTC m=+8.178980987" Jul 15 05:17:46.346454 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount592733200.mount: Deactivated successfully. Jul 15 05:17:46.831398 containerd[1608]: time="2025-07-15T05:17:46.831343632Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:17:46.832174 containerd[1608]: time="2025-07-15T05:17:46.832077803Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.3: active requests=0, bytes read=25056543" Jul 15 05:17:46.832922 containerd[1608]: time="2025-07-15T05:17:46.832895311Z" level=info msg="ImageCreate event name:\"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:17:46.834736 containerd[1608]: time="2025-07-15T05:17:46.834686634Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:17:46.835427 containerd[1608]: time="2025-07-15T05:17:46.835151419Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.3\" with image id \"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\", repo tag \"quay.io/tigera/operator:v1.38.3\", repo digest \"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\", size \"25052538\" in 2.058139185s" Jul 15 05:17:46.835427 containerd[1608]: time="2025-07-15T05:17:46.835188036Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\" returns image reference \"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\"" Jul 15 05:17:46.838202 containerd[1608]: time="2025-07-15T05:17:46.838145144Z" level=info msg="CreateContainer within sandbox \"b144c3781e4cff4fec2d0c4e7b144c997453ef8bfc4aa44825523c4b0a03c8ae\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jul 15 05:17:46.851080 containerd[1608]: time="2025-07-15T05:17:46.849288413Z" level=info msg="Container e5a691b53f54ca31b238e2034d8e653e41483bb871510ba10e46535e9cf9adb6: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:17:46.851454 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4026672945.mount: Deactivated successfully. Jul 15 05:17:46.856185 containerd[1608]: time="2025-07-15T05:17:46.856115940Z" level=info msg="CreateContainer within sandbox \"b144c3781e4cff4fec2d0c4e7b144c997453ef8bfc4aa44825523c4b0a03c8ae\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"e5a691b53f54ca31b238e2034d8e653e41483bb871510ba10e46535e9cf9adb6\"" Jul 15 05:17:46.857131 containerd[1608]: time="2025-07-15T05:17:46.856947950Z" level=info msg="StartContainer for \"e5a691b53f54ca31b238e2034d8e653e41483bb871510ba10e46535e9cf9adb6\"" Jul 15 05:17:46.858475 containerd[1608]: time="2025-07-15T05:17:46.858450365Z" level=info msg="connecting to shim e5a691b53f54ca31b238e2034d8e653e41483bb871510ba10e46535e9cf9adb6" address="unix:///run/containerd/s/90980f45d16bec553981b7c1ae8255dd251034dc91fa42846f7639597d3a7925" protocol=ttrpc version=3 Jul 15 05:17:46.883371 systemd[1]: Started cri-containerd-e5a691b53f54ca31b238e2034d8e653e41483bb871510ba10e46535e9cf9adb6.scope - libcontainer container e5a691b53f54ca31b238e2034d8e653e41483bb871510ba10e46535e9cf9adb6. Jul 15 05:17:46.920728 containerd[1608]: time="2025-07-15T05:17:46.920687165Z" level=info msg="StartContainer for \"e5a691b53f54ca31b238e2034d8e653e41483bb871510ba10e46535e9cf9adb6\" returns successfully" Jul 15 05:17:47.751428 kubelet[2752]: I0715 05:17:47.751364 2752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-5bf8dfcb4-lf5f4" podStartSLOduration=1.6906512870000001 podStartE2EDuration="3.751346581s" podCreationTimestamp="2025-07-15 05:17:44 +0000 UTC" firstStartedPulling="2025-07-15 05:17:44.775671194 +0000 UTC m=+7.199670562" lastFinishedPulling="2025-07-15 05:17:46.836366488 +0000 UTC m=+9.260365856" observedRunningTime="2025-07-15 05:17:47.750633094 +0000 UTC m=+10.174632472" watchObservedRunningTime="2025-07-15 05:17:47.751346581 +0000 UTC m=+10.175345949" Jul 15 05:17:48.413932 update_engine[1579]: I20250715 05:17:48.413283 1579 update_attempter.cc:509] Updating boot flags... Jul 15 05:17:53.026092 sudo[1837]: pam_unix(sudo:session): session closed for user root Jul 15 05:17:53.185877 sshd[1836]: Connection closed by 139.178.89.65 port 40964 Jul 15 05:17:53.187368 sshd-session[1818]: pam_unix(sshd:session): session closed for user core Jul 15 05:17:53.192460 systemd-logind[1577]: Session 7 logged out. Waiting for processes to exit. Jul 15 05:17:53.194844 systemd[1]: sshd@6-157.180.32.153:22-139.178.89.65:40964.service: Deactivated successfully. Jul 15 05:17:53.198810 systemd[1]: session-7.scope: Deactivated successfully. Jul 15 05:17:53.199364 systemd[1]: session-7.scope: Consumed 4.159s CPU time, 158.4M memory peak. Jul 15 05:17:53.205402 systemd-logind[1577]: Removed session 7. Jul 15 05:17:56.528084 systemd[1]: Created slice kubepods-besteffort-pod1064777e_398d_40dd_b264_af2b777be354.slice - libcontainer container kubepods-besteffort-pod1064777e_398d_40dd_b264_af2b777be354.slice. Jul 15 05:17:56.586445 kubelet[2752]: I0715 05:17:56.586264 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxxmd\" (UniqueName: \"kubernetes.io/projected/1064777e-398d-40dd-b264-af2b777be354-kube-api-access-pxxmd\") pod \"calico-typha-74948b5f57-cdwfv\" (UID: \"1064777e-398d-40dd-b264-af2b777be354\") " pod="calico-system/calico-typha-74948b5f57-cdwfv" Jul 15 05:17:56.586445 kubelet[2752]: I0715 05:17:56.586307 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1064777e-398d-40dd-b264-af2b777be354-tigera-ca-bundle\") pod \"calico-typha-74948b5f57-cdwfv\" (UID: \"1064777e-398d-40dd-b264-af2b777be354\") " pod="calico-system/calico-typha-74948b5f57-cdwfv" Jul 15 05:17:56.586445 kubelet[2752]: I0715 05:17:56.586324 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/1064777e-398d-40dd-b264-af2b777be354-typha-certs\") pod \"calico-typha-74948b5f57-cdwfv\" (UID: \"1064777e-398d-40dd-b264-af2b777be354\") " pod="calico-system/calico-typha-74948b5f57-cdwfv" Jul 15 05:17:56.837552 containerd[1608]: time="2025-07-15T05:17:56.837108960Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-74948b5f57-cdwfv,Uid:1064777e-398d-40dd-b264-af2b777be354,Namespace:calico-system,Attempt:0,}" Jul 15 05:17:56.855862 containerd[1608]: time="2025-07-15T05:17:56.855792075Z" level=info msg="connecting to shim 5df563bbf604e5ce285d58f343c28b3847b0facfeecd0ac5edd873331e3030bd" address="unix:///run/containerd/s/9b27620b2ce282d83437b799d7bdb0e6c090ec15ac4a0de5d5a60d33eee52280" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:17:56.890511 systemd[1]: Started cri-containerd-5df563bbf604e5ce285d58f343c28b3847b0facfeecd0ac5edd873331e3030bd.scope - libcontainer container 5df563bbf604e5ce285d58f343c28b3847b0facfeecd0ac5edd873331e3030bd. Jul 15 05:17:56.929680 systemd[1]: Created slice kubepods-besteffort-pod10a13b2a_a6ac_468c_a2a1_577b50007889.slice - libcontainer container kubepods-besteffort-pod10a13b2a_a6ac_468c_a2a1_577b50007889.slice. Jul 15 05:17:56.987248 containerd[1608]: time="2025-07-15T05:17:56.987082877Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-74948b5f57-cdwfv,Uid:1064777e-398d-40dd-b264-af2b777be354,Namespace:calico-system,Attempt:0,} returns sandbox id \"5df563bbf604e5ce285d58f343c28b3847b0facfeecd0ac5edd873331e3030bd\"" Jul 15 05:17:56.988423 kubelet[2752]: I0715 05:17:56.988373 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/10a13b2a-a6ac-468c-a2a1-577b50007889-policysync\") pod \"calico-node-vf58q\" (UID: \"10a13b2a-a6ac-468c-a2a1-577b50007889\") " pod="calico-system/calico-node-vf58q" Jul 15 05:17:56.988423 kubelet[2752]: I0715 05:17:56.988411 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/10a13b2a-a6ac-468c-a2a1-577b50007889-var-run-calico\") pod \"calico-node-vf58q\" (UID: \"10a13b2a-a6ac-468c-a2a1-577b50007889\") " pod="calico-system/calico-node-vf58q" Jul 15 05:17:56.988513 kubelet[2752]: I0715 05:17:56.988428 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/10a13b2a-a6ac-468c-a2a1-577b50007889-cni-log-dir\") pod \"calico-node-vf58q\" (UID: \"10a13b2a-a6ac-468c-a2a1-577b50007889\") " pod="calico-system/calico-node-vf58q" Jul 15 05:17:56.988513 kubelet[2752]: I0715 05:17:56.988440 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/10a13b2a-a6ac-468c-a2a1-577b50007889-var-lib-calico\") pod \"calico-node-vf58q\" (UID: \"10a13b2a-a6ac-468c-a2a1-577b50007889\") " pod="calico-system/calico-node-vf58q" Jul 15 05:17:56.988513 kubelet[2752]: I0715 05:17:56.988465 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/10a13b2a-a6ac-468c-a2a1-577b50007889-node-certs\") pod \"calico-node-vf58q\" (UID: \"10a13b2a-a6ac-468c-a2a1-577b50007889\") " pod="calico-system/calico-node-vf58q" Jul 15 05:17:56.988513 kubelet[2752]: I0715 05:17:56.988480 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/10a13b2a-a6ac-468c-a2a1-577b50007889-lib-modules\") pod \"calico-node-vf58q\" (UID: \"10a13b2a-a6ac-468c-a2a1-577b50007889\") " pod="calico-system/calico-node-vf58q" Jul 15 05:17:56.988513 kubelet[2752]: I0715 05:17:56.988494 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/10a13b2a-a6ac-468c-a2a1-577b50007889-cni-net-dir\") pod \"calico-node-vf58q\" (UID: \"10a13b2a-a6ac-468c-a2a1-577b50007889\") " pod="calico-system/calico-node-vf58q" Jul 15 05:17:56.988615 kubelet[2752]: I0715 05:17:56.988510 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10a13b2a-a6ac-468c-a2a1-577b50007889-tigera-ca-bundle\") pod \"calico-node-vf58q\" (UID: \"10a13b2a-a6ac-468c-a2a1-577b50007889\") " pod="calico-system/calico-node-vf58q" Jul 15 05:17:56.988615 kubelet[2752]: I0715 05:17:56.988523 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/10a13b2a-a6ac-468c-a2a1-577b50007889-xtables-lock\") pod \"calico-node-vf58q\" (UID: \"10a13b2a-a6ac-468c-a2a1-577b50007889\") " pod="calico-system/calico-node-vf58q" Jul 15 05:17:56.988615 kubelet[2752]: I0715 05:17:56.988538 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/10a13b2a-a6ac-468c-a2a1-577b50007889-cni-bin-dir\") pod \"calico-node-vf58q\" (UID: \"10a13b2a-a6ac-468c-a2a1-577b50007889\") " pod="calico-system/calico-node-vf58q" Jul 15 05:17:56.988615 kubelet[2752]: I0715 05:17:56.988553 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/10a13b2a-a6ac-468c-a2a1-577b50007889-flexvol-driver-host\") pod \"calico-node-vf58q\" (UID: \"10a13b2a-a6ac-468c-a2a1-577b50007889\") " pod="calico-system/calico-node-vf58q" Jul 15 05:17:56.988615 kubelet[2752]: I0715 05:17:56.988567 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbq7w\" (UniqueName: \"kubernetes.io/projected/10a13b2a-a6ac-468c-a2a1-577b50007889-kube-api-access-jbq7w\") pod \"calico-node-vf58q\" (UID: \"10a13b2a-a6ac-468c-a2a1-577b50007889\") " pod="calico-system/calico-node-vf58q" Jul 15 05:17:56.993186 containerd[1608]: time="2025-07-15T05:17:56.992538145Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\"" Jul 15 05:17:57.105738 kubelet[2752]: E0715 05:17:57.104272 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.105738 kubelet[2752]: W0715 05:17:57.104525 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.105738 kubelet[2752]: E0715 05:17:57.104556 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.111429 kubelet[2752]: E0715 05:17:57.111386 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.111429 kubelet[2752]: W0715 05:17:57.111415 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.111552 kubelet[2752]: E0715 05:17:57.111470 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.155742 kubelet[2752]: E0715 05:17:57.155454 2752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kn29v" podUID="904a4f13-bfb8-412a-a126-6662c25983b9" Jul 15 05:17:57.176649 kubelet[2752]: E0715 05:17:57.176586 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.176649 kubelet[2752]: W0715 05:17:57.176611 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.176961 kubelet[2752]: E0715 05:17:57.176837 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.177296 kubelet[2752]: E0715 05:17:57.177082 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.177296 kubelet[2752]: W0715 05:17:57.177092 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.177296 kubelet[2752]: E0715 05:17:57.177101 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.177923 kubelet[2752]: E0715 05:17:57.177693 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.177923 kubelet[2752]: W0715 05:17:57.177703 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.177923 kubelet[2752]: E0715 05:17:57.177711 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.178606 kubelet[2752]: E0715 05:17:57.178512 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.178606 kubelet[2752]: W0715 05:17:57.178524 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.178606 kubelet[2752]: E0715 05:17:57.178533 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.178806 kubelet[2752]: E0715 05:17:57.178795 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.178904 kubelet[2752]: W0715 05:17:57.178854 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.178904 kubelet[2752]: E0715 05:17:57.178868 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.179257 kubelet[2752]: E0715 05:17:57.179114 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.179257 kubelet[2752]: W0715 05:17:57.179126 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.179257 kubelet[2752]: E0715 05:17:57.179135 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.179655 kubelet[2752]: E0715 05:17:57.179579 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.179655 kubelet[2752]: W0715 05:17:57.179592 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.179655 kubelet[2752]: E0715 05:17:57.179600 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.180100 kubelet[2752]: E0715 05:17:57.180026 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.180100 kubelet[2752]: W0715 05:17:57.180037 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.180100 kubelet[2752]: E0715 05:17:57.180046 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.180515 kubelet[2752]: E0715 05:17:57.180441 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.180515 kubelet[2752]: W0715 05:17:57.180452 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.180515 kubelet[2752]: E0715 05:17:57.180460 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.180873 kubelet[2752]: E0715 05:17:57.180804 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.180873 kubelet[2752]: W0715 05:17:57.180815 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.180873 kubelet[2752]: E0715 05:17:57.180823 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.181180 kubelet[2752]: E0715 05:17:57.181118 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.181415 kubelet[2752]: W0715 05:17:57.181329 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.181415 kubelet[2752]: E0715 05:17:57.181345 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.181754 kubelet[2752]: E0715 05:17:57.181689 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.181754 kubelet[2752]: W0715 05:17:57.181700 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.181754 kubelet[2752]: E0715 05:17:57.181708 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.182063 kubelet[2752]: E0715 05:17:57.182008 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.182063 kubelet[2752]: W0715 05:17:57.182018 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.182063 kubelet[2752]: E0715 05:17:57.182028 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.182371 kubelet[2752]: E0715 05:17:57.182292 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.182371 kubelet[2752]: W0715 05:17:57.182306 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.182371 kubelet[2752]: E0715 05:17:57.182327 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.182648 kubelet[2752]: E0715 05:17:57.182588 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.182648 kubelet[2752]: W0715 05:17:57.182603 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.182648 kubelet[2752]: E0715 05:17:57.182612 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.182951 kubelet[2752]: E0715 05:17:57.182893 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.182951 kubelet[2752]: W0715 05:17:57.182903 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.182951 kubelet[2752]: E0715 05:17:57.182911 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.183214 kubelet[2752]: E0715 05:17:57.183139 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.183214 kubelet[2752]: W0715 05:17:57.183148 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.183214 kubelet[2752]: E0715 05:17:57.183156 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.183484 kubelet[2752]: E0715 05:17:57.183432 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.183484 kubelet[2752]: W0715 05:17:57.183442 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.183484 kubelet[2752]: E0715 05:17:57.183449 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.183729 kubelet[2752]: E0715 05:17:57.183677 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.183729 kubelet[2752]: W0715 05:17:57.183688 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.183729 kubelet[2752]: E0715 05:17:57.183695 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.183989 kubelet[2752]: E0715 05:17:57.183926 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.183989 kubelet[2752]: W0715 05:17:57.183935 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.183989 kubelet[2752]: E0715 05:17:57.183943 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.191471 kubelet[2752]: E0715 05:17:57.191443 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.191582 kubelet[2752]: W0715 05:17:57.191549 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.191582 kubelet[2752]: E0715 05:17:57.191564 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.191700 kubelet[2752]: I0715 05:17:57.191686 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcvnh\" (UniqueName: \"kubernetes.io/projected/904a4f13-bfb8-412a-a126-6662c25983b9-kube-api-access-gcvnh\") pod \"csi-node-driver-kn29v\" (UID: \"904a4f13-bfb8-412a-a126-6662c25983b9\") " pod="calico-system/csi-node-driver-kn29v" Jul 15 05:17:57.192014 kubelet[2752]: E0715 05:17:57.191994 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.192072 kubelet[2752]: W0715 05:17:57.192061 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.192151 kubelet[2752]: E0715 05:17:57.192137 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.193223 kubelet[2752]: E0715 05:17:57.193185 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.193223 kubelet[2752]: W0715 05:17:57.193205 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.193377 kubelet[2752]: E0715 05:17:57.193339 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.193377 kubelet[2752]: I0715 05:17:57.193362 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/904a4f13-bfb8-412a-a126-6662c25983b9-registration-dir\") pod \"csi-node-driver-kn29v\" (UID: \"904a4f13-bfb8-412a-a126-6662c25983b9\") " pod="calico-system/csi-node-driver-kn29v" Jul 15 05:17:57.193814 kubelet[2752]: E0715 05:17:57.193778 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.193814 kubelet[2752]: W0715 05:17:57.193790 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.193814 kubelet[2752]: E0715 05:17:57.193799 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.194171 kubelet[2752]: E0715 05:17:57.194147 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.194171 kubelet[2752]: W0715 05:17:57.194158 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.194432 kubelet[2752]: E0715 05:17:57.194407 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.194570 kubelet[2752]: E0715 05:17:57.194559 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.194673 kubelet[2752]: W0715 05:17:57.194608 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.194673 kubelet[2752]: E0715 05:17:57.194625 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.194673 kubelet[2752]: I0715 05:17:57.194640 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/904a4f13-bfb8-412a-a126-6662c25983b9-socket-dir\") pod \"csi-node-driver-kn29v\" (UID: \"904a4f13-bfb8-412a-a126-6662c25983b9\") " pod="calico-system/csi-node-driver-kn29v" Jul 15 05:17:57.194956 kubelet[2752]: E0715 05:17:57.194924 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.194956 kubelet[2752]: W0715 05:17:57.194935 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.194956 kubelet[2752]: E0715 05:17:57.194944 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.195335 kubelet[2752]: E0715 05:17:57.195299 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.195459 kubelet[2752]: W0715 05:17:57.195390 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.195459 kubelet[2752]: E0715 05:17:57.195425 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.195735 kubelet[2752]: E0715 05:17:57.195713 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.195735 kubelet[2752]: W0715 05:17:57.195723 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.195905 kubelet[2752]: E0715 05:17:57.195809 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.196058 kubelet[2752]: E0715 05:17:57.196048 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.196145 kubelet[2752]: W0715 05:17:57.196119 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.196145 kubelet[2752]: E0715 05:17:57.196129 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.196391 kubelet[2752]: I0715 05:17:57.196287 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/904a4f13-bfb8-412a-a126-6662c25983b9-kubelet-dir\") pod \"csi-node-driver-kn29v\" (UID: \"904a4f13-bfb8-412a-a126-6662c25983b9\") " pod="calico-system/csi-node-driver-kn29v" Jul 15 05:17:57.197081 kubelet[2752]: E0715 05:17:57.196921 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.197081 kubelet[2752]: W0715 05:17:57.196932 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.197081 kubelet[2752]: E0715 05:17:57.196942 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.197081 kubelet[2752]: I0715 05:17:57.196954 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/904a4f13-bfb8-412a-a126-6662c25983b9-varrun\") pod \"csi-node-driver-kn29v\" (UID: \"904a4f13-bfb8-412a-a126-6662c25983b9\") " pod="calico-system/csi-node-driver-kn29v" Jul 15 05:17:57.197989 kubelet[2752]: E0715 05:17:57.197977 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.198061 kubelet[2752]: W0715 05:17:57.198050 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.198189 kubelet[2752]: E0715 05:17:57.198120 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.198478 kubelet[2752]: E0715 05:17:57.198457 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.198658 kubelet[2752]: W0715 05:17:57.198527 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.198708 kubelet[2752]: E0715 05:17:57.198539 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.199505 kubelet[2752]: E0715 05:17:57.199421 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.199505 kubelet[2752]: W0715 05:17:57.199432 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.199718 kubelet[2752]: E0715 05:17:57.199613 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.199868 kubelet[2752]: E0715 05:17:57.199858 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.199917 kubelet[2752]: W0715 05:17:57.199907 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.199995 kubelet[2752]: E0715 05:17:57.199976 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.234180 containerd[1608]: time="2025-07-15T05:17:57.234128604Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-vf58q,Uid:10a13b2a-a6ac-468c-a2a1-577b50007889,Namespace:calico-system,Attempt:0,}" Jul 15 05:17:57.253711 containerd[1608]: time="2025-07-15T05:17:57.253560480Z" level=info msg="connecting to shim 3cfdfc0948533b774b58de840c30f401db3feda269ca2daafa7f12e0d9099649" address="unix:///run/containerd/s/e15d57789337a53e915876b325a5242894d11bc26ee178d60f1e4aaa4149189c" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:17:57.281559 systemd[1]: Started cri-containerd-3cfdfc0948533b774b58de840c30f401db3feda269ca2daafa7f12e0d9099649.scope - libcontainer container 3cfdfc0948533b774b58de840c30f401db3feda269ca2daafa7f12e0d9099649. Jul 15 05:17:57.298994 kubelet[2752]: E0715 05:17:57.298649 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.298994 kubelet[2752]: W0715 05:17:57.298716 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.298994 kubelet[2752]: E0715 05:17:57.298746 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.299975 kubelet[2752]: E0715 05:17:57.299938 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.300016 kubelet[2752]: W0715 05:17:57.299976 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.300016 kubelet[2752]: E0715 05:17:57.299997 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.300281 kubelet[2752]: E0715 05:17:57.300205 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.300324 kubelet[2752]: W0715 05:17:57.300293 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.300409 kubelet[2752]: E0715 05:17:57.300383 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.301827 kubelet[2752]: E0715 05:17:57.301797 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.301996 kubelet[2752]: W0715 05:17:57.301960 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.302345 kubelet[2752]: E0715 05:17:57.302256 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.303191 kubelet[2752]: E0715 05:17:57.303137 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.303191 kubelet[2752]: W0715 05:17:57.303154 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.304332 kubelet[2752]: E0715 05:17:57.304303 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.304609 kubelet[2752]: E0715 05:17:57.304568 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.304609 kubelet[2752]: W0715 05:17:57.304584 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.304704 kubelet[2752]: E0715 05:17:57.304678 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.306724 kubelet[2752]: E0715 05:17:57.306508 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.306724 kubelet[2752]: W0715 05:17:57.306561 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.306724 kubelet[2752]: E0715 05:17:57.306686 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.307663 kubelet[2752]: E0715 05:17:57.307601 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.307860 kubelet[2752]: W0715 05:17:57.307615 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.308828 kubelet[2752]: E0715 05:17:57.308795 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.309082 kubelet[2752]: W0715 05:17:57.308998 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.310294 kubelet[2752]: E0715 05:17:57.310210 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.310294 kubelet[2752]: W0715 05:17:57.310226 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.310619 kubelet[2752]: E0715 05:17:57.308904 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.311369 kubelet[2752]: E0715 05:17:57.311130 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.311369 kubelet[2752]: W0715 05:17:57.311318 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.311577 kubelet[2752]: E0715 05:17:57.311148 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.311577 kubelet[2752]: E0715 05:17:57.311138 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.311577 kubelet[2752]: E0715 05:17:57.311554 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.312636 kubelet[2752]: E0715 05:17:57.312597 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.312905 kubelet[2752]: W0715 05:17:57.312875 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.313925 kubelet[2752]: E0715 05:17:57.313894 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.314138 kubelet[2752]: E0715 05:17:57.314105 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.314138 kubelet[2752]: W0715 05:17:57.314119 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.314493 kubelet[2752]: E0715 05:17:57.314398 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.314629 kubelet[2752]: E0715 05:17:57.314614 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.315261 kubelet[2752]: W0715 05:17:57.314705 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.315498 kubelet[2752]: E0715 05:17:57.315473 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.315783 kubelet[2752]: E0715 05:17:57.315770 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.315878 kubelet[2752]: W0715 05:17:57.315864 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.316068 kubelet[2752]: E0715 05:17:57.316051 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.317452 kubelet[2752]: E0715 05:17:57.317436 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.317665 kubelet[2752]: W0715 05:17:57.317497 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.317842 kubelet[2752]: E0715 05:17:57.317748 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.318072 kubelet[2752]: E0715 05:17:57.318043 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.318072 kubelet[2752]: W0715 05:17:57.318055 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.318320 kubelet[2752]: E0715 05:17:57.318303 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.318661 kubelet[2752]: E0715 05:17:57.318626 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.318661 kubelet[2752]: W0715 05:17:57.318636 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.318865 kubelet[2752]: E0715 05:17:57.318835 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.318980 kubelet[2752]: E0715 05:17:57.318958 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.318980 kubelet[2752]: W0715 05:17:57.318968 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.319151 kubelet[2752]: E0715 05:17:57.319127 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.319370 kubelet[2752]: E0715 05:17:57.319326 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.319370 kubelet[2752]: W0715 05:17:57.319349 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.319569 kubelet[2752]: E0715 05:17:57.319545 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.320456 kubelet[2752]: E0715 05:17:57.320430 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.320456 kubelet[2752]: W0715 05:17:57.320442 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.320678 kubelet[2752]: E0715 05:17:57.320665 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.320840 kubelet[2752]: E0715 05:17:57.320815 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.320840 kubelet[2752]: W0715 05:17:57.320827 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.321068 kubelet[2752]: E0715 05:17:57.321056 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.321253 kubelet[2752]: E0715 05:17:57.321210 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.321253 kubelet[2752]: W0715 05:17:57.321219 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.321432 kubelet[2752]: E0715 05:17:57.321398 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.321626 kubelet[2752]: E0715 05:17:57.321615 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.321698 kubelet[2752]: W0715 05:17:57.321665 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.321904 kubelet[2752]: E0715 05:17:57.321869 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.322533 kubelet[2752]: E0715 05:17:57.322442 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.322533 kubelet[2752]: W0715 05:17:57.322453 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.322533 kubelet[2752]: E0715 05:17:57.322462 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.338208 kubelet[2752]: E0715 05:17:57.338164 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:57.338402 kubelet[2752]: W0715 05:17:57.338183 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:57.338402 kubelet[2752]: E0715 05:17:57.338370 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:57.371397 containerd[1608]: time="2025-07-15T05:17:57.371177984Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-vf58q,Uid:10a13b2a-a6ac-468c-a2a1-577b50007889,Namespace:calico-system,Attempt:0,} returns sandbox id \"3cfdfc0948533b774b58de840c30f401db3feda269ca2daafa7f12e0d9099649\"" Jul 15 05:17:58.488827 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1051333874.mount: Deactivated successfully. Jul 15 05:17:58.666434 kubelet[2752]: E0715 05:17:58.666369 2752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kn29v" podUID="904a4f13-bfb8-412a-a126-6662c25983b9" Jul 15 05:17:59.147232 containerd[1608]: time="2025-07-15T05:17:59.147162415Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:17:59.148230 containerd[1608]: time="2025-07-15T05:17:59.148184809Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.2: active requests=0, bytes read=35233364" Jul 15 05:17:59.149126 containerd[1608]: time="2025-07-15T05:17:59.149067881Z" level=info msg="ImageCreate event name:\"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:17:59.150780 containerd[1608]: time="2025-07-15T05:17:59.150738924Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:17:59.151166 containerd[1608]: time="2025-07-15T05:17:59.151127628Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.2\" with image id \"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\", size \"35233218\" in 2.158562007s" Jul 15 05:17:59.151213 containerd[1608]: time="2025-07-15T05:17:59.151168771Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\" returns image reference \"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\"" Jul 15 05:17:59.153108 containerd[1608]: time="2025-07-15T05:17:59.152808039Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\"" Jul 15 05:17:59.167863 containerd[1608]: time="2025-07-15T05:17:59.167820588Z" level=info msg="CreateContainer within sandbox \"5df563bbf604e5ce285d58f343c28b3847b0facfeecd0ac5edd873331e3030bd\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jul 15 05:17:59.183285 containerd[1608]: time="2025-07-15T05:17:59.182408801Z" level=info msg="Container 576b412b69b6f3bb3b6883a540518f1ef6c96f00e9bcea12389810eb8a2de7c8: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:17:59.194543 containerd[1608]: time="2025-07-15T05:17:59.194469482Z" level=info msg="CreateContainer within sandbox \"5df563bbf604e5ce285d58f343c28b3847b0facfeecd0ac5edd873331e3030bd\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"576b412b69b6f3bb3b6883a540518f1ef6c96f00e9bcea12389810eb8a2de7c8\"" Jul 15 05:17:59.195377 containerd[1608]: time="2025-07-15T05:17:59.195120115Z" level=info msg="StartContainer for \"576b412b69b6f3bb3b6883a540518f1ef6c96f00e9bcea12389810eb8a2de7c8\"" Jul 15 05:17:59.196898 containerd[1608]: time="2025-07-15T05:17:59.196858343Z" level=info msg="connecting to shim 576b412b69b6f3bb3b6883a540518f1ef6c96f00e9bcea12389810eb8a2de7c8" address="unix:///run/containerd/s/9b27620b2ce282d83437b799d7bdb0e6c090ec15ac4a0de5d5a60d33eee52280" protocol=ttrpc version=3 Jul 15 05:17:59.231441 systemd[1]: Started cri-containerd-576b412b69b6f3bb3b6883a540518f1ef6c96f00e9bcea12389810eb8a2de7c8.scope - libcontainer container 576b412b69b6f3bb3b6883a540518f1ef6c96f00e9bcea12389810eb8a2de7c8. Jul 15 05:17:59.295476 containerd[1608]: time="2025-07-15T05:17:59.295418741Z" level=info msg="StartContainer for \"576b412b69b6f3bb3b6883a540518f1ef6c96f00e9bcea12389810eb8a2de7c8\" returns successfully" Jul 15 05:17:59.805499 kubelet[2752]: E0715 05:17:59.805401 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:59.805499 kubelet[2752]: W0715 05:17:59.805423 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:59.805499 kubelet[2752]: E0715 05:17:59.805442 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:59.806417 kubelet[2752]: E0715 05:17:59.806349 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:59.806417 kubelet[2752]: W0715 05:17:59.806363 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:59.806417 kubelet[2752]: E0715 05:17:59.806375 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:59.806829 kubelet[2752]: E0715 05:17:59.806787 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:59.806829 kubelet[2752]: W0715 05:17:59.806799 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:59.806829 kubelet[2752]: E0715 05:17:59.806809 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:59.807258 kubelet[2752]: E0715 05:17:59.807164 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:59.807258 kubelet[2752]: W0715 05:17:59.807176 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:59.807258 kubelet[2752]: E0715 05:17:59.807187 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:59.807698 kubelet[2752]: E0715 05:17:59.807628 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:59.807698 kubelet[2752]: W0715 05:17:59.807639 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:59.807698 kubelet[2752]: E0715 05:17:59.807648 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:59.808054 kubelet[2752]: E0715 05:17:59.807990 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:59.808054 kubelet[2752]: W0715 05:17:59.808000 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:59.808054 kubelet[2752]: E0715 05:17:59.808008 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:59.808382 kubelet[2752]: E0715 05:17:59.808325 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:59.808382 kubelet[2752]: W0715 05:17:59.808337 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:59.808382 kubelet[2752]: E0715 05:17:59.808345 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:59.808990 kubelet[2752]: E0715 05:17:59.808651 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:59.808990 kubelet[2752]: W0715 05:17:59.808658 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:59.808990 kubelet[2752]: E0715 05:17:59.808667 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:59.809175 kubelet[2752]: E0715 05:17:59.809069 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:59.809175 kubelet[2752]: W0715 05:17:59.809076 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:59.809175 kubelet[2752]: E0715 05:17:59.809085 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:59.809471 kubelet[2752]: E0715 05:17:59.809423 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:59.809471 kubelet[2752]: W0715 05:17:59.809433 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:59.809471 kubelet[2752]: E0715 05:17:59.809441 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:59.809747 kubelet[2752]: E0715 05:17:59.809732 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:59.809851 kubelet[2752]: W0715 05:17:59.809781 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:59.809851 kubelet[2752]: E0715 05:17:59.809790 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:59.810145 kubelet[2752]: E0715 05:17:59.810085 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:59.810145 kubelet[2752]: W0715 05:17:59.810095 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:59.810145 kubelet[2752]: E0715 05:17:59.810103 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:59.810561 kubelet[2752]: E0715 05:17:59.810403 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:59.810561 kubelet[2752]: W0715 05:17:59.810411 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:59.810561 kubelet[2752]: E0715 05:17:59.810419 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:59.810781 kubelet[2752]: E0715 05:17:59.810766 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:59.810869 kubelet[2752]: W0715 05:17:59.810822 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:59.810869 kubelet[2752]: E0715 05:17:59.810833 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:59.811059 kubelet[2752]: E0715 05:17:59.811045 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:59.811171 kubelet[2752]: W0715 05:17:59.811107 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:59.811171 kubelet[2752]: E0715 05:17:59.811119 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:59.830887 kubelet[2752]: E0715 05:17:59.830856 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:59.831155 kubelet[2752]: W0715 05:17:59.831093 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:59.831155 kubelet[2752]: E0715 05:17:59.831153 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:59.831620 kubelet[2752]: E0715 05:17:59.831589 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:59.831620 kubelet[2752]: W0715 05:17:59.831612 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:59.831691 kubelet[2752]: E0715 05:17:59.831632 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:59.832461 kubelet[2752]: E0715 05:17:59.832187 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:59.832461 kubelet[2752]: W0715 05:17:59.832209 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:59.832461 kubelet[2752]: E0715 05:17:59.832224 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:59.833352 kubelet[2752]: E0715 05:17:59.833327 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:59.833397 kubelet[2752]: W0715 05:17:59.833378 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:59.833611 kubelet[2752]: E0715 05:17:59.833578 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:59.833759 kubelet[2752]: E0715 05:17:59.833730 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:59.833759 kubelet[2752]: W0715 05:17:59.833748 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:59.833985 kubelet[2752]: E0715 05:17:59.833956 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:59.833985 kubelet[2752]: W0715 05:17:59.833976 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:59.834046 kubelet[2752]: E0715 05:17:59.833991 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:59.834216 kubelet[2752]: E0715 05:17:59.834190 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:59.834216 kubelet[2752]: W0715 05:17:59.834208 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:59.834341 kubelet[2752]: E0715 05:17:59.834222 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:59.834576 kubelet[2752]: E0715 05:17:59.834525 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:59.834703 kubelet[2752]: E0715 05:17:59.834624 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:59.834703 kubelet[2752]: W0715 05:17:59.834695 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:59.834769 kubelet[2752]: E0715 05:17:59.834722 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:59.834998 kubelet[2752]: E0715 05:17:59.834974 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:59.834998 kubelet[2752]: W0715 05:17:59.834991 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:59.835058 kubelet[2752]: E0715 05:17:59.835013 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:59.835463 kubelet[2752]: E0715 05:17:59.835438 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:59.835463 kubelet[2752]: W0715 05:17:59.835455 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:59.835526 kubelet[2752]: E0715 05:17:59.835492 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:59.835784 kubelet[2752]: E0715 05:17:59.835759 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:59.835784 kubelet[2752]: W0715 05:17:59.835776 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:59.835842 kubelet[2752]: E0715 05:17:59.835795 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:59.836090 kubelet[2752]: E0715 05:17:59.836065 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:59.836090 kubelet[2752]: W0715 05:17:59.836083 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:59.836208 kubelet[2752]: E0715 05:17:59.836181 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:59.836900 kubelet[2752]: E0715 05:17:59.836856 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:59.836900 kubelet[2752]: W0715 05:17:59.836873 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:59.837013 kubelet[2752]: E0715 05:17:59.836987 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:59.837213 kubelet[2752]: E0715 05:17:59.837185 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:59.837213 kubelet[2752]: W0715 05:17:59.837204 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:59.837299 kubelet[2752]: E0715 05:17:59.837273 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:59.837519 kubelet[2752]: E0715 05:17:59.837495 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:59.837519 kubelet[2752]: W0715 05:17:59.837511 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:59.837584 kubelet[2752]: E0715 05:17:59.837529 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:59.837807 kubelet[2752]: E0715 05:17:59.837783 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:59.837807 kubelet[2752]: W0715 05:17:59.837798 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:59.837807 kubelet[2752]: E0715 05:17:59.837808 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:59.838066 kubelet[2752]: E0715 05:17:59.838042 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:59.838066 kubelet[2752]: W0715 05:17:59.838058 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:59.838127 kubelet[2752]: E0715 05:17:59.838069 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:17:59.838469 kubelet[2752]: E0715 05:17:59.838446 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:17:59.838469 kubelet[2752]: W0715 05:17:59.838461 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:17:59.838530 kubelet[2752]: E0715 05:17:59.838472 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:18:00.666620 kubelet[2752]: E0715 05:18:00.666555 2752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kn29v" podUID="904a4f13-bfb8-412a-a126-6662c25983b9" Jul 15 05:18:00.783096 kubelet[2752]: I0715 05:18:00.783071 2752 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 05:18:00.818650 kubelet[2752]: E0715 05:18:00.818463 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:18:00.818650 kubelet[2752]: W0715 05:18:00.818633 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:18:00.818650 kubelet[2752]: E0715 05:18:00.818660 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:18:00.820101 kubelet[2752]: E0715 05:18:00.819045 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:18:00.820101 kubelet[2752]: W0715 05:18:00.819056 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:18:00.820101 kubelet[2752]: E0715 05:18:00.819119 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:18:00.820101 kubelet[2752]: E0715 05:18:00.819399 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:18:00.820101 kubelet[2752]: W0715 05:18:00.819408 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:18:00.820101 kubelet[2752]: E0715 05:18:00.819418 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:18:00.820101 kubelet[2752]: E0715 05:18:00.819798 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:18:00.820101 kubelet[2752]: W0715 05:18:00.819830 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:18:00.820101 kubelet[2752]: E0715 05:18:00.819840 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:18:00.820101 kubelet[2752]: E0715 05:18:00.820178 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:18:00.821075 kubelet[2752]: W0715 05:18:00.820187 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:18:00.821075 kubelet[2752]: E0715 05:18:00.820196 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:18:00.821075 kubelet[2752]: E0715 05:18:00.820568 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:18:00.821075 kubelet[2752]: W0715 05:18:00.820576 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:18:00.821075 kubelet[2752]: E0715 05:18:00.820587 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:18:00.821075 kubelet[2752]: E0715 05:18:00.820816 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:18:00.821075 kubelet[2752]: W0715 05:18:00.820824 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:18:00.821075 kubelet[2752]: E0715 05:18:00.820832 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:18:00.821603 kubelet[2752]: E0715 05:18:00.821099 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:18:00.821603 kubelet[2752]: W0715 05:18:00.821107 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:18:00.821603 kubelet[2752]: E0715 05:18:00.821190 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:18:00.821603 kubelet[2752]: E0715 05:18:00.821543 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:18:00.821603 kubelet[2752]: W0715 05:18:00.821552 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:18:00.821603 kubelet[2752]: E0715 05:18:00.821561 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:18:00.822967 kubelet[2752]: E0715 05:18:00.821785 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:18:00.822967 kubelet[2752]: W0715 05:18:00.821796 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:18:00.822967 kubelet[2752]: E0715 05:18:00.821884 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:18:00.822967 kubelet[2752]: E0715 05:18:00.822120 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:18:00.822967 kubelet[2752]: W0715 05:18:00.822127 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:18:00.822967 kubelet[2752]: E0715 05:18:00.822135 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:18:00.822967 kubelet[2752]: E0715 05:18:00.822411 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:18:00.822967 kubelet[2752]: W0715 05:18:00.822533 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:18:00.822967 kubelet[2752]: E0715 05:18:00.822542 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:18:00.822967 kubelet[2752]: E0715 05:18:00.822876 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:18:00.823601 containerd[1608]: time="2025-07-15T05:18:00.821976446Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:00.824484 kubelet[2752]: W0715 05:18:00.822884 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:18:00.824484 kubelet[2752]: E0715 05:18:00.822893 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:18:00.824484 kubelet[2752]: E0715 05:18:00.823353 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:18:00.824484 kubelet[2752]: W0715 05:18:00.823416 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:18:00.824484 kubelet[2752]: E0715 05:18:00.823429 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:18:00.824484 kubelet[2752]: E0715 05:18:00.823686 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:18:00.824484 kubelet[2752]: W0715 05:18:00.823696 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:18:00.824484 kubelet[2752]: E0715 05:18:00.823708 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:18:00.824638 containerd[1608]: time="2025-07-15T05:18:00.824018235Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2: active requests=0, bytes read=4446956" Jul 15 05:18:00.825977 containerd[1608]: time="2025-07-15T05:18:00.825918840Z" level=info msg="ImageCreate event name:\"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:00.829267 containerd[1608]: time="2025-07-15T05:18:00.828540347Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:00.829267 containerd[1608]: time="2025-07-15T05:18:00.829090383Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" with image id \"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\", size \"5939619\" in 1.67625551s" Jul 15 05:18:00.829267 containerd[1608]: time="2025-07-15T05:18:00.829114491Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" returns image reference \"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\"" Jul 15 05:18:00.832972 containerd[1608]: time="2025-07-15T05:18:00.832935071Z" level=info msg="CreateContainer within sandbox \"3cfdfc0948533b774b58de840c30f401db3feda269ca2daafa7f12e0d9099649\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jul 15 05:18:00.840161 kubelet[2752]: E0715 05:18:00.840120 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:18:00.840161 kubelet[2752]: W0715 05:18:00.840156 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:18:00.840429 kubelet[2752]: E0715 05:18:00.840185 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:18:00.840817 kubelet[2752]: E0715 05:18:00.840468 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:18:00.840817 kubelet[2752]: W0715 05:18:00.840484 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:18:00.840817 kubelet[2752]: E0715 05:18:00.840503 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:18:00.841613 kubelet[2752]: E0715 05:18:00.841550 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:18:00.841971 kubelet[2752]: W0715 05:18:00.841715 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:18:00.841971 kubelet[2752]: E0715 05:18:00.841812 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:18:00.844263 kubelet[2752]: E0715 05:18:00.843994 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:18:00.844263 kubelet[2752]: W0715 05:18:00.844047 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:18:00.844263 kubelet[2752]: E0715 05:18:00.844213 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:18:00.844642 kubelet[2752]: E0715 05:18:00.844618 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:18:00.845053 kubelet[2752]: W0715 05:18:00.844742 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:18:00.845053 kubelet[2752]: E0715 05:18:00.844943 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:18:00.845630 kubelet[2752]: E0715 05:18:00.845615 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:18:00.846078 kubelet[2752]: W0715 05:18:00.845782 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:18:00.846078 kubelet[2752]: E0715 05:18:00.846054 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:18:00.846394 kubelet[2752]: E0715 05:18:00.846366 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:18:00.846394 kubelet[2752]: W0715 05:18:00.846379 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:18:00.846692 kubelet[2752]: E0715 05:18:00.846545 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:18:00.846972 kubelet[2752]: E0715 05:18:00.846938 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:18:00.846972 kubelet[2752]: W0715 05:18:00.846954 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:18:00.847508 kubelet[2752]: E0715 05:18:00.847298 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:18:00.847868 kubelet[2752]: E0715 05:18:00.847837 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:18:00.848071 kubelet[2752]: W0715 05:18:00.848044 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:18:00.848823 kubelet[2752]: E0715 05:18:00.848294 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:18:00.849326 kubelet[2752]: E0715 05:18:00.849298 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:18:00.849326 kubelet[2752]: W0715 05:18:00.849311 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:18:00.850635 kubelet[2752]: E0715 05:18:00.849654 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:18:00.850965 kubelet[2752]: E0715 05:18:00.850929 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:18:00.850965 kubelet[2752]: W0715 05:18:00.850945 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:18:00.851399 kubelet[2752]: E0715 05:18:00.851191 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:18:00.852389 kubelet[2752]: E0715 05:18:00.852373 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:18:00.852873 kubelet[2752]: W0715 05:18:00.852564 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:18:00.852919 containerd[1608]: time="2025-07-15T05:18:00.851982483Z" level=info msg="Container 818e043b03abe4be36734cda752b3ea2965c3c565a09979ae3c052028aa9fe25: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:18:00.855549 kubelet[2752]: E0715 05:18:00.853303 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:18:00.863209 kubelet[2752]: E0715 05:18:00.860948 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:18:00.865936 kubelet[2752]: W0715 05:18:00.865602 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:18:00.866465 kubelet[2752]: E0715 05:18:00.866312 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:18:00.866465 kubelet[2752]: W0715 05:18:00.866345 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:18:00.866841 kubelet[2752]: E0715 05:18:00.866758 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:18:00.866841 kubelet[2752]: E0715 05:18:00.866800 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:18:00.867394 kubelet[2752]: E0715 05:18:00.867344 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:18:00.867666 kubelet[2752]: W0715 05:18:00.867534 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:18:00.867787 kubelet[2752]: E0715 05:18:00.867770 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:18:00.874942 kubelet[2752]: E0715 05:18:00.874466 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:18:00.874942 kubelet[2752]: W0715 05:18:00.874497 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:18:00.874942 kubelet[2752]: E0715 05:18:00.874520 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:18:00.876635 kubelet[2752]: E0715 05:18:00.875413 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:18:00.876635 kubelet[2752]: W0715 05:18:00.875426 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:18:00.876635 kubelet[2752]: E0715 05:18:00.875439 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:18:00.876635 kubelet[2752]: E0715 05:18:00.875870 2752 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:18:00.876635 kubelet[2752]: W0715 05:18:00.875881 2752 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:18:00.876635 kubelet[2752]: E0715 05:18:00.875892 2752 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:18:00.884223 containerd[1608]: time="2025-07-15T05:18:00.884171041Z" level=info msg="CreateContainer within sandbox \"3cfdfc0948533b774b58de840c30f401db3feda269ca2daafa7f12e0d9099649\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"818e043b03abe4be36734cda752b3ea2965c3c565a09979ae3c052028aa9fe25\"" Jul 15 05:18:00.885535 containerd[1608]: time="2025-07-15T05:18:00.885497880Z" level=info msg="StartContainer for \"818e043b03abe4be36734cda752b3ea2965c3c565a09979ae3c052028aa9fe25\"" Jul 15 05:18:00.901613 containerd[1608]: time="2025-07-15T05:18:00.888405674Z" level=info msg="connecting to shim 818e043b03abe4be36734cda752b3ea2965c3c565a09979ae3c052028aa9fe25" address="unix:///run/containerd/s/e15d57789337a53e915876b325a5242894d11bc26ee178d60f1e4aaa4149189c" protocol=ttrpc version=3 Jul 15 05:18:00.939583 systemd[1]: Started cri-containerd-818e043b03abe4be36734cda752b3ea2965c3c565a09979ae3c052028aa9fe25.scope - libcontainer container 818e043b03abe4be36734cda752b3ea2965c3c565a09979ae3c052028aa9fe25. Jul 15 05:18:01.006617 containerd[1608]: time="2025-07-15T05:18:01.006571930Z" level=info msg="StartContainer for \"818e043b03abe4be36734cda752b3ea2965c3c565a09979ae3c052028aa9fe25\" returns successfully" Jul 15 05:18:01.031464 systemd[1]: cri-containerd-818e043b03abe4be36734cda752b3ea2965c3c565a09979ae3c052028aa9fe25.scope: Deactivated successfully. Jul 15 05:18:01.062163 containerd[1608]: time="2025-07-15T05:18:01.062106302Z" level=info msg="TaskExit event in podsandbox handler container_id:\"818e043b03abe4be36734cda752b3ea2965c3c565a09979ae3c052028aa9fe25\" id:\"818e043b03abe4be36734cda752b3ea2965c3c565a09979ae3c052028aa9fe25\" pid:3470 exited_at:{seconds:1752556681 nanos:38846507}" Jul 15 05:18:01.062386 containerd[1608]: time="2025-07-15T05:18:01.062163997Z" level=info msg="received exit event container_id:\"818e043b03abe4be36734cda752b3ea2965c3c565a09979ae3c052028aa9fe25\" id:\"818e043b03abe4be36734cda752b3ea2965c3c565a09979ae3c052028aa9fe25\" pid:3470 exited_at:{seconds:1752556681 nanos:38846507}" Jul 15 05:18:01.099641 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-818e043b03abe4be36734cda752b3ea2965c3c565a09979ae3c052028aa9fe25-rootfs.mount: Deactivated successfully. Jul 15 05:18:01.788645 containerd[1608]: time="2025-07-15T05:18:01.788551061Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\"" Jul 15 05:18:01.804777 kubelet[2752]: I0715 05:18:01.804695 2752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-74948b5f57-cdwfv" podStartSLOduration=3.644786872 podStartE2EDuration="5.804679904s" podCreationTimestamp="2025-07-15 05:17:56 +0000 UTC" firstStartedPulling="2025-07-15 05:17:56.99214835 +0000 UTC m=+19.416147718" lastFinishedPulling="2025-07-15 05:17:59.152041382 +0000 UTC m=+21.576040750" observedRunningTime="2025-07-15 05:17:59.797889464 +0000 UTC m=+22.221888832" watchObservedRunningTime="2025-07-15 05:18:01.804679904 +0000 UTC m=+24.228679272" Jul 15 05:18:02.666505 kubelet[2752]: E0715 05:18:02.666426 2752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kn29v" podUID="904a4f13-bfb8-412a-a126-6662c25983b9" Jul 15 05:18:04.430486 containerd[1608]: time="2025-07-15T05:18:04.430394378Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:04.431741 containerd[1608]: time="2025-07-15T05:18:04.431700250Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.2: active requests=0, bytes read=70436221" Jul 15 05:18:04.432860 containerd[1608]: time="2025-07-15T05:18:04.432803861Z" level=info msg="ImageCreate event name:\"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:04.435387 containerd[1608]: time="2025-07-15T05:18:04.435329864Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:04.436271 containerd[1608]: time="2025-07-15T05:18:04.436162033Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.2\" with image id \"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\", size \"71928924\" in 2.646705203s" Jul 15 05:18:04.436271 containerd[1608]: time="2025-07-15T05:18:04.436202332Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\" returns image reference \"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\"" Jul 15 05:18:04.439508 containerd[1608]: time="2025-07-15T05:18:04.439424173Z" level=info msg="CreateContainer within sandbox \"3cfdfc0948533b774b58de840c30f401db3feda269ca2daafa7f12e0d9099649\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jul 15 05:18:04.457066 containerd[1608]: time="2025-07-15T05:18:04.455564357Z" level=info msg="Container bc1f93139e8dbdd1b20fabfd287961187985037213c8404a2594141fed0abe9b: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:18:04.463940 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1942959157.mount: Deactivated successfully. Jul 15 05:18:04.474979 containerd[1608]: time="2025-07-15T05:18:04.474921272Z" level=info msg="CreateContainer within sandbox \"3cfdfc0948533b774b58de840c30f401db3feda269ca2daafa7f12e0d9099649\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"bc1f93139e8dbdd1b20fabfd287961187985037213c8404a2594141fed0abe9b\"" Jul 15 05:18:04.477078 containerd[1608]: time="2025-07-15T05:18:04.477034513Z" level=info msg="StartContainer for \"bc1f93139e8dbdd1b20fabfd287961187985037213c8404a2594141fed0abe9b\"" Jul 15 05:18:04.479107 containerd[1608]: time="2025-07-15T05:18:04.479066942Z" level=info msg="connecting to shim bc1f93139e8dbdd1b20fabfd287961187985037213c8404a2594141fed0abe9b" address="unix:///run/containerd/s/e15d57789337a53e915876b325a5242894d11bc26ee178d60f1e4aaa4149189c" protocol=ttrpc version=3 Jul 15 05:18:04.510470 systemd[1]: Started cri-containerd-bc1f93139e8dbdd1b20fabfd287961187985037213c8404a2594141fed0abe9b.scope - libcontainer container bc1f93139e8dbdd1b20fabfd287961187985037213c8404a2594141fed0abe9b. Jul 15 05:18:04.551638 containerd[1608]: time="2025-07-15T05:18:04.551590337Z" level=info msg="StartContainer for \"bc1f93139e8dbdd1b20fabfd287961187985037213c8404a2594141fed0abe9b\" returns successfully" Jul 15 05:18:04.666714 kubelet[2752]: E0715 05:18:04.666664 2752 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kn29v" podUID="904a4f13-bfb8-412a-a126-6662c25983b9" Jul 15 05:18:04.976645 systemd[1]: cri-containerd-bc1f93139e8dbdd1b20fabfd287961187985037213c8404a2594141fed0abe9b.scope: Deactivated successfully. Jul 15 05:18:04.976920 systemd[1]: cri-containerd-bc1f93139e8dbdd1b20fabfd287961187985037213c8404a2594141fed0abe9b.scope: Consumed 397ms CPU time, 167M memory peak, 8.8M read from disk, 171.2M written to disk. Jul 15 05:18:04.992512 containerd[1608]: time="2025-07-15T05:18:04.991769733Z" level=info msg="received exit event container_id:\"bc1f93139e8dbdd1b20fabfd287961187985037213c8404a2594141fed0abe9b\" id:\"bc1f93139e8dbdd1b20fabfd287961187985037213c8404a2594141fed0abe9b\" pid:3529 exited_at:{seconds:1752556684 nanos:991458282}" Jul 15 05:18:04.994416 containerd[1608]: time="2025-07-15T05:18:04.993557296Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bc1f93139e8dbdd1b20fabfd287961187985037213c8404a2594141fed0abe9b\" id:\"bc1f93139e8dbdd1b20fabfd287961187985037213c8404a2594141fed0abe9b\" pid:3529 exited_at:{seconds:1752556684 nanos:991458282}" Jul 15 05:18:05.031011 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-bc1f93139e8dbdd1b20fabfd287961187985037213c8404a2594141fed0abe9b-rootfs.mount: Deactivated successfully. Jul 15 05:18:05.065544 kubelet[2752]: I0715 05:18:05.065516 2752 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Jul 15 05:18:05.106911 kubelet[2752]: W0715 05:18:05.105877 2752 reflector.go:561] object-"calico-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4396-0-0-n-153ccb2e88" cannot list resource "configmaps" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ci-4396-0-0-n-153ccb2e88' and this object Jul 15 05:18:05.106911 kubelet[2752]: E0715 05:18:05.105908 2752 reflector.go:158] "Unhandled Error" err="object-\"calico-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ci-4396-0-0-n-153ccb2e88\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'ci-4396-0-0-n-153ccb2e88' and this object" logger="UnhandledError" Jul 15 05:18:05.109554 systemd[1]: Created slice kubepods-burstable-pod823c6c6f_6a29_45b4_966c_d6647f656a61.slice - libcontainer container kubepods-burstable-pod823c6c6f_6a29_45b4_966c_d6647f656a61.slice. Jul 15 05:18:05.124047 systemd[1]: Created slice kubepods-burstable-pod28d073b9_6796_49c5_aff0_a2c431b5cd15.slice - libcontainer container kubepods-burstable-pod28d073b9_6796_49c5_aff0_a2c431b5cd15.slice. Jul 15 05:18:05.132820 systemd[1]: Created slice kubepods-besteffort-podd418068e_2d12_4fbc_8f63_99524e4b1520.slice - libcontainer container kubepods-besteffort-podd418068e_2d12_4fbc_8f63_99524e4b1520.slice. Jul 15 05:18:05.139184 systemd[1]: Created slice kubepods-besteffort-pode527a52e_9424_4162_bc99_6dc8fba65534.slice - libcontainer container kubepods-besteffort-pode527a52e_9424_4162_bc99_6dc8fba65534.slice. Jul 15 05:18:05.147271 systemd[1]: Created slice kubepods-besteffort-pod8b951d9e_0c0e_4876_b472_9d67da122d9d.slice - libcontainer container kubepods-besteffort-pod8b951d9e_0c0e_4876_b472_9d67da122d9d.slice. Jul 15 05:18:05.158972 systemd[1]: Created slice kubepods-besteffort-pod4b41a19a_f623_4f8d_9193_0b05cb36d5c4.slice - libcontainer container kubepods-besteffort-pod4b41a19a_f623_4f8d_9193_0b05cb36d5c4.slice. Jul 15 05:18:05.166822 systemd[1]: Created slice kubepods-besteffort-poddc60a327_6a37_45b6_9859_bf18d43b6044.slice - libcontainer container kubepods-besteffort-poddc60a327_6a37_45b6_9859_bf18d43b6044.slice. Jul 15 05:18:05.174283 systemd[1]: Created slice kubepods-besteffort-pod2fabffa1_a566_4c6e_b25c_23763cd522c0.slice - libcontainer container kubepods-besteffort-pod2fabffa1_a566_4c6e_b25c_23763cd522c0.slice. Jul 15 05:18:05.175970 kubelet[2752]: I0715 05:18:05.175529 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/d418068e-2d12-4fbc-8f63-99524e4b1520-calico-apiserver-certs\") pod \"calico-apiserver-59d64f457d-lv6nr\" (UID: \"d418068e-2d12-4fbc-8f63-99524e4b1520\") " pod="calico-apiserver/calico-apiserver-59d64f457d-lv6nr" Jul 15 05:18:05.175970 kubelet[2752]: I0715 05:18:05.175563 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fabffa1-a566-4c6e-b25c-23763cd522c0-whisker-ca-bundle\") pod \"whisker-7bcb96c5cf-b6djj\" (UID: \"2fabffa1-a566-4c6e-b25c-23763cd522c0\") " pod="calico-system/whisker-7bcb96c5cf-b6djj" Jul 15 05:18:05.175970 kubelet[2752]: I0715 05:18:05.175579 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkknr\" (UniqueName: \"kubernetes.io/projected/4b41a19a-f623-4f8d-9193-0b05cb36d5c4-kube-api-access-gkknr\") pod \"calico-apiserver-59d64f457d-4h9wp\" (UID: \"4b41a19a-f623-4f8d-9193-0b05cb36d5c4\") " pod="calico-apiserver/calico-apiserver-59d64f457d-4h9wp" Jul 15 05:18:05.175970 kubelet[2752]: I0715 05:18:05.175594 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtn5f\" (UniqueName: \"kubernetes.io/projected/dc60a327-6a37-45b6-9859-bf18d43b6044-kube-api-access-mtn5f\") pod \"calico-apiserver-6c6968cd6-klc5z\" (UID: \"dc60a327-6a37-45b6-9859-bf18d43b6044\") " pod="calico-apiserver/calico-apiserver-6c6968cd6-klc5z" Jul 15 05:18:05.175970 kubelet[2752]: I0715 05:18:05.175609 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np7jf\" (UniqueName: \"kubernetes.io/projected/d418068e-2d12-4fbc-8f63-99524e4b1520-kube-api-access-np7jf\") pod \"calico-apiserver-59d64f457d-lv6nr\" (UID: \"d418068e-2d12-4fbc-8f63-99524e4b1520\") " pod="calico-apiserver/calico-apiserver-59d64f457d-lv6nr" Jul 15 05:18:05.176205 kubelet[2752]: I0715 05:18:05.175621 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49kjf\" (UniqueName: \"kubernetes.io/projected/8b951d9e-0c0e-4876-b472-9d67da122d9d-kube-api-access-49kjf\") pod \"goldmane-58fd7646b9-bl72v\" (UID: \"8b951d9e-0c0e-4876-b472-9d67da122d9d\") " pod="calico-system/goldmane-58fd7646b9-bl72v" Jul 15 05:18:05.176205 kubelet[2752]: I0715 05:18:05.175634 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/2fabffa1-a566-4c6e-b25c-23763cd522c0-whisker-backend-key-pair\") pod \"whisker-7bcb96c5cf-b6djj\" (UID: \"2fabffa1-a566-4c6e-b25c-23763cd522c0\") " pod="calico-system/whisker-7bcb96c5cf-b6djj" Jul 15 05:18:05.176205 kubelet[2752]: I0715 05:18:05.175646 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/4b41a19a-f623-4f8d-9193-0b05cb36d5c4-calico-apiserver-certs\") pod \"calico-apiserver-59d64f457d-4h9wp\" (UID: \"4b41a19a-f623-4f8d-9193-0b05cb36d5c4\") " pod="calico-apiserver/calico-apiserver-59d64f457d-4h9wp" Jul 15 05:18:05.176205 kubelet[2752]: I0715 05:18:05.175659 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/823c6c6f-6a29-45b4-966c-d6647f656a61-config-volume\") pod \"coredns-7c65d6cfc9-xhrjr\" (UID: \"823c6c6f-6a29-45b4-966c-d6647f656a61\") " pod="kube-system/coredns-7c65d6cfc9-xhrjr" Jul 15 05:18:05.176205 kubelet[2752]: I0715 05:18:05.175672 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b9n7\" (UniqueName: \"kubernetes.io/projected/823c6c6f-6a29-45b4-966c-d6647f656a61-kube-api-access-7b9n7\") pod \"coredns-7c65d6cfc9-xhrjr\" (UID: \"823c6c6f-6a29-45b4-966c-d6647f656a61\") " pod="kube-system/coredns-7c65d6cfc9-xhrjr" Jul 15 05:18:05.177758 kubelet[2752]: I0715 05:18:05.175687 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/dc60a327-6a37-45b6-9859-bf18d43b6044-calico-apiserver-certs\") pod \"calico-apiserver-6c6968cd6-klc5z\" (UID: \"dc60a327-6a37-45b6-9859-bf18d43b6044\") " pod="calico-apiserver/calico-apiserver-6c6968cd6-klc5z" Jul 15 05:18:05.177758 kubelet[2752]: I0715 05:18:05.175701 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e527a52e-9424-4162-bc99-6dc8fba65534-tigera-ca-bundle\") pod \"calico-kube-controllers-fd76f846d-dzp4s\" (UID: \"e527a52e-9424-4162-bc99-6dc8fba65534\") " pod="calico-system/calico-kube-controllers-fd76f846d-dzp4s" Jul 15 05:18:05.177758 kubelet[2752]: I0715 05:18:05.175716 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t285g\" (UniqueName: \"kubernetes.io/projected/28d073b9-6796-49c5-aff0-a2c431b5cd15-kube-api-access-t285g\") pod \"coredns-7c65d6cfc9-cpxtb\" (UID: \"28d073b9-6796-49c5-aff0-a2c431b5cd15\") " pod="kube-system/coredns-7c65d6cfc9-cpxtb" Jul 15 05:18:05.177758 kubelet[2752]: I0715 05:18:05.175728 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/28d073b9-6796-49c5-aff0-a2c431b5cd15-config-volume\") pod \"coredns-7c65d6cfc9-cpxtb\" (UID: \"28d073b9-6796-49c5-aff0-a2c431b5cd15\") " pod="kube-system/coredns-7c65d6cfc9-cpxtb" Jul 15 05:18:05.177758 kubelet[2752]: I0715 05:18:05.175739 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msj8j\" (UniqueName: \"kubernetes.io/projected/2fabffa1-a566-4c6e-b25c-23763cd522c0-kube-api-access-msj8j\") pod \"whisker-7bcb96c5cf-b6djj\" (UID: \"2fabffa1-a566-4c6e-b25c-23763cd522c0\") " pod="calico-system/whisker-7bcb96c5cf-b6djj" Jul 15 05:18:05.178175 kubelet[2752]: I0715 05:18:05.175751 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/8b951d9e-0c0e-4876-b472-9d67da122d9d-goldmane-key-pair\") pod \"goldmane-58fd7646b9-bl72v\" (UID: \"8b951d9e-0c0e-4876-b472-9d67da122d9d\") " pod="calico-system/goldmane-58fd7646b9-bl72v" Jul 15 05:18:05.178175 kubelet[2752]: I0715 05:18:05.175765 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b951d9e-0c0e-4876-b472-9d67da122d9d-config\") pod \"goldmane-58fd7646b9-bl72v\" (UID: \"8b951d9e-0c0e-4876-b472-9d67da122d9d\") " pod="calico-system/goldmane-58fd7646b9-bl72v" Jul 15 05:18:05.178175 kubelet[2752]: I0715 05:18:05.175778 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b951d9e-0c0e-4876-b472-9d67da122d9d-goldmane-ca-bundle\") pod \"goldmane-58fd7646b9-bl72v\" (UID: \"8b951d9e-0c0e-4876-b472-9d67da122d9d\") " pod="calico-system/goldmane-58fd7646b9-bl72v" Jul 15 05:18:05.178175 kubelet[2752]: I0715 05:18:05.175791 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp5mx\" (UniqueName: \"kubernetes.io/projected/e527a52e-9424-4162-bc99-6dc8fba65534-kube-api-access-pp5mx\") pod \"calico-kube-controllers-fd76f846d-dzp4s\" (UID: \"e527a52e-9424-4162-bc99-6dc8fba65534\") " pod="calico-system/calico-kube-controllers-fd76f846d-dzp4s" Jul 15 05:18:05.422348 containerd[1608]: time="2025-07-15T05:18:05.421796577Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-xhrjr,Uid:823c6c6f-6a29-45b4-966c-d6647f656a61,Namespace:kube-system,Attempt:0,}" Jul 15 05:18:05.428878 containerd[1608]: time="2025-07-15T05:18:05.428542092Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-cpxtb,Uid:28d073b9-6796-49c5-aff0-a2c431b5cd15,Namespace:kube-system,Attempt:0,}" Jul 15 05:18:05.458183 containerd[1608]: time="2025-07-15T05:18:05.458114680Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-bl72v,Uid:8b951d9e-0c0e-4876-b472-9d67da122d9d,Namespace:calico-system,Attempt:0,}" Jul 15 05:18:05.467298 containerd[1608]: time="2025-07-15T05:18:05.466698430Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-fd76f846d-dzp4s,Uid:e527a52e-9424-4162-bc99-6dc8fba65534,Namespace:calico-system,Attempt:0,}" Jul 15 05:18:05.483468 containerd[1608]: time="2025-07-15T05:18:05.483415554Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7bcb96c5cf-b6djj,Uid:2fabffa1-a566-4c6e-b25c-23763cd522c0,Namespace:calico-system,Attempt:0,}" Jul 15 05:18:05.682874 containerd[1608]: time="2025-07-15T05:18:05.681090784Z" level=error msg="Failed to destroy network for sandbox \"ad4df78608304176a9c99f96c61c4b16eed1330b44b5f626fa97739e6cc62b13\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:18:05.684180 systemd[1]: run-netns-cni\x2d79a3d33a\x2d302a\x2da55b\x2d4316\x2d13a0a18c10a9.mount: Deactivated successfully. Jul 15 05:18:05.684586 containerd[1608]: time="2025-07-15T05:18:05.684514022Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-bl72v,Uid:8b951d9e-0c0e-4876-b472-9d67da122d9d,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad4df78608304176a9c99f96c61c4b16eed1330b44b5f626fa97739e6cc62b13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:18:05.687304 kubelet[2752]: E0715 05:18:05.687267 2752 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad4df78608304176a9c99f96c61c4b16eed1330b44b5f626fa97739e6cc62b13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:18:05.687600 kubelet[2752]: E0715 05:18:05.687331 2752 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad4df78608304176a9c99f96c61c4b16eed1330b44b5f626fa97739e6cc62b13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-58fd7646b9-bl72v" Jul 15 05:18:05.687600 kubelet[2752]: E0715 05:18:05.687349 2752 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad4df78608304176a9c99f96c61c4b16eed1330b44b5f626fa97739e6cc62b13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-58fd7646b9-bl72v" Jul 15 05:18:05.687600 kubelet[2752]: E0715 05:18:05.687389 2752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-58fd7646b9-bl72v_calico-system(8b951d9e-0c0e-4876-b472-9d67da122d9d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-58fd7646b9-bl72v_calico-system(8b951d9e-0c0e-4876-b472-9d67da122d9d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ad4df78608304176a9c99f96c61c4b16eed1330b44b5f626fa97739e6cc62b13\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-58fd7646b9-bl72v" podUID="8b951d9e-0c0e-4876-b472-9d67da122d9d" Jul 15 05:18:05.700416 containerd[1608]: time="2025-07-15T05:18:05.700331166Z" level=error msg="Failed to destroy network for sandbox \"5cd2b6fa6f2ec9fd772fa3044f4adb7c3d652185c54c726bebc7afc56f618b46\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:18:05.703256 containerd[1608]: time="2025-07-15T05:18:05.702712391Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-xhrjr,Uid:823c6c6f-6a29-45b4-966c-d6647f656a61,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5cd2b6fa6f2ec9fd772fa3044f4adb7c3d652185c54c726bebc7afc56f618b46\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:18:05.703340 kubelet[2752]: E0715 05:18:05.702925 2752 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5cd2b6fa6f2ec9fd772fa3044f4adb7c3d652185c54c726bebc7afc56f618b46\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:18:05.703340 kubelet[2752]: E0715 05:18:05.703042 2752 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5cd2b6fa6f2ec9fd772fa3044f4adb7c3d652185c54c726bebc7afc56f618b46\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-xhrjr" Jul 15 05:18:05.703340 kubelet[2752]: E0715 05:18:05.703066 2752 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5cd2b6fa6f2ec9fd772fa3044f4adb7c3d652185c54c726bebc7afc56f618b46\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-xhrjr" Jul 15 05:18:05.703414 kubelet[2752]: E0715 05:18:05.703108 2752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-xhrjr_kube-system(823c6c6f-6a29-45b4-966c-d6647f656a61)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-xhrjr_kube-system(823c6c6f-6a29-45b4-966c-d6647f656a61)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5cd2b6fa6f2ec9fd772fa3044f4adb7c3d652185c54c726bebc7afc56f618b46\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-xhrjr" podUID="823c6c6f-6a29-45b4-966c-d6647f656a61" Jul 15 05:18:05.706554 containerd[1608]: time="2025-07-15T05:18:05.706517519Z" level=error msg="Failed to destroy network for sandbox \"81e8b1135eeb4445d36cae424d25fea8f64377a968a5df0d0b619f4ef51dde66\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:18:05.706744 containerd[1608]: time="2025-07-15T05:18:05.706718328Z" level=error msg="Failed to destroy network for sandbox \"fb32113bf61ad9148516926fd79ee783fc6b346d4bacd170f09507d16be5af50\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:18:05.708436 containerd[1608]: time="2025-07-15T05:18:05.708358208Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7bcb96c5cf-b6djj,Uid:2fabffa1-a566-4c6e-b25c-23763cd522c0,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"81e8b1135eeb4445d36cae424d25fea8f64377a968a5df0d0b619f4ef51dde66\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:18:05.708881 kubelet[2752]: E0715 05:18:05.708639 2752 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"81e8b1135eeb4445d36cae424d25fea8f64377a968a5df0d0b619f4ef51dde66\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:18:05.708881 kubelet[2752]: E0715 05:18:05.708677 2752 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"81e8b1135eeb4445d36cae424d25fea8f64377a968a5df0d0b619f4ef51dde66\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7bcb96c5cf-b6djj" Jul 15 05:18:05.708881 kubelet[2752]: E0715 05:18:05.708691 2752 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"81e8b1135eeb4445d36cae424d25fea8f64377a968a5df0d0b619f4ef51dde66\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7bcb96c5cf-b6djj" Jul 15 05:18:05.708985 kubelet[2752]: E0715 05:18:05.708726 2752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-7bcb96c5cf-b6djj_calico-system(2fabffa1-a566-4c6e-b25c-23763cd522c0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-7bcb96c5cf-b6djj_calico-system(2fabffa1-a566-4c6e-b25c-23763cd522c0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"81e8b1135eeb4445d36cae424d25fea8f64377a968a5df0d0b619f4ef51dde66\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-7bcb96c5cf-b6djj" podUID="2fabffa1-a566-4c6e-b25c-23763cd522c0" Jul 15 05:18:05.709734 containerd[1608]: time="2025-07-15T05:18:05.709155935Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-cpxtb,Uid:28d073b9-6796-49c5-aff0-a2c431b5cd15,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"fb32113bf61ad9148516926fd79ee783fc6b346d4bacd170f09507d16be5af50\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:18:05.710027 kubelet[2752]: E0715 05:18:05.709937 2752 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fb32113bf61ad9148516926fd79ee783fc6b346d4bacd170f09507d16be5af50\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:18:05.710168 kubelet[2752]: E0715 05:18:05.710105 2752 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fb32113bf61ad9148516926fd79ee783fc6b346d4bacd170f09507d16be5af50\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-cpxtb" Jul 15 05:18:05.710817 kubelet[2752]: E0715 05:18:05.710726 2752 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fb32113bf61ad9148516926fd79ee783fc6b346d4bacd170f09507d16be5af50\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-cpxtb" Jul 15 05:18:05.710817 kubelet[2752]: E0715 05:18:05.710767 2752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-cpxtb_kube-system(28d073b9-6796-49c5-aff0-a2c431b5cd15)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-cpxtb_kube-system(28d073b9-6796-49c5-aff0-a2c431b5cd15)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fb32113bf61ad9148516926fd79ee783fc6b346d4bacd170f09507d16be5af50\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-cpxtb" podUID="28d073b9-6796-49c5-aff0-a2c431b5cd15" Jul 15 05:18:05.714328 containerd[1608]: time="2025-07-15T05:18:05.714287680Z" level=error msg="Failed to destroy network for sandbox \"3f06a51b2299a9a7e1f546bdad5e08a2396160bc71fddbe2ee2d250f48df1023\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:18:05.715525 containerd[1608]: time="2025-07-15T05:18:05.715448629Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-fd76f846d-dzp4s,Uid:e527a52e-9424-4162-bc99-6dc8fba65534,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f06a51b2299a9a7e1f546bdad5e08a2396160bc71fddbe2ee2d250f48df1023\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:18:05.715647 kubelet[2752]: E0715 05:18:05.715591 2752 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f06a51b2299a9a7e1f546bdad5e08a2396160bc71fddbe2ee2d250f48df1023\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:18:05.715647 kubelet[2752]: E0715 05:18:05.715616 2752 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f06a51b2299a9a7e1f546bdad5e08a2396160bc71fddbe2ee2d250f48df1023\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-fd76f846d-dzp4s" Jul 15 05:18:05.715647 kubelet[2752]: E0715 05:18:05.715628 2752 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f06a51b2299a9a7e1f546bdad5e08a2396160bc71fddbe2ee2d250f48df1023\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-fd76f846d-dzp4s" Jul 15 05:18:05.715746 kubelet[2752]: E0715 05:18:05.715706 2752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-fd76f846d-dzp4s_calico-system(e527a52e-9424-4162-bc99-6dc8fba65534)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-fd76f846d-dzp4s_calico-system(e527a52e-9424-4162-bc99-6dc8fba65534)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3f06a51b2299a9a7e1f546bdad5e08a2396160bc71fddbe2ee2d250f48df1023\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-fd76f846d-dzp4s" podUID="e527a52e-9424-4162-bc99-6dc8fba65534" Jul 15 05:18:05.839007 containerd[1608]: time="2025-07-15T05:18:05.838944645Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\"" Jul 15 05:18:06.304831 kubelet[2752]: E0715 05:18:06.304762 2752 projected.go:288] Couldn't get configMap calico-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Jul 15 05:18:06.304831 kubelet[2752]: E0715 05:18:06.304810 2752 projected.go:194] Error preparing data for projected volume kube-api-access-np7jf for pod calico-apiserver/calico-apiserver-59d64f457d-lv6nr: failed to sync configmap cache: timed out waiting for the condition Jul 15 05:18:06.305010 kubelet[2752]: E0715 05:18:06.304886 2752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d418068e-2d12-4fbc-8f63-99524e4b1520-kube-api-access-np7jf podName:d418068e-2d12-4fbc-8f63-99524e4b1520 nodeName:}" failed. No retries permitted until 2025-07-15 05:18:06.804864116 +0000 UTC m=+29.228863484 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-np7jf" (UniqueName: "kubernetes.io/projected/d418068e-2d12-4fbc-8f63-99524e4b1520-kube-api-access-np7jf") pod "calico-apiserver-59d64f457d-lv6nr" (UID: "d418068e-2d12-4fbc-8f63-99524e4b1520") : failed to sync configmap cache: timed out waiting for the condition Jul 15 05:18:06.305010 kubelet[2752]: E0715 05:18:06.304907 2752 projected.go:288] Couldn't get configMap calico-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Jul 15 05:18:06.305010 kubelet[2752]: E0715 05:18:06.304916 2752 projected.go:194] Error preparing data for projected volume kube-api-access-gkknr for pod calico-apiserver/calico-apiserver-59d64f457d-4h9wp: failed to sync configmap cache: timed out waiting for the condition Jul 15 05:18:06.305010 kubelet[2752]: E0715 05:18:06.304939 2752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b41a19a-f623-4f8d-9193-0b05cb36d5c4-kube-api-access-gkknr podName:4b41a19a-f623-4f8d-9193-0b05cb36d5c4 nodeName:}" failed. No retries permitted until 2025-07-15 05:18:06.804931489 +0000 UTC m=+29.228930867 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-gkknr" (UniqueName: "kubernetes.io/projected/4b41a19a-f623-4f8d-9193-0b05cb36d5c4-kube-api-access-gkknr") pod "calico-apiserver-59d64f457d-4h9wp" (UID: "4b41a19a-f623-4f8d-9193-0b05cb36d5c4") : failed to sync configmap cache: timed out waiting for the condition Jul 15 05:18:06.308277 kubelet[2752]: E0715 05:18:06.308171 2752 projected.go:288] Couldn't get configMap calico-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Jul 15 05:18:06.308277 kubelet[2752]: E0715 05:18:06.308198 2752 projected.go:194] Error preparing data for projected volume kube-api-access-mtn5f for pod calico-apiserver/calico-apiserver-6c6968cd6-klc5z: failed to sync configmap cache: timed out waiting for the condition Jul 15 05:18:06.308277 kubelet[2752]: E0715 05:18:06.308258 2752 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dc60a327-6a37-45b6-9859-bf18d43b6044-kube-api-access-mtn5f podName:dc60a327-6a37-45b6-9859-bf18d43b6044 nodeName:}" failed. No retries permitted until 2025-07-15 05:18:06.808223213 +0000 UTC m=+29.232222581 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-mtn5f" (UniqueName: "kubernetes.io/projected/dc60a327-6a37-45b6-9859-bf18d43b6044-kube-api-access-mtn5f") pod "calico-apiserver-6c6968cd6-klc5z" (UID: "dc60a327-6a37-45b6-9859-bf18d43b6044") : failed to sync configmap cache: timed out waiting for the condition Jul 15 05:18:06.454714 systemd[1]: run-netns-cni\x2dc1fe0a1c\x2df259\x2d2334\x2df635\x2d46853022ea4b.mount: Deactivated successfully. Jul 15 05:18:06.454831 systemd[1]: run-netns-cni\x2db3d1d754\x2d1a82\x2d5da1\x2d69cd\x2d8cad8707ce88.mount: Deactivated successfully. Jul 15 05:18:06.454934 systemd[1]: run-netns-cni\x2d01e6b80d\x2d118a\x2db7f5\x2d4aab\x2ddcef671ccd0e.mount: Deactivated successfully. Jul 15 05:18:06.455014 systemd[1]: run-netns-cni\x2d2bd04fd0\x2de00b\x2d4548\x2d3f8a\x2db0df9059795f.mount: Deactivated successfully. Jul 15 05:18:06.674639 systemd[1]: Created slice kubepods-besteffort-pod904a4f13_bfb8_412a_a126_6662c25983b9.slice - libcontainer container kubepods-besteffort-pod904a4f13_bfb8_412a_a126_6662c25983b9.slice. Jul 15 05:18:06.677832 containerd[1608]: time="2025-07-15T05:18:06.677791135Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kn29v,Uid:904a4f13-bfb8-412a-a126-6662c25983b9,Namespace:calico-system,Attempt:0,}" Jul 15 05:18:06.729429 containerd[1608]: time="2025-07-15T05:18:06.729376767Z" level=error msg="Failed to destroy network for sandbox \"25f8ac33ce3cbe99774354eefc9602f2bdde9cde4466aca33dea5ab72348d098\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:18:06.731696 systemd[1]: run-netns-cni\x2d7094468c\x2ddafb\x2da4d9\x2dd277\x2d8ff7b33e855e.mount: Deactivated successfully. Jul 15 05:18:06.732343 containerd[1608]: time="2025-07-15T05:18:06.732305169Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kn29v,Uid:904a4f13-bfb8-412a-a126-6662c25983b9,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"25f8ac33ce3cbe99774354eefc9602f2bdde9cde4466aca33dea5ab72348d098\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:18:06.732544 kubelet[2752]: E0715 05:18:06.732508 2752 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"25f8ac33ce3cbe99774354eefc9602f2bdde9cde4466aca33dea5ab72348d098\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:18:06.732958 kubelet[2752]: E0715 05:18:06.732629 2752 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"25f8ac33ce3cbe99774354eefc9602f2bdde9cde4466aca33dea5ab72348d098\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-kn29v" Jul 15 05:18:06.732958 kubelet[2752]: E0715 05:18:06.732652 2752 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"25f8ac33ce3cbe99774354eefc9602f2bdde9cde4466aca33dea5ab72348d098\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-kn29v" Jul 15 05:18:06.732958 kubelet[2752]: E0715 05:18:06.732710 2752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-kn29v_calico-system(904a4f13-bfb8-412a-a126-6662c25983b9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-kn29v_calico-system(904a4f13-bfb8-412a-a126-6662c25983b9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"25f8ac33ce3cbe99774354eefc9602f2bdde9cde4466aca33dea5ab72348d098\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-kn29v" podUID="904a4f13-bfb8-412a-a126-6662c25983b9" Jul 15 05:18:06.937457 containerd[1608]: time="2025-07-15T05:18:06.937322969Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59d64f457d-lv6nr,Uid:d418068e-2d12-4fbc-8f63-99524e4b1520,Namespace:calico-apiserver,Attempt:0,}" Jul 15 05:18:06.963116 containerd[1608]: time="2025-07-15T05:18:06.963063316Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59d64f457d-4h9wp,Uid:4b41a19a-f623-4f8d-9193-0b05cb36d5c4,Namespace:calico-apiserver,Attempt:0,}" Jul 15 05:18:06.972419 containerd[1608]: time="2025-07-15T05:18:06.972125752Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6c6968cd6-klc5z,Uid:dc60a327-6a37-45b6-9859-bf18d43b6044,Namespace:calico-apiserver,Attempt:0,}" Jul 15 05:18:06.999553 containerd[1608]: time="2025-07-15T05:18:06.999512597Z" level=error msg="Failed to destroy network for sandbox \"a6fa3fae3292a54909ec78246a94b6907b51973d8d0df843b1a39fb58aef5f41\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:18:07.001442 containerd[1608]: time="2025-07-15T05:18:07.001368971Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59d64f457d-lv6nr,Uid:d418068e-2d12-4fbc-8f63-99524e4b1520,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a6fa3fae3292a54909ec78246a94b6907b51973d8d0df843b1a39fb58aef5f41\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:18:07.001926 kubelet[2752]: E0715 05:18:07.001852 2752 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a6fa3fae3292a54909ec78246a94b6907b51973d8d0df843b1a39fb58aef5f41\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:18:07.001926 kubelet[2752]: E0715 05:18:07.001919 2752 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a6fa3fae3292a54909ec78246a94b6907b51973d8d0df843b1a39fb58aef5f41\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-59d64f457d-lv6nr" Jul 15 05:18:07.002047 kubelet[2752]: E0715 05:18:07.001937 2752 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a6fa3fae3292a54909ec78246a94b6907b51973d8d0df843b1a39fb58aef5f41\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-59d64f457d-lv6nr" Jul 15 05:18:07.004565 kubelet[2752]: E0715 05:18:07.002645 2752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-59d64f457d-lv6nr_calico-apiserver(d418068e-2d12-4fbc-8f63-99524e4b1520)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-59d64f457d-lv6nr_calico-apiserver(d418068e-2d12-4fbc-8f63-99524e4b1520)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a6fa3fae3292a54909ec78246a94b6907b51973d8d0df843b1a39fb58aef5f41\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-59d64f457d-lv6nr" podUID="d418068e-2d12-4fbc-8f63-99524e4b1520" Jul 15 05:18:07.036539 containerd[1608]: time="2025-07-15T05:18:07.036483153Z" level=error msg="Failed to destroy network for sandbox \"09fa28004ef68cb94f27387eb5555bb02ff028180b595efa34ab6e2a6ad18592\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:18:07.037808 containerd[1608]: time="2025-07-15T05:18:07.037773971Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59d64f457d-4h9wp,Uid:4b41a19a-f623-4f8d-9193-0b05cb36d5c4,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"09fa28004ef68cb94f27387eb5555bb02ff028180b595efa34ab6e2a6ad18592\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:18:07.038068 kubelet[2752]: E0715 05:18:07.038006 2752 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"09fa28004ef68cb94f27387eb5555bb02ff028180b595efa34ab6e2a6ad18592\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:18:07.038170 kubelet[2752]: E0715 05:18:07.038066 2752 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"09fa28004ef68cb94f27387eb5555bb02ff028180b595efa34ab6e2a6ad18592\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-59d64f457d-4h9wp" Jul 15 05:18:07.038170 kubelet[2752]: E0715 05:18:07.038089 2752 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"09fa28004ef68cb94f27387eb5555bb02ff028180b595efa34ab6e2a6ad18592\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-59d64f457d-4h9wp" Jul 15 05:18:07.038170 kubelet[2752]: E0715 05:18:07.038139 2752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-59d64f457d-4h9wp_calico-apiserver(4b41a19a-f623-4f8d-9193-0b05cb36d5c4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-59d64f457d-4h9wp_calico-apiserver(4b41a19a-f623-4f8d-9193-0b05cb36d5c4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"09fa28004ef68cb94f27387eb5555bb02ff028180b595efa34ab6e2a6ad18592\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-59d64f457d-4h9wp" podUID="4b41a19a-f623-4f8d-9193-0b05cb36d5c4" Jul 15 05:18:07.044938 containerd[1608]: time="2025-07-15T05:18:07.044903304Z" level=error msg="Failed to destroy network for sandbox \"2458b07985243242f7b1209057d6f2435231a422fede3925adecb33d9b0ba7f2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:18:07.045970 containerd[1608]: time="2025-07-15T05:18:07.045853276Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6c6968cd6-klc5z,Uid:dc60a327-6a37-45b6-9859-bf18d43b6044,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2458b07985243242f7b1209057d6f2435231a422fede3925adecb33d9b0ba7f2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:18:07.046472 kubelet[2752]: E0715 05:18:07.046017 2752 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2458b07985243242f7b1209057d6f2435231a422fede3925adecb33d9b0ba7f2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:18:07.046472 kubelet[2752]: E0715 05:18:07.046059 2752 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2458b07985243242f7b1209057d6f2435231a422fede3925adecb33d9b0ba7f2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6c6968cd6-klc5z" Jul 15 05:18:07.046472 kubelet[2752]: E0715 05:18:07.046075 2752 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2458b07985243242f7b1209057d6f2435231a422fede3925adecb33d9b0ba7f2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6c6968cd6-klc5z" Jul 15 05:18:07.046645 kubelet[2752]: E0715 05:18:07.046110 2752 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6c6968cd6-klc5z_calico-apiserver(dc60a327-6a37-45b6-9859-bf18d43b6044)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6c6968cd6-klc5z_calico-apiserver(dc60a327-6a37-45b6-9859-bf18d43b6044)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2458b07985243242f7b1209057d6f2435231a422fede3925adecb33d9b0ba7f2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6c6968cd6-klc5z" podUID="dc60a327-6a37-45b6-9859-bf18d43b6044" Jul 15 05:18:07.594388 kubelet[2752]: I0715 05:18:07.594348 2752 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 05:18:10.353858 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1609294253.mount: Deactivated successfully. Jul 15 05:18:10.389823 containerd[1608]: time="2025-07-15T05:18:10.389769345Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:10.391666 containerd[1608]: time="2025-07-15T05:18:10.391625003Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.2: active requests=0, bytes read=158500163" Jul 15 05:18:10.391863 containerd[1608]: time="2025-07-15T05:18:10.391770681Z" level=info msg="ImageCreate event name:\"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:10.393448 containerd[1608]: time="2025-07-15T05:18:10.393420473Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:10.394493 containerd[1608]: time="2025-07-15T05:18:10.393872645Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.2\" with image id \"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\", size \"158500025\" in 4.554888652s" Jul 15 05:18:10.394493 containerd[1608]: time="2025-07-15T05:18:10.393899408Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\" returns image reference \"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\"" Jul 15 05:18:10.414080 containerd[1608]: time="2025-07-15T05:18:10.414029440Z" level=info msg="CreateContainer within sandbox \"3cfdfc0948533b774b58de840c30f401db3feda269ca2daafa7f12e0d9099649\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jul 15 05:18:10.433406 containerd[1608]: time="2025-07-15T05:18:10.433361782Z" level=info msg="Container 1d0a6ddf65c9eabd203e65bc5725a394c99e08c5ffb9eaf85ad442e7430c5c93: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:18:10.434480 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3129520642.mount: Deactivated successfully. Jul 15 05:18:10.467228 containerd[1608]: time="2025-07-15T05:18:10.467105291Z" level=info msg="CreateContainer within sandbox \"3cfdfc0948533b774b58de840c30f401db3feda269ca2daafa7f12e0d9099649\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"1d0a6ddf65c9eabd203e65bc5725a394c99e08c5ffb9eaf85ad442e7430c5c93\"" Jul 15 05:18:10.468010 containerd[1608]: time="2025-07-15T05:18:10.467938083Z" level=info msg="StartContainer for \"1d0a6ddf65c9eabd203e65bc5725a394c99e08c5ffb9eaf85ad442e7430c5c93\"" Jul 15 05:18:10.469750 containerd[1608]: time="2025-07-15T05:18:10.469719264Z" level=info msg="connecting to shim 1d0a6ddf65c9eabd203e65bc5725a394c99e08c5ffb9eaf85ad442e7430c5c93" address="unix:///run/containerd/s/e15d57789337a53e915876b325a5242894d11bc26ee178d60f1e4aaa4149189c" protocol=ttrpc version=3 Jul 15 05:18:10.586366 systemd[1]: Started cri-containerd-1d0a6ddf65c9eabd203e65bc5725a394c99e08c5ffb9eaf85ad442e7430c5c93.scope - libcontainer container 1d0a6ddf65c9eabd203e65bc5725a394c99e08c5ffb9eaf85ad442e7430c5c93. Jul 15 05:18:10.647584 containerd[1608]: time="2025-07-15T05:18:10.647418691Z" level=info msg="StartContainer for \"1d0a6ddf65c9eabd203e65bc5725a394c99e08c5ffb9eaf85ad442e7430c5c93\" returns successfully" Jul 15 05:18:10.743884 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jul 15 05:18:10.744936 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jul 15 05:18:10.919186 kubelet[2752]: I0715 05:18:10.919073 2752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/2fabffa1-a566-4c6e-b25c-23763cd522c0-whisker-backend-key-pair\") pod \"2fabffa1-a566-4c6e-b25c-23763cd522c0\" (UID: \"2fabffa1-a566-4c6e-b25c-23763cd522c0\") " Jul 15 05:18:10.921281 kubelet[2752]: I0715 05:18:10.920852 2752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msj8j\" (UniqueName: \"kubernetes.io/projected/2fabffa1-a566-4c6e-b25c-23763cd522c0-kube-api-access-msj8j\") pod \"2fabffa1-a566-4c6e-b25c-23763cd522c0\" (UID: \"2fabffa1-a566-4c6e-b25c-23763cd522c0\") " Jul 15 05:18:10.921281 kubelet[2752]: I0715 05:18:10.920883 2752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fabffa1-a566-4c6e-b25c-23763cd522c0-whisker-ca-bundle\") pod \"2fabffa1-a566-4c6e-b25c-23763cd522c0\" (UID: \"2fabffa1-a566-4c6e-b25c-23763cd522c0\") " Jul 15 05:18:10.925290 kubelet[2752]: I0715 05:18:10.925266 2752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fabffa1-a566-4c6e-b25c-23763cd522c0-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "2fabffa1-a566-4c6e-b25c-23763cd522c0" (UID: "2fabffa1-a566-4c6e-b25c-23763cd522c0"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Jul 15 05:18:10.927288 kubelet[2752]: I0715 05:18:10.925449 2752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fabffa1-a566-4c6e-b25c-23763cd522c0-kube-api-access-msj8j" (OuterVolumeSpecName: "kube-api-access-msj8j") pod "2fabffa1-a566-4c6e-b25c-23763cd522c0" (UID: "2fabffa1-a566-4c6e-b25c-23763cd522c0"). InnerVolumeSpecName "kube-api-access-msj8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jul 15 05:18:10.927446 kubelet[2752]: I0715 05:18:10.925713 2752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fabffa1-a566-4c6e-b25c-23763cd522c0-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "2fabffa1-a566-4c6e-b25c-23763cd522c0" (UID: "2fabffa1-a566-4c6e-b25c-23763cd522c0"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jul 15 05:18:11.021662 kubelet[2752]: I0715 05:18:11.021592 2752 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/2fabffa1-a566-4c6e-b25c-23763cd522c0-whisker-backend-key-pair\") on node \"ci-4396-0-0-n-153ccb2e88\" DevicePath \"\"" Jul 15 05:18:11.021855 kubelet[2752]: I0715 05:18:11.021820 2752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msj8j\" (UniqueName: \"kubernetes.io/projected/2fabffa1-a566-4c6e-b25c-23763cd522c0-kube-api-access-msj8j\") on node \"ci-4396-0-0-n-153ccb2e88\" DevicePath \"\"" Jul 15 05:18:11.021855 kubelet[2752]: I0715 05:18:11.021837 2752 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fabffa1-a566-4c6e-b25c-23763cd522c0-whisker-ca-bundle\") on node \"ci-4396-0-0-n-153ccb2e88\" DevicePath \"\"" Jul 15 05:18:11.354084 systemd[1]: var-lib-kubelet-pods-2fabffa1\x2da566\x2d4c6e\x2db25c\x2d23763cd522c0-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dmsj8j.mount: Deactivated successfully. Jul 15 05:18:11.354492 systemd[1]: var-lib-kubelet-pods-2fabffa1\x2da566\x2d4c6e\x2db25c\x2d23763cd522c0-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jul 15 05:18:11.675548 systemd[1]: Removed slice kubepods-besteffort-pod2fabffa1_a566_4c6e_b25c_23763cd522c0.slice - libcontainer container kubepods-besteffort-pod2fabffa1_a566_4c6e_b25c_23763cd522c0.slice. Jul 15 05:18:11.882568 kubelet[2752]: I0715 05:18:11.882523 2752 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 05:18:11.903879 kubelet[2752]: I0715 05:18:11.900719 2752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-vf58q" podStartSLOduration=2.879848478 podStartE2EDuration="15.900686454s" podCreationTimestamp="2025-07-15 05:17:56 +0000 UTC" firstStartedPulling="2025-07-15 05:17:57.37379065 +0000 UTC m=+19.797790018" lastFinishedPulling="2025-07-15 05:18:10.394628625 +0000 UTC m=+32.818627994" observedRunningTime="2025-07-15 05:18:10.930349185 +0000 UTC m=+33.354348573" watchObservedRunningTime="2025-07-15 05:18:11.900686454 +0000 UTC m=+34.324685853" Jul 15 05:18:11.962381 systemd[1]: Created slice kubepods-besteffort-pod971a6a9f_3aa6_4ac8_a881_237dda6b6bce.slice - libcontainer container kubepods-besteffort-pod971a6a9f_3aa6_4ac8_a881_237dda6b6bce.slice. Jul 15 05:18:12.030631 kubelet[2752]: I0715 05:18:12.030539 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/971a6a9f-3aa6-4ac8-a881-237dda6b6bce-whisker-ca-bundle\") pod \"whisker-bff68fcb9-7j99r\" (UID: \"971a6a9f-3aa6-4ac8-a881-237dda6b6bce\") " pod="calico-system/whisker-bff68fcb9-7j99r" Jul 15 05:18:12.030631 kubelet[2752]: I0715 05:18:12.030604 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs6gk\" (UniqueName: \"kubernetes.io/projected/971a6a9f-3aa6-4ac8-a881-237dda6b6bce-kube-api-access-cs6gk\") pod \"whisker-bff68fcb9-7j99r\" (UID: \"971a6a9f-3aa6-4ac8-a881-237dda6b6bce\") " pod="calico-system/whisker-bff68fcb9-7j99r" Jul 15 05:18:12.030631 kubelet[2752]: I0715 05:18:12.030624 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/971a6a9f-3aa6-4ac8-a881-237dda6b6bce-whisker-backend-key-pair\") pod \"whisker-bff68fcb9-7j99r\" (UID: \"971a6a9f-3aa6-4ac8-a881-237dda6b6bce\") " pod="calico-system/whisker-bff68fcb9-7j99r" Jul 15 05:18:12.272839 containerd[1608]: time="2025-07-15T05:18:12.272727280Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-bff68fcb9-7j99r,Uid:971a6a9f-3aa6-4ac8-a881-237dda6b6bce,Namespace:calico-system,Attempt:0,}" Jul 15 05:18:12.766714 systemd-networkd[1464]: cali2b6d91124f7: Link UP Jul 15 05:18:12.768312 systemd-networkd[1464]: cali2b6d91124f7: Gained carrier Jul 15 05:18:12.788526 containerd[1608]: 2025-07-15 05:18:12.497 [INFO][3978] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 15 05:18:12.788526 containerd[1608]: 2025-07-15 05:18:12.534 [INFO][3978] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4396--0--0--n--153ccb2e88-k8s-whisker--bff68fcb9--7j99r-eth0 whisker-bff68fcb9- calico-system 971a6a9f-3aa6-4ac8-a881-237dda6b6bce 882 0 2025-07-15 05:18:11 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:bff68fcb9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4396-0-0-n-153ccb2e88 whisker-bff68fcb9-7j99r eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali2b6d91124f7 [] [] }} ContainerID="d747bce9c2663319c5b494b4ee317d79a38e2be96367a03fd6d59328166bee05" Namespace="calico-system" Pod="whisker-bff68fcb9-7j99r" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-whisker--bff68fcb9--7j99r-" Jul 15 05:18:12.788526 containerd[1608]: 2025-07-15 05:18:12.535 [INFO][3978] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d747bce9c2663319c5b494b4ee317d79a38e2be96367a03fd6d59328166bee05" Namespace="calico-system" Pod="whisker-bff68fcb9-7j99r" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-whisker--bff68fcb9--7j99r-eth0" Jul 15 05:18:12.788526 containerd[1608]: 2025-07-15 05:18:12.705 [INFO][3998] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d747bce9c2663319c5b494b4ee317d79a38e2be96367a03fd6d59328166bee05" HandleID="k8s-pod-network.d747bce9c2663319c5b494b4ee317d79a38e2be96367a03fd6d59328166bee05" Workload="ci--4396--0--0--n--153ccb2e88-k8s-whisker--bff68fcb9--7j99r-eth0" Jul 15 05:18:12.791053 containerd[1608]: 2025-07-15 05:18:12.708 [INFO][3998] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d747bce9c2663319c5b494b4ee317d79a38e2be96367a03fd6d59328166bee05" HandleID="k8s-pod-network.d747bce9c2663319c5b494b4ee317d79a38e2be96367a03fd6d59328166bee05" Workload="ci--4396--0--0--n--153ccb2e88-k8s-whisker--bff68fcb9--7j99r-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000333b10), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4396-0-0-n-153ccb2e88", "pod":"whisker-bff68fcb9-7j99r", "timestamp":"2025-07-15 05:18:12.705986333 +0000 UTC"}, Hostname:"ci-4396-0-0-n-153ccb2e88", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:18:12.791053 containerd[1608]: 2025-07-15 05:18:12.708 [INFO][3998] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:18:12.791053 containerd[1608]: 2025-07-15 05:18:12.709 [INFO][3998] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:18:12.791053 containerd[1608]: 2025-07-15 05:18:12.710 [INFO][3998] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4396-0-0-n-153ccb2e88' Jul 15 05:18:12.791053 containerd[1608]: 2025-07-15 05:18:12.720 [INFO][3998] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d747bce9c2663319c5b494b4ee317d79a38e2be96367a03fd6d59328166bee05" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:12.791053 containerd[1608]: 2025-07-15 05:18:12.729 [INFO][3998] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:12.791053 containerd[1608]: 2025-07-15 05:18:12.734 [INFO][3998] ipam/ipam.go 511: Trying affinity for 192.168.43.0/26 host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:12.791053 containerd[1608]: 2025-07-15 05:18:12.736 [INFO][3998] ipam/ipam.go 158: Attempting to load block cidr=192.168.43.0/26 host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:12.791053 containerd[1608]: 2025-07-15 05:18:12.738 [INFO][3998] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.43.0/26 host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:12.791363 containerd[1608]: 2025-07-15 05:18:12.738 [INFO][3998] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.43.0/26 handle="k8s-pod-network.d747bce9c2663319c5b494b4ee317d79a38e2be96367a03fd6d59328166bee05" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:12.791363 containerd[1608]: 2025-07-15 05:18:12.743 [INFO][3998] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d747bce9c2663319c5b494b4ee317d79a38e2be96367a03fd6d59328166bee05 Jul 15 05:18:12.791363 containerd[1608]: 2025-07-15 05:18:12.746 [INFO][3998] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.43.0/26 handle="k8s-pod-network.d747bce9c2663319c5b494b4ee317d79a38e2be96367a03fd6d59328166bee05" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:12.791363 containerd[1608]: 2025-07-15 05:18:12.750 [INFO][3998] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.43.1/26] block=192.168.43.0/26 handle="k8s-pod-network.d747bce9c2663319c5b494b4ee317d79a38e2be96367a03fd6d59328166bee05" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:12.791363 containerd[1608]: 2025-07-15 05:18:12.750 [INFO][3998] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.43.1/26] handle="k8s-pod-network.d747bce9c2663319c5b494b4ee317d79a38e2be96367a03fd6d59328166bee05" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:12.791363 containerd[1608]: 2025-07-15 05:18:12.750 [INFO][3998] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:18:12.791363 containerd[1608]: 2025-07-15 05:18:12.750 [INFO][3998] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.43.1/26] IPv6=[] ContainerID="d747bce9c2663319c5b494b4ee317d79a38e2be96367a03fd6d59328166bee05" HandleID="k8s-pod-network.d747bce9c2663319c5b494b4ee317d79a38e2be96367a03fd6d59328166bee05" Workload="ci--4396--0--0--n--153ccb2e88-k8s-whisker--bff68fcb9--7j99r-eth0" Jul 15 05:18:12.791921 containerd[1608]: 2025-07-15 05:18:12.753 [INFO][3978] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d747bce9c2663319c5b494b4ee317d79a38e2be96367a03fd6d59328166bee05" Namespace="calico-system" Pod="whisker-bff68fcb9-7j99r" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-whisker--bff68fcb9--7j99r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--153ccb2e88-k8s-whisker--bff68fcb9--7j99r-eth0", GenerateName:"whisker-bff68fcb9-", Namespace:"calico-system", SelfLink:"", UID:"971a6a9f-3aa6-4ac8-a881-237dda6b6bce", ResourceVersion:"882", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 18, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"bff68fcb9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-153ccb2e88", ContainerID:"", Pod:"whisker-bff68fcb9-7j99r", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.43.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali2b6d91124f7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:18:12.791921 containerd[1608]: 2025-07-15 05:18:12.753 [INFO][3978] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.43.1/32] ContainerID="d747bce9c2663319c5b494b4ee317d79a38e2be96367a03fd6d59328166bee05" Namespace="calico-system" Pod="whisker-bff68fcb9-7j99r" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-whisker--bff68fcb9--7j99r-eth0" Jul 15 05:18:12.792403 containerd[1608]: 2025-07-15 05:18:12.753 [INFO][3978] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2b6d91124f7 ContainerID="d747bce9c2663319c5b494b4ee317d79a38e2be96367a03fd6d59328166bee05" Namespace="calico-system" Pod="whisker-bff68fcb9-7j99r" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-whisker--bff68fcb9--7j99r-eth0" Jul 15 05:18:12.792403 containerd[1608]: 2025-07-15 05:18:12.765 [INFO][3978] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d747bce9c2663319c5b494b4ee317d79a38e2be96367a03fd6d59328166bee05" Namespace="calico-system" Pod="whisker-bff68fcb9-7j99r" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-whisker--bff68fcb9--7j99r-eth0" Jul 15 05:18:12.792449 containerd[1608]: 2025-07-15 05:18:12.769 [INFO][3978] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d747bce9c2663319c5b494b4ee317d79a38e2be96367a03fd6d59328166bee05" Namespace="calico-system" Pod="whisker-bff68fcb9-7j99r" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-whisker--bff68fcb9--7j99r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--153ccb2e88-k8s-whisker--bff68fcb9--7j99r-eth0", GenerateName:"whisker-bff68fcb9-", Namespace:"calico-system", SelfLink:"", UID:"971a6a9f-3aa6-4ac8-a881-237dda6b6bce", ResourceVersion:"882", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 18, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"bff68fcb9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-153ccb2e88", ContainerID:"d747bce9c2663319c5b494b4ee317d79a38e2be96367a03fd6d59328166bee05", Pod:"whisker-bff68fcb9-7j99r", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.43.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali2b6d91124f7", MAC:"da:29:64:a3:9c:5d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:18:12.792495 containerd[1608]: 2025-07-15 05:18:12.780 [INFO][3978] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d747bce9c2663319c5b494b4ee317d79a38e2be96367a03fd6d59328166bee05" Namespace="calico-system" Pod="whisker-bff68fcb9-7j99r" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-whisker--bff68fcb9--7j99r-eth0" Jul 15 05:18:12.961893 systemd-networkd[1464]: vxlan.calico: Link UP Jul 15 05:18:12.961901 systemd-networkd[1464]: vxlan.calico: Gained carrier Jul 15 05:18:13.024717 containerd[1608]: time="2025-07-15T05:18:13.024289140Z" level=info msg="connecting to shim d747bce9c2663319c5b494b4ee317d79a38e2be96367a03fd6d59328166bee05" address="unix:///run/containerd/s/3c4bcf869f7ec41c6c3ee8fc4f9e43f83a3b97d262e2aa5bcf3dfbce10188cc9" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:18:13.055425 systemd[1]: Started cri-containerd-d747bce9c2663319c5b494b4ee317d79a38e2be96367a03fd6d59328166bee05.scope - libcontainer container d747bce9c2663319c5b494b4ee317d79a38e2be96367a03fd6d59328166bee05. Jul 15 05:18:13.120896 containerd[1608]: time="2025-07-15T05:18:13.120864852Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-bff68fcb9-7j99r,Uid:971a6a9f-3aa6-4ac8-a881-237dda6b6bce,Namespace:calico-system,Attempt:0,} returns sandbox id \"d747bce9c2663319c5b494b4ee317d79a38e2be96367a03fd6d59328166bee05\"" Jul 15 05:18:13.132117 containerd[1608]: time="2025-07-15T05:18:13.131927492Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\"" Jul 15 05:18:13.670714 kubelet[2752]: I0715 05:18:13.670656 2752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fabffa1-a566-4c6e-b25c-23763cd522c0" path="/var/lib/kubelet/pods/2fabffa1-a566-4c6e-b25c-23763cd522c0/volumes" Jul 15 05:18:14.249597 systemd-networkd[1464]: cali2b6d91124f7: Gained IPv6LL Jul 15 05:18:14.507388 systemd-networkd[1464]: vxlan.calico: Gained IPv6LL Jul 15 05:18:14.760129 containerd[1608]: time="2025-07-15T05:18:14.759529788Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:14.760638 containerd[1608]: time="2025-07-15T05:18:14.760602382Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.2: active requests=0, bytes read=4661207" Jul 15 05:18:14.761349 containerd[1608]: time="2025-07-15T05:18:14.761293005Z" level=info msg="ImageCreate event name:\"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:14.762866 containerd[1608]: time="2025-07-15T05:18:14.762829399Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:14.763826 containerd[1608]: time="2025-07-15T05:18:14.763325531Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.2\" with image id \"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\", size \"6153902\" in 1.631375215s" Jul 15 05:18:14.763826 containerd[1608]: time="2025-07-15T05:18:14.763364338Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\" returns image reference \"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\"" Jul 15 05:18:14.765880 containerd[1608]: time="2025-07-15T05:18:14.765854861Z" level=info msg="CreateContainer within sandbox \"d747bce9c2663319c5b494b4ee317d79a38e2be96367a03fd6d59328166bee05\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Jul 15 05:18:14.771447 containerd[1608]: time="2025-07-15T05:18:14.771428993Z" level=info msg="Container 04b76ae3621803448e97c6e899a3d84a442d9b80a9ac51fe148c70c9e9e803a8: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:18:14.792232 containerd[1608]: time="2025-07-15T05:18:14.792187681Z" level=info msg="CreateContainer within sandbox \"d747bce9c2663319c5b494b4ee317d79a38e2be96367a03fd6d59328166bee05\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"04b76ae3621803448e97c6e899a3d84a442d9b80a9ac51fe148c70c9e9e803a8\"" Jul 15 05:18:14.792646 containerd[1608]: time="2025-07-15T05:18:14.792626752Z" level=info msg="StartContainer for \"04b76ae3621803448e97c6e899a3d84a442d9b80a9ac51fe148c70c9e9e803a8\"" Jul 15 05:18:14.793813 containerd[1608]: time="2025-07-15T05:18:14.793794522Z" level=info msg="connecting to shim 04b76ae3621803448e97c6e899a3d84a442d9b80a9ac51fe148c70c9e9e803a8" address="unix:///run/containerd/s/3c4bcf869f7ec41c6c3ee8fc4f9e43f83a3b97d262e2aa5bcf3dfbce10188cc9" protocol=ttrpc version=3 Jul 15 05:18:14.813360 systemd[1]: Started cri-containerd-04b76ae3621803448e97c6e899a3d84a442d9b80a9ac51fe148c70c9e9e803a8.scope - libcontainer container 04b76ae3621803448e97c6e899a3d84a442d9b80a9ac51fe148c70c9e9e803a8. Jul 15 05:18:14.865103 containerd[1608]: time="2025-07-15T05:18:14.865046228Z" level=info msg="StartContainer for \"04b76ae3621803448e97c6e899a3d84a442d9b80a9ac51fe148c70c9e9e803a8\" returns successfully" Jul 15 05:18:14.867684 containerd[1608]: time="2025-07-15T05:18:14.867643702Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\"" Jul 15 05:18:16.668372 containerd[1608]: time="2025-07-15T05:18:16.667914767Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-cpxtb,Uid:28d073b9-6796-49c5-aff0-a2c431b5cd15,Namespace:kube-system,Attempt:0,}" Jul 15 05:18:16.826294 systemd-networkd[1464]: calibaeef165ca8: Link UP Jul 15 05:18:16.828332 systemd-networkd[1464]: calibaeef165ca8: Gained carrier Jul 15 05:18:16.851720 containerd[1608]: 2025-07-15 05:18:16.721 [INFO][4197] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4396--0--0--n--153ccb2e88-k8s-coredns--7c65d6cfc9--cpxtb-eth0 coredns-7c65d6cfc9- kube-system 28d073b9-6796-49c5-aff0-a2c431b5cd15 805 0 2025-07-15 05:17:44 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4396-0-0-n-153ccb2e88 coredns-7c65d6cfc9-cpxtb eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calibaeef165ca8 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="ea59575730bcc0a70275de6903e2e6a1dfba5e15931e6921ab7a19f825da68a7" Namespace="kube-system" Pod="coredns-7c65d6cfc9-cpxtb" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-coredns--7c65d6cfc9--cpxtb-" Jul 15 05:18:16.851720 containerd[1608]: 2025-07-15 05:18:16.722 [INFO][4197] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ea59575730bcc0a70275de6903e2e6a1dfba5e15931e6921ab7a19f825da68a7" Namespace="kube-system" Pod="coredns-7c65d6cfc9-cpxtb" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-coredns--7c65d6cfc9--cpxtb-eth0" Jul 15 05:18:16.851720 containerd[1608]: 2025-07-15 05:18:16.767 [INFO][4210] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ea59575730bcc0a70275de6903e2e6a1dfba5e15931e6921ab7a19f825da68a7" HandleID="k8s-pod-network.ea59575730bcc0a70275de6903e2e6a1dfba5e15931e6921ab7a19f825da68a7" Workload="ci--4396--0--0--n--153ccb2e88-k8s-coredns--7c65d6cfc9--cpxtb-eth0" Jul 15 05:18:16.851913 containerd[1608]: 2025-07-15 05:18:16.767 [INFO][4210] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ea59575730bcc0a70275de6903e2e6a1dfba5e15931e6921ab7a19f825da68a7" HandleID="k8s-pod-network.ea59575730bcc0a70275de6903e2e6a1dfba5e15931e6921ab7a19f825da68a7" Workload="ci--4396--0--0--n--153ccb2e88-k8s-coredns--7c65d6cfc9--cpxtb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f6c0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4396-0-0-n-153ccb2e88", "pod":"coredns-7c65d6cfc9-cpxtb", "timestamp":"2025-07-15 05:18:16.76756954 +0000 UTC"}, Hostname:"ci-4396-0-0-n-153ccb2e88", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:18:16.851913 containerd[1608]: 2025-07-15 05:18:16.767 [INFO][4210] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:18:16.851913 containerd[1608]: 2025-07-15 05:18:16.767 [INFO][4210] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:18:16.851913 containerd[1608]: 2025-07-15 05:18:16.767 [INFO][4210] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4396-0-0-n-153ccb2e88' Jul 15 05:18:16.851913 containerd[1608]: 2025-07-15 05:18:16.778 [INFO][4210] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ea59575730bcc0a70275de6903e2e6a1dfba5e15931e6921ab7a19f825da68a7" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:16.851913 containerd[1608]: 2025-07-15 05:18:16.784 [INFO][4210] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:16.851913 containerd[1608]: 2025-07-15 05:18:16.790 [INFO][4210] ipam/ipam.go 511: Trying affinity for 192.168.43.0/26 host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:16.851913 containerd[1608]: 2025-07-15 05:18:16.792 [INFO][4210] ipam/ipam.go 158: Attempting to load block cidr=192.168.43.0/26 host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:16.851913 containerd[1608]: 2025-07-15 05:18:16.795 [INFO][4210] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.43.0/26 host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:16.852956 containerd[1608]: 2025-07-15 05:18:16.795 [INFO][4210] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.43.0/26 handle="k8s-pod-network.ea59575730bcc0a70275de6903e2e6a1dfba5e15931e6921ab7a19f825da68a7" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:16.852956 containerd[1608]: 2025-07-15 05:18:16.797 [INFO][4210] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.ea59575730bcc0a70275de6903e2e6a1dfba5e15931e6921ab7a19f825da68a7 Jul 15 05:18:16.852956 containerd[1608]: 2025-07-15 05:18:16.802 [INFO][4210] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.43.0/26 handle="k8s-pod-network.ea59575730bcc0a70275de6903e2e6a1dfba5e15931e6921ab7a19f825da68a7" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:16.852956 containerd[1608]: 2025-07-15 05:18:16.813 [INFO][4210] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.43.2/26] block=192.168.43.0/26 handle="k8s-pod-network.ea59575730bcc0a70275de6903e2e6a1dfba5e15931e6921ab7a19f825da68a7" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:16.852956 containerd[1608]: 2025-07-15 05:18:16.813 [INFO][4210] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.43.2/26] handle="k8s-pod-network.ea59575730bcc0a70275de6903e2e6a1dfba5e15931e6921ab7a19f825da68a7" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:16.852956 containerd[1608]: 2025-07-15 05:18:16.813 [INFO][4210] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:18:16.852956 containerd[1608]: 2025-07-15 05:18:16.813 [INFO][4210] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.43.2/26] IPv6=[] ContainerID="ea59575730bcc0a70275de6903e2e6a1dfba5e15931e6921ab7a19f825da68a7" HandleID="k8s-pod-network.ea59575730bcc0a70275de6903e2e6a1dfba5e15931e6921ab7a19f825da68a7" Workload="ci--4396--0--0--n--153ccb2e88-k8s-coredns--7c65d6cfc9--cpxtb-eth0" Jul 15 05:18:16.853102 containerd[1608]: 2025-07-15 05:18:16.821 [INFO][4197] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ea59575730bcc0a70275de6903e2e6a1dfba5e15931e6921ab7a19f825da68a7" Namespace="kube-system" Pod="coredns-7c65d6cfc9-cpxtb" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-coredns--7c65d6cfc9--cpxtb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--153ccb2e88-k8s-coredns--7c65d6cfc9--cpxtb-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"28d073b9-6796-49c5-aff0-a2c431b5cd15", ResourceVersion:"805", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 17, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-153ccb2e88", ContainerID:"", Pod:"coredns-7c65d6cfc9-cpxtb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.43.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calibaeef165ca8", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:18:16.853102 containerd[1608]: 2025-07-15 05:18:16.821 [INFO][4197] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.43.2/32] ContainerID="ea59575730bcc0a70275de6903e2e6a1dfba5e15931e6921ab7a19f825da68a7" Namespace="kube-system" Pod="coredns-7c65d6cfc9-cpxtb" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-coredns--7c65d6cfc9--cpxtb-eth0" Jul 15 05:18:16.853102 containerd[1608]: 2025-07-15 05:18:16.821 [INFO][4197] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibaeef165ca8 ContainerID="ea59575730bcc0a70275de6903e2e6a1dfba5e15931e6921ab7a19f825da68a7" Namespace="kube-system" Pod="coredns-7c65d6cfc9-cpxtb" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-coredns--7c65d6cfc9--cpxtb-eth0" Jul 15 05:18:16.853102 containerd[1608]: 2025-07-15 05:18:16.828 [INFO][4197] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ea59575730bcc0a70275de6903e2e6a1dfba5e15931e6921ab7a19f825da68a7" Namespace="kube-system" Pod="coredns-7c65d6cfc9-cpxtb" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-coredns--7c65d6cfc9--cpxtb-eth0" Jul 15 05:18:16.853102 containerd[1608]: 2025-07-15 05:18:16.828 [INFO][4197] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ea59575730bcc0a70275de6903e2e6a1dfba5e15931e6921ab7a19f825da68a7" Namespace="kube-system" Pod="coredns-7c65d6cfc9-cpxtb" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-coredns--7c65d6cfc9--cpxtb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--153ccb2e88-k8s-coredns--7c65d6cfc9--cpxtb-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"28d073b9-6796-49c5-aff0-a2c431b5cd15", ResourceVersion:"805", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 17, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-153ccb2e88", ContainerID:"ea59575730bcc0a70275de6903e2e6a1dfba5e15931e6921ab7a19f825da68a7", Pod:"coredns-7c65d6cfc9-cpxtb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.43.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calibaeef165ca8", MAC:"86:fc:ac:c2:34:9a", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:18:16.853102 containerd[1608]: 2025-07-15 05:18:16.844 [INFO][4197] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ea59575730bcc0a70275de6903e2e6a1dfba5e15931e6921ab7a19f825da68a7" Namespace="kube-system" Pod="coredns-7c65d6cfc9-cpxtb" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-coredns--7c65d6cfc9--cpxtb-eth0" Jul 15 05:18:16.898224 containerd[1608]: time="2025-07-15T05:18:16.898141881Z" level=info msg="connecting to shim ea59575730bcc0a70275de6903e2e6a1dfba5e15931e6921ab7a19f825da68a7" address="unix:///run/containerd/s/213ea9711e11ab760fd0b5df58cf5996d9ff8fa514fc6f3bf450371dbc16e9df" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:18:16.926446 systemd[1]: Started cri-containerd-ea59575730bcc0a70275de6903e2e6a1dfba5e15931e6921ab7a19f825da68a7.scope - libcontainer container ea59575730bcc0a70275de6903e2e6a1dfba5e15931e6921ab7a19f825da68a7. Jul 15 05:18:16.982506 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3895861614.mount: Deactivated successfully. Jul 15 05:18:16.993700 containerd[1608]: time="2025-07-15T05:18:16.993609722Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-cpxtb,Uid:28d073b9-6796-49c5-aff0-a2c431b5cd15,Namespace:kube-system,Attempt:0,} returns sandbox id \"ea59575730bcc0a70275de6903e2e6a1dfba5e15931e6921ab7a19f825da68a7\"" Jul 15 05:18:16.998112 containerd[1608]: time="2025-07-15T05:18:16.998083345Z" level=info msg="CreateContainer within sandbox \"ea59575730bcc0a70275de6903e2e6a1dfba5e15931e6921ab7a19f825da68a7\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jul 15 05:18:17.013725 containerd[1608]: time="2025-07-15T05:18:17.013691292Z" level=info msg="Container ff48582e825cca5f552d8bbce583eb66afd591f519084989cb7db3aac79a9403: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:18:17.014193 containerd[1608]: time="2025-07-15T05:18:17.014158917Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.2: active requests=0, bytes read=33083477" Jul 15 05:18:17.014677 containerd[1608]: time="2025-07-15T05:18:17.014640007Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:17.025528 containerd[1608]: time="2025-07-15T05:18:17.025479456Z" level=info msg="ImageCreate event name:\"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:17.027164 containerd[1608]: time="2025-07-15T05:18:17.027124753Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:17.027799 containerd[1608]: time="2025-07-15T05:18:17.027765997Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" with image id \"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\", size \"33083307\" in 2.160091162s" Jul 15 05:18:17.027799 containerd[1608]: time="2025-07-15T05:18:17.027794131Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" returns image reference \"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\"" Jul 15 05:18:17.028693 containerd[1608]: time="2025-07-15T05:18:17.028661887Z" level=info msg="CreateContainer within sandbox \"ea59575730bcc0a70275de6903e2e6a1dfba5e15931e6921ab7a19f825da68a7\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"ff48582e825cca5f552d8bbce583eb66afd591f519084989cb7db3aac79a9403\"" Jul 15 05:18:17.030178 containerd[1608]: time="2025-07-15T05:18:17.030118565Z" level=info msg="StartContainer for \"ff48582e825cca5f552d8bbce583eb66afd591f519084989cb7db3aac79a9403\"" Jul 15 05:18:17.031700 containerd[1608]: time="2025-07-15T05:18:17.031674296Z" level=info msg="connecting to shim ff48582e825cca5f552d8bbce583eb66afd591f519084989cb7db3aac79a9403" address="unix:///run/containerd/s/213ea9711e11ab760fd0b5df58cf5996d9ff8fa514fc6f3bf450371dbc16e9df" protocol=ttrpc version=3 Jul 15 05:18:17.032761 containerd[1608]: time="2025-07-15T05:18:17.032684471Z" level=info msg="CreateContainer within sandbox \"d747bce9c2663319c5b494b4ee317d79a38e2be96367a03fd6d59328166bee05\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Jul 15 05:18:17.042068 containerd[1608]: time="2025-07-15T05:18:17.042033117Z" level=info msg="Container 1233ea69515d50da8172a7c550c7466152d1ba9383e99ec255ae2789bafcd255: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:18:17.050002 containerd[1608]: time="2025-07-15T05:18:17.049883763Z" level=info msg="CreateContainer within sandbox \"d747bce9c2663319c5b494b4ee317d79a38e2be96367a03fd6d59328166bee05\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"1233ea69515d50da8172a7c550c7466152d1ba9383e99ec255ae2789bafcd255\"" Jul 15 05:18:17.051343 containerd[1608]: time="2025-07-15T05:18:17.051164357Z" level=info msg="StartContainer for \"1233ea69515d50da8172a7c550c7466152d1ba9383e99ec255ae2789bafcd255\"" Jul 15 05:18:17.053109 containerd[1608]: time="2025-07-15T05:18:17.053089600Z" level=info msg="connecting to shim 1233ea69515d50da8172a7c550c7466152d1ba9383e99ec255ae2789bafcd255" address="unix:///run/containerd/s/3c4bcf869f7ec41c6c3ee8fc4f9e43f83a3b97d262e2aa5bcf3dfbce10188cc9" protocol=ttrpc version=3 Jul 15 05:18:17.054484 systemd[1]: Started cri-containerd-ff48582e825cca5f552d8bbce583eb66afd591f519084989cb7db3aac79a9403.scope - libcontainer container ff48582e825cca5f552d8bbce583eb66afd591f519084989cb7db3aac79a9403. Jul 15 05:18:17.078366 systemd[1]: Started cri-containerd-1233ea69515d50da8172a7c550c7466152d1ba9383e99ec255ae2789bafcd255.scope - libcontainer container 1233ea69515d50da8172a7c550c7466152d1ba9383e99ec255ae2789bafcd255. Jul 15 05:18:17.094842 containerd[1608]: time="2025-07-15T05:18:17.094793897Z" level=info msg="StartContainer for \"ff48582e825cca5f552d8bbce583eb66afd591f519084989cb7db3aac79a9403\" returns successfully" Jul 15 05:18:17.150009 containerd[1608]: time="2025-07-15T05:18:17.149963145Z" level=info msg="StartContainer for \"1233ea69515d50da8172a7c550c7466152d1ba9383e99ec255ae2789bafcd255\" returns successfully" Jul 15 05:18:17.668460 containerd[1608]: time="2025-07-15T05:18:17.668149375Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-fd76f846d-dzp4s,Uid:e527a52e-9424-4162-bc99-6dc8fba65534,Namespace:calico-system,Attempt:0,}" Jul 15 05:18:17.790038 systemd-networkd[1464]: calid74867ea957: Link UP Jul 15 05:18:17.790216 systemd-networkd[1464]: calid74867ea957: Gained carrier Jul 15 05:18:17.806758 containerd[1608]: 2025-07-15 05:18:17.714 [INFO][4345] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4396--0--0--n--153ccb2e88-k8s-calico--kube--controllers--fd76f846d--dzp4s-eth0 calico-kube-controllers-fd76f846d- calico-system e527a52e-9424-4162-bc99-6dc8fba65534 803 0 2025-07-15 05:17:57 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:fd76f846d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4396-0-0-n-153ccb2e88 calico-kube-controllers-fd76f846d-dzp4s eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calid74867ea957 [] [] }} ContainerID="355c63f09f126d17f1513d02c6566bed111fcc8045c861fe48cdfdacd3659ba5" Namespace="calico-system" Pod="calico-kube-controllers-fd76f846d-dzp4s" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--kube--controllers--fd76f846d--dzp4s-" Jul 15 05:18:17.806758 containerd[1608]: 2025-07-15 05:18:17.714 [INFO][4345] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="355c63f09f126d17f1513d02c6566bed111fcc8045c861fe48cdfdacd3659ba5" Namespace="calico-system" Pod="calico-kube-controllers-fd76f846d-dzp4s" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--kube--controllers--fd76f846d--dzp4s-eth0" Jul 15 05:18:17.806758 containerd[1608]: 2025-07-15 05:18:17.752 [INFO][4357] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="355c63f09f126d17f1513d02c6566bed111fcc8045c861fe48cdfdacd3659ba5" HandleID="k8s-pod-network.355c63f09f126d17f1513d02c6566bed111fcc8045c861fe48cdfdacd3659ba5" Workload="ci--4396--0--0--n--153ccb2e88-k8s-calico--kube--controllers--fd76f846d--dzp4s-eth0" Jul 15 05:18:17.806758 containerd[1608]: 2025-07-15 05:18:17.752 [INFO][4357] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="355c63f09f126d17f1513d02c6566bed111fcc8045c861fe48cdfdacd3659ba5" HandleID="k8s-pod-network.355c63f09f126d17f1513d02c6566bed111fcc8045c861fe48cdfdacd3659ba5" Workload="ci--4396--0--0--n--153ccb2e88-k8s-calico--kube--controllers--fd76f846d--dzp4s-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5730), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4396-0-0-n-153ccb2e88", "pod":"calico-kube-controllers-fd76f846d-dzp4s", "timestamp":"2025-07-15 05:18:17.752067592 +0000 UTC"}, Hostname:"ci-4396-0-0-n-153ccb2e88", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:18:17.806758 containerd[1608]: 2025-07-15 05:18:17.752 [INFO][4357] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:18:17.806758 containerd[1608]: 2025-07-15 05:18:17.752 [INFO][4357] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:18:17.806758 containerd[1608]: 2025-07-15 05:18:17.752 [INFO][4357] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4396-0-0-n-153ccb2e88' Jul 15 05:18:17.806758 containerd[1608]: 2025-07-15 05:18:17.759 [INFO][4357] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.355c63f09f126d17f1513d02c6566bed111fcc8045c861fe48cdfdacd3659ba5" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:17.806758 containerd[1608]: 2025-07-15 05:18:17.767 [INFO][4357] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:17.806758 containerd[1608]: 2025-07-15 05:18:17.771 [INFO][4357] ipam/ipam.go 511: Trying affinity for 192.168.43.0/26 host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:17.806758 containerd[1608]: 2025-07-15 05:18:17.773 [INFO][4357] ipam/ipam.go 158: Attempting to load block cidr=192.168.43.0/26 host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:17.806758 containerd[1608]: 2025-07-15 05:18:17.775 [INFO][4357] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.43.0/26 host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:17.806758 containerd[1608]: 2025-07-15 05:18:17.775 [INFO][4357] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.43.0/26 handle="k8s-pod-network.355c63f09f126d17f1513d02c6566bed111fcc8045c861fe48cdfdacd3659ba5" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:17.806758 containerd[1608]: 2025-07-15 05:18:17.776 [INFO][4357] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.355c63f09f126d17f1513d02c6566bed111fcc8045c861fe48cdfdacd3659ba5 Jul 15 05:18:17.806758 containerd[1608]: 2025-07-15 05:18:17.779 [INFO][4357] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.43.0/26 handle="k8s-pod-network.355c63f09f126d17f1513d02c6566bed111fcc8045c861fe48cdfdacd3659ba5" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:17.806758 containerd[1608]: 2025-07-15 05:18:17.783 [INFO][4357] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.43.3/26] block=192.168.43.0/26 handle="k8s-pod-network.355c63f09f126d17f1513d02c6566bed111fcc8045c861fe48cdfdacd3659ba5" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:17.806758 containerd[1608]: 2025-07-15 05:18:17.784 [INFO][4357] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.43.3/26] handle="k8s-pod-network.355c63f09f126d17f1513d02c6566bed111fcc8045c861fe48cdfdacd3659ba5" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:17.806758 containerd[1608]: 2025-07-15 05:18:17.784 [INFO][4357] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:18:17.806758 containerd[1608]: 2025-07-15 05:18:17.784 [INFO][4357] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.43.3/26] IPv6=[] ContainerID="355c63f09f126d17f1513d02c6566bed111fcc8045c861fe48cdfdacd3659ba5" HandleID="k8s-pod-network.355c63f09f126d17f1513d02c6566bed111fcc8045c861fe48cdfdacd3659ba5" Workload="ci--4396--0--0--n--153ccb2e88-k8s-calico--kube--controllers--fd76f846d--dzp4s-eth0" Jul 15 05:18:17.807878 containerd[1608]: 2025-07-15 05:18:17.786 [INFO][4345] cni-plugin/k8s.go 418: Populated endpoint ContainerID="355c63f09f126d17f1513d02c6566bed111fcc8045c861fe48cdfdacd3659ba5" Namespace="calico-system" Pod="calico-kube-controllers-fd76f846d-dzp4s" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--kube--controllers--fd76f846d--dzp4s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--153ccb2e88-k8s-calico--kube--controllers--fd76f846d--dzp4s-eth0", GenerateName:"calico-kube-controllers-fd76f846d-", Namespace:"calico-system", SelfLink:"", UID:"e527a52e-9424-4162-bc99-6dc8fba65534", ResourceVersion:"803", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 17, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"fd76f846d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-153ccb2e88", ContainerID:"", Pod:"calico-kube-controllers-fd76f846d-dzp4s", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.43.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calid74867ea957", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:18:17.807878 containerd[1608]: 2025-07-15 05:18:17.787 [INFO][4345] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.43.3/32] ContainerID="355c63f09f126d17f1513d02c6566bed111fcc8045c861fe48cdfdacd3659ba5" Namespace="calico-system" Pod="calico-kube-controllers-fd76f846d-dzp4s" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--kube--controllers--fd76f846d--dzp4s-eth0" Jul 15 05:18:17.807878 containerd[1608]: 2025-07-15 05:18:17.787 [INFO][4345] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid74867ea957 ContainerID="355c63f09f126d17f1513d02c6566bed111fcc8045c861fe48cdfdacd3659ba5" Namespace="calico-system" Pod="calico-kube-controllers-fd76f846d-dzp4s" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--kube--controllers--fd76f846d--dzp4s-eth0" Jul 15 05:18:17.807878 containerd[1608]: 2025-07-15 05:18:17.789 [INFO][4345] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="355c63f09f126d17f1513d02c6566bed111fcc8045c861fe48cdfdacd3659ba5" Namespace="calico-system" Pod="calico-kube-controllers-fd76f846d-dzp4s" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--kube--controllers--fd76f846d--dzp4s-eth0" Jul 15 05:18:17.807878 containerd[1608]: 2025-07-15 05:18:17.790 [INFO][4345] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="355c63f09f126d17f1513d02c6566bed111fcc8045c861fe48cdfdacd3659ba5" Namespace="calico-system" Pod="calico-kube-controllers-fd76f846d-dzp4s" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--kube--controllers--fd76f846d--dzp4s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--153ccb2e88-k8s-calico--kube--controllers--fd76f846d--dzp4s-eth0", GenerateName:"calico-kube-controllers-fd76f846d-", Namespace:"calico-system", SelfLink:"", UID:"e527a52e-9424-4162-bc99-6dc8fba65534", ResourceVersion:"803", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 17, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"fd76f846d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-153ccb2e88", ContainerID:"355c63f09f126d17f1513d02c6566bed111fcc8045c861fe48cdfdacd3659ba5", Pod:"calico-kube-controllers-fd76f846d-dzp4s", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.43.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calid74867ea957", MAC:"16:60:c3:52:c9:2c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:18:17.807878 containerd[1608]: 2025-07-15 05:18:17.800 [INFO][4345] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="355c63f09f126d17f1513d02c6566bed111fcc8045c861fe48cdfdacd3659ba5" Namespace="calico-system" Pod="calico-kube-controllers-fd76f846d-dzp4s" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--kube--controllers--fd76f846d--dzp4s-eth0" Jul 15 05:18:17.833413 containerd[1608]: time="2025-07-15T05:18:17.833365468Z" level=info msg="connecting to shim 355c63f09f126d17f1513d02c6566bed111fcc8045c861fe48cdfdacd3659ba5" address="unix:///run/containerd/s/f67234b375d52e10742390fac92de7dbada35145fd3055c7f7768ae279dfa343" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:18:17.867433 systemd[1]: Started cri-containerd-355c63f09f126d17f1513d02c6566bed111fcc8045c861fe48cdfdacd3659ba5.scope - libcontainer container 355c63f09f126d17f1513d02c6566bed111fcc8045c861fe48cdfdacd3659ba5. Jul 15 05:18:17.898387 systemd-networkd[1464]: calibaeef165ca8: Gained IPv6LL Jul 15 05:18:17.935440 containerd[1608]: time="2025-07-15T05:18:17.935296819Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-fd76f846d-dzp4s,Uid:e527a52e-9424-4162-bc99-6dc8fba65534,Namespace:calico-system,Attempt:0,} returns sandbox id \"355c63f09f126d17f1513d02c6566bed111fcc8045c861fe48cdfdacd3659ba5\"" Jul 15 05:18:17.938570 containerd[1608]: time="2025-07-15T05:18:17.938543988Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\"" Jul 15 05:18:17.964783 kubelet[2752]: I0715 05:18:17.964717 2752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-bff68fcb9-7j99r" podStartSLOduration=3.065616593 podStartE2EDuration="6.964676813s" podCreationTimestamp="2025-07-15 05:18:11 +0000 UTC" firstStartedPulling="2025-07-15 05:18:13.131020812 +0000 UTC m=+35.555020180" lastFinishedPulling="2025-07-15 05:18:17.030081032 +0000 UTC m=+39.454080400" observedRunningTime="2025-07-15 05:18:17.943806144 +0000 UTC m=+40.367805532" watchObservedRunningTime="2025-07-15 05:18:17.964676813 +0000 UTC m=+40.388676182" Jul 15 05:18:17.966637 kubelet[2752]: I0715 05:18:17.965191 2752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-cpxtb" podStartSLOduration=33.965181269 podStartE2EDuration="33.965181269s" podCreationTimestamp="2025-07-15 05:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 05:18:17.963096965 +0000 UTC m=+40.387096343" watchObservedRunningTime="2025-07-15 05:18:17.965181269 +0000 UTC m=+40.389180638" Jul 15 05:18:18.668059 containerd[1608]: time="2025-07-15T05:18:18.667706101Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-xhrjr,Uid:823c6c6f-6a29-45b4-966c-d6647f656a61,Namespace:kube-system,Attempt:0,}" Jul 15 05:18:18.668059 containerd[1608]: time="2025-07-15T05:18:18.667830824Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-bl72v,Uid:8b951d9e-0c0e-4876-b472-9d67da122d9d,Namespace:calico-system,Attempt:0,}" Jul 15 05:18:18.668425 containerd[1608]: time="2025-07-15T05:18:18.668368365Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6c6968cd6-klc5z,Uid:dc60a327-6a37-45b6-9859-bf18d43b6044,Namespace:calico-apiserver,Attempt:0,}" Jul 15 05:18:18.832116 systemd-networkd[1464]: calia0bbc301614: Link UP Jul 15 05:18:18.833066 systemd-networkd[1464]: calia0bbc301614: Gained carrier Jul 15 05:18:18.847576 containerd[1608]: 2025-07-15 05:18:18.752 [INFO][4429] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4396--0--0--n--153ccb2e88-k8s-goldmane--58fd7646b9--bl72v-eth0 goldmane-58fd7646b9- calico-system 8b951d9e-0c0e-4876-b472-9d67da122d9d 804 0 2025-07-15 05:17:56 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:58fd7646b9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4396-0-0-n-153ccb2e88 goldmane-58fd7646b9-bl72v eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calia0bbc301614 [] [] }} ContainerID="8040c7a18cf43c01ee2630b9ec5ec0e6900f931f3f3fffd8dd856ba377926cf1" Namespace="calico-system" Pod="goldmane-58fd7646b9-bl72v" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-goldmane--58fd7646b9--bl72v-" Jul 15 05:18:18.847576 containerd[1608]: 2025-07-15 05:18:18.752 [INFO][4429] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8040c7a18cf43c01ee2630b9ec5ec0e6900f931f3f3fffd8dd856ba377926cf1" Namespace="calico-system" Pod="goldmane-58fd7646b9-bl72v" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-goldmane--58fd7646b9--bl72v-eth0" Jul 15 05:18:18.847576 containerd[1608]: 2025-07-15 05:18:18.787 [INFO][4477] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8040c7a18cf43c01ee2630b9ec5ec0e6900f931f3f3fffd8dd856ba377926cf1" HandleID="k8s-pod-network.8040c7a18cf43c01ee2630b9ec5ec0e6900f931f3f3fffd8dd856ba377926cf1" Workload="ci--4396--0--0--n--153ccb2e88-k8s-goldmane--58fd7646b9--bl72v-eth0" Jul 15 05:18:18.847576 containerd[1608]: 2025-07-15 05:18:18.787 [INFO][4477] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8040c7a18cf43c01ee2630b9ec5ec0e6900f931f3f3fffd8dd856ba377926cf1" HandleID="k8s-pod-network.8040c7a18cf43c01ee2630b9ec5ec0e6900f931f3f3fffd8dd856ba377926cf1" Workload="ci--4396--0--0--n--153ccb2e88-k8s-goldmane--58fd7646b9--bl72v-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ccff0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4396-0-0-n-153ccb2e88", "pod":"goldmane-58fd7646b9-bl72v", "timestamp":"2025-07-15 05:18:18.787422223 +0000 UTC"}, Hostname:"ci-4396-0-0-n-153ccb2e88", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:18:18.847576 containerd[1608]: 2025-07-15 05:18:18.787 [INFO][4477] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:18:18.847576 containerd[1608]: 2025-07-15 05:18:18.787 [INFO][4477] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:18:18.847576 containerd[1608]: 2025-07-15 05:18:18.787 [INFO][4477] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4396-0-0-n-153ccb2e88' Jul 15 05:18:18.847576 containerd[1608]: 2025-07-15 05:18:18.793 [INFO][4477] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8040c7a18cf43c01ee2630b9ec5ec0e6900f931f3f3fffd8dd856ba377926cf1" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:18.847576 containerd[1608]: 2025-07-15 05:18:18.801 [INFO][4477] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:18.847576 containerd[1608]: 2025-07-15 05:18:18.805 [INFO][4477] ipam/ipam.go 511: Trying affinity for 192.168.43.0/26 host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:18.847576 containerd[1608]: 2025-07-15 05:18:18.807 [INFO][4477] ipam/ipam.go 158: Attempting to load block cidr=192.168.43.0/26 host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:18.847576 containerd[1608]: 2025-07-15 05:18:18.809 [INFO][4477] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.43.0/26 host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:18.847576 containerd[1608]: 2025-07-15 05:18:18.809 [INFO][4477] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.43.0/26 handle="k8s-pod-network.8040c7a18cf43c01ee2630b9ec5ec0e6900f931f3f3fffd8dd856ba377926cf1" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:18.847576 containerd[1608]: 2025-07-15 05:18:18.811 [INFO][4477] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8040c7a18cf43c01ee2630b9ec5ec0e6900f931f3f3fffd8dd856ba377926cf1 Jul 15 05:18:18.847576 containerd[1608]: 2025-07-15 05:18:18.815 [INFO][4477] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.43.0/26 handle="k8s-pod-network.8040c7a18cf43c01ee2630b9ec5ec0e6900f931f3f3fffd8dd856ba377926cf1" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:18.847576 containerd[1608]: 2025-07-15 05:18:18.820 [INFO][4477] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.43.4/26] block=192.168.43.0/26 handle="k8s-pod-network.8040c7a18cf43c01ee2630b9ec5ec0e6900f931f3f3fffd8dd856ba377926cf1" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:18.847576 containerd[1608]: 2025-07-15 05:18:18.820 [INFO][4477] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.43.4/26] handle="k8s-pod-network.8040c7a18cf43c01ee2630b9ec5ec0e6900f931f3f3fffd8dd856ba377926cf1" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:18.847576 containerd[1608]: 2025-07-15 05:18:18.821 [INFO][4477] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:18:18.847576 containerd[1608]: 2025-07-15 05:18:18.821 [INFO][4477] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.43.4/26] IPv6=[] ContainerID="8040c7a18cf43c01ee2630b9ec5ec0e6900f931f3f3fffd8dd856ba377926cf1" HandleID="k8s-pod-network.8040c7a18cf43c01ee2630b9ec5ec0e6900f931f3f3fffd8dd856ba377926cf1" Workload="ci--4396--0--0--n--153ccb2e88-k8s-goldmane--58fd7646b9--bl72v-eth0" Jul 15 05:18:18.848961 containerd[1608]: 2025-07-15 05:18:18.823 [INFO][4429] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8040c7a18cf43c01ee2630b9ec5ec0e6900f931f3f3fffd8dd856ba377926cf1" Namespace="calico-system" Pod="goldmane-58fd7646b9-bl72v" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-goldmane--58fd7646b9--bl72v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--153ccb2e88-k8s-goldmane--58fd7646b9--bl72v-eth0", GenerateName:"goldmane-58fd7646b9-", Namespace:"calico-system", SelfLink:"", UID:"8b951d9e-0c0e-4876-b472-9d67da122d9d", ResourceVersion:"804", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 17, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"58fd7646b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-153ccb2e88", ContainerID:"", Pod:"goldmane-58fd7646b9-bl72v", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.43.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia0bbc301614", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:18:18.848961 containerd[1608]: 2025-07-15 05:18:18.824 [INFO][4429] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.43.4/32] ContainerID="8040c7a18cf43c01ee2630b9ec5ec0e6900f931f3f3fffd8dd856ba377926cf1" Namespace="calico-system" Pod="goldmane-58fd7646b9-bl72v" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-goldmane--58fd7646b9--bl72v-eth0" Jul 15 05:18:18.848961 containerd[1608]: 2025-07-15 05:18:18.824 [INFO][4429] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia0bbc301614 ContainerID="8040c7a18cf43c01ee2630b9ec5ec0e6900f931f3f3fffd8dd856ba377926cf1" Namespace="calico-system" Pod="goldmane-58fd7646b9-bl72v" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-goldmane--58fd7646b9--bl72v-eth0" Jul 15 05:18:18.848961 containerd[1608]: 2025-07-15 05:18:18.833 [INFO][4429] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8040c7a18cf43c01ee2630b9ec5ec0e6900f931f3f3fffd8dd856ba377926cf1" Namespace="calico-system" Pod="goldmane-58fd7646b9-bl72v" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-goldmane--58fd7646b9--bl72v-eth0" Jul 15 05:18:18.848961 containerd[1608]: 2025-07-15 05:18:18.834 [INFO][4429] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8040c7a18cf43c01ee2630b9ec5ec0e6900f931f3f3fffd8dd856ba377926cf1" Namespace="calico-system" Pod="goldmane-58fd7646b9-bl72v" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-goldmane--58fd7646b9--bl72v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--153ccb2e88-k8s-goldmane--58fd7646b9--bl72v-eth0", GenerateName:"goldmane-58fd7646b9-", Namespace:"calico-system", SelfLink:"", UID:"8b951d9e-0c0e-4876-b472-9d67da122d9d", ResourceVersion:"804", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 17, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"58fd7646b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-153ccb2e88", ContainerID:"8040c7a18cf43c01ee2630b9ec5ec0e6900f931f3f3fffd8dd856ba377926cf1", Pod:"goldmane-58fd7646b9-bl72v", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.43.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia0bbc301614", MAC:"d2:4c:30:87:a4:2f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:18:18.848961 containerd[1608]: 2025-07-15 05:18:18.843 [INFO][4429] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8040c7a18cf43c01ee2630b9ec5ec0e6900f931f3f3fffd8dd856ba377926cf1" Namespace="calico-system" Pod="goldmane-58fd7646b9-bl72v" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-goldmane--58fd7646b9--bl72v-eth0" Jul 15 05:18:18.883820 containerd[1608]: time="2025-07-15T05:18:18.883780917Z" level=info msg="connecting to shim 8040c7a18cf43c01ee2630b9ec5ec0e6900f931f3f3fffd8dd856ba377926cf1" address="unix:///run/containerd/s/cc2c8ec5dc4cf155e18df0f7929be56c6e32a44076de89cdbac49cc9c3a4e0f6" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:18:18.922507 systemd[1]: Started cri-containerd-8040c7a18cf43c01ee2630b9ec5ec0e6900f931f3f3fffd8dd856ba377926cf1.scope - libcontainer container 8040c7a18cf43c01ee2630b9ec5ec0e6900f931f3f3fffd8dd856ba377926cf1. Jul 15 05:18:18.946751 systemd-networkd[1464]: cali237de393942: Link UP Jul 15 05:18:18.948698 systemd-networkd[1464]: cali237de393942: Gained carrier Jul 15 05:18:18.964546 containerd[1608]: 2025-07-15 05:18:18.753 [INFO][4452] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--6c6968cd6--klc5z-eth0 calico-apiserver-6c6968cd6- calico-apiserver dc60a327-6a37-45b6-9859-bf18d43b6044 808 0 2025-07-15 05:17:54 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6c6968cd6 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4396-0-0-n-153ccb2e88 calico-apiserver-6c6968cd6-klc5z eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali237de393942 [] [] }} ContainerID="e20dcb8ec451b3983022fa86dc4e33ec481556a978f1be66399b37f09c18739d" Namespace="calico-apiserver" Pod="calico-apiserver-6c6968cd6-klc5z" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--6c6968cd6--klc5z-" Jul 15 05:18:18.964546 containerd[1608]: 2025-07-15 05:18:18.753 [INFO][4452] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e20dcb8ec451b3983022fa86dc4e33ec481556a978f1be66399b37f09c18739d" Namespace="calico-apiserver" Pod="calico-apiserver-6c6968cd6-klc5z" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--6c6968cd6--klc5z-eth0" Jul 15 05:18:18.964546 containerd[1608]: 2025-07-15 05:18:18.799 [INFO][4470] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e20dcb8ec451b3983022fa86dc4e33ec481556a978f1be66399b37f09c18739d" HandleID="k8s-pod-network.e20dcb8ec451b3983022fa86dc4e33ec481556a978f1be66399b37f09c18739d" Workload="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--6c6968cd6--klc5z-eth0" Jul 15 05:18:18.964546 containerd[1608]: 2025-07-15 05:18:18.799 [INFO][4470] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e20dcb8ec451b3983022fa86dc4e33ec481556a978f1be66399b37f09c18739d" HandleID="k8s-pod-network.e20dcb8ec451b3983022fa86dc4e33ec481556a978f1be66399b37f09c18739d" Workload="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--6c6968cd6--klc5z-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5790), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4396-0-0-n-153ccb2e88", "pod":"calico-apiserver-6c6968cd6-klc5z", "timestamp":"2025-07-15 05:18:18.799815438 +0000 UTC"}, Hostname:"ci-4396-0-0-n-153ccb2e88", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:18:18.964546 containerd[1608]: 2025-07-15 05:18:18.800 [INFO][4470] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:18:18.964546 containerd[1608]: 2025-07-15 05:18:18.821 [INFO][4470] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:18:18.964546 containerd[1608]: 2025-07-15 05:18:18.821 [INFO][4470] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4396-0-0-n-153ccb2e88' Jul 15 05:18:18.964546 containerd[1608]: 2025-07-15 05:18:18.895 [INFO][4470] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e20dcb8ec451b3983022fa86dc4e33ec481556a978f1be66399b37f09c18739d" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:18.964546 containerd[1608]: 2025-07-15 05:18:18.901 [INFO][4470] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:18.964546 containerd[1608]: 2025-07-15 05:18:18.905 [INFO][4470] ipam/ipam.go 511: Trying affinity for 192.168.43.0/26 host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:18.964546 containerd[1608]: 2025-07-15 05:18:18.907 [INFO][4470] ipam/ipam.go 158: Attempting to load block cidr=192.168.43.0/26 host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:18.964546 containerd[1608]: 2025-07-15 05:18:18.909 [INFO][4470] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.43.0/26 host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:18.964546 containerd[1608]: 2025-07-15 05:18:18.909 [INFO][4470] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.43.0/26 handle="k8s-pod-network.e20dcb8ec451b3983022fa86dc4e33ec481556a978f1be66399b37f09c18739d" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:18.964546 containerd[1608]: 2025-07-15 05:18:18.911 [INFO][4470] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e20dcb8ec451b3983022fa86dc4e33ec481556a978f1be66399b37f09c18739d Jul 15 05:18:18.964546 containerd[1608]: 2025-07-15 05:18:18.917 [INFO][4470] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.43.0/26 handle="k8s-pod-network.e20dcb8ec451b3983022fa86dc4e33ec481556a978f1be66399b37f09c18739d" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:18.964546 containerd[1608]: 2025-07-15 05:18:18.926 [INFO][4470] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.43.5/26] block=192.168.43.0/26 handle="k8s-pod-network.e20dcb8ec451b3983022fa86dc4e33ec481556a978f1be66399b37f09c18739d" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:18.964546 containerd[1608]: 2025-07-15 05:18:18.926 [INFO][4470] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.43.5/26] handle="k8s-pod-network.e20dcb8ec451b3983022fa86dc4e33ec481556a978f1be66399b37f09c18739d" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:18.964546 containerd[1608]: 2025-07-15 05:18:18.926 [INFO][4470] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:18:18.964546 containerd[1608]: 2025-07-15 05:18:18.926 [INFO][4470] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.43.5/26] IPv6=[] ContainerID="e20dcb8ec451b3983022fa86dc4e33ec481556a978f1be66399b37f09c18739d" HandleID="k8s-pod-network.e20dcb8ec451b3983022fa86dc4e33ec481556a978f1be66399b37f09c18739d" Workload="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--6c6968cd6--klc5z-eth0" Jul 15 05:18:18.966322 containerd[1608]: 2025-07-15 05:18:18.932 [INFO][4452] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e20dcb8ec451b3983022fa86dc4e33ec481556a978f1be66399b37f09c18739d" Namespace="calico-apiserver" Pod="calico-apiserver-6c6968cd6-klc5z" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--6c6968cd6--klc5z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--6c6968cd6--klc5z-eth0", GenerateName:"calico-apiserver-6c6968cd6-", Namespace:"calico-apiserver", SelfLink:"", UID:"dc60a327-6a37-45b6-9859-bf18d43b6044", ResourceVersion:"808", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 17, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6c6968cd6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-153ccb2e88", ContainerID:"", Pod:"calico-apiserver-6c6968cd6-klc5z", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.43.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali237de393942", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:18:18.966322 containerd[1608]: 2025-07-15 05:18:18.932 [INFO][4452] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.43.5/32] ContainerID="e20dcb8ec451b3983022fa86dc4e33ec481556a978f1be66399b37f09c18739d" Namespace="calico-apiserver" Pod="calico-apiserver-6c6968cd6-klc5z" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--6c6968cd6--klc5z-eth0" Jul 15 05:18:18.966322 containerd[1608]: 2025-07-15 05:18:18.932 [INFO][4452] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali237de393942 ContainerID="e20dcb8ec451b3983022fa86dc4e33ec481556a978f1be66399b37f09c18739d" Namespace="calico-apiserver" Pod="calico-apiserver-6c6968cd6-klc5z" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--6c6968cd6--klc5z-eth0" Jul 15 05:18:18.966322 containerd[1608]: 2025-07-15 05:18:18.947 [INFO][4452] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e20dcb8ec451b3983022fa86dc4e33ec481556a978f1be66399b37f09c18739d" Namespace="calico-apiserver" Pod="calico-apiserver-6c6968cd6-klc5z" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--6c6968cd6--klc5z-eth0" Jul 15 05:18:18.966322 containerd[1608]: 2025-07-15 05:18:18.948 [INFO][4452] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e20dcb8ec451b3983022fa86dc4e33ec481556a978f1be66399b37f09c18739d" Namespace="calico-apiserver" Pod="calico-apiserver-6c6968cd6-klc5z" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--6c6968cd6--klc5z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--6c6968cd6--klc5z-eth0", GenerateName:"calico-apiserver-6c6968cd6-", Namespace:"calico-apiserver", SelfLink:"", UID:"dc60a327-6a37-45b6-9859-bf18d43b6044", ResourceVersion:"808", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 17, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6c6968cd6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-153ccb2e88", ContainerID:"e20dcb8ec451b3983022fa86dc4e33ec481556a978f1be66399b37f09c18739d", Pod:"calico-apiserver-6c6968cd6-klc5z", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.43.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali237de393942", MAC:"0a:0c:57:ea:bc:da", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:18:18.966322 containerd[1608]: 2025-07-15 05:18:18.959 [INFO][4452] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e20dcb8ec451b3983022fa86dc4e33ec481556a978f1be66399b37f09c18739d" Namespace="calico-apiserver" Pod="calico-apiserver-6c6968cd6-klc5z" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--6c6968cd6--klc5z-eth0" Jul 15 05:18:19.006850 containerd[1608]: time="2025-07-15T05:18:19.006719753Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-bl72v,Uid:8b951d9e-0c0e-4876-b472-9d67da122d9d,Namespace:calico-system,Attempt:0,} returns sandbox id \"8040c7a18cf43c01ee2630b9ec5ec0e6900f931f3f3fffd8dd856ba377926cf1\"" Jul 15 05:18:19.019056 containerd[1608]: time="2025-07-15T05:18:19.019016549Z" level=info msg="connecting to shim e20dcb8ec451b3983022fa86dc4e33ec481556a978f1be66399b37f09c18739d" address="unix:///run/containerd/s/b011c583adc61bb6985b43b66b76db2e12c7fb306ccd8f25c974aa3b59d064e2" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:18:19.056371 systemd[1]: Started cri-containerd-e20dcb8ec451b3983022fa86dc4e33ec481556a978f1be66399b37f09c18739d.scope - libcontainer container e20dcb8ec451b3983022fa86dc4e33ec481556a978f1be66399b37f09c18739d. Jul 15 05:18:19.061813 systemd-networkd[1464]: calif270f9c6591: Link UP Jul 15 05:18:19.062058 systemd-networkd[1464]: calif270f9c6591: Gained carrier Jul 15 05:18:19.087743 containerd[1608]: 2025-07-15 05:18:18.754 [INFO][4435] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4396--0--0--n--153ccb2e88-k8s-coredns--7c65d6cfc9--xhrjr-eth0 coredns-7c65d6cfc9- kube-system 823c6c6f-6a29-45b4-966c-d6647f656a61 795 0 2025-07-15 05:17:44 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4396-0-0-n-153ccb2e88 coredns-7c65d6cfc9-xhrjr eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calif270f9c6591 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="66c8288c85477f7ed2e1f784830951e744e0e19eca833550c23fb275fe590c8a" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xhrjr" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-coredns--7c65d6cfc9--xhrjr-" Jul 15 05:18:19.087743 containerd[1608]: 2025-07-15 05:18:18.754 [INFO][4435] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="66c8288c85477f7ed2e1f784830951e744e0e19eca833550c23fb275fe590c8a" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xhrjr" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-coredns--7c65d6cfc9--xhrjr-eth0" Jul 15 05:18:19.087743 containerd[1608]: 2025-07-15 05:18:18.802 [INFO][4473] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="66c8288c85477f7ed2e1f784830951e744e0e19eca833550c23fb275fe590c8a" HandleID="k8s-pod-network.66c8288c85477f7ed2e1f784830951e744e0e19eca833550c23fb275fe590c8a" Workload="ci--4396--0--0--n--153ccb2e88-k8s-coredns--7c65d6cfc9--xhrjr-eth0" Jul 15 05:18:19.087743 containerd[1608]: 2025-07-15 05:18:18.802 [INFO][4473] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="66c8288c85477f7ed2e1f784830951e744e0e19eca833550c23fb275fe590c8a" HandleID="k8s-pod-network.66c8288c85477f7ed2e1f784830951e744e0e19eca833550c23fb275fe590c8a" Workload="ci--4396--0--0--n--153ccb2e88-k8s-coredns--7c65d6cfc9--xhrjr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d59f0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4396-0-0-n-153ccb2e88", "pod":"coredns-7c65d6cfc9-xhrjr", "timestamp":"2025-07-15 05:18:18.80242248 +0000 UTC"}, Hostname:"ci-4396-0-0-n-153ccb2e88", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:18:19.087743 containerd[1608]: 2025-07-15 05:18:18.802 [INFO][4473] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:18:19.087743 containerd[1608]: 2025-07-15 05:18:18.926 [INFO][4473] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:18:19.087743 containerd[1608]: 2025-07-15 05:18:18.926 [INFO][4473] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4396-0-0-n-153ccb2e88' Jul 15 05:18:19.087743 containerd[1608]: 2025-07-15 05:18:18.996 [INFO][4473] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.66c8288c85477f7ed2e1f784830951e744e0e19eca833550c23fb275fe590c8a" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:19.087743 containerd[1608]: 2025-07-15 05:18:19.003 [INFO][4473] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:19.087743 containerd[1608]: 2025-07-15 05:18:19.014 [INFO][4473] ipam/ipam.go 511: Trying affinity for 192.168.43.0/26 host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:19.087743 containerd[1608]: 2025-07-15 05:18:19.020 [INFO][4473] ipam/ipam.go 158: Attempting to load block cidr=192.168.43.0/26 host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:19.087743 containerd[1608]: 2025-07-15 05:18:19.022 [INFO][4473] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.43.0/26 host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:19.087743 containerd[1608]: 2025-07-15 05:18:19.022 [INFO][4473] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.43.0/26 handle="k8s-pod-network.66c8288c85477f7ed2e1f784830951e744e0e19eca833550c23fb275fe590c8a" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:19.087743 containerd[1608]: 2025-07-15 05:18:19.024 [INFO][4473] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.66c8288c85477f7ed2e1f784830951e744e0e19eca833550c23fb275fe590c8a Jul 15 05:18:19.087743 containerd[1608]: 2025-07-15 05:18:19.030 [INFO][4473] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.43.0/26 handle="k8s-pod-network.66c8288c85477f7ed2e1f784830951e744e0e19eca833550c23fb275fe590c8a" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:19.087743 containerd[1608]: 2025-07-15 05:18:19.035 [INFO][4473] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.43.6/26] block=192.168.43.0/26 handle="k8s-pod-network.66c8288c85477f7ed2e1f784830951e744e0e19eca833550c23fb275fe590c8a" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:19.087743 containerd[1608]: 2025-07-15 05:18:19.035 [INFO][4473] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.43.6/26] handle="k8s-pod-network.66c8288c85477f7ed2e1f784830951e744e0e19eca833550c23fb275fe590c8a" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:19.087743 containerd[1608]: 2025-07-15 05:18:19.035 [INFO][4473] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:18:19.087743 containerd[1608]: 2025-07-15 05:18:19.036 [INFO][4473] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.43.6/26] IPv6=[] ContainerID="66c8288c85477f7ed2e1f784830951e744e0e19eca833550c23fb275fe590c8a" HandleID="k8s-pod-network.66c8288c85477f7ed2e1f784830951e744e0e19eca833550c23fb275fe590c8a" Workload="ci--4396--0--0--n--153ccb2e88-k8s-coredns--7c65d6cfc9--xhrjr-eth0" Jul 15 05:18:19.089786 containerd[1608]: 2025-07-15 05:18:19.041 [INFO][4435] cni-plugin/k8s.go 418: Populated endpoint ContainerID="66c8288c85477f7ed2e1f784830951e744e0e19eca833550c23fb275fe590c8a" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xhrjr" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-coredns--7c65d6cfc9--xhrjr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--153ccb2e88-k8s-coredns--7c65d6cfc9--xhrjr-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"823c6c6f-6a29-45b4-966c-d6647f656a61", ResourceVersion:"795", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 17, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-153ccb2e88", ContainerID:"", Pod:"coredns-7c65d6cfc9-xhrjr", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.43.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif270f9c6591", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:18:19.089786 containerd[1608]: 2025-07-15 05:18:19.042 [INFO][4435] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.43.6/32] ContainerID="66c8288c85477f7ed2e1f784830951e744e0e19eca833550c23fb275fe590c8a" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xhrjr" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-coredns--7c65d6cfc9--xhrjr-eth0" Jul 15 05:18:19.089786 containerd[1608]: 2025-07-15 05:18:19.042 [INFO][4435] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif270f9c6591 ContainerID="66c8288c85477f7ed2e1f784830951e744e0e19eca833550c23fb275fe590c8a" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xhrjr" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-coredns--7c65d6cfc9--xhrjr-eth0" Jul 15 05:18:19.089786 containerd[1608]: 2025-07-15 05:18:19.061 [INFO][4435] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="66c8288c85477f7ed2e1f784830951e744e0e19eca833550c23fb275fe590c8a" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xhrjr" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-coredns--7c65d6cfc9--xhrjr-eth0" Jul 15 05:18:19.089786 containerd[1608]: 2025-07-15 05:18:19.065 [INFO][4435] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="66c8288c85477f7ed2e1f784830951e744e0e19eca833550c23fb275fe590c8a" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xhrjr" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-coredns--7c65d6cfc9--xhrjr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--153ccb2e88-k8s-coredns--7c65d6cfc9--xhrjr-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"823c6c6f-6a29-45b4-966c-d6647f656a61", ResourceVersion:"795", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 17, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-153ccb2e88", ContainerID:"66c8288c85477f7ed2e1f784830951e744e0e19eca833550c23fb275fe590c8a", Pod:"coredns-7c65d6cfc9-xhrjr", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.43.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif270f9c6591", MAC:"b6:ad:a4:25:80:42", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:18:19.089786 containerd[1608]: 2025-07-15 05:18:19.076 [INFO][4435] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="66c8288c85477f7ed2e1f784830951e744e0e19eca833550c23fb275fe590c8a" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xhrjr" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-coredns--7c65d6cfc9--xhrjr-eth0" Jul 15 05:18:19.112384 containerd[1608]: time="2025-07-15T05:18:19.112311582Z" level=info msg="connecting to shim 66c8288c85477f7ed2e1f784830951e744e0e19eca833550c23fb275fe590c8a" address="unix:///run/containerd/s/58454d7c16e12924022bb5e6e487f7d42f79a36ec6830e33cd1d13cd17f12487" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:18:19.143364 systemd[1]: Started cri-containerd-66c8288c85477f7ed2e1f784830951e744e0e19eca833550c23fb275fe590c8a.scope - libcontainer container 66c8288c85477f7ed2e1f784830951e744e0e19eca833550c23fb275fe590c8a. Jul 15 05:18:19.165560 containerd[1608]: time="2025-07-15T05:18:19.165487066Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6c6968cd6-klc5z,Uid:dc60a327-6a37-45b6-9859-bf18d43b6044,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"e20dcb8ec451b3983022fa86dc4e33ec481556a978f1be66399b37f09c18739d\"" Jul 15 05:18:19.213454 containerd[1608]: time="2025-07-15T05:18:19.213405432Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-xhrjr,Uid:823c6c6f-6a29-45b4-966c-d6647f656a61,Namespace:kube-system,Attempt:0,} returns sandbox id \"66c8288c85477f7ed2e1f784830951e744e0e19eca833550c23fb275fe590c8a\"" Jul 15 05:18:19.217378 containerd[1608]: time="2025-07-15T05:18:19.217294522Z" level=info msg="CreateContainer within sandbox \"66c8288c85477f7ed2e1f784830951e744e0e19eca833550c23fb275fe590c8a\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jul 15 05:18:19.223887 containerd[1608]: time="2025-07-15T05:18:19.223859181Z" level=info msg="Container d0d75a115d389a77c3a2cb1571199b21aaf69fc590b6978747a3a74c5b321e59: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:18:19.228374 containerd[1608]: time="2025-07-15T05:18:19.228333382Z" level=info msg="CreateContainer within sandbox \"66c8288c85477f7ed2e1f784830951e744e0e19eca833550c23fb275fe590c8a\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"d0d75a115d389a77c3a2cb1571199b21aaf69fc590b6978747a3a74c5b321e59\"" Jul 15 05:18:19.228884 containerd[1608]: time="2025-07-15T05:18:19.228782889Z" level=info msg="StartContainer for \"d0d75a115d389a77c3a2cb1571199b21aaf69fc590b6978747a3a74c5b321e59\"" Jul 15 05:18:19.229733 containerd[1608]: time="2025-07-15T05:18:19.229704467Z" level=info msg="connecting to shim d0d75a115d389a77c3a2cb1571199b21aaf69fc590b6978747a3a74c5b321e59" address="unix:///run/containerd/s/58454d7c16e12924022bb5e6e487f7d42f79a36ec6830e33cd1d13cd17f12487" protocol=ttrpc version=3 Jul 15 05:18:19.247368 systemd[1]: Started cri-containerd-d0d75a115d389a77c3a2cb1571199b21aaf69fc590b6978747a3a74c5b321e59.scope - libcontainer container d0d75a115d389a77c3a2cb1571199b21aaf69fc590b6978747a3a74c5b321e59. Jul 15 05:18:19.276403 containerd[1608]: time="2025-07-15T05:18:19.276366201Z" level=info msg="StartContainer for \"d0d75a115d389a77c3a2cb1571199b21aaf69fc590b6978747a3a74c5b321e59\" returns successfully" Jul 15 05:18:19.668691 containerd[1608]: time="2025-07-15T05:18:19.668458966Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kn29v,Uid:904a4f13-bfb8-412a-a126-6662c25983b9,Namespace:calico-system,Attempt:0,}" Jul 15 05:18:19.689508 systemd-networkd[1464]: calid74867ea957: Gained IPv6LL Jul 15 05:18:19.810827 systemd-networkd[1464]: calic9d47332e9b: Link UP Jul 15 05:18:19.813278 systemd-networkd[1464]: calic9d47332e9b: Gained carrier Jul 15 05:18:19.835745 containerd[1608]: 2025-07-15 05:18:19.731 [INFO][4694] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4396--0--0--n--153ccb2e88-k8s-csi--node--driver--kn29v-eth0 csi-node-driver- calico-system 904a4f13-bfb8-412a-a126-6662c25983b9 701 0 2025-07-15 05:17:57 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:57bd658777 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4396-0-0-n-153ccb2e88 csi-node-driver-kn29v eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calic9d47332e9b [] [] }} ContainerID="5c46315e9d8a9312a5b24e1dfb4f6e24ac1c169cc2ee962c1fea635838dd5265" Namespace="calico-system" Pod="csi-node-driver-kn29v" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-csi--node--driver--kn29v-" Jul 15 05:18:19.835745 containerd[1608]: 2025-07-15 05:18:19.731 [INFO][4694] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5c46315e9d8a9312a5b24e1dfb4f6e24ac1c169cc2ee962c1fea635838dd5265" Namespace="calico-system" Pod="csi-node-driver-kn29v" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-csi--node--driver--kn29v-eth0" Jul 15 05:18:19.835745 containerd[1608]: 2025-07-15 05:18:19.760 [INFO][4707] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5c46315e9d8a9312a5b24e1dfb4f6e24ac1c169cc2ee962c1fea635838dd5265" HandleID="k8s-pod-network.5c46315e9d8a9312a5b24e1dfb4f6e24ac1c169cc2ee962c1fea635838dd5265" Workload="ci--4396--0--0--n--153ccb2e88-k8s-csi--node--driver--kn29v-eth0" Jul 15 05:18:19.835745 containerd[1608]: 2025-07-15 05:18:19.760 [INFO][4707] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5c46315e9d8a9312a5b24e1dfb4f6e24ac1c169cc2ee962c1fea635838dd5265" HandleID="k8s-pod-network.5c46315e9d8a9312a5b24e1dfb4f6e24ac1c169cc2ee962c1fea635838dd5265" Workload="ci--4396--0--0--n--153ccb2e88-k8s-csi--node--driver--kn29v-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5040), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4396-0-0-n-153ccb2e88", "pod":"csi-node-driver-kn29v", "timestamp":"2025-07-15 05:18:19.760473639 +0000 UTC"}, Hostname:"ci-4396-0-0-n-153ccb2e88", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:18:19.835745 containerd[1608]: 2025-07-15 05:18:19.760 [INFO][4707] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:18:19.835745 containerd[1608]: 2025-07-15 05:18:19.760 [INFO][4707] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:18:19.835745 containerd[1608]: 2025-07-15 05:18:19.760 [INFO][4707] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4396-0-0-n-153ccb2e88' Jul 15 05:18:19.835745 containerd[1608]: 2025-07-15 05:18:19.766 [INFO][4707] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5c46315e9d8a9312a5b24e1dfb4f6e24ac1c169cc2ee962c1fea635838dd5265" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:19.835745 containerd[1608]: 2025-07-15 05:18:19.772 [INFO][4707] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:19.835745 containerd[1608]: 2025-07-15 05:18:19.778 [INFO][4707] ipam/ipam.go 511: Trying affinity for 192.168.43.0/26 host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:19.835745 containerd[1608]: 2025-07-15 05:18:19.779 [INFO][4707] ipam/ipam.go 158: Attempting to load block cidr=192.168.43.0/26 host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:19.835745 containerd[1608]: 2025-07-15 05:18:19.782 [INFO][4707] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.43.0/26 host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:19.835745 containerd[1608]: 2025-07-15 05:18:19.782 [INFO][4707] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.43.0/26 handle="k8s-pod-network.5c46315e9d8a9312a5b24e1dfb4f6e24ac1c169cc2ee962c1fea635838dd5265" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:19.835745 containerd[1608]: 2025-07-15 05:18:19.784 [INFO][4707] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5c46315e9d8a9312a5b24e1dfb4f6e24ac1c169cc2ee962c1fea635838dd5265 Jul 15 05:18:19.835745 containerd[1608]: 2025-07-15 05:18:19.792 [INFO][4707] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.43.0/26 handle="k8s-pod-network.5c46315e9d8a9312a5b24e1dfb4f6e24ac1c169cc2ee962c1fea635838dd5265" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:19.835745 containerd[1608]: 2025-07-15 05:18:19.798 [INFO][4707] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.43.7/26] block=192.168.43.0/26 handle="k8s-pod-network.5c46315e9d8a9312a5b24e1dfb4f6e24ac1c169cc2ee962c1fea635838dd5265" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:19.835745 containerd[1608]: 2025-07-15 05:18:19.798 [INFO][4707] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.43.7/26] handle="k8s-pod-network.5c46315e9d8a9312a5b24e1dfb4f6e24ac1c169cc2ee962c1fea635838dd5265" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:19.835745 containerd[1608]: 2025-07-15 05:18:19.798 [INFO][4707] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:18:19.835745 containerd[1608]: 2025-07-15 05:18:19.798 [INFO][4707] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.43.7/26] IPv6=[] ContainerID="5c46315e9d8a9312a5b24e1dfb4f6e24ac1c169cc2ee962c1fea635838dd5265" HandleID="k8s-pod-network.5c46315e9d8a9312a5b24e1dfb4f6e24ac1c169cc2ee962c1fea635838dd5265" Workload="ci--4396--0--0--n--153ccb2e88-k8s-csi--node--driver--kn29v-eth0" Jul 15 05:18:19.839329 containerd[1608]: 2025-07-15 05:18:19.801 [INFO][4694] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5c46315e9d8a9312a5b24e1dfb4f6e24ac1c169cc2ee962c1fea635838dd5265" Namespace="calico-system" Pod="csi-node-driver-kn29v" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-csi--node--driver--kn29v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--153ccb2e88-k8s-csi--node--driver--kn29v-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"904a4f13-bfb8-412a-a126-6662c25983b9", ResourceVersion:"701", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 17, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"57bd658777", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-153ccb2e88", ContainerID:"", Pod:"csi-node-driver-kn29v", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.43.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calic9d47332e9b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:18:19.839329 containerd[1608]: 2025-07-15 05:18:19.801 [INFO][4694] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.43.7/32] ContainerID="5c46315e9d8a9312a5b24e1dfb4f6e24ac1c169cc2ee962c1fea635838dd5265" Namespace="calico-system" Pod="csi-node-driver-kn29v" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-csi--node--driver--kn29v-eth0" Jul 15 05:18:19.839329 containerd[1608]: 2025-07-15 05:18:19.801 [INFO][4694] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic9d47332e9b ContainerID="5c46315e9d8a9312a5b24e1dfb4f6e24ac1c169cc2ee962c1fea635838dd5265" Namespace="calico-system" Pod="csi-node-driver-kn29v" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-csi--node--driver--kn29v-eth0" Jul 15 05:18:19.839329 containerd[1608]: 2025-07-15 05:18:19.817 [INFO][4694] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5c46315e9d8a9312a5b24e1dfb4f6e24ac1c169cc2ee962c1fea635838dd5265" Namespace="calico-system" Pod="csi-node-driver-kn29v" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-csi--node--driver--kn29v-eth0" Jul 15 05:18:19.839329 containerd[1608]: 2025-07-15 05:18:19.820 [INFO][4694] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5c46315e9d8a9312a5b24e1dfb4f6e24ac1c169cc2ee962c1fea635838dd5265" Namespace="calico-system" Pod="csi-node-driver-kn29v" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-csi--node--driver--kn29v-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--153ccb2e88-k8s-csi--node--driver--kn29v-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"904a4f13-bfb8-412a-a126-6662c25983b9", ResourceVersion:"701", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 17, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"57bd658777", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-153ccb2e88", ContainerID:"5c46315e9d8a9312a5b24e1dfb4f6e24ac1c169cc2ee962c1fea635838dd5265", Pod:"csi-node-driver-kn29v", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.43.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calic9d47332e9b", MAC:"d6:b5:26:a2:1b:4f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:18:19.839329 containerd[1608]: 2025-07-15 05:18:19.831 [INFO][4694] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5c46315e9d8a9312a5b24e1dfb4f6e24ac1c169cc2ee962c1fea635838dd5265" Namespace="calico-system" Pod="csi-node-driver-kn29v" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-csi--node--driver--kn29v-eth0" Jul 15 05:18:19.887916 containerd[1608]: time="2025-07-15T05:18:19.887879664Z" level=info msg="connecting to shim 5c46315e9d8a9312a5b24e1dfb4f6e24ac1c169cc2ee962c1fea635838dd5265" address="unix:///run/containerd/s/0826dc4cd34351e9515dffb7f375b6c769b946b91eba94692594d3f33d50d318" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:18:19.922415 systemd[1]: Started cri-containerd-5c46315e9d8a9312a5b24e1dfb4f6e24ac1c169cc2ee962c1fea635838dd5265.scope - libcontainer container 5c46315e9d8a9312a5b24e1dfb4f6e24ac1c169cc2ee962c1fea635838dd5265. Jul 15 05:18:19.970105 containerd[1608]: time="2025-07-15T05:18:19.970052313Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kn29v,Uid:904a4f13-bfb8-412a-a126-6662c25983b9,Namespace:calico-system,Attempt:0,} returns sandbox id \"5c46315e9d8a9312a5b24e1dfb4f6e24ac1c169cc2ee962c1fea635838dd5265\"" Jul 15 05:18:20.026338 kubelet[2752]: I0715 05:18:20.026229 2752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-xhrjr" podStartSLOduration=36.026170039 podStartE2EDuration="36.026170039s" podCreationTimestamp="2025-07-15 05:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 05:18:19.999693584 +0000 UTC m=+42.423692962" watchObservedRunningTime="2025-07-15 05:18:20.026170039 +0000 UTC m=+42.450169407" Jul 15 05:18:20.323745 containerd[1608]: time="2025-07-15T05:18:20.323670228Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:20.324735 containerd[1608]: time="2025-07-15T05:18:20.324713802Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.2: active requests=0, bytes read=51276688" Jul 15 05:18:20.325797 containerd[1608]: time="2025-07-15T05:18:20.325612475Z" level=info msg="ImageCreate event name:\"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:20.327497 containerd[1608]: time="2025-07-15T05:18:20.327471179Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:20.328319 containerd[1608]: time="2025-07-15T05:18:20.328006814Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" with image id \"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\", size \"52769359\" in 2.389435363s" Jul 15 05:18:20.328319 containerd[1608]: time="2025-07-15T05:18:20.328031052Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" returns image reference \"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\"" Jul 15 05:18:20.328689 containerd[1608]: time="2025-07-15T05:18:20.328672382Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\"" Jul 15 05:18:20.353059 containerd[1608]: time="2025-07-15T05:18:20.353023871Z" level=info msg="CreateContainer within sandbox \"355c63f09f126d17f1513d02c6566bed111fcc8045c861fe48cdfdacd3659ba5\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Jul 15 05:18:20.365258 containerd[1608]: time="2025-07-15T05:18:20.364975665Z" level=info msg="Container 2a96ae7bd5ca1a2349cf0af2bbdfcabe2b74960ff2c8621c8469e5e0f1fa17ef: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:18:20.371671 containerd[1608]: time="2025-07-15T05:18:20.371632835Z" level=info msg="CreateContainer within sandbox \"355c63f09f126d17f1513d02c6566bed111fcc8045c861fe48cdfdacd3659ba5\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"2a96ae7bd5ca1a2349cf0af2bbdfcabe2b74960ff2c8621c8469e5e0f1fa17ef\"" Jul 15 05:18:20.372353 containerd[1608]: time="2025-07-15T05:18:20.372326026Z" level=info msg="StartContainer for \"2a96ae7bd5ca1a2349cf0af2bbdfcabe2b74960ff2c8621c8469e5e0f1fa17ef\"" Jul 15 05:18:20.373546 containerd[1608]: time="2025-07-15T05:18:20.373428897Z" level=info msg="connecting to shim 2a96ae7bd5ca1a2349cf0af2bbdfcabe2b74960ff2c8621c8469e5e0f1fa17ef" address="unix:///run/containerd/s/f67234b375d52e10742390fac92de7dbada35145fd3055c7f7768ae279dfa343" protocol=ttrpc version=3 Jul 15 05:18:20.393824 systemd-networkd[1464]: calia0bbc301614: Gained IPv6LL Jul 15 05:18:20.395358 systemd[1]: Started cri-containerd-2a96ae7bd5ca1a2349cf0af2bbdfcabe2b74960ff2c8621c8469e5e0f1fa17ef.scope - libcontainer container 2a96ae7bd5ca1a2349cf0af2bbdfcabe2b74960ff2c8621c8469e5e0f1fa17ef. Jul 15 05:18:20.442273 containerd[1608]: time="2025-07-15T05:18:20.442197009Z" level=info msg="StartContainer for \"2a96ae7bd5ca1a2349cf0af2bbdfcabe2b74960ff2c8621c8469e5e0f1fa17ef\" returns successfully" Jul 15 05:18:20.650216 systemd-networkd[1464]: calif270f9c6591: Gained IPv6LL Jul 15 05:18:20.842503 systemd-networkd[1464]: cali237de393942: Gained IPv6LL Jul 15 05:18:20.978759 kubelet[2752]: I0715 05:18:20.977391 2752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-fd76f846d-dzp4s" podStartSLOduration=21.586493288 podStartE2EDuration="23.977368903s" podCreationTimestamp="2025-07-15 05:17:57 +0000 UTC" firstStartedPulling="2025-07-15 05:18:17.937705709 +0000 UTC m=+40.361705077" lastFinishedPulling="2025-07-15 05:18:20.328581324 +0000 UTC m=+42.752580692" observedRunningTime="2025-07-15 05:18:20.976468117 +0000 UTC m=+43.400467525" watchObservedRunningTime="2025-07-15 05:18:20.977368903 +0000 UTC m=+43.401368272" Jul 15 05:18:21.173586 containerd[1608]: time="2025-07-15T05:18:21.173528194Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a96ae7bd5ca1a2349cf0af2bbdfcabe2b74960ff2c8621c8469e5e0f1fa17ef\" id:\"b3561db7e69abc05c475f3f6aed99296f4a9de48f30ef0cf6f96ae5ad58d16a6\" pid:4830 exited_at:{seconds:1752556701 nanos:168367088}" Jul 15 05:18:21.668953 containerd[1608]: time="2025-07-15T05:18:21.668758836Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59d64f457d-4h9wp,Uid:4b41a19a-f623-4f8d-9193-0b05cb36d5c4,Namespace:calico-apiserver,Attempt:0,}" Jul 15 05:18:21.669369 containerd[1608]: time="2025-07-15T05:18:21.668843681Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59d64f457d-lv6nr,Uid:d418068e-2d12-4fbc-8f63-99524e4b1520,Namespace:calico-apiserver,Attempt:0,}" Jul 15 05:18:21.801538 systemd-networkd[1464]: calic9d47332e9b: Gained IPv6LL Jul 15 05:18:21.827187 systemd-networkd[1464]: calia54b591b3e4: Link UP Jul 15 05:18:21.829148 systemd-networkd[1464]: calia54b591b3e4: Gained carrier Jul 15 05:18:21.848003 containerd[1608]: 2025-07-15 05:18:21.737 [INFO][4844] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--lv6nr-eth0 calico-apiserver-59d64f457d- calico-apiserver d418068e-2d12-4fbc-8f63-99524e4b1520 806 0 2025-07-15 05:17:53 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:59d64f457d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4396-0-0-n-153ccb2e88 calico-apiserver-59d64f457d-lv6nr eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calia54b591b3e4 [] [] }} ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" Namespace="calico-apiserver" Pod="calico-apiserver-59d64f457d-lv6nr" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--lv6nr-" Jul 15 05:18:21.848003 containerd[1608]: 2025-07-15 05:18:21.737 [INFO][4844] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" Namespace="calico-apiserver" Pod="calico-apiserver-59d64f457d-lv6nr" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--lv6nr-eth0" Jul 15 05:18:21.848003 containerd[1608]: 2025-07-15 05:18:21.770 [INFO][4863] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" HandleID="k8s-pod-network.5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" Workload="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--lv6nr-eth0" Jul 15 05:18:21.848003 containerd[1608]: 2025-07-15 05:18:21.774 [INFO][4863] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" HandleID="k8s-pod-network.5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" Workload="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--lv6nr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003327d0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4396-0-0-n-153ccb2e88", "pod":"calico-apiserver-59d64f457d-lv6nr", "timestamp":"2025-07-15 05:18:21.770149424 +0000 UTC"}, Hostname:"ci-4396-0-0-n-153ccb2e88", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:18:21.848003 containerd[1608]: 2025-07-15 05:18:21.774 [INFO][4863] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:18:21.848003 containerd[1608]: 2025-07-15 05:18:21.774 [INFO][4863] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:18:21.848003 containerd[1608]: 2025-07-15 05:18:21.774 [INFO][4863] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4396-0-0-n-153ccb2e88' Jul 15 05:18:21.848003 containerd[1608]: 2025-07-15 05:18:21.781 [INFO][4863] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:21.848003 containerd[1608]: 2025-07-15 05:18:21.786 [INFO][4863] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:21.848003 containerd[1608]: 2025-07-15 05:18:21.790 [INFO][4863] ipam/ipam.go 511: Trying affinity for 192.168.43.0/26 host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:21.848003 containerd[1608]: 2025-07-15 05:18:21.792 [INFO][4863] ipam/ipam.go 158: Attempting to load block cidr=192.168.43.0/26 host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:21.848003 containerd[1608]: 2025-07-15 05:18:21.794 [INFO][4863] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.43.0/26 host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:21.848003 containerd[1608]: 2025-07-15 05:18:21.794 [INFO][4863] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.43.0/26 handle="k8s-pod-network.5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:21.848003 containerd[1608]: 2025-07-15 05:18:21.796 [INFO][4863] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9 Jul 15 05:18:21.848003 containerd[1608]: 2025-07-15 05:18:21.801 [INFO][4863] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.43.0/26 handle="k8s-pod-network.5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:21.848003 containerd[1608]: 2025-07-15 05:18:21.810 [INFO][4863] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.43.8/26] block=192.168.43.0/26 handle="k8s-pod-network.5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:21.848003 containerd[1608]: 2025-07-15 05:18:21.810 [INFO][4863] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.43.8/26] handle="k8s-pod-network.5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:21.848003 containerd[1608]: 2025-07-15 05:18:21.810 [INFO][4863] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:18:21.848003 containerd[1608]: 2025-07-15 05:18:21.810 [INFO][4863] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.43.8/26] IPv6=[] ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" HandleID="k8s-pod-network.5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" Workload="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--lv6nr-eth0" Jul 15 05:18:21.850698 containerd[1608]: 2025-07-15 05:18:21.818 [INFO][4844] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" Namespace="calico-apiserver" Pod="calico-apiserver-59d64f457d-lv6nr" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--lv6nr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--lv6nr-eth0", GenerateName:"calico-apiserver-59d64f457d-", Namespace:"calico-apiserver", SelfLink:"", UID:"d418068e-2d12-4fbc-8f63-99524e4b1520", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 17, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"59d64f457d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-153ccb2e88", ContainerID:"", Pod:"calico-apiserver-59d64f457d-lv6nr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.43.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia54b591b3e4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:18:21.850698 containerd[1608]: 2025-07-15 05:18:21.819 [INFO][4844] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.43.8/32] ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" Namespace="calico-apiserver" Pod="calico-apiserver-59d64f457d-lv6nr" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--lv6nr-eth0" Jul 15 05:18:21.850698 containerd[1608]: 2025-07-15 05:18:21.819 [INFO][4844] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia54b591b3e4 ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" Namespace="calico-apiserver" Pod="calico-apiserver-59d64f457d-lv6nr" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--lv6nr-eth0" Jul 15 05:18:21.850698 containerd[1608]: 2025-07-15 05:18:21.828 [INFO][4844] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" Namespace="calico-apiserver" Pod="calico-apiserver-59d64f457d-lv6nr" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--lv6nr-eth0" Jul 15 05:18:21.850698 containerd[1608]: 2025-07-15 05:18:21.829 [INFO][4844] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" Namespace="calico-apiserver" Pod="calico-apiserver-59d64f457d-lv6nr" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--lv6nr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--lv6nr-eth0", GenerateName:"calico-apiserver-59d64f457d-", Namespace:"calico-apiserver", SelfLink:"", UID:"d418068e-2d12-4fbc-8f63-99524e4b1520", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 17, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"59d64f457d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-153ccb2e88", ContainerID:"5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9", Pod:"calico-apiserver-59d64f457d-lv6nr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.43.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia54b591b3e4", MAC:"c2:14:a3:2e:f8:82", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:18:21.850698 containerd[1608]: 2025-07-15 05:18:21.837 [INFO][4844] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" Namespace="calico-apiserver" Pod="calico-apiserver-59d64f457d-lv6nr" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--lv6nr-eth0" Jul 15 05:18:21.877497 containerd[1608]: time="2025-07-15T05:18:21.877452431Z" level=info msg="connecting to shim 5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" address="unix:///run/containerd/s/d4fb3f30b0c1c19de5007b8479aa1349a6b2edb3169ceeb558e52c5e3e8c8c93" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:18:21.917764 systemd[1]: Started cri-containerd-5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9.scope - libcontainer container 5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9. Jul 15 05:18:21.942737 systemd-networkd[1464]: cali68f7fdc37f2: Link UP Jul 15 05:18:21.946036 systemd-networkd[1464]: cali68f7fdc37f2: Gained carrier Jul 15 05:18:21.968160 containerd[1608]: 2025-07-15 05:18:21.737 [INFO][4838] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--4h9wp-eth0 calico-apiserver-59d64f457d- calico-apiserver 4b41a19a-f623-4f8d-9193-0b05cb36d5c4 807 0 2025-07-15 05:17:53 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:59d64f457d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4396-0-0-n-153ccb2e88 calico-apiserver-59d64f457d-4h9wp eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali68f7fdc37f2 [] [] }} ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" Namespace="calico-apiserver" Pod="calico-apiserver-59d64f457d-4h9wp" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--4h9wp-" Jul 15 05:18:21.968160 containerd[1608]: 2025-07-15 05:18:21.737 [INFO][4838] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" Namespace="calico-apiserver" Pod="calico-apiserver-59d64f457d-4h9wp" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--4h9wp-eth0" Jul 15 05:18:21.968160 containerd[1608]: 2025-07-15 05:18:21.774 [INFO][4865] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" HandleID="k8s-pod-network.7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" Workload="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--4h9wp-eth0" Jul 15 05:18:21.968160 containerd[1608]: 2025-07-15 05:18:21.774 [INFO][4865] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" HandleID="k8s-pod-network.7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" Workload="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--4h9wp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5730), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4396-0-0-n-153ccb2e88", "pod":"calico-apiserver-59d64f457d-4h9wp", "timestamp":"2025-07-15 05:18:21.774454993 +0000 UTC"}, Hostname:"ci-4396-0-0-n-153ccb2e88", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:18:21.968160 containerd[1608]: 2025-07-15 05:18:21.774 [INFO][4865] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:18:21.968160 containerd[1608]: 2025-07-15 05:18:21.810 [INFO][4865] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:18:21.968160 containerd[1608]: 2025-07-15 05:18:21.810 [INFO][4865] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4396-0-0-n-153ccb2e88' Jul 15 05:18:21.968160 containerd[1608]: 2025-07-15 05:18:21.883 [INFO][4865] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:21.968160 containerd[1608]: 2025-07-15 05:18:21.893 [INFO][4865] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:21.968160 containerd[1608]: 2025-07-15 05:18:21.898 [INFO][4865] ipam/ipam.go 511: Trying affinity for 192.168.43.0/26 host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:21.968160 containerd[1608]: 2025-07-15 05:18:21.903 [INFO][4865] ipam/ipam.go 158: Attempting to load block cidr=192.168.43.0/26 host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:21.968160 containerd[1608]: 2025-07-15 05:18:21.906 [INFO][4865] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.43.0/26 host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:21.968160 containerd[1608]: 2025-07-15 05:18:21.907 [INFO][4865] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.43.0/26 handle="k8s-pod-network.7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:21.968160 containerd[1608]: 2025-07-15 05:18:21.911 [INFO][4865] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2 Jul 15 05:18:21.968160 containerd[1608]: 2025-07-15 05:18:21.922 [INFO][4865] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.43.0/26 handle="k8s-pod-network.7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:21.968160 containerd[1608]: 2025-07-15 05:18:21.929 [INFO][4865] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.43.9/26] block=192.168.43.0/26 handle="k8s-pod-network.7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:21.968160 containerd[1608]: 2025-07-15 05:18:21.929 [INFO][4865] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.43.9/26] handle="k8s-pod-network.7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:21.968160 containerd[1608]: 2025-07-15 05:18:21.929 [INFO][4865] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:18:21.968160 containerd[1608]: 2025-07-15 05:18:21.929 [INFO][4865] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.43.9/26] IPv6=[] ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" HandleID="k8s-pod-network.7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" Workload="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--4h9wp-eth0" Jul 15 05:18:21.970405 containerd[1608]: 2025-07-15 05:18:21.935 [INFO][4838] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" Namespace="calico-apiserver" Pod="calico-apiserver-59d64f457d-4h9wp" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--4h9wp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--4h9wp-eth0", GenerateName:"calico-apiserver-59d64f457d-", Namespace:"calico-apiserver", SelfLink:"", UID:"4b41a19a-f623-4f8d-9193-0b05cb36d5c4", ResourceVersion:"807", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 17, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"59d64f457d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-153ccb2e88", ContainerID:"", Pod:"calico-apiserver-59d64f457d-4h9wp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.43.9/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali68f7fdc37f2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:18:21.970405 containerd[1608]: 2025-07-15 05:18:21.937 [INFO][4838] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.43.9/32] ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" Namespace="calico-apiserver" Pod="calico-apiserver-59d64f457d-4h9wp" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--4h9wp-eth0" Jul 15 05:18:21.970405 containerd[1608]: 2025-07-15 05:18:21.937 [INFO][4838] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali68f7fdc37f2 ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" Namespace="calico-apiserver" Pod="calico-apiserver-59d64f457d-4h9wp" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--4h9wp-eth0" Jul 15 05:18:21.970405 containerd[1608]: 2025-07-15 05:18:21.946 [INFO][4838] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" Namespace="calico-apiserver" Pod="calico-apiserver-59d64f457d-4h9wp" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--4h9wp-eth0" Jul 15 05:18:21.970405 containerd[1608]: 2025-07-15 05:18:21.947 [INFO][4838] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" Namespace="calico-apiserver" Pod="calico-apiserver-59d64f457d-4h9wp" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--4h9wp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--4h9wp-eth0", GenerateName:"calico-apiserver-59d64f457d-", Namespace:"calico-apiserver", SelfLink:"", UID:"4b41a19a-f623-4f8d-9193-0b05cb36d5c4", ResourceVersion:"807", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 17, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"59d64f457d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-153ccb2e88", ContainerID:"7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2", Pod:"calico-apiserver-59d64f457d-4h9wp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.43.9/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali68f7fdc37f2", MAC:"2e:1d:17:dd:7f:46", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:18:21.970405 containerd[1608]: 2025-07-15 05:18:21.961 [INFO][4838] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" Namespace="calico-apiserver" Pod="calico-apiserver-59d64f457d-4h9wp" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--4h9wp-eth0" Jul 15 05:18:21.998194 containerd[1608]: time="2025-07-15T05:18:21.998141750Z" level=info msg="connecting to shim 7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" address="unix:///run/containerd/s/41b6e6a8594275874b905fa045e6e4dfe91887867137723457c6631cccbfb805" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:18:22.038506 systemd[1]: Started cri-containerd-7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2.scope - libcontainer container 7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2. Jul 15 05:18:22.078142 containerd[1608]: time="2025-07-15T05:18:22.069125982Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59d64f457d-lv6nr,Uid:d418068e-2d12-4fbc-8f63-99524e4b1520,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9\"" Jul 15 05:18:22.135357 containerd[1608]: time="2025-07-15T05:18:22.135227164Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59d64f457d-4h9wp,Uid:4b41a19a-f623-4f8d-9193-0b05cb36d5c4,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2\"" Jul 15 05:18:22.954470 systemd-networkd[1464]: calia54b591b3e4: Gained IPv6LL Jul 15 05:18:23.266894 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1795930858.mount: Deactivated successfully. Jul 15 05:18:23.697200 containerd[1608]: time="2025-07-15T05:18:23.697099897Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:23.699683 containerd[1608]: time="2025-07-15T05:18:23.699648916Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.2: active requests=0, bytes read=66352308" Jul 15 05:18:23.699753 containerd[1608]: time="2025-07-15T05:18:23.699696057Z" level=info msg="ImageCreate event name:\"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:23.702153 containerd[1608]: time="2025-07-15T05:18:23.702120203Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:23.702635 containerd[1608]: time="2025-07-15T05:18:23.702421350Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" with image id \"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\", size \"66352154\" in 3.373727206s" Jul 15 05:18:23.702635 containerd[1608]: time="2025-07-15T05:18:23.702440948Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" returns image reference \"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\"" Jul 15 05:18:23.704256 containerd[1608]: time="2025-07-15T05:18:23.703275341Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Jul 15 05:18:23.706511 containerd[1608]: time="2025-07-15T05:18:23.706469957Z" level=info msg="CreateContainer within sandbox \"8040c7a18cf43c01ee2630b9ec5ec0e6900f931f3f3fffd8dd856ba377926cf1\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Jul 15 05:18:23.717258 containerd[1608]: time="2025-07-15T05:18:23.716860913Z" level=info msg="Container beb15ea80d0b27a3296a404cddd85a6296b726e2f4fa2fe16ae0fbb7f363b75a: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:18:23.750247 containerd[1608]: time="2025-07-15T05:18:23.750207582Z" level=info msg="CreateContainer within sandbox \"8040c7a18cf43c01ee2630b9ec5ec0e6900f931f3f3fffd8dd856ba377926cf1\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"beb15ea80d0b27a3296a404cddd85a6296b726e2f4fa2fe16ae0fbb7f363b75a\"" Jul 15 05:18:23.751315 containerd[1608]: time="2025-07-15T05:18:23.751171557Z" level=info msg="StartContainer for \"beb15ea80d0b27a3296a404cddd85a6296b726e2f4fa2fe16ae0fbb7f363b75a\"" Jul 15 05:18:23.752265 containerd[1608]: time="2025-07-15T05:18:23.752077470Z" level=info msg="connecting to shim beb15ea80d0b27a3296a404cddd85a6296b726e2f4fa2fe16ae0fbb7f363b75a" address="unix:///run/containerd/s/cc2c8ec5dc4cf155e18df0f7929be56c6e32a44076de89cdbac49cc9c3a4e0f6" protocol=ttrpc version=3 Jul 15 05:18:23.798354 systemd[1]: Started cri-containerd-beb15ea80d0b27a3296a404cddd85a6296b726e2f4fa2fe16ae0fbb7f363b75a.scope - libcontainer container beb15ea80d0b27a3296a404cddd85a6296b726e2f4fa2fe16ae0fbb7f363b75a. Jul 15 05:18:23.847233 containerd[1608]: time="2025-07-15T05:18:23.847186733Z" level=info msg="StartContainer for \"beb15ea80d0b27a3296a404cddd85a6296b726e2f4fa2fe16ae0fbb7f363b75a\" returns successfully" Jul 15 05:18:23.849445 systemd-networkd[1464]: cali68f7fdc37f2: Gained IPv6LL Jul 15 05:18:24.063499 kubelet[2752]: I0715 05:18:24.063421 2752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-58fd7646b9-bl72v" podStartSLOduration=23.368649885 podStartE2EDuration="28.063382445s" podCreationTimestamp="2025-07-15 05:17:56 +0000 UTC" firstStartedPulling="2025-07-15 05:18:19.008458937 +0000 UTC m=+41.432458306" lastFinishedPulling="2025-07-15 05:18:23.703191498 +0000 UTC m=+46.127190866" observedRunningTime="2025-07-15 05:18:24.054153731 +0000 UTC m=+46.478153100" watchObservedRunningTime="2025-07-15 05:18:24.063382445 +0000 UTC m=+46.487381813" Jul 15 05:18:24.105228 containerd[1608]: time="2025-07-15T05:18:24.105158473Z" level=info msg="TaskExit event in podsandbox handler container_id:\"beb15ea80d0b27a3296a404cddd85a6296b726e2f4fa2fe16ae0fbb7f363b75a\" id:\"4b9041ef485e18279d2fd45f2ab78a43bb9e926162cdd60ef90ad4ffcc41b89f\" pid:5049 exit_status:1 exited_at:{seconds:1752556704 nanos:103959492}" Jul 15 05:18:25.103719 containerd[1608]: time="2025-07-15T05:18:25.103670825Z" level=info msg="TaskExit event in podsandbox handler container_id:\"beb15ea80d0b27a3296a404cddd85a6296b726e2f4fa2fe16ae0fbb7f363b75a\" id:\"b0f494717832b869f720cae222302481a8a8fb338129bfef602a1da2c4e5f748\" pid:5073 exit_status:1 exited_at:{seconds:1752556705 nanos:103368107}" Jul 15 05:18:25.903091 containerd[1608]: time="2025-07-15T05:18:25.903039580Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:25.904046 containerd[1608]: time="2025-07-15T05:18:25.904011028Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=47317977" Jul 15 05:18:25.905195 containerd[1608]: time="2025-07-15T05:18:25.905130083Z" level=info msg="ImageCreate event name:\"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:25.907630 containerd[1608]: time="2025-07-15T05:18:25.907544825Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:25.908649 containerd[1608]: time="2025-07-15T05:18:25.908054074Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 2.204756751s" Jul 15 05:18:25.908649 containerd[1608]: time="2025-07-15T05:18:25.908074765Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Jul 15 05:18:25.909624 containerd[1608]: time="2025-07-15T05:18:25.909598366Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\"" Jul 15 05:18:25.911840 containerd[1608]: time="2025-07-15T05:18:25.911815503Z" level=info msg="CreateContainer within sandbox \"e20dcb8ec451b3983022fa86dc4e33ec481556a978f1be66399b37f09c18739d\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 15 05:18:25.921285 containerd[1608]: time="2025-07-15T05:18:25.919098686Z" level=info msg="Container 1fae2be2e305888d9fc335752131ea6211bb69d5c417213afd97092487b7b3c7: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:18:25.937567 containerd[1608]: time="2025-07-15T05:18:25.937526156Z" level=info msg="CreateContainer within sandbox \"e20dcb8ec451b3983022fa86dc4e33ec481556a978f1be66399b37f09c18739d\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"1fae2be2e305888d9fc335752131ea6211bb69d5c417213afd97092487b7b3c7\"" Jul 15 05:18:25.938503 containerd[1608]: time="2025-07-15T05:18:25.938443447Z" level=info msg="StartContainer for \"1fae2be2e305888d9fc335752131ea6211bb69d5c417213afd97092487b7b3c7\"" Jul 15 05:18:25.939827 containerd[1608]: time="2025-07-15T05:18:25.939767882Z" level=info msg="connecting to shim 1fae2be2e305888d9fc335752131ea6211bb69d5c417213afd97092487b7b3c7" address="unix:///run/containerd/s/b011c583adc61bb6985b43b66b76db2e12c7fb306ccd8f25c974aa3b59d064e2" protocol=ttrpc version=3 Jul 15 05:18:25.971447 systemd[1]: Started cri-containerd-1fae2be2e305888d9fc335752131ea6211bb69d5c417213afd97092487b7b3c7.scope - libcontainer container 1fae2be2e305888d9fc335752131ea6211bb69d5c417213afd97092487b7b3c7. Jul 15 05:18:26.068083 containerd[1608]: time="2025-07-15T05:18:26.067954833Z" level=info msg="StartContainer for \"1fae2be2e305888d9fc335752131ea6211bb69d5c417213afd97092487b7b3c7\" returns successfully" Jul 15 05:18:26.213668 kubelet[2752]: I0715 05:18:26.212692 2752 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 05:18:26.338770 containerd[1608]: time="2025-07-15T05:18:26.338723252Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1d0a6ddf65c9eabd203e65bc5725a394c99e08c5ffb9eaf85ad442e7430c5c93\" id:\"96082cca63ca575c4b4463a53d1f04f720612a0cf84c42df49ff30721b063889\" pid:5137 exited_at:{seconds:1752556706 nanos:337915414}" Jul 15 05:18:26.456589 containerd[1608]: time="2025-07-15T05:18:26.456534619Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1d0a6ddf65c9eabd203e65bc5725a394c99e08c5ffb9eaf85ad442e7430c5c93\" id:\"44ece0aee899040093b88d3fa86f50d1cf0da04397210fe44b0bf86bc1edc08e\" pid:5161 exited_at:{seconds:1752556706 nanos:454312031}" Jul 15 05:18:27.039189 kubelet[2752]: I0715 05:18:27.039123 2752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6c6968cd6-klc5z" podStartSLOduration=26.298484018 podStartE2EDuration="33.039106504s" podCreationTimestamp="2025-07-15 05:17:54 +0000 UTC" firstStartedPulling="2025-07-15 05:18:19.168445619 +0000 UTC m=+41.592444988" lastFinishedPulling="2025-07-15 05:18:25.909068106 +0000 UTC m=+48.333067474" observedRunningTime="2025-07-15 05:18:27.03885718 +0000 UTC m=+49.462856548" watchObservedRunningTime="2025-07-15 05:18:27.039106504 +0000 UTC m=+49.463105872" Jul 15 05:18:27.573811 containerd[1608]: time="2025-07-15T05:18:27.573727532Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:27.574979 containerd[1608]: time="2025-07-15T05:18:27.574836705Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.2: active requests=0, bytes read=8759190" Jul 15 05:18:27.577270 containerd[1608]: time="2025-07-15T05:18:27.576015895Z" level=info msg="ImageCreate event name:\"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:27.578351 containerd[1608]: time="2025-07-15T05:18:27.578316801Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:27.579437 containerd[1608]: time="2025-07-15T05:18:27.579396015Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.2\" with image id \"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\", size \"10251893\" in 1.669766439s" Jul 15 05:18:27.579567 containerd[1608]: time="2025-07-15T05:18:27.579541828Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\" returns image reference \"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\"" Jul 15 05:18:27.582196 containerd[1608]: time="2025-07-15T05:18:27.582140052Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Jul 15 05:18:27.592021 containerd[1608]: time="2025-07-15T05:18:27.591965094Z" level=info msg="CreateContainer within sandbox \"5c46315e9d8a9312a5b24e1dfb4f6e24ac1c169cc2ee962c1fea635838dd5265\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Jul 15 05:18:27.637716 containerd[1608]: time="2025-07-15T05:18:27.637666556Z" level=info msg="Container e376ca673cb057b78fcc3e459e882b5bfaf9209920ba3f6c7c5b0b0e86f67224: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:18:27.652272 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2014740245.mount: Deactivated successfully. Jul 15 05:18:27.667676 containerd[1608]: time="2025-07-15T05:18:27.667636330Z" level=info msg="CreateContainer within sandbox \"5c46315e9d8a9312a5b24e1dfb4f6e24ac1c169cc2ee962c1fea635838dd5265\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"e376ca673cb057b78fcc3e459e882b5bfaf9209920ba3f6c7c5b0b0e86f67224\"" Jul 15 05:18:27.686793 containerd[1608]: time="2025-07-15T05:18:27.685793325Z" level=info msg="StartContainer for \"e376ca673cb057b78fcc3e459e882b5bfaf9209920ba3f6c7c5b0b0e86f67224\"" Jul 15 05:18:27.689640 containerd[1608]: time="2025-07-15T05:18:27.689583723Z" level=info msg="connecting to shim e376ca673cb057b78fcc3e459e882b5bfaf9209920ba3f6c7c5b0b0e86f67224" address="unix:///run/containerd/s/0826dc4cd34351e9515dffb7f375b6c769b946b91eba94692594d3f33d50d318" protocol=ttrpc version=3 Jul 15 05:18:27.723480 systemd[1]: Started cri-containerd-e376ca673cb057b78fcc3e459e882b5bfaf9209920ba3f6c7c5b0b0e86f67224.scope - libcontainer container e376ca673cb057b78fcc3e459e882b5bfaf9209920ba3f6c7c5b0b0e86f67224. Jul 15 05:18:27.802135 containerd[1608]: time="2025-07-15T05:18:27.802072218Z" level=info msg="StartContainer for \"e376ca673cb057b78fcc3e459e882b5bfaf9209920ba3f6c7c5b0b0e86f67224\" returns successfully" Jul 15 05:18:28.046639 kubelet[2752]: I0715 05:18:28.046061 2752 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 05:18:28.148764 containerd[1608]: time="2025-07-15T05:18:28.146933920Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:28.152924 containerd[1608]: time="2025-07-15T05:18:28.152870278Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=77" Jul 15 05:18:28.159553 containerd[1608]: time="2025-07-15T05:18:28.159500501Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 577.325022ms" Jul 15 05:18:28.159553 containerd[1608]: time="2025-07-15T05:18:28.159554557Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Jul 15 05:18:28.161442 containerd[1608]: time="2025-07-15T05:18:28.161405057Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Jul 15 05:18:28.165507 containerd[1608]: time="2025-07-15T05:18:28.165476185Z" level=info msg="CreateContainer within sandbox \"5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 15 05:18:28.179070 containerd[1608]: time="2025-07-15T05:18:28.179034431Z" level=info msg="Container 699b0a7e22705f706f3cab5f1737d061ec383112a93f205ec943f7a40586008d: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:18:28.211762 containerd[1608]: time="2025-07-15T05:18:28.211700970Z" level=info msg="CreateContainer within sandbox \"5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"699b0a7e22705f706f3cab5f1737d061ec383112a93f205ec943f7a40586008d\"" Jul 15 05:18:28.213881 containerd[1608]: time="2025-07-15T05:18:28.212816374Z" level=info msg="StartContainer for \"699b0a7e22705f706f3cab5f1737d061ec383112a93f205ec943f7a40586008d\"" Jul 15 05:18:28.223566 containerd[1608]: time="2025-07-15T05:18:28.223501255Z" level=info msg="connecting to shim 699b0a7e22705f706f3cab5f1737d061ec383112a93f205ec943f7a40586008d" address="unix:///run/containerd/s/d4fb3f30b0c1c19de5007b8479aa1349a6b2edb3169ceeb558e52c5e3e8c8c93" protocol=ttrpc version=3 Jul 15 05:18:28.277815 systemd[1]: Started cri-containerd-699b0a7e22705f706f3cab5f1737d061ec383112a93f205ec943f7a40586008d.scope - libcontainer container 699b0a7e22705f706f3cab5f1737d061ec383112a93f205ec943f7a40586008d. Jul 15 05:18:28.369010 containerd[1608]: time="2025-07-15T05:18:28.368887841Z" level=info msg="StartContainer for \"699b0a7e22705f706f3cab5f1737d061ec383112a93f205ec943f7a40586008d\" returns successfully" Jul 15 05:18:28.654164 containerd[1608]: time="2025-07-15T05:18:28.654027634Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:28.654834 containerd[1608]: time="2025-07-15T05:18:28.654758973Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=77" Jul 15 05:18:28.657687 containerd[1608]: time="2025-07-15T05:18:28.657540910Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 496.102058ms" Jul 15 05:18:28.657687 containerd[1608]: time="2025-07-15T05:18:28.657572331Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Jul 15 05:18:28.660843 containerd[1608]: time="2025-07-15T05:18:28.659766519Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\"" Jul 15 05:18:28.662655 containerd[1608]: time="2025-07-15T05:18:28.662631917Z" level=info msg="CreateContainer within sandbox \"7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 15 05:18:28.682101 containerd[1608]: time="2025-07-15T05:18:28.682045283Z" level=info msg="Container 1e126e8722c21725cada007fc161839def35ae95e2a7d0da5d9e17ab157ff01e: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:18:28.689996 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3601664473.mount: Deactivated successfully. Jul 15 05:18:28.731188 containerd[1608]: time="2025-07-15T05:18:28.731139445Z" level=info msg="CreateContainer within sandbox \"7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"1e126e8722c21725cada007fc161839def35ae95e2a7d0da5d9e17ab157ff01e\"" Jul 15 05:18:28.733010 containerd[1608]: time="2025-07-15T05:18:28.732953284Z" level=info msg="StartContainer for \"1e126e8722c21725cada007fc161839def35ae95e2a7d0da5d9e17ab157ff01e\"" Jul 15 05:18:28.734037 containerd[1608]: time="2025-07-15T05:18:28.733999905Z" level=info msg="connecting to shim 1e126e8722c21725cada007fc161839def35ae95e2a7d0da5d9e17ab157ff01e" address="unix:///run/containerd/s/41b6e6a8594275874b905fa045e6e4dfe91887867137723457c6631cccbfb805" protocol=ttrpc version=3 Jul 15 05:18:28.766298 systemd[1]: Started cri-containerd-1e126e8722c21725cada007fc161839def35ae95e2a7d0da5d9e17ab157ff01e.scope - libcontainer container 1e126e8722c21725cada007fc161839def35ae95e2a7d0da5d9e17ab157ff01e. Jul 15 05:18:29.001458 containerd[1608]: time="2025-07-15T05:18:29.001413334Z" level=info msg="StartContainer for \"1e126e8722c21725cada007fc161839def35ae95e2a7d0da5d9e17ab157ff01e\" returns successfully" Jul 15 05:18:29.068435 kubelet[2752]: I0715 05:18:29.068352 2752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-59d64f457d-4h9wp" podStartSLOduration=29.546243193 podStartE2EDuration="36.068336518s" podCreationTimestamp="2025-07-15 05:17:53 +0000 UTC" firstStartedPulling="2025-07-15 05:18:22.137289068 +0000 UTC m=+44.561288436" lastFinishedPulling="2025-07-15 05:18:28.659382393 +0000 UTC m=+51.083381761" observedRunningTime="2025-07-15 05:18:29.067366317 +0000 UTC m=+51.491365695" watchObservedRunningTime="2025-07-15 05:18:29.068336518 +0000 UTC m=+51.492335886" Jul 15 05:18:29.087158 kubelet[2752]: I0715 05:18:29.087096 2752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-59d64f457d-lv6nr" podStartSLOduration=30.005885162 podStartE2EDuration="36.087079808s" podCreationTimestamp="2025-07-15 05:17:53 +0000 UTC" firstStartedPulling="2025-07-15 05:18:22.080008088 +0000 UTC m=+44.504007456" lastFinishedPulling="2025-07-15 05:18:28.161202734 +0000 UTC m=+50.585202102" observedRunningTime="2025-07-15 05:18:29.085758195 +0000 UTC m=+51.509757562" watchObservedRunningTime="2025-07-15 05:18:29.087079808 +0000 UTC m=+51.511079176" Jul 15 05:18:30.059779 kubelet[2752]: I0715 05:18:30.059741 2752 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 05:18:31.510310 containerd[1608]: time="2025-07-15T05:18:31.509708236Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:31.516072 containerd[1608]: time="2025-07-15T05:18:31.515096406Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2: active requests=0, bytes read=14703784" Jul 15 05:18:31.533252 containerd[1608]: time="2025-07-15T05:18:31.533195046Z" level=info msg="ImageCreate event name:\"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:31.540821 containerd[1608]: time="2025-07-15T05:18:31.540730037Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:18:31.542143 containerd[1608]: time="2025-07-15T05:18:31.542116302Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" with image id \"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\", size \"16196439\" in 2.882002961s" Jul 15 05:18:31.542300 containerd[1608]: time="2025-07-15T05:18:31.542282946Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" returns image reference \"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\"" Jul 15 05:18:31.551203 containerd[1608]: time="2025-07-15T05:18:31.551162331Z" level=info msg="CreateContainer within sandbox \"5c46315e9d8a9312a5b24e1dfb4f6e24ac1c169cc2ee962c1fea635838dd5265\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Jul 15 05:18:31.559282 containerd[1608]: time="2025-07-15T05:18:31.559209043Z" level=info msg="Container edf9e3c826ac517336c26ab9d77cc82cfbbae1159e9649ef408673ed784428fc: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:18:31.572468 containerd[1608]: time="2025-07-15T05:18:31.572424570Z" level=info msg="CreateContainer within sandbox \"5c46315e9d8a9312a5b24e1dfb4f6e24ac1c169cc2ee962c1fea635838dd5265\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"edf9e3c826ac517336c26ab9d77cc82cfbbae1159e9649ef408673ed784428fc\"" Jul 15 05:18:31.573752 containerd[1608]: time="2025-07-15T05:18:31.573729940Z" level=info msg="StartContainer for \"edf9e3c826ac517336c26ab9d77cc82cfbbae1159e9649ef408673ed784428fc\"" Jul 15 05:18:31.578969 containerd[1608]: time="2025-07-15T05:18:31.578903905Z" level=info msg="connecting to shim edf9e3c826ac517336c26ab9d77cc82cfbbae1159e9649ef408673ed784428fc" address="unix:///run/containerd/s/0826dc4cd34351e9515dffb7f375b6c769b946b91eba94692594d3f33d50d318" protocol=ttrpc version=3 Jul 15 05:18:31.617339 systemd[1]: Started cri-containerd-edf9e3c826ac517336c26ab9d77cc82cfbbae1159e9649ef408673ed784428fc.scope - libcontainer container edf9e3c826ac517336c26ab9d77cc82cfbbae1159e9649ef408673ed784428fc. Jul 15 05:18:31.871502 containerd[1608]: time="2025-07-15T05:18:31.871380783Z" level=info msg="StartContainer for \"edf9e3c826ac517336c26ab9d77cc82cfbbae1159e9649ef408673ed784428fc\" returns successfully" Jul 15 05:18:32.198841 kubelet[2752]: I0715 05:18:32.198192 2752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-kn29v" podStartSLOduration=23.603279811 podStartE2EDuration="35.173618358s" podCreationTimestamp="2025-07-15 05:17:57 +0000 UTC" firstStartedPulling="2025-07-15 05:18:19.978061452 +0000 UTC m=+42.402060819" lastFinishedPulling="2025-07-15 05:18:31.548399998 +0000 UTC m=+53.972399366" observedRunningTime="2025-07-15 05:18:32.149454583 +0000 UTC m=+54.573453961" watchObservedRunningTime="2025-07-15 05:18:32.173618358 +0000 UTC m=+54.597617746" Jul 15 05:18:33.026112 kubelet[2752]: I0715 05:18:33.020217 2752 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Jul 15 05:18:33.028064 kubelet[2752]: I0715 05:18:33.028048 2752 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Jul 15 05:18:36.929589 containerd[1608]: time="2025-07-15T05:18:36.929431108Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a96ae7bd5ca1a2349cf0af2bbdfcabe2b74960ff2c8621c8469e5e0f1fa17ef\" id:\"c32c93942b0c6328f22a53cd2364eef713cff4ed5ca3b32ae972df569e851b53\" pid:5348 exited_at:{seconds:1752556716 nanos:923358219}" Jul 15 05:18:39.994584 containerd[1608]: time="2025-07-15T05:18:39.994529077Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a96ae7bd5ca1a2349cf0af2bbdfcabe2b74960ff2c8621c8469e5e0f1fa17ef\" id:\"c4eb290257e4c15dd7c1acd253c544fafcc52e13a551bb77e01c0fb0f7317fe0\" pid:5374 exited_at:{seconds:1752556719 nanos:993781012}" Jul 15 05:18:42.814513 containerd[1608]: time="2025-07-15T05:18:42.814458059Z" level=info msg="TaskExit event in podsandbox handler container_id:\"beb15ea80d0b27a3296a404cddd85a6296b726e2f4fa2fe16ae0fbb7f363b75a\" id:\"74fe471e694f107fc5c202e7e51128ec699d34cba3ff555c651f625dc65a71bd\" pid:5397 exited_at:{seconds:1752556722 nanos:814166961}" Jul 15 05:18:45.122833 kubelet[2752]: I0715 05:18:45.122725 2752 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 05:18:45.198016 kubelet[2752]: I0715 05:18:45.197987 2752 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 05:18:45.230459 containerd[1608]: time="2025-07-15T05:18:45.230397805Z" level=info msg="StopContainer for \"699b0a7e22705f706f3cab5f1737d061ec383112a93f205ec943f7a40586008d\" with timeout 30 (s)" Jul 15 05:18:45.240026 containerd[1608]: time="2025-07-15T05:18:45.239973750Z" level=info msg="Stop container \"699b0a7e22705f706f3cab5f1737d061ec383112a93f205ec943f7a40586008d\" with signal terminated" Jul 15 05:18:45.323555 systemd[1]: cri-containerd-699b0a7e22705f706f3cab5f1737d061ec383112a93f205ec943f7a40586008d.scope: Deactivated successfully. Jul 15 05:18:45.323892 systemd[1]: cri-containerd-699b0a7e22705f706f3cab5f1737d061ec383112a93f205ec943f7a40586008d.scope: Consumed 1.109s CPU time, 43M memory peak, 440K read from disk. Jul 15 05:18:45.331255 containerd[1608]: time="2025-07-15T05:18:45.331164913Z" level=info msg="received exit event container_id:\"699b0a7e22705f706f3cab5f1737d061ec383112a93f205ec943f7a40586008d\" id:\"699b0a7e22705f706f3cab5f1737d061ec383112a93f205ec943f7a40586008d\" pid:5226 exit_status:1 exited_at:{seconds:1752556725 nanos:330912644}" Jul 15 05:18:45.331460 containerd[1608]: time="2025-07-15T05:18:45.331435617Z" level=info msg="TaskExit event in podsandbox handler container_id:\"699b0a7e22705f706f3cab5f1737d061ec383112a93f205ec943f7a40586008d\" id:\"699b0a7e22705f706f3cab5f1737d061ec383112a93f205ec943f7a40586008d\" pid:5226 exit_status:1 exited_at:{seconds:1752556725 nanos:330912644}" Jul 15 05:18:45.366995 systemd[1]: Created slice kubepods-besteffort-pod4a044352_3064_4e1e_808c_a0652b0f9229.slice - libcontainer container kubepods-besteffort-pod4a044352_3064_4e1e_808c_a0652b0f9229.slice. Jul 15 05:18:45.396359 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-699b0a7e22705f706f3cab5f1737d061ec383112a93f205ec943f7a40586008d-rootfs.mount: Deactivated successfully. Jul 15 05:18:45.445220 containerd[1608]: time="2025-07-15T05:18:45.445159831Z" level=info msg="StopContainer for \"699b0a7e22705f706f3cab5f1737d061ec383112a93f205ec943f7a40586008d\" returns successfully" Jul 15 05:18:45.456926 containerd[1608]: time="2025-07-15T05:18:45.455572632Z" level=info msg="StopPodSandbox for \"5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9\"" Jul 15 05:18:45.466021 kubelet[2752]: I0715 05:18:45.465971 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/4a044352-3064-4e1e-808c-a0652b0f9229-calico-apiserver-certs\") pod \"calico-apiserver-6c6968cd6-97h8r\" (UID: \"4a044352-3064-4e1e-808c-a0652b0f9229\") " pod="calico-apiserver/calico-apiserver-6c6968cd6-97h8r" Jul 15 05:18:45.467398 kubelet[2752]: I0715 05:18:45.467341 2752 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkcd4\" (UniqueName: \"kubernetes.io/projected/4a044352-3064-4e1e-808c-a0652b0f9229-kube-api-access-kkcd4\") pod \"calico-apiserver-6c6968cd6-97h8r\" (UID: \"4a044352-3064-4e1e-808c-a0652b0f9229\") " pod="calico-apiserver/calico-apiserver-6c6968cd6-97h8r" Jul 15 05:18:45.472377 containerd[1608]: time="2025-07-15T05:18:45.472326575Z" level=info msg="Container to stop \"699b0a7e22705f706f3cab5f1737d061ec383112a93f205ec943f7a40586008d\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jul 15 05:18:45.499184 systemd[1]: cri-containerd-5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9.scope: Deactivated successfully. Jul 15 05:18:45.504096 containerd[1608]: time="2025-07-15T05:18:45.501951547Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9\" id:\"5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9\" pid:4917 exit_status:137 exited_at:{seconds:1752556725 nanos:501662159}" Jul 15 05:18:45.556051 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9-rootfs.mount: Deactivated successfully. Jul 15 05:18:45.579648 containerd[1608]: time="2025-07-15T05:18:45.579605428Z" level=info msg="shim disconnected" id=5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9 namespace=k8s.io Jul 15 05:18:45.579648 containerd[1608]: time="2025-07-15T05:18:45.579640242Z" level=warning msg="cleaning up after shim disconnected" id=5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9 namespace=k8s.io Jul 15 05:18:45.591546 containerd[1608]: time="2025-07-15T05:18:45.579648025Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jul 15 05:18:45.679383 containerd[1608]: time="2025-07-15T05:18:45.678990840Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6c6968cd6-97h8r,Uid:4a044352-3064-4e1e-808c-a0652b0f9229,Namespace:calico-apiserver,Attempt:0,}" Jul 15 05:18:45.728140 containerd[1608]: time="2025-07-15T05:18:45.728095541Z" level=info msg="received exit event sandbox_id:\"5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9\" exit_status:137 exited_at:{seconds:1752556725 nanos:501662159}" Jul 15 05:18:45.742029 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9-shm.mount: Deactivated successfully. Jul 15 05:18:45.967677 systemd-networkd[1464]: calia54b591b3e4: Link DOWN Jul 15 05:18:45.968332 systemd-networkd[1464]: calia54b591b3e4: Lost carrier Jul 15 05:18:46.159269 kubelet[2752]: I0715 05:18:46.158925 2752 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" Jul 15 05:18:46.268748 containerd[1608]: 2025-07-15 05:18:45.962 [INFO][5496] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" Jul 15 05:18:46.268748 containerd[1608]: 2025-07-15 05:18:45.962 [INFO][5496] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" iface="eth0" netns="/var/run/netns/cni-a7cb5040-9710-1f33-174c-5ecd6986977f" Jul 15 05:18:46.268748 containerd[1608]: 2025-07-15 05:18:45.962 [INFO][5496] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" iface="eth0" netns="/var/run/netns/cni-a7cb5040-9710-1f33-174c-5ecd6986977f" Jul 15 05:18:46.268748 containerd[1608]: 2025-07-15 05:18:45.982 [INFO][5496] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" after=20.212093ms iface="eth0" netns="/var/run/netns/cni-a7cb5040-9710-1f33-174c-5ecd6986977f" Jul 15 05:18:46.268748 containerd[1608]: 2025-07-15 05:18:45.983 [INFO][5496] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" Jul 15 05:18:46.268748 containerd[1608]: 2025-07-15 05:18:45.983 [INFO][5496] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" Jul 15 05:18:46.268748 containerd[1608]: 2025-07-15 05:18:46.199 [INFO][5508] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" HandleID="k8s-pod-network.5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" Workload="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--lv6nr-eth0" Jul 15 05:18:46.268748 containerd[1608]: 2025-07-15 05:18:46.202 [INFO][5508] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:18:46.268748 containerd[1608]: 2025-07-15 05:18:46.202 [INFO][5508] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:18:46.268748 containerd[1608]: 2025-07-15 05:18:46.255 [INFO][5508] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" HandleID="k8s-pod-network.5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" Workload="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--lv6nr-eth0" Jul 15 05:18:46.268748 containerd[1608]: 2025-07-15 05:18:46.255 [INFO][5508] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" HandleID="k8s-pod-network.5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" Workload="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--lv6nr-eth0" Jul 15 05:18:46.268748 containerd[1608]: 2025-07-15 05:18:46.257 [INFO][5508] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:18:46.268748 containerd[1608]: 2025-07-15 05:18:46.263 [INFO][5496] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" Jul 15 05:18:46.271642 containerd[1608]: time="2025-07-15T05:18:46.271614983Z" level=info msg="TearDown network for sandbox \"5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9\" successfully" Jul 15 05:18:46.271994 containerd[1608]: time="2025-07-15T05:18:46.271733639Z" level=info msg="StopPodSandbox for \"5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9\" returns successfully" Jul 15 05:18:46.391867 kubelet[2752]: I0715 05:18:46.391358 2752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/d418068e-2d12-4fbc-8f63-99524e4b1520-calico-apiserver-certs\") pod \"d418068e-2d12-4fbc-8f63-99524e4b1520\" (UID: \"d418068e-2d12-4fbc-8f63-99524e4b1520\") " Jul 15 05:18:46.391867 kubelet[2752]: I0715 05:18:46.391415 2752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-np7jf\" (UniqueName: \"kubernetes.io/projected/d418068e-2d12-4fbc-8f63-99524e4b1520-kube-api-access-np7jf\") pod \"d418068e-2d12-4fbc-8f63-99524e4b1520\" (UID: \"d418068e-2d12-4fbc-8f63-99524e4b1520\") " Jul 15 05:18:46.400208 systemd[1]: run-netns-cni\x2da7cb5040\x2d9710\x2d1f33\x2d174c\x2d5ecd6986977f.mount: Deactivated successfully. Jul 15 05:18:46.404063 systemd-networkd[1464]: cali26ac141b82f: Link UP Jul 15 05:18:46.413376 systemd-networkd[1464]: cali26ac141b82f: Gained carrier Jul 15 05:18:46.435953 systemd[1]: var-lib-kubelet-pods-d418068e\x2d2d12\x2d4fbc\x2d8f63\x2d99524e4b1520-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dnp7jf.mount: Deactivated successfully. Jul 15 05:18:46.436089 systemd[1]: var-lib-kubelet-pods-d418068e\x2d2d12\x2d4fbc\x2d8f63\x2d99524e4b1520-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Jul 15 05:18:46.446103 kubelet[2752]: I0715 05:18:46.440187 2752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d418068e-2d12-4fbc-8f63-99524e4b1520-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "d418068e-2d12-4fbc-8f63-99524e4b1520" (UID: "d418068e-2d12-4fbc-8f63-99524e4b1520"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jul 15 05:18:46.449267 kubelet[2752]: I0715 05:18:46.440567 2752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d418068e-2d12-4fbc-8f63-99524e4b1520-kube-api-access-np7jf" (OuterVolumeSpecName: "kube-api-access-np7jf") pod "d418068e-2d12-4fbc-8f63-99524e4b1520" (UID: "d418068e-2d12-4fbc-8f63-99524e4b1520"). InnerVolumeSpecName "kube-api-access-np7jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jul 15 05:18:46.449926 containerd[1608]: 2025-07-15 05:18:45.952 [INFO][5477] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--6c6968cd6--97h8r-eth0 calico-apiserver-6c6968cd6- calico-apiserver 4a044352-3064-4e1e-808c-a0652b0f9229 1112 0 2025-07-15 05:18:45 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6c6968cd6 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4396-0-0-n-153ccb2e88 calico-apiserver-6c6968cd6-97h8r eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali26ac141b82f [] [] }} ContainerID="0edbafc171e6baa9488dbc9302e3c33101f539febd532434e0e0470313662c9a" Namespace="calico-apiserver" Pod="calico-apiserver-6c6968cd6-97h8r" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--6c6968cd6--97h8r-" Jul 15 05:18:46.449926 containerd[1608]: 2025-07-15 05:18:45.953 [INFO][5477] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0edbafc171e6baa9488dbc9302e3c33101f539febd532434e0e0470313662c9a" Namespace="calico-apiserver" Pod="calico-apiserver-6c6968cd6-97h8r" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--6c6968cd6--97h8r-eth0" Jul 15 05:18:46.449926 containerd[1608]: 2025-07-15 05:18:46.198 [INFO][5505] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0edbafc171e6baa9488dbc9302e3c33101f539febd532434e0e0470313662c9a" HandleID="k8s-pod-network.0edbafc171e6baa9488dbc9302e3c33101f539febd532434e0e0470313662c9a" Workload="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--6c6968cd6--97h8r-eth0" Jul 15 05:18:46.449926 containerd[1608]: 2025-07-15 05:18:46.201 [INFO][5505] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0edbafc171e6baa9488dbc9302e3c33101f539febd532434e0e0470313662c9a" HandleID="k8s-pod-network.0edbafc171e6baa9488dbc9302e3c33101f539febd532434e0e0470313662c9a" Workload="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--6c6968cd6--97h8r-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00033c250), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4396-0-0-n-153ccb2e88", "pod":"calico-apiserver-6c6968cd6-97h8r", "timestamp":"2025-07-15 05:18:46.198249416 +0000 UTC"}, Hostname:"ci-4396-0-0-n-153ccb2e88", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:18:46.449926 containerd[1608]: 2025-07-15 05:18:46.202 [INFO][5505] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:18:46.449926 containerd[1608]: 2025-07-15 05:18:46.257 [INFO][5505] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:18:46.449926 containerd[1608]: 2025-07-15 05:18:46.257 [INFO][5505] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4396-0-0-n-153ccb2e88' Jul 15 05:18:46.449926 containerd[1608]: 2025-07-15 05:18:46.269 [INFO][5505] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0edbafc171e6baa9488dbc9302e3c33101f539febd532434e0e0470313662c9a" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:46.449926 containerd[1608]: 2025-07-15 05:18:46.357 [INFO][5505] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:46.449926 containerd[1608]: 2025-07-15 05:18:46.362 [INFO][5505] ipam/ipam.go 511: Trying affinity for 192.168.43.0/26 host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:46.449926 containerd[1608]: 2025-07-15 05:18:46.363 [INFO][5505] ipam/ipam.go 158: Attempting to load block cidr=192.168.43.0/26 host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:46.449926 containerd[1608]: 2025-07-15 05:18:46.365 [INFO][5505] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.43.0/26 host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:46.449926 containerd[1608]: 2025-07-15 05:18:46.366 [INFO][5505] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.43.0/26 handle="k8s-pod-network.0edbafc171e6baa9488dbc9302e3c33101f539febd532434e0e0470313662c9a" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:46.449926 containerd[1608]: 2025-07-15 05:18:46.367 [INFO][5505] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.0edbafc171e6baa9488dbc9302e3c33101f539febd532434e0e0470313662c9a Jul 15 05:18:46.449926 containerd[1608]: 2025-07-15 05:18:46.375 [INFO][5505] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.43.0/26 handle="k8s-pod-network.0edbafc171e6baa9488dbc9302e3c33101f539febd532434e0e0470313662c9a" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:46.449926 containerd[1608]: 2025-07-15 05:18:46.383 [INFO][5505] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.43.10/26] block=192.168.43.0/26 handle="k8s-pod-network.0edbafc171e6baa9488dbc9302e3c33101f539febd532434e0e0470313662c9a" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:46.449926 containerd[1608]: 2025-07-15 05:18:46.383 [INFO][5505] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.43.10/26] handle="k8s-pod-network.0edbafc171e6baa9488dbc9302e3c33101f539febd532434e0e0470313662c9a" host="ci-4396-0-0-n-153ccb2e88" Jul 15 05:18:46.449926 containerd[1608]: 2025-07-15 05:18:46.383 [INFO][5505] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:18:46.449926 containerd[1608]: 2025-07-15 05:18:46.383 [INFO][5505] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.43.10/26] IPv6=[] ContainerID="0edbafc171e6baa9488dbc9302e3c33101f539febd532434e0e0470313662c9a" HandleID="k8s-pod-network.0edbafc171e6baa9488dbc9302e3c33101f539febd532434e0e0470313662c9a" Workload="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--6c6968cd6--97h8r-eth0" Jul 15 05:18:46.452064 containerd[1608]: 2025-07-15 05:18:46.393 [INFO][5477] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0edbafc171e6baa9488dbc9302e3c33101f539febd532434e0e0470313662c9a" Namespace="calico-apiserver" Pod="calico-apiserver-6c6968cd6-97h8r" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--6c6968cd6--97h8r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--6c6968cd6--97h8r-eth0", GenerateName:"calico-apiserver-6c6968cd6-", Namespace:"calico-apiserver", SelfLink:"", UID:"4a044352-3064-4e1e-808c-a0652b0f9229", ResourceVersion:"1112", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 18, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6c6968cd6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-153ccb2e88", ContainerID:"", Pod:"calico-apiserver-6c6968cd6-97h8r", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.43.10/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali26ac141b82f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:18:46.452064 containerd[1608]: 2025-07-15 05:18:46.393 [INFO][5477] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.43.10/32] ContainerID="0edbafc171e6baa9488dbc9302e3c33101f539febd532434e0e0470313662c9a" Namespace="calico-apiserver" Pod="calico-apiserver-6c6968cd6-97h8r" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--6c6968cd6--97h8r-eth0" Jul 15 05:18:46.452064 containerd[1608]: 2025-07-15 05:18:46.393 [INFO][5477] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali26ac141b82f ContainerID="0edbafc171e6baa9488dbc9302e3c33101f539febd532434e0e0470313662c9a" Namespace="calico-apiserver" Pod="calico-apiserver-6c6968cd6-97h8r" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--6c6968cd6--97h8r-eth0" Jul 15 05:18:46.452064 containerd[1608]: 2025-07-15 05:18:46.407 [INFO][5477] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0edbafc171e6baa9488dbc9302e3c33101f539febd532434e0e0470313662c9a" Namespace="calico-apiserver" Pod="calico-apiserver-6c6968cd6-97h8r" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--6c6968cd6--97h8r-eth0" Jul 15 05:18:46.452064 containerd[1608]: 2025-07-15 05:18:46.411 [INFO][5477] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0edbafc171e6baa9488dbc9302e3c33101f539febd532434e0e0470313662c9a" Namespace="calico-apiserver" Pod="calico-apiserver-6c6968cd6-97h8r" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--6c6968cd6--97h8r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--6c6968cd6--97h8r-eth0", GenerateName:"calico-apiserver-6c6968cd6-", Namespace:"calico-apiserver", SelfLink:"", UID:"4a044352-3064-4e1e-808c-a0652b0f9229", ResourceVersion:"1112", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 18, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6c6968cd6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-153ccb2e88", ContainerID:"0edbafc171e6baa9488dbc9302e3c33101f539febd532434e0e0470313662c9a", Pod:"calico-apiserver-6c6968cd6-97h8r", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.43.10/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali26ac141b82f", MAC:"86:89:53:73:16:6f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:18:46.452064 containerd[1608]: 2025-07-15 05:18:46.437 [INFO][5477] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0edbafc171e6baa9488dbc9302e3c33101f539febd532434e0e0470313662c9a" Namespace="calico-apiserver" Pod="calico-apiserver-6c6968cd6-97h8r" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--6c6968cd6--97h8r-eth0" Jul 15 05:18:46.492993 kubelet[2752]: I0715 05:18:46.492950 2752 reconciler_common.go:293] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/d418068e-2d12-4fbc-8f63-99524e4b1520-calico-apiserver-certs\") on node \"ci-4396-0-0-n-153ccb2e88\" DevicePath \"\"" Jul 15 05:18:46.492993 kubelet[2752]: I0715 05:18:46.492986 2752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-np7jf\" (UniqueName: \"kubernetes.io/projected/d418068e-2d12-4fbc-8f63-99524e4b1520-kube-api-access-np7jf\") on node \"ci-4396-0-0-n-153ccb2e88\" DevicePath \"\"" Jul 15 05:18:46.585507 containerd[1608]: time="2025-07-15T05:18:46.585216399Z" level=info msg="connecting to shim 0edbafc171e6baa9488dbc9302e3c33101f539febd532434e0e0470313662c9a" address="unix:///run/containerd/s/35a2b4867ed6dce27c69016ea901cfff5f7bbd6c25f344f71f722329ec5dd149" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:18:46.647366 systemd[1]: Started cri-containerd-0edbafc171e6baa9488dbc9302e3c33101f539febd532434e0e0470313662c9a.scope - libcontainer container 0edbafc171e6baa9488dbc9302e3c33101f539febd532434e0e0470313662c9a. Jul 15 05:18:46.712411 containerd[1608]: time="2025-07-15T05:18:46.712313057Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6c6968cd6-97h8r,Uid:4a044352-3064-4e1e-808c-a0652b0f9229,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"0edbafc171e6baa9488dbc9302e3c33101f539febd532434e0e0470313662c9a\"" Jul 15 05:18:46.728474 containerd[1608]: time="2025-07-15T05:18:46.728428813Z" level=info msg="CreateContainer within sandbox \"0edbafc171e6baa9488dbc9302e3c33101f539febd532434e0e0470313662c9a\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 15 05:18:46.776292 containerd[1608]: time="2025-07-15T05:18:46.776191774Z" level=info msg="Container 5527822309ea0c2e84f41c32e47697550783aef103b3737ce2f1dcef3782d1e1: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:18:46.789909 containerd[1608]: time="2025-07-15T05:18:46.789852701Z" level=info msg="CreateContainer within sandbox \"0edbafc171e6baa9488dbc9302e3c33101f539febd532434e0e0470313662c9a\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"5527822309ea0c2e84f41c32e47697550783aef103b3737ce2f1dcef3782d1e1\"" Jul 15 05:18:46.790785 containerd[1608]: time="2025-07-15T05:18:46.790677266Z" level=info msg="StartContainer for \"5527822309ea0c2e84f41c32e47697550783aef103b3737ce2f1dcef3782d1e1\"" Jul 15 05:18:46.803333 containerd[1608]: time="2025-07-15T05:18:46.803299447Z" level=info msg="connecting to shim 5527822309ea0c2e84f41c32e47697550783aef103b3737ce2f1dcef3782d1e1" address="unix:///run/containerd/s/35a2b4867ed6dce27c69016ea901cfff5f7bbd6c25f344f71f722329ec5dd149" protocol=ttrpc version=3 Jul 15 05:18:46.828370 systemd[1]: Started cri-containerd-5527822309ea0c2e84f41c32e47697550783aef103b3737ce2f1dcef3782d1e1.scope - libcontainer container 5527822309ea0c2e84f41c32e47697550783aef103b3737ce2f1dcef3782d1e1. Jul 15 05:18:46.966690 containerd[1608]: time="2025-07-15T05:18:46.966629789Z" level=info msg="StartContainer for \"5527822309ea0c2e84f41c32e47697550783aef103b3737ce2f1dcef3782d1e1\" returns successfully" Jul 15 05:18:47.179961 systemd[1]: Removed slice kubepods-besteffort-podd418068e_2d12_4fbc_8f63_99524e4b1520.slice - libcontainer container kubepods-besteffort-podd418068e_2d12_4fbc_8f63_99524e4b1520.slice. Jul 15 05:18:47.180061 systemd[1]: kubepods-besteffort-podd418068e_2d12_4fbc_8f63_99524e4b1520.slice: Consumed 1.144s CPU time, 43.3M memory peak, 440K read from disk. Jul 15 05:18:47.224155 kubelet[2752]: I0715 05:18:47.222021 2752 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6c6968cd6-97h8r" podStartSLOduration=2.217231126 podStartE2EDuration="2.217231126s" podCreationTimestamp="2025-07-15 05:18:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 05:18:47.188516983 +0000 UTC m=+69.612516361" watchObservedRunningTime="2025-07-15 05:18:47.217231126 +0000 UTC m=+69.641230493" Jul 15 05:18:47.398913 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount910482469.mount: Deactivated successfully. Jul 15 05:18:47.672760 kubelet[2752]: I0715 05:18:47.672643 2752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d418068e-2d12-4fbc-8f63-99524e4b1520" path="/var/lib/kubelet/pods/d418068e-2d12-4fbc-8f63-99524e4b1520/volumes" Jul 15 05:18:47.849423 systemd-networkd[1464]: cali26ac141b82f: Gained IPv6LL Jul 15 05:18:48.197839 containerd[1608]: time="2025-07-15T05:18:48.197773584Z" level=info msg="StopContainer for \"1e126e8722c21725cada007fc161839def35ae95e2a7d0da5d9e17ab157ff01e\" with timeout 30 (s)" Jul 15 05:18:48.200083 containerd[1608]: time="2025-07-15T05:18:48.200045010Z" level=info msg="Stop container \"1e126e8722c21725cada007fc161839def35ae95e2a7d0da5d9e17ab157ff01e\" with signal terminated" Jul 15 05:18:48.245028 systemd[1]: cri-containerd-1e126e8722c21725cada007fc161839def35ae95e2a7d0da5d9e17ab157ff01e.scope: Deactivated successfully. Jul 15 05:18:48.245847 systemd[1]: cri-containerd-1e126e8722c21725cada007fc161839def35ae95e2a7d0da5d9e17ab157ff01e.scope: Consumed 1.337s CPU time, 59.7M memory peak, 1.2M read from disk. Jul 15 05:18:48.248988 containerd[1608]: time="2025-07-15T05:18:48.248958416Z" level=info msg="received exit event container_id:\"1e126e8722c21725cada007fc161839def35ae95e2a7d0da5d9e17ab157ff01e\" id:\"1e126e8722c21725cada007fc161839def35ae95e2a7d0da5d9e17ab157ff01e\" pid:5260 exit_status:1 exited_at:{seconds:1752556728 nanos:248695544}" Jul 15 05:18:48.250093 containerd[1608]: time="2025-07-15T05:18:48.249641766Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1e126e8722c21725cada007fc161839def35ae95e2a7d0da5d9e17ab157ff01e\" id:\"1e126e8722c21725cada007fc161839def35ae95e2a7d0da5d9e17ab157ff01e\" pid:5260 exit_status:1 exited_at:{seconds:1752556728 nanos:248695544}" Jul 15 05:18:48.298062 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-1e126e8722c21725cada007fc161839def35ae95e2a7d0da5d9e17ab157ff01e-rootfs.mount: Deactivated successfully. Jul 15 05:18:48.325005 containerd[1608]: time="2025-07-15T05:18:48.324971946Z" level=info msg="StopContainer for \"1e126e8722c21725cada007fc161839def35ae95e2a7d0da5d9e17ab157ff01e\" returns successfully" Jul 15 05:18:48.325854 containerd[1608]: time="2025-07-15T05:18:48.325748788Z" level=info msg="StopPodSandbox for \"7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2\"" Jul 15 05:18:48.325943 containerd[1608]: time="2025-07-15T05:18:48.325916274Z" level=info msg="Container to stop \"1e126e8722c21725cada007fc161839def35ae95e2a7d0da5d9e17ab157ff01e\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jul 15 05:18:48.333570 systemd[1]: cri-containerd-7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2.scope: Deactivated successfully. Jul 15 05:18:48.338948 containerd[1608]: time="2025-07-15T05:18:48.338837621Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2\" id:\"7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2\" pid:4966 exit_status:137 exited_at:{seconds:1752556728 nanos:338417993}" Jul 15 05:18:48.382661 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2-rootfs.mount: Deactivated successfully. Jul 15 05:18:48.384294 containerd[1608]: time="2025-07-15T05:18:48.384210887Z" level=info msg="shim disconnected" id=7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2 namespace=k8s.io Jul 15 05:18:48.384518 containerd[1608]: time="2025-07-15T05:18:48.384411264Z" level=warning msg="cleaning up after shim disconnected" id=7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2 namespace=k8s.io Jul 15 05:18:48.384518 containerd[1608]: time="2025-07-15T05:18:48.384445297Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jul 15 05:18:48.412243 containerd[1608]: time="2025-07-15T05:18:48.411881687Z" level=info msg="received exit event sandbox_id:\"7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2\" exit_status:137 exited_at:{seconds:1752556728 nanos:338417993}" Jul 15 05:18:48.416696 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2-shm.mount: Deactivated successfully. Jul 15 05:18:48.475812 systemd-networkd[1464]: cali68f7fdc37f2: Link DOWN Jul 15 05:18:48.476432 systemd-networkd[1464]: cali68f7fdc37f2: Lost carrier Jul 15 05:18:48.696489 containerd[1608]: 2025-07-15 05:18:48.474 [INFO][5696] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" Jul 15 05:18:48.696489 containerd[1608]: 2025-07-15 05:18:48.475 [INFO][5696] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" iface="eth0" netns="/var/run/netns/cni-1e732796-1071-80d6-f8c1-9a0fa36b4461" Jul 15 05:18:48.696489 containerd[1608]: 2025-07-15 05:18:48.475 [INFO][5696] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" iface="eth0" netns="/var/run/netns/cni-1e732796-1071-80d6-f8c1-9a0fa36b4461" Jul 15 05:18:48.696489 containerd[1608]: 2025-07-15 05:18:48.487 [INFO][5696] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" after=12.017212ms iface="eth0" netns="/var/run/netns/cni-1e732796-1071-80d6-f8c1-9a0fa36b4461" Jul 15 05:18:48.696489 containerd[1608]: 2025-07-15 05:18:48.487 [INFO][5696] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" Jul 15 05:18:48.696489 containerd[1608]: 2025-07-15 05:18:48.487 [INFO][5696] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" Jul 15 05:18:48.696489 containerd[1608]: 2025-07-15 05:18:48.566 [INFO][5703] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" HandleID="k8s-pod-network.7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" Workload="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--4h9wp-eth0" Jul 15 05:18:48.696489 containerd[1608]: 2025-07-15 05:18:48.567 [INFO][5703] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:18:48.696489 containerd[1608]: 2025-07-15 05:18:48.567 [INFO][5703] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:18:48.696489 containerd[1608]: 2025-07-15 05:18:48.684 [INFO][5703] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" HandleID="k8s-pod-network.7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" Workload="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--4h9wp-eth0" Jul 15 05:18:48.696489 containerd[1608]: 2025-07-15 05:18:48.684 [INFO][5703] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" HandleID="k8s-pod-network.7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" Workload="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--4h9wp-eth0" Jul 15 05:18:48.696489 containerd[1608]: 2025-07-15 05:18:48.686 [INFO][5703] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:18:48.696489 containerd[1608]: 2025-07-15 05:18:48.692 [INFO][5696] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" Jul 15 05:18:48.698642 containerd[1608]: time="2025-07-15T05:18:48.698514647Z" level=info msg="TearDown network for sandbox \"7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2\" successfully" Jul 15 05:18:48.698642 containerd[1608]: time="2025-07-15T05:18:48.698580287Z" level=info msg="StopPodSandbox for \"7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2\" returns successfully" Jul 15 05:18:48.708838 systemd[1]: run-netns-cni\x2d1e732796\x2d1071\x2d80d6\x2df8c1\x2d9a0fa36b4461.mount: Deactivated successfully. Jul 15 05:18:48.838549 kubelet[2752]: I0715 05:18:48.838411 2752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/4b41a19a-f623-4f8d-9193-0b05cb36d5c4-calico-apiserver-certs\") pod \"4b41a19a-f623-4f8d-9193-0b05cb36d5c4\" (UID: \"4b41a19a-f623-4f8d-9193-0b05cb36d5c4\") " Jul 15 05:18:48.838549 kubelet[2752]: I0715 05:18:48.838489 2752 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkknr\" (UniqueName: \"kubernetes.io/projected/4b41a19a-f623-4f8d-9193-0b05cb36d5c4-kube-api-access-gkknr\") pod \"4b41a19a-f623-4f8d-9193-0b05cb36d5c4\" (UID: \"4b41a19a-f623-4f8d-9193-0b05cb36d5c4\") " Jul 15 05:18:48.847498 kubelet[2752]: I0715 05:18:48.847463 2752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b41a19a-f623-4f8d-9193-0b05cb36d5c4-kube-api-access-gkknr" (OuterVolumeSpecName: "kube-api-access-gkknr") pod "4b41a19a-f623-4f8d-9193-0b05cb36d5c4" (UID: "4b41a19a-f623-4f8d-9193-0b05cb36d5c4"). InnerVolumeSpecName "kube-api-access-gkknr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jul 15 05:18:48.848126 systemd[1]: var-lib-kubelet-pods-4b41a19a\x2df623\x2d4f8d\x2d9193\x2d0b05cb36d5c4-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dgkknr.mount: Deactivated successfully. Jul 15 05:18:48.848536 kubelet[2752]: I0715 05:18:48.848507 2752 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b41a19a-f623-4f8d-9193-0b05cb36d5c4-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "4b41a19a-f623-4f8d-9193-0b05cb36d5c4" (UID: "4b41a19a-f623-4f8d-9193-0b05cb36d5c4"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jul 15 05:18:48.857897 systemd[1]: var-lib-kubelet-pods-4b41a19a\x2df623\x2d4f8d\x2d9193\x2d0b05cb36d5c4-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Jul 15 05:18:48.939742 kubelet[2752]: I0715 05:18:48.939675 2752 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkknr\" (UniqueName: \"kubernetes.io/projected/4b41a19a-f623-4f8d-9193-0b05cb36d5c4-kube-api-access-gkknr\") on node \"ci-4396-0-0-n-153ccb2e88\" DevicePath \"\"" Jul 15 05:18:48.939742 kubelet[2752]: I0715 05:18:48.939719 2752 reconciler_common.go:293] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/4b41a19a-f623-4f8d-9193-0b05cb36d5c4-calico-apiserver-certs\") on node \"ci-4396-0-0-n-153ccb2e88\" DevicePath \"\"" Jul 15 05:18:49.204258 kubelet[2752]: I0715 05:18:49.203937 2752 scope.go:117] "RemoveContainer" containerID="1e126e8722c21725cada007fc161839def35ae95e2a7d0da5d9e17ab157ff01e" Jul 15 05:18:49.209316 systemd[1]: Removed slice kubepods-besteffort-pod4b41a19a_f623_4f8d_9193_0b05cb36d5c4.slice - libcontainer container kubepods-besteffort-pod4b41a19a_f623_4f8d_9193_0b05cb36d5c4.slice. Jul 15 05:18:49.209416 systemd[1]: kubepods-besteffort-pod4b41a19a_f623_4f8d_9193_0b05cb36d5c4.slice: Consumed 1.368s CPU time, 60M memory peak, 1.2M read from disk. Jul 15 05:18:49.214469 containerd[1608]: time="2025-07-15T05:18:49.214426485Z" level=info msg="RemoveContainer for \"1e126e8722c21725cada007fc161839def35ae95e2a7d0da5d9e17ab157ff01e\"" Jul 15 05:18:49.228143 containerd[1608]: time="2025-07-15T05:18:49.228077193Z" level=info msg="RemoveContainer for \"1e126e8722c21725cada007fc161839def35ae95e2a7d0da5d9e17ab157ff01e\" returns successfully" Jul 15 05:18:49.235132 kubelet[2752]: I0715 05:18:49.234895 2752 scope.go:117] "RemoveContainer" containerID="1e126e8722c21725cada007fc161839def35ae95e2a7d0da5d9e17ab157ff01e" Jul 15 05:18:49.236374 containerd[1608]: time="2025-07-15T05:18:49.236315520Z" level=error msg="ContainerStatus for \"1e126e8722c21725cada007fc161839def35ae95e2a7d0da5d9e17ab157ff01e\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"1e126e8722c21725cada007fc161839def35ae95e2a7d0da5d9e17ab157ff01e\": not found" Jul 15 05:18:49.241841 kubelet[2752]: E0715 05:18:49.241727 2752 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"1e126e8722c21725cada007fc161839def35ae95e2a7d0da5d9e17ab157ff01e\": not found" containerID="1e126e8722c21725cada007fc161839def35ae95e2a7d0da5d9e17ab157ff01e" Jul 15 05:18:49.261969 kubelet[2752]: I0715 05:18:49.241784 2752 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"1e126e8722c21725cada007fc161839def35ae95e2a7d0da5d9e17ab157ff01e"} err="failed to get container status \"1e126e8722c21725cada007fc161839def35ae95e2a7d0da5d9e17ab157ff01e\": rpc error: code = NotFound desc = an error occurred when try to find container \"1e126e8722c21725cada007fc161839def35ae95e2a7d0da5d9e17ab157ff01e\": not found" Jul 15 05:18:49.671846 kubelet[2752]: I0715 05:18:49.671643 2752 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b41a19a-f623-4f8d-9193-0b05cb36d5c4" path="/var/lib/kubelet/pods/4b41a19a-f623-4f8d-9193-0b05cb36d5c4/volumes" Jul 15 05:18:56.559673 containerd[1608]: time="2025-07-15T05:18:56.559612231Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1d0a6ddf65c9eabd203e65bc5725a394c99e08c5ffb9eaf85ad442e7430c5c93\" id:\"b350dbbaddbf8f41407bd06fdc79fd501362b8da52cc4b5f8f6fdb52f85ea48e\" pid:5742 exited_at:{seconds:1752556736 nanos:549801996}" Jul 15 05:18:58.669108 containerd[1608]: time="2025-07-15T05:18:58.668886082Z" level=info msg="TaskExit event in podsandbox handler container_id:\"beb15ea80d0b27a3296a404cddd85a6296b726e2f4fa2fe16ae0fbb7f363b75a\" id:\"f70f81d08aeffba9b6834056803385a446844ab591954c2c314cb4c20e4654b7\" pid:5768 exited_at:{seconds:1752556738 nanos:667944237}" Jul 15 05:19:06.849123 containerd[1608]: time="2025-07-15T05:19:06.849082694Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a96ae7bd5ca1a2349cf0af2bbdfcabe2b74960ff2c8621c8469e5e0f1fa17ef\" id:\"db7220a5a444a8e9558d7d05d05364881b22cdc7f2ae9b91be6656ffe273ccfe\" pid:5793 exited_at:{seconds:1752556746 nanos:848743001}" Jul 15 05:19:12.689642 containerd[1608]: time="2025-07-15T05:19:12.689581157Z" level=info msg="TaskExit event in podsandbox handler container_id:\"beb15ea80d0b27a3296a404cddd85a6296b726e2f4fa2fe16ae0fbb7f363b75a\" id:\"f7c6e9495b48e998489971c04164cea146642235853103fa0e62880b71e1eef0\" pid:5816 exited_at:{seconds:1752556752 nanos:689092422}" Jul 15 05:19:26.357387 containerd[1608]: time="2025-07-15T05:19:26.357334895Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1d0a6ddf65c9eabd203e65bc5725a394c99e08c5ffb9eaf85ad442e7430c5c93\" id:\"a416a46edcaa9cf27fb96f42f198d1d0e0e138881552eb7c17fab286fa602893\" pid:5841 exited_at:{seconds:1752556766 nanos:356941191}" Jul 15 05:19:29.721815 systemd[1]: Started sshd@7-157.180.32.153:22-139.178.89.65:44642.service - OpenSSH per-connection server daemon (139.178.89.65:44642). Jul 15 05:19:30.757984 sshd[5858]: Accepted publickey for core from 139.178.89.65 port 44642 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:19:30.760899 sshd-session[5858]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:19:30.768014 systemd-logind[1577]: New session 8 of user core. Jul 15 05:19:30.773426 systemd[1]: Started session-8.scope - Session 8 of User core. Jul 15 05:19:31.968341 sshd[5862]: Connection closed by 139.178.89.65 port 44642 Jul 15 05:19:31.969099 sshd-session[5858]: pam_unix(sshd:session): session closed for user core Jul 15 05:19:31.977538 systemd-logind[1577]: Session 8 logged out. Waiting for processes to exit. Jul 15 05:19:31.978118 systemd[1]: sshd@7-157.180.32.153:22-139.178.89.65:44642.service: Deactivated successfully. Jul 15 05:19:31.981176 systemd[1]: session-8.scope: Deactivated successfully. Jul 15 05:19:31.985061 systemd-logind[1577]: Removed session 8. Jul 15 05:19:36.868859 containerd[1608]: time="2025-07-15T05:19:36.868801198Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a96ae7bd5ca1a2349cf0af2bbdfcabe2b74960ff2c8621c8469e5e0f1fa17ef\" id:\"abe1836f5c6735730bade00e908a359c94a560e9fe92b11379ab02c6cadceee8\" pid:5892 exited_at:{seconds:1752556776 nanos:868338712}" Jul 15 05:19:37.143979 systemd[1]: Started sshd@8-157.180.32.153:22-139.178.89.65:44648.service - OpenSSH per-connection server daemon (139.178.89.65:44648). Jul 15 05:19:38.027077 kubelet[2752]: I0715 05:19:38.025398 2752 scope.go:117] "RemoveContainer" containerID="699b0a7e22705f706f3cab5f1737d061ec383112a93f205ec943f7a40586008d" Jul 15 05:19:38.030197 containerd[1608]: time="2025-07-15T05:19:38.030159128Z" level=info msg="RemoveContainer for \"699b0a7e22705f706f3cab5f1737d061ec383112a93f205ec943f7a40586008d\"" Jul 15 05:19:38.064554 containerd[1608]: time="2025-07-15T05:19:38.064514979Z" level=info msg="RemoveContainer for \"699b0a7e22705f706f3cab5f1737d061ec383112a93f205ec943f7a40586008d\" returns successfully" Jul 15 05:19:38.071517 containerd[1608]: time="2025-07-15T05:19:38.071469693Z" level=info msg="StopPodSandbox for \"7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2\"" Jul 15 05:19:38.169439 sshd[5904]: Accepted publickey for core from 139.178.89.65 port 44648 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:19:38.173687 sshd-session[5904]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:19:38.186578 systemd-logind[1577]: New session 9 of user core. Jul 15 05:19:38.193392 systemd[1]: Started session-9.scope - Session 9 of User core. Jul 15 05:19:38.308466 containerd[1608]: 2025-07-15 05:19:38.155 [WARNING][5917] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--4h9wp-eth0" Jul 15 05:19:38.308466 containerd[1608]: 2025-07-15 05:19:38.156 [INFO][5917] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" Jul 15 05:19:38.308466 containerd[1608]: 2025-07-15 05:19:38.156 [INFO][5917] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" iface="eth0" netns="" Jul 15 05:19:38.308466 containerd[1608]: 2025-07-15 05:19:38.156 [INFO][5917] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" Jul 15 05:19:38.308466 containerd[1608]: 2025-07-15 05:19:38.156 [INFO][5917] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" Jul 15 05:19:38.308466 containerd[1608]: 2025-07-15 05:19:38.290 [INFO][5924] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" HandleID="k8s-pod-network.7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" Workload="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--4h9wp-eth0" Jul 15 05:19:38.308466 containerd[1608]: 2025-07-15 05:19:38.292 [INFO][5924] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:19:38.308466 containerd[1608]: 2025-07-15 05:19:38.292 [INFO][5924] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:19:38.308466 containerd[1608]: 2025-07-15 05:19:38.300 [WARNING][5924] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" HandleID="k8s-pod-network.7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" Workload="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--4h9wp-eth0" Jul 15 05:19:38.308466 containerd[1608]: 2025-07-15 05:19:38.301 [INFO][5924] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" HandleID="k8s-pod-network.7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" Workload="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--4h9wp-eth0" Jul 15 05:19:38.308466 containerd[1608]: 2025-07-15 05:19:38.302 [INFO][5924] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:19:38.308466 containerd[1608]: 2025-07-15 05:19:38.304 [INFO][5917] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" Jul 15 05:19:38.310113 containerd[1608]: time="2025-07-15T05:19:38.308861498Z" level=info msg="TearDown network for sandbox \"7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2\" successfully" Jul 15 05:19:38.310113 containerd[1608]: time="2025-07-15T05:19:38.308885584Z" level=info msg="StopPodSandbox for \"7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2\" returns successfully" Jul 15 05:19:38.410959 containerd[1608]: time="2025-07-15T05:19:38.410886747Z" level=info msg="RemovePodSandbox for \"7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2\"" Jul 15 05:19:38.419900 containerd[1608]: time="2025-07-15T05:19:38.419873103Z" level=info msg="Forcibly stopping sandbox \"7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2\"" Jul 15 05:19:38.510751 containerd[1608]: 2025-07-15 05:19:38.466 [WARNING][5940] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--4h9wp-eth0" Jul 15 05:19:38.510751 containerd[1608]: 2025-07-15 05:19:38.467 [INFO][5940] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" Jul 15 05:19:38.510751 containerd[1608]: 2025-07-15 05:19:38.467 [INFO][5940] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" iface="eth0" netns="" Jul 15 05:19:38.510751 containerd[1608]: 2025-07-15 05:19:38.468 [INFO][5940] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" Jul 15 05:19:38.510751 containerd[1608]: 2025-07-15 05:19:38.468 [INFO][5940] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" Jul 15 05:19:38.510751 containerd[1608]: 2025-07-15 05:19:38.497 [INFO][5947] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" HandleID="k8s-pod-network.7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" Workload="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--4h9wp-eth0" Jul 15 05:19:38.510751 containerd[1608]: 2025-07-15 05:19:38.497 [INFO][5947] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:19:38.510751 containerd[1608]: 2025-07-15 05:19:38.497 [INFO][5947] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:19:38.510751 containerd[1608]: 2025-07-15 05:19:38.503 [WARNING][5947] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" HandleID="k8s-pod-network.7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" Workload="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--4h9wp-eth0" Jul 15 05:19:38.510751 containerd[1608]: 2025-07-15 05:19:38.503 [INFO][5947] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" HandleID="k8s-pod-network.7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" Workload="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--4h9wp-eth0" Jul 15 05:19:38.510751 containerd[1608]: 2025-07-15 05:19:38.504 [INFO][5947] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:19:38.510751 containerd[1608]: 2025-07-15 05:19:38.508 [INFO][5940] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2" Jul 15 05:19:38.511761 containerd[1608]: time="2025-07-15T05:19:38.510792818Z" level=info msg="TearDown network for sandbox \"7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2\" successfully" Jul 15 05:19:38.515155 containerd[1608]: time="2025-07-15T05:19:38.515123248Z" level=info msg="Ensure that sandbox 7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2 in task-service has been cleanup successfully" Jul 15 05:19:38.518264 containerd[1608]: time="2025-07-15T05:19:38.518225288Z" level=info msg="RemovePodSandbox \"7851e0500f381cde3f1c033381acd4dce5e5e065f770bba4316f180d2edcd2f2\" returns successfully" Jul 15 05:19:38.519054 containerd[1608]: time="2025-07-15T05:19:38.518747649Z" level=info msg="StopPodSandbox for \"5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9\"" Jul 15 05:19:38.594509 containerd[1608]: 2025-07-15 05:19:38.547 [WARNING][5961] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--lv6nr-eth0" Jul 15 05:19:38.594509 containerd[1608]: 2025-07-15 05:19:38.547 [INFO][5961] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" Jul 15 05:19:38.594509 containerd[1608]: 2025-07-15 05:19:38.547 [INFO][5961] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" iface="eth0" netns="" Jul 15 05:19:38.594509 containerd[1608]: 2025-07-15 05:19:38.547 [INFO][5961] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" Jul 15 05:19:38.594509 containerd[1608]: 2025-07-15 05:19:38.547 [INFO][5961] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" Jul 15 05:19:38.594509 containerd[1608]: 2025-07-15 05:19:38.576 [INFO][5968] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" HandleID="k8s-pod-network.5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" Workload="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--lv6nr-eth0" Jul 15 05:19:38.594509 containerd[1608]: 2025-07-15 05:19:38.576 [INFO][5968] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:19:38.594509 containerd[1608]: 2025-07-15 05:19:38.576 [INFO][5968] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:19:38.594509 containerd[1608]: 2025-07-15 05:19:38.584 [WARNING][5968] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" HandleID="k8s-pod-network.5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" Workload="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--lv6nr-eth0" Jul 15 05:19:38.594509 containerd[1608]: 2025-07-15 05:19:38.584 [INFO][5968] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" HandleID="k8s-pod-network.5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" Workload="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--lv6nr-eth0" Jul 15 05:19:38.594509 containerd[1608]: 2025-07-15 05:19:38.586 [INFO][5968] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:19:38.594509 containerd[1608]: 2025-07-15 05:19:38.591 [INFO][5961] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" Jul 15 05:19:38.594509 containerd[1608]: time="2025-07-15T05:19:38.594353131Z" level=info msg="TearDown network for sandbox \"5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9\" successfully" Jul 15 05:19:38.594509 containerd[1608]: time="2025-07-15T05:19:38.594376094Z" level=info msg="StopPodSandbox for \"5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9\" returns successfully" Jul 15 05:19:38.597710 containerd[1608]: time="2025-07-15T05:19:38.596062531Z" level=info msg="RemovePodSandbox for \"5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9\"" Jul 15 05:19:38.597710 containerd[1608]: time="2025-07-15T05:19:38.596105252Z" level=info msg="Forcibly stopping sandbox \"5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9\"" Jul 15 05:19:38.672788 containerd[1608]: 2025-07-15 05:19:38.640 [WARNING][5983] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" WorkloadEndpoint="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--lv6nr-eth0" Jul 15 05:19:38.672788 containerd[1608]: 2025-07-15 05:19:38.641 [INFO][5983] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" Jul 15 05:19:38.672788 containerd[1608]: 2025-07-15 05:19:38.641 [INFO][5983] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" iface="eth0" netns="" Jul 15 05:19:38.672788 containerd[1608]: 2025-07-15 05:19:38.641 [INFO][5983] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" Jul 15 05:19:38.672788 containerd[1608]: 2025-07-15 05:19:38.641 [INFO][5983] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" Jul 15 05:19:38.672788 containerd[1608]: 2025-07-15 05:19:38.660 [INFO][5990] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" HandleID="k8s-pod-network.5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" Workload="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--lv6nr-eth0" Jul 15 05:19:38.672788 containerd[1608]: 2025-07-15 05:19:38.660 [INFO][5990] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:19:38.672788 containerd[1608]: 2025-07-15 05:19:38.661 [INFO][5990] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:19:38.672788 containerd[1608]: 2025-07-15 05:19:38.665 [WARNING][5990] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" HandleID="k8s-pod-network.5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" Workload="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--lv6nr-eth0" Jul 15 05:19:38.672788 containerd[1608]: 2025-07-15 05:19:38.665 [INFO][5990] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" HandleID="k8s-pod-network.5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" Workload="ci--4396--0--0--n--153ccb2e88-k8s-calico--apiserver--59d64f457d--lv6nr-eth0" Jul 15 05:19:38.672788 containerd[1608]: 2025-07-15 05:19:38.667 [INFO][5990] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:19:38.672788 containerd[1608]: 2025-07-15 05:19:38.670 [INFO][5983] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9" Jul 15 05:19:38.674590 containerd[1608]: time="2025-07-15T05:19:38.673194257Z" level=info msg="TearDown network for sandbox \"5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9\" successfully" Jul 15 05:19:38.689871 containerd[1608]: time="2025-07-15T05:19:38.689846766Z" level=info msg="Ensure that sandbox 5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9 in task-service has been cleanup successfully" Jul 15 05:19:38.694445 containerd[1608]: time="2025-07-15T05:19:38.694417482Z" level=info msg="RemovePodSandbox \"5b9709c0f180cc3679b924b05deffd57b79f398e3acb0d2a96d1a48c09d26bd9\" returns successfully" Jul 15 05:19:39.173736 sshd[5928]: Connection closed by 139.178.89.65 port 44648 Jul 15 05:19:39.186708 sshd-session[5904]: pam_unix(sshd:session): session closed for user core Jul 15 05:19:39.202847 systemd[1]: sshd@8-157.180.32.153:22-139.178.89.65:44648.service: Deactivated successfully. Jul 15 05:19:39.208537 systemd[1]: session-9.scope: Deactivated successfully. Jul 15 05:19:39.211024 systemd-logind[1577]: Session 9 logged out. Waiting for processes to exit. Jul 15 05:19:39.212958 systemd-logind[1577]: Removed session 9. Jul 15 05:19:39.341457 systemd[1]: Started sshd@9-157.180.32.153:22-139.178.89.65:56542.service - OpenSSH per-connection server daemon (139.178.89.65:56542). Jul 15 05:19:40.033544 containerd[1608]: time="2025-07-15T05:19:40.033491065Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a96ae7bd5ca1a2349cf0af2bbdfcabe2b74960ff2c8621c8469e5e0f1fa17ef\" id:\"4a560072be0adb7632178e291d0032c6c2d515e7851034f7a19650abd58cc86e\" pid:6024 exited_at:{seconds:1752556780 nanos:32666581}" Jul 15 05:19:40.352451 sshd[6008]: Accepted publickey for core from 139.178.89.65 port 56542 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:19:40.354140 sshd-session[6008]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:19:40.360044 systemd-logind[1577]: New session 10 of user core. Jul 15 05:19:40.364379 systemd[1]: Started session-10.scope - Session 10 of User core. Jul 15 05:19:41.141756 sshd[6033]: Connection closed by 139.178.89.65 port 56542 Jul 15 05:19:41.143183 sshd-session[6008]: pam_unix(sshd:session): session closed for user core Jul 15 05:19:41.149929 systemd[1]: sshd@9-157.180.32.153:22-139.178.89.65:56542.service: Deactivated successfully. Jul 15 05:19:41.153004 systemd[1]: session-10.scope: Deactivated successfully. Jul 15 05:19:41.156078 systemd-logind[1577]: Session 10 logged out. Waiting for processes to exit. Jul 15 05:19:41.158716 systemd-logind[1577]: Removed session 10. Jul 15 05:19:41.310629 systemd[1]: Started sshd@10-157.180.32.153:22-139.178.89.65:56548.service - OpenSSH per-connection server daemon (139.178.89.65:56548). Jul 15 05:19:42.333654 sshd[6044]: Accepted publickey for core from 139.178.89.65 port 56548 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:19:42.335507 sshd-session[6044]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:19:42.340308 systemd-logind[1577]: New session 11 of user core. Jul 15 05:19:42.348362 systemd[1]: Started session-11.scope - Session 11 of User core. Jul 15 05:19:42.787465 containerd[1608]: time="2025-07-15T05:19:42.786392337Z" level=info msg="TaskExit event in podsandbox handler container_id:\"beb15ea80d0b27a3296a404cddd85a6296b726e2f4fa2fe16ae0fbb7f363b75a\" id:\"434e3113be95d3b8c4355de43888d9ac59673c0092d6a53afc99d344bbd227fa\" pid:6060 exited_at:{seconds:1752556782 nanos:772007037}" Jul 15 05:19:43.085311 sshd[6047]: Connection closed by 139.178.89.65 port 56548 Jul 15 05:19:43.085995 sshd-session[6044]: pam_unix(sshd:session): session closed for user core Jul 15 05:19:43.092003 systemd-logind[1577]: Session 11 logged out. Waiting for processes to exit. Jul 15 05:19:43.092555 systemd[1]: sshd@10-157.180.32.153:22-139.178.89.65:56548.service: Deactivated successfully. Jul 15 05:19:43.094368 systemd[1]: session-11.scope: Deactivated successfully. Jul 15 05:19:43.096169 systemd-logind[1577]: Removed session 11. Jul 15 05:19:48.261616 systemd[1]: Started sshd@11-157.180.32.153:22-139.178.89.65:56556.service - OpenSSH per-connection server daemon (139.178.89.65:56556). Jul 15 05:19:49.278684 sshd[6109]: Accepted publickey for core from 139.178.89.65 port 56556 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:19:49.280528 sshd-session[6109]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:19:49.286299 systemd-logind[1577]: New session 12 of user core. Jul 15 05:19:49.290404 systemd[1]: Started session-12.scope - Session 12 of User core. Jul 15 05:19:50.088815 sshd[6112]: Connection closed by 139.178.89.65 port 56556 Jul 15 05:19:50.092556 sshd-session[6109]: pam_unix(sshd:session): session closed for user core Jul 15 05:19:50.103094 systemd[1]: sshd@11-157.180.32.153:22-139.178.89.65:56556.service: Deactivated successfully. Jul 15 05:19:50.106801 systemd[1]: session-12.scope: Deactivated successfully. Jul 15 05:19:50.109136 systemd-logind[1577]: Session 12 logged out. Waiting for processes to exit. Jul 15 05:19:50.111773 systemd-logind[1577]: Removed session 12. Jul 15 05:19:55.263860 systemd[1]: Started sshd@12-157.180.32.153:22-139.178.89.65:46572.service - OpenSSH per-connection server daemon (139.178.89.65:46572). Jul 15 05:19:56.292857 sshd[6124]: Accepted publickey for core from 139.178.89.65 port 46572 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:19:56.295544 sshd-session[6124]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:19:56.301268 systemd-logind[1577]: New session 13 of user core. Jul 15 05:19:56.306581 systemd[1]: Started session-13.scope - Session 13 of User core. Jul 15 05:19:56.524524 containerd[1608]: time="2025-07-15T05:19:56.524489618Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1d0a6ddf65c9eabd203e65bc5725a394c99e08c5ffb9eaf85ad442e7430c5c93\" id:\"471e83702e1aad3e873ec0450409401dd5eeca5949328c568d54b3a0a6f731da\" pid:6140 exited_at:{seconds:1752556796 nanos:523678924}" Jul 15 05:19:57.093288 sshd[6146]: Connection closed by 139.178.89.65 port 46572 Jul 15 05:19:57.093917 sshd-session[6124]: pam_unix(sshd:session): session closed for user core Jul 15 05:19:57.099205 systemd[1]: sshd@12-157.180.32.153:22-139.178.89.65:46572.service: Deactivated successfully. Jul 15 05:19:57.104014 systemd[1]: session-13.scope: Deactivated successfully. Jul 15 05:19:57.106086 systemd-logind[1577]: Session 13 logged out. Waiting for processes to exit. Jul 15 05:19:57.108077 systemd-logind[1577]: Removed session 13. Jul 15 05:19:58.502072 containerd[1608]: time="2025-07-15T05:19:58.502009889Z" level=info msg="TaskExit event in podsandbox handler container_id:\"beb15ea80d0b27a3296a404cddd85a6296b726e2f4fa2fe16ae0fbb7f363b75a\" id:\"926130fbe8151c03ba4071d65e5a38d9a5c020d890843cb5a5c5380feb296c90\" pid:6174 exited_at:{seconds:1752556798 nanos:501772406}" Jul 15 05:20:02.264846 systemd[1]: Started sshd@13-157.180.32.153:22-139.178.89.65:35420.service - OpenSSH per-connection server daemon (139.178.89.65:35420). Jul 15 05:20:03.279967 sshd[6185]: Accepted publickey for core from 139.178.89.65 port 35420 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:20:03.281392 sshd-session[6185]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:20:03.286985 systemd-logind[1577]: New session 14 of user core. Jul 15 05:20:03.293759 systemd[1]: Started session-14.scope - Session 14 of User core. Jul 15 05:20:04.259309 sshd[6188]: Connection closed by 139.178.89.65 port 35420 Jul 15 05:20:04.258672 sshd-session[6185]: pam_unix(sshd:session): session closed for user core Jul 15 05:20:04.263313 systemd-logind[1577]: Session 14 logged out. Waiting for processes to exit. Jul 15 05:20:04.264384 systemd[1]: sshd@13-157.180.32.153:22-139.178.89.65:35420.service: Deactivated successfully. Jul 15 05:20:04.266772 systemd[1]: session-14.scope: Deactivated successfully. Jul 15 05:20:04.268858 systemd-logind[1577]: Removed session 14. Jul 15 05:20:04.428292 systemd[1]: Started sshd@14-157.180.32.153:22-139.178.89.65:35432.service - OpenSSH per-connection server daemon (139.178.89.65:35432). Jul 15 05:20:05.426375 sshd[6200]: Accepted publickey for core from 139.178.89.65 port 35432 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:20:05.428655 sshd-session[6200]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:20:05.434702 systemd-logind[1577]: New session 15 of user core. Jul 15 05:20:05.441389 systemd[1]: Started session-15.scope - Session 15 of User core. Jul 15 05:20:06.359397 sshd[6203]: Connection closed by 139.178.89.65 port 35432 Jul 15 05:20:06.366201 sshd-session[6200]: pam_unix(sshd:session): session closed for user core Jul 15 05:20:06.375824 systemd[1]: sshd@14-157.180.32.153:22-139.178.89.65:35432.service: Deactivated successfully. Jul 15 05:20:06.378230 systemd[1]: session-15.scope: Deactivated successfully. Jul 15 05:20:06.380136 systemd-logind[1577]: Session 15 logged out. Waiting for processes to exit. Jul 15 05:20:06.381764 systemd-logind[1577]: Removed session 15. Jul 15 05:20:06.527111 systemd[1]: Started sshd@15-157.180.32.153:22-139.178.89.65:35440.service - OpenSSH per-connection server daemon (139.178.89.65:35440). Jul 15 05:20:06.863999 containerd[1608]: time="2025-07-15T05:20:06.863899899Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a96ae7bd5ca1a2349cf0af2bbdfcabe2b74960ff2c8621c8469e5e0f1fa17ef\" id:\"e7ac1f76359aaec2bfa66224a5e055541a7d0efcc6285230652f0d430b8b25c8\" pid:6229 exited_at:{seconds:1752556806 nanos:863422130}" Jul 15 05:20:07.543110 sshd[6214]: Accepted publickey for core from 139.178.89.65 port 35440 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:20:07.546510 sshd-session[6214]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:20:07.555568 systemd-logind[1577]: New session 16 of user core. Jul 15 05:20:07.561447 systemd[1]: Started session-16.scope - Session 16 of User core. Jul 15 05:20:09.910912 sshd[6238]: Connection closed by 139.178.89.65 port 35440 Jul 15 05:20:09.915034 sshd-session[6214]: pam_unix(sshd:session): session closed for user core Jul 15 05:20:09.935573 systemd-logind[1577]: Session 16 logged out. Waiting for processes to exit. Jul 15 05:20:09.935943 systemd[1]: sshd@15-157.180.32.153:22-139.178.89.65:35440.service: Deactivated successfully. Jul 15 05:20:09.939607 systemd[1]: session-16.scope: Deactivated successfully. Jul 15 05:20:09.939934 systemd[1]: session-16.scope: Consumed 554ms CPU time, 84.5M memory peak. Jul 15 05:20:09.942212 systemd-logind[1577]: Removed session 16. Jul 15 05:20:10.081064 systemd[1]: Started sshd@16-157.180.32.153:22-139.178.89.65:43878.service - OpenSSH per-connection server daemon (139.178.89.65:43878). Jul 15 05:20:11.099487 sshd[6259]: Accepted publickey for core from 139.178.89.65 port 43878 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:20:11.103137 sshd-session[6259]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:20:11.113352 systemd-logind[1577]: New session 17 of user core. Jul 15 05:20:11.120557 systemd[1]: Started session-17.scope - Session 17 of User core. Jul 15 05:20:12.503423 sshd[6262]: Connection closed by 139.178.89.65 port 43878 Jul 15 05:20:12.505413 sshd-session[6259]: pam_unix(sshd:session): session closed for user core Jul 15 05:20:12.510171 systemd[1]: sshd@16-157.180.32.153:22-139.178.89.65:43878.service: Deactivated successfully. Jul 15 05:20:12.513062 systemd[1]: session-17.scope: Deactivated successfully. Jul 15 05:20:12.515978 systemd-logind[1577]: Session 17 logged out. Waiting for processes to exit. Jul 15 05:20:12.518118 systemd-logind[1577]: Removed session 17. Jul 15 05:20:12.672280 systemd[1]: Started sshd@17-157.180.32.153:22-139.178.89.65:43892.service - OpenSSH per-connection server daemon (139.178.89.65:43892). Jul 15 05:20:12.782344 containerd[1608]: time="2025-07-15T05:20:12.782199672Z" level=info msg="TaskExit event in podsandbox handler container_id:\"beb15ea80d0b27a3296a404cddd85a6296b726e2f4fa2fe16ae0fbb7f363b75a\" id:\"254c8acdf7e85fc3b2e7a0d3328f82a33c76505460bf6a808ecfda346b0d6962\" pid:6283 exited_at:{seconds:1752556812 nanos:781654112}" Jul 15 05:20:13.664642 sshd[6289]: Accepted publickey for core from 139.178.89.65 port 43892 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:20:13.666335 sshd-session[6289]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:20:13.673293 systemd-logind[1577]: New session 18 of user core. Jul 15 05:20:13.675373 systemd[1]: Started session-18.scope - Session 18 of User core. Jul 15 05:20:14.525618 sshd[6296]: Connection closed by 139.178.89.65 port 43892 Jul 15 05:20:14.526524 sshd-session[6289]: pam_unix(sshd:session): session closed for user core Jul 15 05:20:14.531633 systemd[1]: sshd@17-157.180.32.153:22-139.178.89.65:43892.service: Deactivated successfully. Jul 15 05:20:14.535050 systemd[1]: session-18.scope: Deactivated successfully. Jul 15 05:20:14.536687 systemd-logind[1577]: Session 18 logged out. Waiting for processes to exit. Jul 15 05:20:14.539894 systemd-logind[1577]: Removed session 18. Jul 15 05:20:19.703594 systemd[1]: Started sshd@18-157.180.32.153:22-139.178.89.65:54926.service - OpenSSH per-connection server daemon (139.178.89.65:54926). Jul 15 05:20:20.760671 sshd[6313]: Accepted publickey for core from 139.178.89.65 port 54926 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:20:20.763942 sshd-session[6313]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:20:20.772468 systemd-logind[1577]: New session 19 of user core. Jul 15 05:20:20.777454 systemd[1]: Started session-19.scope - Session 19 of User core. Jul 15 05:20:21.705620 sshd[6316]: Connection closed by 139.178.89.65 port 54926 Jul 15 05:20:21.707491 sshd-session[6313]: pam_unix(sshd:session): session closed for user core Jul 15 05:20:21.712653 systemd-logind[1577]: Session 19 logged out. Waiting for processes to exit. Jul 15 05:20:21.712843 systemd[1]: sshd@18-157.180.32.153:22-139.178.89.65:54926.service: Deactivated successfully. Jul 15 05:20:21.715675 systemd[1]: session-19.scope: Deactivated successfully. Jul 15 05:20:21.718136 systemd-logind[1577]: Removed session 19. Jul 15 05:20:26.434006 containerd[1608]: time="2025-07-15T05:20:26.433934868Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1d0a6ddf65c9eabd203e65bc5725a394c99e08c5ffb9eaf85ad442e7430c5c93\" id:\"9ec05b02cd350c9b3e64661c94c88d3e5b5559df8bbd77f71a0c6b9756df624e\" pid:6339 exited_at:{seconds:1752556826 nanos:433655314}" Jul 15 05:20:26.876861 systemd[1]: Started sshd@19-157.180.32.153:22-139.178.89.65:54928.service - OpenSSH per-connection server daemon (139.178.89.65:54928). Jul 15 05:20:27.898535 sshd[6350]: Accepted publickey for core from 139.178.89.65 port 54928 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:20:27.901197 sshd-session[6350]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:20:27.907624 systemd-logind[1577]: New session 20 of user core. Jul 15 05:20:27.915415 systemd[1]: Started session-20.scope - Session 20 of User core. Jul 15 05:20:28.964384 sshd[6353]: Connection closed by 139.178.89.65 port 54928 Jul 15 05:20:28.965047 sshd-session[6350]: pam_unix(sshd:session): session closed for user core Jul 15 05:20:28.970939 systemd[1]: sshd@19-157.180.32.153:22-139.178.89.65:54928.service: Deactivated successfully. Jul 15 05:20:28.974069 systemd[1]: session-20.scope: Deactivated successfully. Jul 15 05:20:28.975613 systemd-logind[1577]: Session 20 logged out. Waiting for processes to exit. Jul 15 05:20:28.977042 systemd-logind[1577]: Removed session 20. Jul 15 05:20:36.888963 containerd[1608]: time="2025-07-15T05:20:36.888917321Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a96ae7bd5ca1a2349cf0af2bbdfcabe2b74960ff2c8621c8469e5e0f1fa17ef\" id:\"f0f4ae151a573b2c3ca11812456b330dcdd3db6fcbafbc4fe2d1f52cf444aef1\" pid:6376 exited_at:{seconds:1752556836 nanos:887909376}" Jul 15 05:20:39.997511 containerd[1608]: time="2025-07-15T05:20:39.997181351Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2a96ae7bd5ca1a2349cf0af2bbdfcabe2b74960ff2c8621c8469e5e0f1fa17ef\" id:\"6493c0bd8fd3b08e879fed0c67d569ae935574032328a86dfc00b522e1a65345\" pid:6399 exited_at:{seconds:1752556839 nanos:996882410}" Jul 15 05:20:42.785026 containerd[1608]: time="2025-07-15T05:20:42.784963537Z" level=info msg="TaskExit event in podsandbox handler container_id:\"beb15ea80d0b27a3296a404cddd85a6296b726e2f4fa2fe16ae0fbb7f363b75a\" id:\"a02c5ac86473738ba384d948e9f90835863f7684e39c87af89296187fe32a79e\" pid:6421 exited_at:{seconds:1752556842 nanos:753041952}" Jul 15 05:20:45.157367 systemd[1]: cri-containerd-59f00e5231c6853387d462524bb9a6a5c2f8dc219c14950e690bd6e9a64484dc.scope: Deactivated successfully. Jul 15 05:20:45.157683 systemd[1]: cri-containerd-59f00e5231c6853387d462524bb9a6a5c2f8dc219c14950e690bd6e9a64484dc.scope: Consumed 3.330s CPU time, 88M memory peak, 106.2M read from disk. Jul 15 05:20:45.246761 containerd[1608]: time="2025-07-15T05:20:45.246190956Z" level=info msg="TaskExit event in podsandbox handler container_id:\"59f00e5231c6853387d462524bb9a6a5c2f8dc219c14950e690bd6e9a64484dc\" id:\"59f00e5231c6853387d462524bb9a6a5c2f8dc219c14950e690bd6e9a64484dc\" pid:2580 exit_status:1 exited_at:{seconds:1752556845 nanos:224965200}" Jul 15 05:20:45.248493 containerd[1608]: time="2025-07-15T05:20:45.248457556Z" level=info msg="received exit event container_id:\"59f00e5231c6853387d462524bb9a6a5c2f8dc219c14950e690bd6e9a64484dc\" id:\"59f00e5231c6853387d462524bb9a6a5c2f8dc219c14950e690bd6e9a64484dc\" pid:2580 exit_status:1 exited_at:{seconds:1752556845 nanos:224965200}" Jul 15 05:20:45.337771 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-59f00e5231c6853387d462524bb9a6a5c2f8dc219c14950e690bd6e9a64484dc-rootfs.mount: Deactivated successfully. Jul 15 05:20:45.370967 systemd[1]: cri-containerd-cada33e1bba514d30c3f9f702dbb636c7e399eca8d67bc5c76d2e2842005f366.scope: Deactivated successfully. Jul 15 05:20:45.371347 systemd[1]: cri-containerd-cada33e1bba514d30c3f9f702dbb636c7e399eca8d67bc5c76d2e2842005f366.scope: Consumed 1.205s CPU time, 36.8M memory peak, 57M read from disk. Jul 15 05:20:45.379127 containerd[1608]: time="2025-07-15T05:20:45.378984550Z" level=info msg="received exit event container_id:\"cada33e1bba514d30c3f9f702dbb636c7e399eca8d67bc5c76d2e2842005f366\" id:\"cada33e1bba514d30c3f9f702dbb636c7e399eca8d67bc5c76d2e2842005f366\" pid:2610 exit_status:1 exited_at:{seconds:1752556845 nanos:377616736}" Jul 15 05:20:45.380884 containerd[1608]: time="2025-07-15T05:20:45.380484004Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cada33e1bba514d30c3f9f702dbb636c7e399eca8d67bc5c76d2e2842005f366\" id:\"cada33e1bba514d30c3f9f702dbb636c7e399eca8d67bc5c76d2e2842005f366\" pid:2610 exit_status:1 exited_at:{seconds:1752556845 nanos:377616736}" Jul 15 05:20:45.399810 kubelet[2752]: E0715 05:20:45.399723 2752 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:55556->10.0.0.2:2379: read: connection timed out" Jul 15 05:20:45.430528 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-cada33e1bba514d30c3f9f702dbb636c7e399eca8d67bc5c76d2e2842005f366-rootfs.mount: Deactivated successfully. Jul 15 05:20:45.760577 kubelet[2752]: I0715 05:20:45.760541 2752 scope.go:117] "RemoveContainer" containerID="59f00e5231c6853387d462524bb9a6a5c2f8dc219c14950e690bd6e9a64484dc" Jul 15 05:20:45.760861 kubelet[2752]: I0715 05:20:45.760815 2752 scope.go:117] "RemoveContainer" containerID="cada33e1bba514d30c3f9f702dbb636c7e399eca8d67bc5c76d2e2842005f366" Jul 15 05:20:45.796148 containerd[1608]: time="2025-07-15T05:20:45.795592909Z" level=info msg="CreateContainer within sandbox \"d205ab30e83028c0cf057a59a1970a1bf0d4651b9a49349e0e4c6ca815fbea18\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Jul 15 05:20:45.798301 containerd[1608]: time="2025-07-15T05:20:45.798277840Z" level=info msg="CreateContainer within sandbox \"70d41083680083c5d0e665e4609b79a3db984331599e60a099b2b22db72e8030\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Jul 15 05:20:45.837088 systemd[1]: cri-containerd-e5a691b53f54ca31b238e2034d8e653e41483bb871510ba10e46535e9cf9adb6.scope: Deactivated successfully. Jul 15 05:20:45.837463 systemd[1]: cri-containerd-e5a691b53f54ca31b238e2034d8e653e41483bb871510ba10e46535e9cf9adb6.scope: Consumed 13.615s CPU time, 108M memory peak, 67.4M read from disk. Jul 15 05:20:45.848095 containerd[1608]: time="2025-07-15T05:20:45.848025861Z" level=info msg="received exit event container_id:\"e5a691b53f54ca31b238e2034d8e653e41483bb871510ba10e46535e9cf9adb6\" id:\"e5a691b53f54ca31b238e2034d8e653e41483bb871510ba10e46535e9cf9adb6\" pid:3067 exit_status:1 exited_at:{seconds:1752556845 nanos:842295993}" Jul 15 05:20:45.849091 containerd[1608]: time="2025-07-15T05:20:45.849071388Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e5a691b53f54ca31b238e2034d8e653e41483bb871510ba10e46535e9cf9adb6\" id:\"e5a691b53f54ca31b238e2034d8e653e41483bb871510ba10e46535e9cf9adb6\" pid:3067 exit_status:1 exited_at:{seconds:1752556845 nanos:842295993}" Jul 15 05:20:45.905522 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e5a691b53f54ca31b238e2034d8e653e41483bb871510ba10e46535e9cf9adb6-rootfs.mount: Deactivated successfully. Jul 15 05:20:45.916258 containerd[1608]: time="2025-07-15T05:20:45.914101790Z" level=info msg="Container 38a557a07a48eb8246593ec3d0c18a095ecd1914bd5a820b87ed855b155b4856: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:20:45.918432 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1916554864.mount: Deactivated successfully. Jul 15 05:20:45.919474 containerd[1608]: time="2025-07-15T05:20:45.918622778Z" level=info msg="Container 9b36e1862cffc89924beafa3fa49a9da691b3e669642689542a628f00947d4f7: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:20:45.928364 containerd[1608]: time="2025-07-15T05:20:45.928276893Z" level=info msg="CreateContainer within sandbox \"70d41083680083c5d0e665e4609b79a3db984331599e60a099b2b22db72e8030\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"9b36e1862cffc89924beafa3fa49a9da691b3e669642689542a628f00947d4f7\"" Jul 15 05:20:45.930631 containerd[1608]: time="2025-07-15T05:20:45.930510391Z" level=info msg="CreateContainer within sandbox \"d205ab30e83028c0cf057a59a1970a1bf0d4651b9a49349e0e4c6ca815fbea18\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"38a557a07a48eb8246593ec3d0c18a095ecd1914bd5a820b87ed855b155b4856\"" Jul 15 05:20:45.933356 containerd[1608]: time="2025-07-15T05:20:45.933312074Z" level=info msg="StartContainer for \"9b36e1862cffc89924beafa3fa49a9da691b3e669642689542a628f00947d4f7\"" Jul 15 05:20:45.933484 containerd[1608]: time="2025-07-15T05:20:45.933321071Z" level=info msg="StartContainer for \"38a557a07a48eb8246593ec3d0c18a095ecd1914bd5a820b87ed855b155b4856\"" Jul 15 05:20:45.938742 containerd[1608]: time="2025-07-15T05:20:45.938693065Z" level=info msg="connecting to shim 9b36e1862cffc89924beafa3fa49a9da691b3e669642689542a628f00947d4f7" address="unix:///run/containerd/s/b6fcea72191e540e2fd32b6df2b0d1d98ceb474a8d960fae0c4601afa26a24ca" protocol=ttrpc version=3 Jul 15 05:20:45.951060 containerd[1608]: time="2025-07-15T05:20:45.938692053Z" level=info msg="connecting to shim 38a557a07a48eb8246593ec3d0c18a095ecd1914bd5a820b87ed855b155b4856" address="unix:///run/containerd/s/6a7a2c535117b684db63279c3de52cd2f7c3fb20cc7d49822837980c435f27ff" protocol=ttrpc version=3 Jul 15 05:20:46.006823 systemd[1]: Started cri-containerd-38a557a07a48eb8246593ec3d0c18a095ecd1914bd5a820b87ed855b155b4856.scope - libcontainer container 38a557a07a48eb8246593ec3d0c18a095ecd1914bd5a820b87ed855b155b4856. Jul 15 05:20:46.016592 systemd[1]: Started cri-containerd-9b36e1862cffc89924beafa3fa49a9da691b3e669642689542a628f00947d4f7.scope - libcontainer container 9b36e1862cffc89924beafa3fa49a9da691b3e669642689542a628f00947d4f7. Jul 15 05:20:46.111903 containerd[1608]: time="2025-07-15T05:20:46.111666967Z" level=info msg="StartContainer for \"38a557a07a48eb8246593ec3d0c18a095ecd1914bd5a820b87ed855b155b4856\" returns successfully" Jul 15 05:20:46.112220 containerd[1608]: time="2025-07-15T05:20:46.111742181Z" level=info msg="StartContainer for \"9b36e1862cffc89924beafa3fa49a9da691b3e669642689542a628f00947d4f7\" returns successfully" Jul 15 05:20:46.334809 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount177519035.mount: Deactivated successfully. Jul 15 05:20:46.790263 kubelet[2752]: I0715 05:20:46.789603 2752 scope.go:117] "RemoveContainer" containerID="e5a691b53f54ca31b238e2034d8e653e41483bb871510ba10e46535e9cf9adb6" Jul 15 05:20:46.796561 containerd[1608]: time="2025-07-15T05:20:46.796483274Z" level=info msg="CreateContainer within sandbox \"b144c3781e4cff4fec2d0c4e7b144c997453ef8bfc4aa44825523c4b0a03c8ae\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Jul 15 05:20:46.811320 containerd[1608]: time="2025-07-15T05:20:46.810292300Z" level=info msg="Container c3e9556fcf29868ecfa00b8ad08ca2e2835b6a3051d8b948f96a480aaa88b83b: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:20:46.815204 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2593552737.mount: Deactivated successfully. Jul 15 05:20:46.821802 containerd[1608]: time="2025-07-15T05:20:46.821765942Z" level=info msg="CreateContainer within sandbox \"b144c3781e4cff4fec2d0c4e7b144c997453ef8bfc4aa44825523c4b0a03c8ae\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"c3e9556fcf29868ecfa00b8ad08ca2e2835b6a3051d8b948f96a480aaa88b83b\"" Jul 15 05:20:46.822783 containerd[1608]: time="2025-07-15T05:20:46.822760032Z" level=info msg="StartContainer for \"c3e9556fcf29868ecfa00b8ad08ca2e2835b6a3051d8b948f96a480aaa88b83b\"" Jul 15 05:20:46.824953 containerd[1608]: time="2025-07-15T05:20:46.824926532Z" level=info msg="connecting to shim c3e9556fcf29868ecfa00b8ad08ca2e2835b6a3051d8b948f96a480aaa88b83b" address="unix:///run/containerd/s/90980f45d16bec553981b7c1ae8255dd251034dc91fa42846f7639597d3a7925" protocol=ttrpc version=3 Jul 15 05:20:46.861560 systemd[1]: Started cri-containerd-c3e9556fcf29868ecfa00b8ad08ca2e2835b6a3051d8b948f96a480aaa88b83b.scope - libcontainer container c3e9556fcf29868ecfa00b8ad08ca2e2835b6a3051d8b948f96a480aaa88b83b. Jul 15 05:20:46.899059 containerd[1608]: time="2025-07-15T05:20:46.898995497Z" level=info msg="StartContainer for \"c3e9556fcf29868ecfa00b8ad08ca2e2835b6a3051d8b948f96a480aaa88b83b\" returns successfully" Jul 15 05:20:49.518988 kubelet[2752]: E0715 05:20:49.516485 2752 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:55366->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4396-0-0-n-153ccb2e88.18525528ddf288f6 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4396-0-0-n-153ccb2e88,UID:4438e3662bee56f52349e1daec9c5cd7,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4396-0-0-n-153ccb2e88,},FirstTimestamp:2025-07-15 05:20:39.011256566 +0000 UTC m=+181.435255954,LastTimestamp:2025-07-15 05:20:39.011256566 +0000 UTC m=+181.435255954,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4396-0-0-n-153ccb2e88,}"