Jun 21 04:39:58.805868 kernel: Linux version 6.12.34-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Fri Jun 20 23:59:04 -00 2025 Jun 21 04:39:58.805889 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=d3c0be6f64121476b0313f5d7d7bbd73e21bc1a219aacd38b8006b291898eca1 Jun 21 04:39:58.805900 kernel: BIOS-provided physical RAM map: Jun 21 04:39:58.805907 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Jun 21 04:39:58.805913 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000007fffff] usable Jun 21 04:39:58.805919 kernel: BIOS-e820: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Jun 21 04:39:58.805927 kernel: BIOS-e820: [mem 0x0000000000808000-0x000000000080afff] usable Jun 21 04:39:58.805934 kernel: BIOS-e820: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Jun 21 04:39:58.805942 kernel: BIOS-e820: [mem 0x000000000080c000-0x0000000000810fff] usable Jun 21 04:39:58.805948 kernel: BIOS-e820: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Jun 21 04:39:58.805955 kernel: BIOS-e820: [mem 0x0000000000900000-0x000000009bd3efff] usable Jun 21 04:39:58.805961 kernel: BIOS-e820: [mem 0x000000009bd3f000-0x000000009bdfffff] reserved Jun 21 04:39:58.805968 kernel: BIOS-e820: [mem 0x000000009be00000-0x000000009c8ecfff] usable Jun 21 04:39:58.805974 kernel: BIOS-e820: [mem 0x000000009c8ed000-0x000000009cb6cfff] reserved Jun 21 04:39:58.805985 kernel: BIOS-e820: [mem 0x000000009cb6d000-0x000000009cb7efff] ACPI data Jun 21 04:39:58.805992 kernel: BIOS-e820: [mem 0x000000009cb7f000-0x000000009cbfefff] ACPI NVS Jun 21 04:39:58.805999 kernel: BIOS-e820: [mem 0x000000009cbff000-0x000000009ce90fff] usable Jun 21 04:39:58.806006 kernel: BIOS-e820: [mem 0x000000009ce91000-0x000000009ce94fff] reserved Jun 21 04:39:58.806013 kernel: BIOS-e820: [mem 0x000000009ce95000-0x000000009ce96fff] ACPI NVS Jun 21 04:39:58.806020 kernel: BIOS-e820: [mem 0x000000009ce97000-0x000000009cedbfff] usable Jun 21 04:39:58.806027 kernel: BIOS-e820: [mem 0x000000009cedc000-0x000000009cf5ffff] reserved Jun 21 04:39:58.806033 kernel: BIOS-e820: [mem 0x000000009cf60000-0x000000009cffffff] ACPI NVS Jun 21 04:39:58.806040 kernel: BIOS-e820: [mem 0x00000000e0000000-0x00000000efffffff] reserved Jun 21 04:39:58.806047 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jun 21 04:39:58.806054 kernel: BIOS-e820: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Jun 21 04:39:58.806063 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Jun 21 04:39:58.806070 kernel: NX (Execute Disable) protection: active Jun 21 04:39:58.806077 kernel: APIC: Static calls initialized Jun 21 04:39:58.806084 kernel: e820: update [mem 0x9b320018-0x9b329c57] usable ==> usable Jun 21 04:39:58.806091 kernel: e820: update [mem 0x9b2e3018-0x9b31fe57] usable ==> usable Jun 21 04:39:58.806098 kernel: extended physical RAM map: Jun 21 04:39:58.806105 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Jun 21 04:39:58.806112 kernel: reserve setup_data: [mem 0x0000000000100000-0x00000000007fffff] usable Jun 21 04:39:58.806119 kernel: reserve setup_data: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Jun 21 04:39:58.806126 kernel: reserve setup_data: [mem 0x0000000000808000-0x000000000080afff] usable Jun 21 04:39:58.806133 kernel: reserve setup_data: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Jun 21 04:39:58.806142 kernel: reserve setup_data: [mem 0x000000000080c000-0x0000000000810fff] usable Jun 21 04:39:58.806164 kernel: reserve setup_data: [mem 0x0000000000811000-0x00000000008fffff] ACPI NVS Jun 21 04:39:58.806171 kernel: reserve setup_data: [mem 0x0000000000900000-0x000000009b2e3017] usable Jun 21 04:39:58.806179 kernel: reserve setup_data: [mem 0x000000009b2e3018-0x000000009b31fe57] usable Jun 21 04:39:58.806190 kernel: reserve setup_data: [mem 0x000000009b31fe58-0x000000009b320017] usable Jun 21 04:39:58.806197 kernel: reserve setup_data: [mem 0x000000009b320018-0x000000009b329c57] usable Jun 21 04:39:58.806206 kernel: reserve setup_data: [mem 0x000000009b329c58-0x000000009bd3efff] usable Jun 21 04:39:58.806214 kernel: reserve setup_data: [mem 0x000000009bd3f000-0x000000009bdfffff] reserved Jun 21 04:39:58.806221 kernel: reserve setup_data: [mem 0x000000009be00000-0x000000009c8ecfff] usable Jun 21 04:39:58.806228 kernel: reserve setup_data: [mem 0x000000009c8ed000-0x000000009cb6cfff] reserved Jun 21 04:39:58.806235 kernel: reserve setup_data: [mem 0x000000009cb6d000-0x000000009cb7efff] ACPI data Jun 21 04:39:58.806243 kernel: reserve setup_data: [mem 0x000000009cb7f000-0x000000009cbfefff] ACPI NVS Jun 21 04:39:58.806250 kernel: reserve setup_data: [mem 0x000000009cbff000-0x000000009ce90fff] usable Jun 21 04:39:58.806257 kernel: reserve setup_data: [mem 0x000000009ce91000-0x000000009ce94fff] reserved Jun 21 04:39:58.806264 kernel: reserve setup_data: [mem 0x000000009ce95000-0x000000009ce96fff] ACPI NVS Jun 21 04:39:58.806274 kernel: reserve setup_data: [mem 0x000000009ce97000-0x000000009cedbfff] usable Jun 21 04:39:58.806290 kernel: reserve setup_data: [mem 0x000000009cedc000-0x000000009cf5ffff] reserved Jun 21 04:39:58.806298 kernel: reserve setup_data: [mem 0x000000009cf60000-0x000000009cffffff] ACPI NVS Jun 21 04:39:58.806305 kernel: reserve setup_data: [mem 0x00000000e0000000-0x00000000efffffff] reserved Jun 21 04:39:58.806312 kernel: reserve setup_data: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jun 21 04:39:58.806319 kernel: reserve setup_data: [mem 0x00000000ffc00000-0x00000000ffffffff] reserved Jun 21 04:39:58.806327 kernel: reserve setup_data: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Jun 21 04:39:58.806334 kernel: efi: EFI v2.7 by EDK II Jun 21 04:39:58.806341 kernel: efi: SMBIOS=0x9c988000 ACPI=0x9cb7e000 ACPI 2.0=0x9cb7e014 MEMATTR=0x9b9e4198 RNG=0x9cb73018 Jun 21 04:39:58.806348 kernel: random: crng init done Jun 21 04:39:58.806356 kernel: efi: Remove mem151: MMIO range=[0xffc00000-0xffffffff] (4MB) from e820 map Jun 21 04:39:58.806363 kernel: e820: remove [mem 0xffc00000-0xffffffff] reserved Jun 21 04:39:58.806373 kernel: secureboot: Secure boot disabled Jun 21 04:39:58.806380 kernel: SMBIOS 2.8 present. Jun 21 04:39:58.806387 kernel: DMI: QEMU Standard PC (Q35 + ICH9, 2009), BIOS unknown 02/02/2022 Jun 21 04:39:58.806395 kernel: DMI: Memory slots populated: 1/1 Jun 21 04:39:58.806402 kernel: Hypervisor detected: KVM Jun 21 04:39:58.806409 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Jun 21 04:39:58.806416 kernel: kvm-clock: using sched offset of 3575698852 cycles Jun 21 04:39:58.806424 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Jun 21 04:39:58.806432 kernel: tsc: Detected 2794.746 MHz processor Jun 21 04:39:58.806439 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jun 21 04:39:58.806447 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jun 21 04:39:58.806456 kernel: last_pfn = 0x9cedc max_arch_pfn = 0x400000000 Jun 21 04:39:58.806464 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs Jun 21 04:39:58.806471 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jun 21 04:39:58.806478 kernel: Using GB pages for direct mapping Jun 21 04:39:58.806486 kernel: ACPI: Early table checksum verification disabled Jun 21 04:39:58.806493 kernel: ACPI: RSDP 0x000000009CB7E014 000024 (v02 BOCHS ) Jun 21 04:39:58.806501 kernel: ACPI: XSDT 0x000000009CB7D0E8 000054 (v01 BOCHS BXPC 00000001 01000013) Jun 21 04:39:58.806508 kernel: ACPI: FACP 0x000000009CB79000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jun 21 04:39:58.806516 kernel: ACPI: DSDT 0x000000009CB7A000 0021BA (v01 BOCHS BXPC 00000001 BXPC 00000001) Jun 21 04:39:58.806532 kernel: ACPI: FACS 0x000000009CBDD000 000040 Jun 21 04:39:58.806539 kernel: ACPI: APIC 0x000000009CB78000 000090 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jun 21 04:39:58.806547 kernel: ACPI: HPET 0x000000009CB77000 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jun 21 04:39:58.806554 kernel: ACPI: MCFG 0x000000009CB76000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jun 21 04:39:58.806562 kernel: ACPI: WAET 0x000000009CB75000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jun 21 04:39:58.806569 kernel: ACPI: BGRT 0x000000009CB74000 000038 (v01 INTEL EDK2 00000002 01000013) Jun 21 04:39:58.806577 kernel: ACPI: Reserving FACP table memory at [mem 0x9cb79000-0x9cb790f3] Jun 21 04:39:58.806584 kernel: ACPI: Reserving DSDT table memory at [mem 0x9cb7a000-0x9cb7c1b9] Jun 21 04:39:58.806594 kernel: ACPI: Reserving FACS table memory at [mem 0x9cbdd000-0x9cbdd03f] Jun 21 04:39:58.806601 kernel: ACPI: Reserving APIC table memory at [mem 0x9cb78000-0x9cb7808f] Jun 21 04:39:58.806608 kernel: ACPI: Reserving HPET table memory at [mem 0x9cb77000-0x9cb77037] Jun 21 04:39:58.806616 kernel: ACPI: Reserving MCFG table memory at [mem 0x9cb76000-0x9cb7603b] Jun 21 04:39:58.806623 kernel: ACPI: Reserving WAET table memory at [mem 0x9cb75000-0x9cb75027] Jun 21 04:39:58.806631 kernel: ACPI: Reserving BGRT table memory at [mem 0x9cb74000-0x9cb74037] Jun 21 04:39:58.806638 kernel: No NUMA configuration found Jun 21 04:39:58.806645 kernel: Faking a node at [mem 0x0000000000000000-0x000000009cedbfff] Jun 21 04:39:58.806653 kernel: NODE_DATA(0) allocated [mem 0x9ce36dc0-0x9ce3dfff] Jun 21 04:39:58.806660 kernel: Zone ranges: Jun 21 04:39:58.806670 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jun 21 04:39:58.806677 kernel: DMA32 [mem 0x0000000001000000-0x000000009cedbfff] Jun 21 04:39:58.806684 kernel: Normal empty Jun 21 04:39:58.806692 kernel: Device empty Jun 21 04:39:58.806699 kernel: Movable zone start for each node Jun 21 04:39:58.806706 kernel: Early memory node ranges Jun 21 04:39:58.806714 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Jun 21 04:39:58.806721 kernel: node 0: [mem 0x0000000000100000-0x00000000007fffff] Jun 21 04:39:58.806728 kernel: node 0: [mem 0x0000000000808000-0x000000000080afff] Jun 21 04:39:58.806738 kernel: node 0: [mem 0x000000000080c000-0x0000000000810fff] Jun 21 04:39:58.806746 kernel: node 0: [mem 0x0000000000900000-0x000000009bd3efff] Jun 21 04:39:58.806753 kernel: node 0: [mem 0x000000009be00000-0x000000009c8ecfff] Jun 21 04:39:58.806760 kernel: node 0: [mem 0x000000009cbff000-0x000000009ce90fff] Jun 21 04:39:58.806767 kernel: node 0: [mem 0x000000009ce97000-0x000000009cedbfff] Jun 21 04:39:58.806775 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000009cedbfff] Jun 21 04:39:58.806782 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jun 21 04:39:58.806789 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Jun 21 04:39:58.806807 kernel: On node 0, zone DMA: 8 pages in unavailable ranges Jun 21 04:39:58.806814 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jun 21 04:39:58.806822 kernel: On node 0, zone DMA: 239 pages in unavailable ranges Jun 21 04:39:58.806829 kernel: On node 0, zone DMA32: 193 pages in unavailable ranges Jun 21 04:39:58.806839 kernel: On node 0, zone DMA32: 786 pages in unavailable ranges Jun 21 04:39:58.806846 kernel: On node 0, zone DMA32: 6 pages in unavailable ranges Jun 21 04:39:58.806854 kernel: On node 0, zone DMA32: 12580 pages in unavailable ranges Jun 21 04:39:58.806862 kernel: ACPI: PM-Timer IO Port: 0x608 Jun 21 04:39:58.806870 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Jun 21 04:39:58.806879 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Jun 21 04:39:58.806887 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Jun 21 04:39:58.806895 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Jun 21 04:39:58.806903 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Jun 21 04:39:58.806910 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Jun 21 04:39:58.806918 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Jun 21 04:39:58.806925 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jun 21 04:39:58.806933 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Jun 21 04:39:58.806941 kernel: TSC deadline timer available Jun 21 04:39:58.806948 kernel: CPU topo: Max. logical packages: 1 Jun 21 04:39:58.806958 kernel: CPU topo: Max. logical dies: 1 Jun 21 04:39:58.806966 kernel: CPU topo: Max. dies per package: 1 Jun 21 04:39:58.806973 kernel: CPU topo: Max. threads per core: 1 Jun 21 04:39:58.806981 kernel: CPU topo: Num. cores per package: 4 Jun 21 04:39:58.806988 kernel: CPU topo: Num. threads per package: 4 Jun 21 04:39:58.806996 kernel: CPU topo: Allowing 4 present CPUs plus 0 hotplug CPUs Jun 21 04:39:58.807003 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Jun 21 04:39:58.807011 kernel: kvm-guest: KVM setup pv remote TLB flush Jun 21 04:39:58.807019 kernel: kvm-guest: setup PV sched yield Jun 21 04:39:58.807028 kernel: [mem 0x9d000000-0xdfffffff] available for PCI devices Jun 21 04:39:58.807036 kernel: Booting paravirtualized kernel on KVM Jun 21 04:39:58.807044 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jun 21 04:39:58.807052 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:4 nr_cpu_ids:4 nr_node_ids:1 Jun 21 04:39:58.807059 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u524288 Jun 21 04:39:58.807067 kernel: pcpu-alloc: s207832 r8192 d29736 u524288 alloc=1*2097152 Jun 21 04:39:58.807075 kernel: pcpu-alloc: [0] 0 1 2 3 Jun 21 04:39:58.807082 kernel: kvm-guest: PV spinlocks enabled Jun 21 04:39:58.807090 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Jun 21 04:39:58.807101 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=d3c0be6f64121476b0313f5d7d7bbd73e21bc1a219aacd38b8006b291898eca1 Jun 21 04:39:58.807109 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Jun 21 04:39:58.807117 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Jun 21 04:39:58.807125 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jun 21 04:39:58.807132 kernel: Fallback order for Node 0: 0 Jun 21 04:39:58.807140 kernel: Built 1 zonelists, mobility grouping on. Total pages: 641450 Jun 21 04:39:58.807160 kernel: Policy zone: DMA32 Jun 21 04:39:58.807168 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jun 21 04:39:58.807178 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Jun 21 04:39:58.807185 kernel: ftrace: allocating 40093 entries in 157 pages Jun 21 04:39:58.807193 kernel: ftrace: allocated 157 pages with 5 groups Jun 21 04:39:58.807201 kernel: Dynamic Preempt: voluntary Jun 21 04:39:58.807209 kernel: rcu: Preemptible hierarchical RCU implementation. Jun 21 04:39:58.807217 kernel: rcu: RCU event tracing is enabled. Jun 21 04:39:58.807225 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Jun 21 04:39:58.807233 kernel: Trampoline variant of Tasks RCU enabled. Jun 21 04:39:58.807241 kernel: Rude variant of Tasks RCU enabled. Jun 21 04:39:58.807248 kernel: Tracing variant of Tasks RCU enabled. Jun 21 04:39:58.807258 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jun 21 04:39:58.807266 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Jun 21 04:39:58.807274 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Jun 21 04:39:58.807282 kernel: RCU Tasks Rude: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Jun 21 04:39:58.807289 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Jun 21 04:39:58.807297 kernel: NR_IRQS: 33024, nr_irqs: 456, preallocated irqs: 16 Jun 21 04:39:58.807305 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jun 21 04:39:58.807313 kernel: Console: colour dummy device 80x25 Jun 21 04:39:58.807320 kernel: printk: legacy console [ttyS0] enabled Jun 21 04:39:58.807330 kernel: ACPI: Core revision 20240827 Jun 21 04:39:58.807338 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Jun 21 04:39:58.807346 kernel: APIC: Switch to symmetric I/O mode setup Jun 21 04:39:58.807354 kernel: x2apic enabled Jun 21 04:39:58.807361 kernel: APIC: Switched APIC routing to: physical x2apic Jun 21 04:39:58.807369 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() Jun 21 04:39:58.807377 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() Jun 21 04:39:58.807384 kernel: kvm-guest: setup PV IPIs Jun 21 04:39:58.807392 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Jun 21 04:39:58.807402 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x2848ddd4e75, max_idle_ns: 440795346320 ns Jun 21 04:39:58.807410 kernel: Calibrating delay loop (skipped) preset value.. 5589.49 BogoMIPS (lpj=2794746) Jun 21 04:39:58.807418 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Jun 21 04:39:58.807426 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Jun 21 04:39:58.807433 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Jun 21 04:39:58.807441 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jun 21 04:39:58.807449 kernel: Spectre V2 : Mitigation: Retpolines Jun 21 04:39:58.807457 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Jun 21 04:39:58.807467 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Jun 21 04:39:58.807475 kernel: RETBleed: Mitigation: untrained return thunk Jun 21 04:39:58.807482 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jun 21 04:39:58.807490 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jun 21 04:39:58.807498 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied! Jun 21 04:39:58.807528 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options. Jun 21 04:39:58.807536 kernel: x86/bugs: return thunk changed Jun 21 04:39:58.807545 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode Jun 21 04:39:58.807555 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jun 21 04:39:58.807567 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jun 21 04:39:58.807575 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jun 21 04:39:58.807583 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jun 21 04:39:58.807591 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Jun 21 04:39:58.807599 kernel: Freeing SMP alternatives memory: 32K Jun 21 04:39:58.807606 kernel: pid_max: default: 32768 minimum: 301 Jun 21 04:39:58.807614 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jun 21 04:39:58.807622 kernel: landlock: Up and running. Jun 21 04:39:58.807629 kernel: SELinux: Initializing. Jun 21 04:39:58.807639 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jun 21 04:39:58.807647 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jun 21 04:39:58.807655 kernel: smpboot: CPU0: AMD EPYC 7402P 24-Core Processor (family: 0x17, model: 0x31, stepping: 0x0) Jun 21 04:39:58.807663 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Jun 21 04:39:58.807671 kernel: ... version: 0 Jun 21 04:39:58.807678 kernel: ... bit width: 48 Jun 21 04:39:58.807686 kernel: ... generic registers: 6 Jun 21 04:39:58.807693 kernel: ... value mask: 0000ffffffffffff Jun 21 04:39:58.807701 kernel: ... max period: 00007fffffffffff Jun 21 04:39:58.807711 kernel: ... fixed-purpose events: 0 Jun 21 04:39:58.807718 kernel: ... event mask: 000000000000003f Jun 21 04:39:58.807726 kernel: signal: max sigframe size: 1776 Jun 21 04:39:58.807734 kernel: rcu: Hierarchical SRCU implementation. Jun 21 04:39:58.807742 kernel: rcu: Max phase no-delay instances is 400. Jun 21 04:39:58.807749 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jun 21 04:39:58.807757 kernel: smp: Bringing up secondary CPUs ... Jun 21 04:39:58.807765 kernel: smpboot: x86: Booting SMP configuration: Jun 21 04:39:58.807772 kernel: .... node #0, CPUs: #1 #2 #3 Jun 21 04:39:58.807782 kernel: smp: Brought up 1 node, 4 CPUs Jun 21 04:39:58.807790 kernel: smpboot: Total of 4 processors activated (22357.96 BogoMIPS) Jun 21 04:39:58.807798 kernel: Memory: 2422668K/2565800K available (14336K kernel code, 2430K rwdata, 9956K rodata, 54424K init, 2544K bss, 137196K reserved, 0K cma-reserved) Jun 21 04:39:58.807805 kernel: devtmpfs: initialized Jun 21 04:39:58.807813 kernel: x86/mm: Memory block size: 128MB Jun 21 04:39:58.807821 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00800000-0x00807fff] (32768 bytes) Jun 21 04:39:58.807829 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x0080b000-0x0080bfff] (4096 bytes) Jun 21 04:39:58.807837 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00811000-0x008fffff] (978944 bytes) Jun 21 04:39:58.807844 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9cb7f000-0x9cbfefff] (524288 bytes) Jun 21 04:39:58.807854 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9ce95000-0x9ce96fff] (8192 bytes) Jun 21 04:39:58.807862 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9cf60000-0x9cffffff] (655360 bytes) Jun 21 04:39:58.807870 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jun 21 04:39:58.807878 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Jun 21 04:39:58.807885 kernel: pinctrl core: initialized pinctrl subsystem Jun 21 04:39:58.807893 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jun 21 04:39:58.807901 kernel: audit: initializing netlink subsys (disabled) Jun 21 04:39:58.807908 kernel: audit: type=2000 audit(1750480797.058:1): state=initialized audit_enabled=0 res=1 Jun 21 04:39:58.807918 kernel: thermal_sys: Registered thermal governor 'step_wise' Jun 21 04:39:58.807926 kernel: thermal_sys: Registered thermal governor 'user_space' Jun 21 04:39:58.807933 kernel: cpuidle: using governor menu Jun 21 04:39:58.807941 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jun 21 04:39:58.807949 kernel: dca service started, version 1.12.1 Jun 21 04:39:58.807956 kernel: PCI: ECAM [mem 0xe0000000-0xefffffff] (base 0xe0000000) for domain 0000 [bus 00-ff] Jun 21 04:39:58.807964 kernel: PCI: Using configuration type 1 for base access Jun 21 04:39:58.807972 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jun 21 04:39:58.807979 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jun 21 04:39:58.807989 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jun 21 04:39:58.807997 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jun 21 04:39:58.808004 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jun 21 04:39:58.808012 kernel: ACPI: Added _OSI(Module Device) Jun 21 04:39:58.808019 kernel: ACPI: Added _OSI(Processor Device) Jun 21 04:39:58.808027 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jun 21 04:39:58.808035 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jun 21 04:39:58.808042 kernel: ACPI: Interpreter enabled Jun 21 04:39:58.808050 kernel: ACPI: PM: (supports S0 S3 S5) Jun 21 04:39:58.808057 kernel: ACPI: Using IOAPIC for interrupt routing Jun 21 04:39:58.808068 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jun 21 04:39:58.808075 kernel: PCI: Using E820 reservations for host bridge windows Jun 21 04:39:58.808083 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Jun 21 04:39:58.808090 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jun 21 04:39:58.808289 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jun 21 04:39:58.808409 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Jun 21 04:39:58.808532 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Jun 21 04:39:58.808557 kernel: PCI host bridge to bus 0000:00 Jun 21 04:39:58.808686 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jun 21 04:39:58.808793 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Jun 21 04:39:58.808900 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jun 21 04:39:58.809003 kernel: pci_bus 0000:00: root bus resource [mem 0x9d000000-0xdfffffff window] Jun 21 04:39:58.809107 kernel: pci_bus 0000:00: root bus resource [mem 0xf0000000-0xfebfffff window] Jun 21 04:39:58.809243 kernel: pci_bus 0000:00: root bus resource [mem 0x380000000000-0x3807ffffffff window] Jun 21 04:39:58.809353 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jun 21 04:39:58.809508 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Jun 21 04:39:58.809654 kernel: pci 0000:00:01.0: [1234:1111] type 00 class 0x030000 conventional PCI endpoint Jun 21 04:39:58.809770 kernel: pci 0000:00:01.0: BAR 0 [mem 0xc0000000-0xc0ffffff pref] Jun 21 04:39:58.809886 kernel: pci 0000:00:01.0: BAR 2 [mem 0xc1044000-0xc1044fff] Jun 21 04:39:58.810000 kernel: pci 0000:00:01.0: ROM [mem 0xffff0000-0xffffffff pref] Jun 21 04:39:58.810120 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jun 21 04:39:58.810263 kernel: pci 0000:00:02.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint Jun 21 04:39:58.810383 kernel: pci 0000:00:02.0: BAR 0 [io 0x6100-0x611f] Jun 21 04:39:58.810498 kernel: pci 0000:00:02.0: BAR 1 [mem 0xc1043000-0xc1043fff] Jun 21 04:39:58.810627 kernel: pci 0000:00:02.0: BAR 4 [mem 0x380000000000-0x380000003fff 64bit pref] Jun 21 04:39:58.810752 kernel: pci 0000:00:03.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint Jun 21 04:39:58.810868 kernel: pci 0000:00:03.0: BAR 0 [io 0x6000-0x607f] Jun 21 04:39:58.810989 kernel: pci 0000:00:03.0: BAR 1 [mem 0xc1042000-0xc1042fff] Jun 21 04:39:58.811103 kernel: pci 0000:00:03.0: BAR 4 [mem 0x380000004000-0x380000007fff 64bit pref] Jun 21 04:39:58.811303 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint Jun 21 04:39:58.811422 kernel: pci 0000:00:04.0: BAR 0 [io 0x60e0-0x60ff] Jun 21 04:39:58.811546 kernel: pci 0000:00:04.0: BAR 1 [mem 0xc1041000-0xc1041fff] Jun 21 04:39:58.811664 kernel: pci 0000:00:04.0: BAR 4 [mem 0x380000008000-0x38000000bfff 64bit pref] Jun 21 04:39:58.811778 kernel: pci 0000:00:04.0: ROM [mem 0xfffc0000-0xffffffff pref] Jun 21 04:39:58.811907 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Jun 21 04:39:58.812023 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Jun 21 04:39:58.812160 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Jun 21 04:39:58.812293 kernel: pci 0000:00:1f.2: BAR 4 [io 0x60c0-0x60df] Jun 21 04:39:58.812407 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xc1040000-0xc1040fff] Jun 21 04:39:58.812539 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Jun 21 04:39:58.812661 kernel: pci 0000:00:1f.3: BAR 4 [io 0x6080-0x60bf] Jun 21 04:39:58.812672 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Jun 21 04:39:58.812680 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Jun 21 04:39:58.812688 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jun 21 04:39:58.812696 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Jun 21 04:39:58.812704 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Jun 21 04:39:58.812711 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Jun 21 04:39:58.812719 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Jun 21 04:39:58.812727 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Jun 21 04:39:58.812737 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Jun 21 04:39:58.812745 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Jun 21 04:39:58.812753 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Jun 21 04:39:58.812761 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Jun 21 04:39:58.812768 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Jun 21 04:39:58.812776 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Jun 21 04:39:58.812784 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Jun 21 04:39:58.812792 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Jun 21 04:39:58.812800 kernel: iommu: Default domain type: Translated Jun 21 04:39:58.812810 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jun 21 04:39:58.812817 kernel: efivars: Registered efivars operations Jun 21 04:39:58.812825 kernel: PCI: Using ACPI for IRQ routing Jun 21 04:39:58.812833 kernel: PCI: pci_cache_line_size set to 64 bytes Jun 21 04:39:58.812841 kernel: e820: reserve RAM buffer [mem 0x0080b000-0x008fffff] Jun 21 04:39:58.812848 kernel: e820: reserve RAM buffer [mem 0x00811000-0x008fffff] Jun 21 04:39:58.812856 kernel: e820: reserve RAM buffer [mem 0x9b2e3018-0x9bffffff] Jun 21 04:39:58.812864 kernel: e820: reserve RAM buffer [mem 0x9b320018-0x9bffffff] Jun 21 04:39:58.812871 kernel: e820: reserve RAM buffer [mem 0x9bd3f000-0x9bffffff] Jun 21 04:39:58.812881 kernel: e820: reserve RAM buffer [mem 0x9c8ed000-0x9fffffff] Jun 21 04:39:58.812889 kernel: e820: reserve RAM buffer [mem 0x9ce91000-0x9fffffff] Jun 21 04:39:58.812896 kernel: e820: reserve RAM buffer [mem 0x9cedc000-0x9fffffff] Jun 21 04:39:58.813012 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Jun 21 04:39:58.813126 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Jun 21 04:39:58.813261 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jun 21 04:39:58.813272 kernel: vgaarb: loaded Jun 21 04:39:58.813280 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Jun 21 04:39:58.813292 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Jun 21 04:39:58.813300 kernel: clocksource: Switched to clocksource kvm-clock Jun 21 04:39:58.813308 kernel: VFS: Disk quotas dquot_6.6.0 Jun 21 04:39:58.813316 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jun 21 04:39:58.813323 kernel: pnp: PnP ACPI init Jun 21 04:39:58.813446 kernel: system 00:05: [mem 0xe0000000-0xefffffff window] has been reserved Jun 21 04:39:58.813474 kernel: pnp: PnP ACPI: found 6 devices Jun 21 04:39:58.813484 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jun 21 04:39:58.813494 kernel: NET: Registered PF_INET protocol family Jun 21 04:39:58.813502 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Jun 21 04:39:58.813510 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Jun 21 04:39:58.813518 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jun 21 04:39:58.813534 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Jun 21 04:39:58.813542 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Jun 21 04:39:58.813550 kernel: TCP: Hash tables configured (established 32768 bind 32768) Jun 21 04:39:58.813559 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Jun 21 04:39:58.813567 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Jun 21 04:39:58.813577 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jun 21 04:39:58.813586 kernel: NET: Registered PF_XDP protocol family Jun 21 04:39:58.813705 kernel: pci 0000:00:04.0: ROM [mem 0xfffc0000-0xffffffff pref]: can't claim; no compatible bridge window Jun 21 04:39:58.813823 kernel: pci 0000:00:04.0: ROM [mem 0x9d000000-0x9d03ffff pref]: assigned Jun 21 04:39:58.813929 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Jun 21 04:39:58.814033 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Jun 21 04:39:58.814137 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Jun 21 04:39:58.814258 kernel: pci_bus 0000:00: resource 7 [mem 0x9d000000-0xdfffffff window] Jun 21 04:39:58.814367 kernel: pci_bus 0000:00: resource 8 [mem 0xf0000000-0xfebfffff window] Jun 21 04:39:58.814471 kernel: pci_bus 0000:00: resource 9 [mem 0x380000000000-0x3807ffffffff window] Jun 21 04:39:58.814481 kernel: PCI: CLS 0 bytes, default 64 Jun 21 04:39:58.814490 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x2848ddd4e75, max_idle_ns: 440795346320 ns Jun 21 04:39:58.814498 kernel: Initialise system trusted keyrings Jun 21 04:39:58.814506 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Jun 21 04:39:58.814514 kernel: Key type asymmetric registered Jun 21 04:39:58.814530 kernel: Asymmetric key parser 'x509' registered Jun 21 04:39:58.814541 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Jun 21 04:39:58.814550 kernel: io scheduler mq-deadline registered Jun 21 04:39:58.814562 kernel: io scheduler kyber registered Jun 21 04:39:58.814570 kernel: io scheduler bfq registered Jun 21 04:39:58.814580 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jun 21 04:39:58.814589 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Jun 21 04:39:58.814599 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Jun 21 04:39:58.814607 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Jun 21 04:39:58.814615 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jun 21 04:39:58.814623 kernel: 00:03: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jun 21 04:39:58.814631 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Jun 21 04:39:58.814639 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jun 21 04:39:58.814647 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jun 21 04:39:58.814767 kernel: rtc_cmos 00:04: RTC can wake from S4 Jun 21 04:39:58.814781 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jun 21 04:39:58.814891 kernel: rtc_cmos 00:04: registered as rtc0 Jun 21 04:39:58.814999 kernel: rtc_cmos 00:04: setting system clock to 2025-06-21T04:39:58 UTC (1750480798) Jun 21 04:39:58.815106 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram Jun 21 04:39:58.815116 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Jun 21 04:39:58.815124 kernel: efifb: probing for efifb Jun 21 04:39:58.815132 kernel: efifb: framebuffer at 0xc0000000, using 4000k, total 4000k Jun 21 04:39:58.815141 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Jun 21 04:39:58.815162 kernel: efifb: scrolling: redraw Jun 21 04:39:58.815174 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Jun 21 04:39:58.815182 kernel: Console: switching to colour frame buffer device 160x50 Jun 21 04:39:58.815190 kernel: fb0: EFI VGA frame buffer device Jun 21 04:39:58.815198 kernel: pstore: Using crash dump compression: deflate Jun 21 04:39:58.815206 kernel: pstore: Registered efi_pstore as persistent store backend Jun 21 04:39:58.815215 kernel: NET: Registered PF_INET6 protocol family Jun 21 04:39:58.815222 kernel: Segment Routing with IPv6 Jun 21 04:39:58.815231 kernel: In-situ OAM (IOAM) with IPv6 Jun 21 04:39:58.815239 kernel: NET: Registered PF_PACKET protocol family Jun 21 04:39:58.815249 kernel: Key type dns_resolver registered Jun 21 04:39:58.815257 kernel: IPI shorthand broadcast: enabled Jun 21 04:39:58.815265 kernel: sched_clock: Marking stable (2842002817, 161848181)->(3028268278, -24417280) Jun 21 04:39:58.815273 kernel: registered taskstats version 1 Jun 21 04:39:58.815281 kernel: Loading compiled-in X.509 certificates Jun 21 04:39:58.815289 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.34-flatcar: ec4617d162e00e1890f71f252cdf44036a7b66f7' Jun 21 04:39:58.815297 kernel: Demotion targets for Node 0: null Jun 21 04:39:58.815305 kernel: Key type .fscrypt registered Jun 21 04:39:58.815313 kernel: Key type fscrypt-provisioning registered Jun 21 04:39:58.815323 kernel: ima: No TPM chip found, activating TPM-bypass! Jun 21 04:39:58.815331 kernel: ima: Allocated hash algorithm: sha1 Jun 21 04:39:58.815339 kernel: ima: No architecture policies found Jun 21 04:39:58.815347 kernel: clk: Disabling unused clocks Jun 21 04:39:58.815355 kernel: Warning: unable to open an initial console. Jun 21 04:39:58.815363 kernel: Freeing unused kernel image (initmem) memory: 54424K Jun 21 04:39:58.815371 kernel: Write protecting the kernel read-only data: 24576k Jun 21 04:39:58.815379 kernel: Freeing unused kernel image (rodata/data gap) memory: 284K Jun 21 04:39:58.815389 kernel: Run /init as init process Jun 21 04:39:58.815397 kernel: with arguments: Jun 21 04:39:58.815404 kernel: /init Jun 21 04:39:58.815412 kernel: with environment: Jun 21 04:39:58.815420 kernel: HOME=/ Jun 21 04:39:58.815428 kernel: TERM=linux Jun 21 04:39:58.815436 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Jun 21 04:39:58.815445 systemd[1]: Successfully made /usr/ read-only. Jun 21 04:39:58.815456 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jun 21 04:39:58.815467 systemd[1]: Detected virtualization kvm. Jun 21 04:39:58.815476 systemd[1]: Detected architecture x86-64. Jun 21 04:39:58.815484 systemd[1]: Running in initrd. Jun 21 04:39:58.815492 systemd[1]: No hostname configured, using default hostname. Jun 21 04:39:58.815501 systemd[1]: Hostname set to . Jun 21 04:39:58.815509 systemd[1]: Initializing machine ID from VM UUID. Jun 21 04:39:58.815518 systemd[1]: Queued start job for default target initrd.target. Jun 21 04:39:58.815533 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jun 21 04:39:58.815543 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jun 21 04:39:58.815553 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jun 21 04:39:58.815564 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jun 21 04:39:58.815573 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jun 21 04:39:58.815585 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jun 21 04:39:58.815594 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jun 21 04:39:58.815605 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jun 21 04:39:58.815613 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jun 21 04:39:58.815622 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jun 21 04:39:58.815630 systemd[1]: Reached target paths.target - Path Units. Jun 21 04:39:58.815639 systemd[1]: Reached target slices.target - Slice Units. Jun 21 04:39:58.815647 systemd[1]: Reached target swap.target - Swaps. Jun 21 04:39:58.815656 systemd[1]: Reached target timers.target - Timer Units. Jun 21 04:39:58.815664 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jun 21 04:39:58.815673 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jun 21 04:39:58.815683 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jun 21 04:39:58.815692 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jun 21 04:39:58.815701 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jun 21 04:39:58.815709 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jun 21 04:39:58.815718 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jun 21 04:39:58.815726 systemd[1]: Reached target sockets.target - Socket Units. Jun 21 04:39:58.815735 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jun 21 04:39:58.815743 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jun 21 04:39:58.815754 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jun 21 04:39:58.815763 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jun 21 04:39:58.815771 systemd[1]: Starting systemd-fsck-usr.service... Jun 21 04:39:58.815780 systemd[1]: Starting systemd-journald.service - Journal Service... Jun 21 04:39:58.815788 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jun 21 04:39:58.815797 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jun 21 04:39:58.815805 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jun 21 04:39:58.815816 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jun 21 04:39:58.815825 systemd[1]: Finished systemd-fsck-usr.service. Jun 21 04:39:58.815853 systemd-journald[220]: Collecting audit messages is disabled. Jun 21 04:39:58.815875 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jun 21 04:39:58.815884 systemd-journald[220]: Journal started Jun 21 04:39:58.815902 systemd-journald[220]: Runtime Journal (/run/log/journal/7f3a8ad8216d432f9423836d2f3e4e19) is 6M, max 48.5M, 42.4M free. Jun 21 04:39:58.805189 systemd-modules-load[221]: Inserted module 'overlay' Jun 21 04:39:58.821179 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jun 21 04:39:58.824686 systemd[1]: Started systemd-journald.service - Journal Service. Jun 21 04:39:58.827290 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jun 21 04:39:58.830484 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jun 21 04:39:58.831821 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jun 21 04:39:58.835786 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jun 21 04:39:58.839175 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jun 21 04:39:58.842817 systemd-modules-load[221]: Inserted module 'br_netfilter' Jun 21 04:39:58.843836 kernel: Bridge firewalling registered Jun 21 04:39:58.844082 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jun 21 04:39:58.848300 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jun 21 04:39:58.852297 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jun 21 04:39:58.853127 systemd-tmpfiles[239]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jun 21 04:39:58.855656 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jun 21 04:39:58.864310 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jun 21 04:39:58.866993 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jun 21 04:39:58.875413 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jun 21 04:39:58.878263 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jun 21 04:39:58.886499 dracut-cmdline[260]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=d3c0be6f64121476b0313f5d7d7bbd73e21bc1a219aacd38b8006b291898eca1 Jun 21 04:39:58.924823 systemd-resolved[268]: Positive Trust Anchors: Jun 21 04:39:58.924839 systemd-resolved[268]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jun 21 04:39:58.924869 systemd-resolved[268]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jun 21 04:39:58.927252 systemd-resolved[268]: Defaulting to hostname 'linux'. Jun 21 04:39:58.928304 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jun 21 04:39:58.934952 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jun 21 04:39:58.992188 kernel: SCSI subsystem initialized Jun 21 04:39:59.001173 kernel: Loading iSCSI transport class v2.0-870. Jun 21 04:39:59.011177 kernel: iscsi: registered transport (tcp) Jun 21 04:39:59.032177 kernel: iscsi: registered transport (qla4xxx) Jun 21 04:39:59.032197 kernel: QLogic iSCSI HBA Driver Jun 21 04:39:59.052374 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jun 21 04:39:59.077456 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jun 21 04:39:59.081129 systemd[1]: Reached target network-pre.target - Preparation for Network. Jun 21 04:39:59.135320 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jun 21 04:39:59.138725 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jun 21 04:39:59.197171 kernel: raid6: avx2x4 gen() 29863 MB/s Jun 21 04:39:59.214167 kernel: raid6: avx2x2 gen() 30366 MB/s Jun 21 04:39:59.231255 kernel: raid6: avx2x1 gen() 25580 MB/s Jun 21 04:39:59.231274 kernel: raid6: using algorithm avx2x2 gen() 30366 MB/s Jun 21 04:39:59.249263 kernel: raid6: .... xor() 19885 MB/s, rmw enabled Jun 21 04:39:59.249278 kernel: raid6: using avx2x2 recovery algorithm Jun 21 04:39:59.270178 kernel: xor: automatically using best checksumming function avx Jun 21 04:39:59.432206 kernel: Btrfs loaded, zoned=no, fsverity=no Jun 21 04:39:59.440850 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jun 21 04:39:59.442765 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jun 21 04:39:59.474274 systemd-udevd[472]: Using default interface naming scheme 'v255'. Jun 21 04:39:59.479479 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jun 21 04:39:59.483395 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jun 21 04:39:59.509039 dracut-pre-trigger[479]: rd.md=0: removing MD RAID activation Jun 21 04:39:59.537257 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jun 21 04:39:59.540670 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jun 21 04:39:59.624130 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jun 21 04:39:59.627473 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jun 21 04:39:59.663181 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Jun 21 04:39:59.665772 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Jun 21 04:39:59.673970 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jun 21 04:39:59.674069 kernel: GPT:9289727 != 19775487 Jun 21 04:39:59.674110 kernel: GPT:Alternate GPT header not at the end of the disk. Jun 21 04:39:59.674164 kernel: GPT:9289727 != 19775487 Jun 21 04:39:59.674211 kernel: GPT: Use GNU Parted to correct GPT errors. Jun 21 04:39:59.674245 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jun 21 04:39:59.678203 kernel: cryptd: max_cpu_qlen set to 1000 Jun 21 04:39:59.684193 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Jun 21 04:39:59.699175 kernel: libata version 3.00 loaded. Jun 21 04:39:59.699518 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jun 21 04:39:59.699692 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jun 21 04:39:59.703664 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jun 21 04:39:59.707131 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jun 21 04:39:59.713180 kernel: AES CTR mode by8 optimization enabled Jun 21 04:39:59.713220 kernel: ahci 0000:00:1f.2: version 3.0 Jun 21 04:39:59.713457 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Jun 21 04:39:59.715217 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Jun 21 04:39:59.717030 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Jun 21 04:39:59.717245 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Jun 21 04:39:59.724198 kernel: scsi host0: ahci Jun 21 04:39:59.726053 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jun 21 04:39:59.727731 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jun 21 04:39:59.740672 kernel: scsi host1: ahci Jun 21 04:39:59.740909 kernel: scsi host2: ahci Jun 21 04:39:59.744175 kernel: scsi host3: ahci Jun 21 04:39:59.745167 kernel: scsi host4: ahci Jun 21 04:39:59.746971 kernel: scsi host5: ahci Jun 21 04:39:59.747213 kernel: ata1: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040100 irq 34 lpm-pol 0 Jun 21 04:39:59.747225 kernel: ata2: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040180 irq 34 lpm-pol 0 Jun 21 04:39:59.747687 kernel: ata3: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040200 irq 34 lpm-pol 0 Jun 21 04:39:59.748597 kernel: ata4: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040280 irq 34 lpm-pol 0 Jun 21 04:39:59.749808 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Jun 21 04:39:59.754768 kernel: ata5: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040300 irq 34 lpm-pol 0 Jun 21 04:39:59.754786 kernel: ata6: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040380 irq 34 lpm-pol 0 Jun 21 04:39:59.777080 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Jun 21 04:39:59.786021 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jun 21 04:39:59.798374 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Jun 21 04:39:59.799676 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Jun 21 04:39:59.803223 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Jun 21 04:39:59.804440 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jun 21 04:39:59.807957 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jun 21 04:39:59.826287 disk-uuid[632]: Primary Header is updated. Jun 21 04:39:59.826287 disk-uuid[632]: Secondary Entries is updated. Jun 21 04:39:59.826287 disk-uuid[632]: Secondary Header is updated. Jun 21 04:39:59.830169 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jun 21 04:39:59.832105 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jun 21 04:40:00.057197 kernel: ata4: SATA link down (SStatus 0 SControl 300) Jun 21 04:40:00.057274 kernel: ata5: SATA link down (SStatus 0 SControl 300) Jun 21 04:40:00.058190 kernel: ata3: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Jun 21 04:40:00.059773 kernel: ata3.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Jun 21 04:40:00.059800 kernel: ata3.00: applying bridge limits Jun 21 04:40:00.060182 kernel: ata3.00: configured for UDMA/100 Jun 21 04:40:00.066188 kernel: ata6: SATA link down (SStatus 0 SControl 300) Jun 21 04:40:00.066220 kernel: ata2: SATA link down (SStatus 0 SControl 300) Jun 21 04:40:00.067187 kernel: ata1: SATA link down (SStatus 0 SControl 300) Jun 21 04:40:00.068181 kernel: scsi 2:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Jun 21 04:40:00.110177 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Jun 21 04:40:00.110396 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Jun 21 04:40:00.125179 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Jun 21 04:40:00.458454 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jun 21 04:40:00.460060 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jun 21 04:40:00.461854 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jun 21 04:40:00.463050 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jun 21 04:40:00.466276 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jun 21 04:40:00.492039 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jun 21 04:40:00.841184 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jun 21 04:40:00.842016 disk-uuid[637]: The operation has completed successfully. Jun 21 04:40:00.872654 systemd[1]: disk-uuid.service: Deactivated successfully. Jun 21 04:40:00.872781 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jun 21 04:40:00.910436 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jun 21 04:40:00.938631 sh[666]: Success Jun 21 04:40:00.958176 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jun 21 04:40:00.958206 kernel: device-mapper: uevent: version 1.0.3 Jun 21 04:40:00.960174 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jun 21 04:40:00.969183 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Jun 21 04:40:00.999910 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jun 21 04:40:01.004003 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jun 21 04:40:01.028345 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jun 21 04:40:01.034182 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' Jun 21 04:40:01.037146 kernel: BTRFS: device fsid bfb8168c-5be0-428c-83e7-820ccaf1f8e9 devid 1 transid 41 /dev/mapper/usr (253:0) scanned by mount (679) Jun 21 04:40:01.037176 kernel: BTRFS info (device dm-0): first mount of filesystem bfb8168c-5be0-428c-83e7-820ccaf1f8e9 Jun 21 04:40:01.037187 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jun 21 04:40:01.038786 kernel: BTRFS info (device dm-0): using free-space-tree Jun 21 04:40:01.043384 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jun 21 04:40:01.045639 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jun 21 04:40:01.047990 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jun 21 04:40:01.050741 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jun 21 04:40:01.053514 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jun 21 04:40:01.077194 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 (254:6) scanned by mount (712) Jun 21 04:40:01.079637 kernel: BTRFS info (device vda6): first mount of filesystem 57d2b200-37a8-4067-8765-910d3ed0182c Jun 21 04:40:01.079685 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jun 21 04:40:01.079697 kernel: BTRFS info (device vda6): using free-space-tree Jun 21 04:40:01.087179 kernel: BTRFS info (device vda6): last unmount of filesystem 57d2b200-37a8-4067-8765-910d3ed0182c Jun 21 04:40:01.087867 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jun 21 04:40:01.089649 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jun 21 04:40:01.179983 ignition[752]: Ignition 2.21.0 Jun 21 04:40:01.180700 ignition[752]: Stage: fetch-offline Jun 21 04:40:01.180757 ignition[752]: no configs at "/usr/lib/ignition/base.d" Jun 21 04:40:01.181693 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jun 21 04:40:01.180769 ignition[752]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Jun 21 04:40:01.184354 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jun 21 04:40:01.180857 ignition[752]: parsed url from cmdline: "" Jun 21 04:40:01.180860 ignition[752]: no config URL provided Jun 21 04:40:01.180865 ignition[752]: reading system config file "/usr/lib/ignition/user.ign" Jun 21 04:40:01.180874 ignition[752]: no config at "/usr/lib/ignition/user.ign" Jun 21 04:40:01.180907 ignition[752]: op(1): [started] loading QEMU firmware config module Jun 21 04:40:01.180913 ignition[752]: op(1): executing: "modprobe" "qemu_fw_cfg" Jun 21 04:40:01.196693 ignition[752]: op(1): [finished] loading QEMU firmware config module Jun 21 04:40:01.229142 systemd-networkd[856]: lo: Link UP Jun 21 04:40:01.229172 systemd-networkd[856]: lo: Gained carrier Jun 21 04:40:01.230622 systemd-networkd[856]: Enumeration completed Jun 21 04:40:01.230956 systemd-networkd[856]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jun 21 04:40:01.230960 systemd-networkd[856]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jun 21 04:40:01.231146 systemd[1]: Started systemd-networkd.service - Network Configuration. Jun 21 04:40:01.232142 systemd-networkd[856]: eth0: Link UP Jun 21 04:40:01.232156 systemd-networkd[856]: eth0: Gained carrier Jun 21 04:40:01.232164 systemd-networkd[856]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jun 21 04:40:01.233405 systemd[1]: Reached target network.target - Network. Jun 21 04:40:01.249403 ignition[752]: parsing config with SHA512: 8bfba786a324be4df503d0783b1702faa0b01a34be477b3a4b533ee39c6c04423f50495c3b9b0ab8e9a58886f9902fa9f56cad6236a85ab98681622e88ab8de2 Jun 21 04:40:01.252984 unknown[752]: fetched base config from "system" Jun 21 04:40:01.252997 unknown[752]: fetched user config from "qemu" Jun 21 04:40:01.253395 ignition[752]: fetch-offline: fetch-offline passed Jun 21 04:40:01.253460 ignition[752]: Ignition finished successfully Jun 21 04:40:01.255233 systemd-networkd[856]: eth0: DHCPv4 address 10.0.0.63/16, gateway 10.0.0.1 acquired from 10.0.0.1 Jun 21 04:40:01.258676 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jun 21 04:40:01.261112 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Jun 21 04:40:01.261927 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jun 21 04:40:01.301474 ignition[861]: Ignition 2.21.0 Jun 21 04:40:01.301486 ignition[861]: Stage: kargs Jun 21 04:40:01.301612 ignition[861]: no configs at "/usr/lib/ignition/base.d" Jun 21 04:40:01.301622 ignition[861]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Jun 21 04:40:01.303003 ignition[861]: kargs: kargs passed Jun 21 04:40:01.303135 ignition[861]: Ignition finished successfully Jun 21 04:40:01.309685 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jun 21 04:40:01.311763 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jun 21 04:40:01.350543 ignition[869]: Ignition 2.21.0 Jun 21 04:40:01.350557 ignition[869]: Stage: disks Jun 21 04:40:01.350688 ignition[869]: no configs at "/usr/lib/ignition/base.d" Jun 21 04:40:01.350698 ignition[869]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Jun 21 04:40:01.351444 ignition[869]: disks: disks passed Jun 21 04:40:01.354386 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jun 21 04:40:01.351499 ignition[869]: Ignition finished successfully Jun 21 04:40:01.357273 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jun 21 04:40:01.357925 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jun 21 04:40:01.358482 systemd[1]: Reached target local-fs.target - Local File Systems. Jun 21 04:40:01.358848 systemd[1]: Reached target sysinit.target - System Initialization. Jun 21 04:40:01.359391 systemd[1]: Reached target basic.target - Basic System. Jun 21 04:40:01.368431 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jun 21 04:40:01.395962 systemd-fsck[879]: ROOT: clean, 15/553520 files, 52789/553472 blocks Jun 21 04:40:01.403966 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jun 21 04:40:01.407599 systemd[1]: Mounting sysroot.mount - /sysroot... Jun 21 04:40:01.514183 kernel: EXT4-fs (vda9): mounted filesystem 6d18c974-0fd6-4e4a-98cf-62524fcf9e99 r/w with ordered data mode. Quota mode: none. Jun 21 04:40:01.515008 systemd[1]: Mounted sysroot.mount - /sysroot. Jun 21 04:40:01.517325 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jun 21 04:40:01.520725 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jun 21 04:40:01.521736 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jun 21 04:40:01.522857 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jun 21 04:40:01.522900 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jun 21 04:40:01.522925 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jun 21 04:40:01.543280 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jun 21 04:40:01.544731 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jun 21 04:40:01.550178 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 (254:6) scanned by mount (887) Jun 21 04:40:01.550205 kernel: BTRFS info (device vda6): first mount of filesystem 57d2b200-37a8-4067-8765-910d3ed0182c Jun 21 04:40:01.552174 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jun 21 04:40:01.553168 kernel: BTRFS info (device vda6): using free-space-tree Jun 21 04:40:01.557411 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jun 21 04:40:01.582325 initrd-setup-root[911]: cut: /sysroot/etc/passwd: No such file or directory Jun 21 04:40:01.586290 initrd-setup-root[918]: cut: /sysroot/etc/group: No such file or directory Jun 21 04:40:01.590895 initrd-setup-root[925]: cut: /sysroot/etc/shadow: No such file or directory Jun 21 04:40:01.595432 initrd-setup-root[932]: cut: /sysroot/etc/gshadow: No such file or directory Jun 21 04:40:01.682512 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jun 21 04:40:01.683560 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jun 21 04:40:01.686769 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jun 21 04:40:01.711171 kernel: BTRFS info (device vda6): last unmount of filesystem 57d2b200-37a8-4067-8765-910d3ed0182c Jun 21 04:40:01.723271 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jun 21 04:40:01.737541 ignition[1001]: INFO : Ignition 2.21.0 Jun 21 04:40:01.737541 ignition[1001]: INFO : Stage: mount Jun 21 04:40:01.739250 ignition[1001]: INFO : no configs at "/usr/lib/ignition/base.d" Jun 21 04:40:01.739250 ignition[1001]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Jun 21 04:40:01.741513 ignition[1001]: INFO : mount: mount passed Jun 21 04:40:01.741513 ignition[1001]: INFO : Ignition finished successfully Jun 21 04:40:01.743419 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jun 21 04:40:01.746575 systemd[1]: Starting ignition-files.service - Ignition (files)... Jun 21 04:40:02.034854 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jun 21 04:40:02.036632 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jun 21 04:40:02.063741 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 (254:6) scanned by mount (1013) Jun 21 04:40:02.063769 kernel: BTRFS info (device vda6): first mount of filesystem 57d2b200-37a8-4067-8765-910d3ed0182c Jun 21 04:40:02.063780 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Jun 21 04:40:02.064613 kernel: BTRFS info (device vda6): using free-space-tree Jun 21 04:40:02.068898 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jun 21 04:40:02.098799 ignition[1030]: INFO : Ignition 2.21.0 Jun 21 04:40:02.098799 ignition[1030]: INFO : Stage: files Jun 21 04:40:02.100696 ignition[1030]: INFO : no configs at "/usr/lib/ignition/base.d" Jun 21 04:40:02.100696 ignition[1030]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Jun 21 04:40:02.103847 ignition[1030]: DEBUG : files: compiled without relabeling support, skipping Jun 21 04:40:02.105209 ignition[1030]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jun 21 04:40:02.105209 ignition[1030]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jun 21 04:40:02.109985 ignition[1030]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jun 21 04:40:02.111585 ignition[1030]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jun 21 04:40:02.113440 unknown[1030]: wrote ssh authorized keys file for user: core Jun 21 04:40:02.114658 ignition[1030]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jun 21 04:40:02.116297 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Jun 21 04:40:02.116297 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 Jun 21 04:40:02.168258 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jun 21 04:40:02.239310 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" Jun 21 04:40:02.241462 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jun 21 04:40:02.241462 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jun 21 04:40:02.241462 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jun 21 04:40:02.241462 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jun 21 04:40:02.241462 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jun 21 04:40:02.241462 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jun 21 04:40:02.241462 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jun 21 04:40:02.241462 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jun 21 04:40:02.256075 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jun 21 04:40:02.256075 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jun 21 04:40:02.256075 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Jun 21 04:40:02.256075 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Jun 21 04:40:02.256075 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Jun 21 04:40:02.256075 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-x86-64.raw: attempt #1 Jun 21 04:40:02.842955 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jun 21 04:40:03.098327 systemd-networkd[856]: eth0: Gained IPv6LL Jun 21 04:40:03.180212 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" Jun 21 04:40:03.180212 ignition[1030]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jun 21 04:40:03.184170 ignition[1030]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jun 21 04:40:03.190424 ignition[1030]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jun 21 04:40:03.190424 ignition[1030]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jun 21 04:40:03.193744 ignition[1030]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Jun 21 04:40:03.195018 ignition[1030]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Jun 21 04:40:03.196967 ignition[1030]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Jun 21 04:40:03.196967 ignition[1030]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Jun 21 04:40:03.196967 ignition[1030]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Jun 21 04:40:03.219441 ignition[1030]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Jun 21 04:40:03.225601 ignition[1030]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Jun 21 04:40:03.227573 ignition[1030]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Jun 21 04:40:03.227573 ignition[1030]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Jun 21 04:40:03.232241 ignition[1030]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Jun 21 04:40:03.232241 ignition[1030]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Jun 21 04:40:03.232241 ignition[1030]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Jun 21 04:40:03.232241 ignition[1030]: INFO : files: files passed Jun 21 04:40:03.232241 ignition[1030]: INFO : Ignition finished successfully Jun 21 04:40:03.238567 systemd[1]: Finished ignition-files.service - Ignition (files). Jun 21 04:40:03.241709 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jun 21 04:40:03.242595 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jun 21 04:40:03.261016 systemd[1]: ignition-quench.service: Deactivated successfully. Jun 21 04:40:03.261243 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jun 21 04:40:03.264828 initrd-setup-root-after-ignition[1059]: grep: /sysroot/oem/oem-release: No such file or directory Jun 21 04:40:03.268049 initrd-setup-root-after-ignition[1061]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jun 21 04:40:03.268049 initrd-setup-root-after-ignition[1061]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jun 21 04:40:03.271686 initrd-setup-root-after-ignition[1065]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jun 21 04:40:03.272064 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jun 21 04:40:03.276698 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jun 21 04:40:03.280189 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jun 21 04:40:03.330091 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jun 21 04:40:03.330266 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jun 21 04:40:03.331568 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jun 21 04:40:03.334002 systemd[1]: Reached target initrd.target - Initrd Default Target. Jun 21 04:40:03.336169 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jun 21 04:40:03.338378 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jun 21 04:40:03.375723 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jun 21 04:40:03.380118 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jun 21 04:40:03.405113 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jun 21 04:40:03.405391 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jun 21 04:40:03.409235 systemd[1]: Stopped target timers.target - Timer Units. Jun 21 04:40:03.412368 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jun 21 04:40:03.412545 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jun 21 04:40:03.416167 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jun 21 04:40:03.416323 systemd[1]: Stopped target basic.target - Basic System. Jun 21 04:40:03.419416 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jun 21 04:40:03.420460 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jun 21 04:40:03.420817 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jun 21 04:40:03.421250 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jun 21 04:40:03.421760 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jun 21 04:40:03.422128 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jun 21 04:40:03.422719 systemd[1]: Stopped target sysinit.target - System Initialization. Jun 21 04:40:03.423076 systemd[1]: Stopped target local-fs.target - Local File Systems. Jun 21 04:40:03.423652 systemd[1]: Stopped target swap.target - Swaps. Jun 21 04:40:03.423982 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jun 21 04:40:03.424092 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jun 21 04:40:03.424939 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jun 21 04:40:03.425532 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jun 21 04:40:03.425855 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jun 21 04:40:03.445808 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jun 21 04:40:03.446815 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jun 21 04:40:03.446919 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jun 21 04:40:03.449630 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jun 21 04:40:03.449737 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jun 21 04:40:03.452240 systemd[1]: Stopped target paths.target - Path Units. Jun 21 04:40:03.452640 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jun 21 04:40:03.459218 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jun 21 04:40:03.459378 systemd[1]: Stopped target slices.target - Slice Units. Jun 21 04:40:03.461993 systemd[1]: Stopped target sockets.target - Socket Units. Jun 21 04:40:03.463708 systemd[1]: iscsid.socket: Deactivated successfully. Jun 21 04:40:03.463799 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jun 21 04:40:03.465466 systemd[1]: iscsiuio.socket: Deactivated successfully. Jun 21 04:40:03.465551 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jun 21 04:40:03.467255 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jun 21 04:40:03.467371 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jun 21 04:40:03.469075 systemd[1]: ignition-files.service: Deactivated successfully. Jun 21 04:40:03.469192 systemd[1]: Stopped ignition-files.service - Ignition (files). Jun 21 04:40:03.475311 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jun 21 04:40:03.477021 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jun 21 04:40:03.477134 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jun 21 04:40:03.479979 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jun 21 04:40:03.481096 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jun 21 04:40:03.481271 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jun 21 04:40:03.483323 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jun 21 04:40:03.483469 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jun 21 04:40:03.490821 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jun 21 04:40:03.490939 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jun 21 04:40:03.502485 ignition[1085]: INFO : Ignition 2.21.0 Jun 21 04:40:03.502485 ignition[1085]: INFO : Stage: umount Jun 21 04:40:03.504301 ignition[1085]: INFO : no configs at "/usr/lib/ignition/base.d" Jun 21 04:40:03.504301 ignition[1085]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Jun 21 04:40:03.504301 ignition[1085]: INFO : umount: umount passed Jun 21 04:40:03.504301 ignition[1085]: INFO : Ignition finished successfully Jun 21 04:40:03.510288 systemd[1]: ignition-mount.service: Deactivated successfully. Jun 21 04:40:03.510444 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jun 21 04:40:03.511881 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jun 21 04:40:03.512989 systemd[1]: Stopped target network.target - Network. Jun 21 04:40:03.513330 systemd[1]: ignition-disks.service: Deactivated successfully. Jun 21 04:40:03.513382 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jun 21 04:40:03.513695 systemd[1]: ignition-kargs.service: Deactivated successfully. Jun 21 04:40:03.513738 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jun 21 04:40:03.514020 systemd[1]: ignition-setup.service: Deactivated successfully. Jun 21 04:40:03.514064 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jun 21 04:40:03.514368 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jun 21 04:40:03.514418 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jun 21 04:40:03.514915 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jun 21 04:40:03.524787 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jun 21 04:40:03.532871 systemd[1]: systemd-resolved.service: Deactivated successfully. Jun 21 04:40:03.533014 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jun 21 04:40:03.536733 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Jun 21 04:40:03.537068 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jun 21 04:40:03.537116 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jun 21 04:40:03.541269 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Jun 21 04:40:03.541513 systemd[1]: systemd-networkd.service: Deactivated successfully. Jun 21 04:40:03.541627 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jun 21 04:40:03.545140 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Jun 21 04:40:03.545653 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jun 21 04:40:03.547905 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jun 21 04:40:03.547960 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jun 21 04:40:03.550481 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jun 21 04:40:03.552467 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jun 21 04:40:03.552523 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jun 21 04:40:03.552875 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jun 21 04:40:03.552919 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jun 21 04:40:03.558958 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jun 21 04:40:03.559006 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jun 21 04:40:03.561491 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jun 21 04:40:03.565571 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Jun 21 04:40:03.582042 systemd[1]: systemd-udevd.service: Deactivated successfully. Jun 21 04:40:03.582245 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jun 21 04:40:03.584816 systemd[1]: network-cleanup.service: Deactivated successfully. Jun 21 04:40:03.584923 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jun 21 04:40:03.587556 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jun 21 04:40:03.587626 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jun 21 04:40:03.589281 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jun 21 04:40:03.589326 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jun 21 04:40:03.591387 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jun 21 04:40:03.591448 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jun 21 04:40:03.594353 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jun 21 04:40:03.594415 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jun 21 04:40:03.596036 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jun 21 04:40:03.596094 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jun 21 04:40:03.604932 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jun 21 04:40:03.606213 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jun 21 04:40:03.606273 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jun 21 04:40:03.610073 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jun 21 04:40:03.610123 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jun 21 04:40:03.614116 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jun 21 04:40:03.614184 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jun 21 04:40:03.629656 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jun 21 04:40:03.629776 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jun 21 04:40:03.700286 systemd[1]: sysroot-boot.service: Deactivated successfully. Jun 21 04:40:03.700433 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jun 21 04:40:03.701640 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jun 21 04:40:03.703138 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jun 21 04:40:03.703222 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jun 21 04:40:03.708062 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jun 21 04:40:03.737976 systemd[1]: Switching root. Jun 21 04:40:03.784688 systemd-journald[220]: Journal stopped Jun 21 04:40:04.978007 systemd-journald[220]: Received SIGTERM from PID 1 (systemd). Jun 21 04:40:04.978076 kernel: SELinux: policy capability network_peer_controls=1 Jun 21 04:40:04.978090 kernel: SELinux: policy capability open_perms=1 Jun 21 04:40:04.978102 kernel: SELinux: policy capability extended_socket_class=1 Jun 21 04:40:04.978119 kernel: SELinux: policy capability always_check_network=0 Jun 21 04:40:04.978129 kernel: SELinux: policy capability cgroup_seclabel=1 Jun 21 04:40:04.978144 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jun 21 04:40:04.978208 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jun 21 04:40:04.978222 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jun 21 04:40:04.978237 kernel: SELinux: policy capability userspace_initial_context=0 Jun 21 04:40:04.978250 kernel: audit: type=1403 audit(1750480804.195:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Jun 21 04:40:04.978263 systemd[1]: Successfully loaded SELinux policy in 50.388ms. Jun 21 04:40:04.978292 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 12.328ms. Jun 21 04:40:04.978305 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jun 21 04:40:04.978317 systemd[1]: Detected virtualization kvm. Jun 21 04:40:04.978331 systemd[1]: Detected architecture x86-64. Jun 21 04:40:04.978343 systemd[1]: Detected first boot. Jun 21 04:40:04.978364 systemd[1]: Initializing machine ID from VM UUID. Jun 21 04:40:04.978376 zram_generator::config[1130]: No configuration found. Jun 21 04:40:04.978389 kernel: Guest personality initialized and is inactive Jun 21 04:40:04.978400 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Jun 21 04:40:04.978411 kernel: Initialized host personality Jun 21 04:40:04.978422 kernel: NET: Registered PF_VSOCK protocol family Jun 21 04:40:04.978433 systemd[1]: Populated /etc with preset unit settings. Jun 21 04:40:04.978448 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Jun 21 04:40:04.978460 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jun 21 04:40:04.978472 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jun 21 04:40:04.978484 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jun 21 04:40:04.978496 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jun 21 04:40:04.978508 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jun 21 04:40:04.978520 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jun 21 04:40:04.978531 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jun 21 04:40:04.978546 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jun 21 04:40:04.978558 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jun 21 04:40:04.978571 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jun 21 04:40:04.978583 systemd[1]: Created slice user.slice - User and Session Slice. Jun 21 04:40:04.978600 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jun 21 04:40:04.978612 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jun 21 04:40:04.978624 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jun 21 04:40:04.978636 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jun 21 04:40:04.978648 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jun 21 04:40:04.978662 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jun 21 04:40:04.978674 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jun 21 04:40:04.978685 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jun 21 04:40:04.978697 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jun 21 04:40:04.978709 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jun 21 04:40:04.978720 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jun 21 04:40:04.978732 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jun 21 04:40:04.978746 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jun 21 04:40:04.978758 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jun 21 04:40:04.978769 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jun 21 04:40:04.978781 systemd[1]: Reached target slices.target - Slice Units. Jun 21 04:40:04.978793 systemd[1]: Reached target swap.target - Swaps. Jun 21 04:40:04.978805 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jun 21 04:40:04.978816 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jun 21 04:40:04.978828 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jun 21 04:40:04.978839 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jun 21 04:40:04.978851 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jun 21 04:40:04.978865 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jun 21 04:40:04.978877 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jun 21 04:40:04.978889 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jun 21 04:40:04.978901 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jun 21 04:40:04.978913 systemd[1]: Mounting media.mount - External Media Directory... Jun 21 04:40:04.978925 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jun 21 04:40:04.978937 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jun 21 04:40:04.978949 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jun 21 04:40:04.978962 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jun 21 04:40:04.978974 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jun 21 04:40:04.978986 systemd[1]: Reached target machines.target - Containers. Jun 21 04:40:04.978998 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jun 21 04:40:04.979010 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jun 21 04:40:04.979024 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jun 21 04:40:04.979036 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jun 21 04:40:04.979048 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jun 21 04:40:04.979059 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jun 21 04:40:04.979073 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jun 21 04:40:04.979084 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jun 21 04:40:04.979096 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jun 21 04:40:04.979108 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jun 21 04:40:04.979120 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jun 21 04:40:04.979132 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jun 21 04:40:04.979143 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jun 21 04:40:04.979170 systemd[1]: Stopped systemd-fsck-usr.service. Jun 21 04:40:04.979185 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jun 21 04:40:04.979197 systemd[1]: Starting systemd-journald.service - Journal Service... Jun 21 04:40:04.979208 kernel: fuse: init (API version 7.41) Jun 21 04:40:04.979220 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jun 21 04:40:04.979231 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jun 21 04:40:04.979243 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jun 21 04:40:04.979255 kernel: loop: module loaded Jun 21 04:40:04.979266 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jun 21 04:40:04.979278 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jun 21 04:40:04.979293 systemd[1]: verity-setup.service: Deactivated successfully. Jun 21 04:40:04.979305 systemd[1]: Stopped verity-setup.service. Jun 21 04:40:04.979316 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jun 21 04:40:04.979328 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jun 21 04:40:04.979340 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jun 21 04:40:04.979353 systemd[1]: Mounted media.mount - External Media Directory. Jun 21 04:40:04.979374 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jun 21 04:40:04.979385 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jun 21 04:40:04.979398 kernel: ACPI: bus type drm_connector registered Jun 21 04:40:04.979409 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jun 21 04:40:04.979423 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jun 21 04:40:04.979435 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jun 21 04:40:04.979466 systemd-journald[1205]: Collecting audit messages is disabled. Jun 21 04:40:04.979488 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jun 21 04:40:04.979500 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jun 21 04:40:04.979511 systemd-journald[1205]: Journal started Jun 21 04:40:04.979536 systemd-journald[1205]: Runtime Journal (/run/log/journal/7f3a8ad8216d432f9423836d2f3e4e19) is 6M, max 48.5M, 42.4M free. Jun 21 04:40:04.741849 systemd[1]: Queued start job for default target multi-user.target. Jun 21 04:40:04.754485 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Jun 21 04:40:04.754987 systemd[1]: systemd-journald.service: Deactivated successfully. Jun 21 04:40:04.984242 systemd[1]: Started systemd-journald.service - Journal Service. Jun 21 04:40:04.985811 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jun 21 04:40:04.986034 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jun 21 04:40:04.987496 systemd[1]: modprobe@drm.service: Deactivated successfully. Jun 21 04:40:04.987715 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jun 21 04:40:04.989069 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jun 21 04:40:04.989301 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jun 21 04:40:04.990854 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jun 21 04:40:04.991074 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jun 21 04:40:04.992462 systemd[1]: modprobe@loop.service: Deactivated successfully. Jun 21 04:40:04.992670 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jun 21 04:40:04.994085 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jun 21 04:40:04.995580 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jun 21 04:40:04.997456 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jun 21 04:40:04.999047 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jun 21 04:40:05.016385 systemd[1]: Reached target network-pre.target - Preparation for Network. Jun 21 04:40:05.019219 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jun 21 04:40:05.021589 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jun 21 04:40:05.023007 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jun 21 04:40:05.023036 systemd[1]: Reached target local-fs.target - Local File Systems. Jun 21 04:40:05.025231 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jun 21 04:40:05.030180 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jun 21 04:40:05.031623 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jun 21 04:40:05.033184 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jun 21 04:40:05.038395 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jun 21 04:40:05.039635 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jun 21 04:40:05.042259 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jun 21 04:40:05.043462 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jun 21 04:40:05.045366 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jun 21 04:40:05.047899 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jun 21 04:40:05.057509 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jun 21 04:40:05.061903 systemd-journald[1205]: Time spent on flushing to /var/log/journal/7f3a8ad8216d432f9423836d2f3e4e19 is 19.244ms for 1061 entries. Jun 21 04:40:05.061903 systemd-journald[1205]: System Journal (/var/log/journal/7f3a8ad8216d432f9423836d2f3e4e19) is 8M, max 195.6M, 187.6M free. Jun 21 04:40:05.104698 systemd-journald[1205]: Received client request to flush runtime journal. Jun 21 04:40:05.104794 kernel: loop0: detected capacity change from 0 to 113872 Jun 21 04:40:05.104826 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jun 21 04:40:05.061781 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jun 21 04:40:05.064259 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jun 21 04:40:05.065610 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jun 21 04:40:05.073690 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jun 21 04:40:05.077476 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jun 21 04:40:05.081372 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jun 21 04:40:05.096994 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jun 21 04:40:05.106100 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jun 21 04:40:05.116170 kernel: loop1: detected capacity change from 0 to 146240 Jun 21 04:40:05.117416 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jun 21 04:40:05.124614 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jun 21 04:40:05.128404 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jun 21 04:40:05.159166 kernel: loop2: detected capacity change from 0 to 229808 Jun 21 04:40:05.169143 systemd-tmpfiles[1268]: ACLs are not supported, ignoring. Jun 21 04:40:05.169176 systemd-tmpfiles[1268]: ACLs are not supported, ignoring. Jun 21 04:40:05.175453 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jun 21 04:40:05.185191 kernel: loop3: detected capacity change from 0 to 113872 Jun 21 04:40:05.197183 kernel: loop4: detected capacity change from 0 to 146240 Jun 21 04:40:05.207182 kernel: loop5: detected capacity change from 0 to 229808 Jun 21 04:40:05.215821 (sd-merge)[1272]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Jun 21 04:40:05.216900 (sd-merge)[1272]: Merged extensions into '/usr'. Jun 21 04:40:05.221730 systemd[1]: Reload requested from client PID 1249 ('systemd-sysext') (unit systemd-sysext.service)... Jun 21 04:40:05.221750 systemd[1]: Reloading... Jun 21 04:40:05.291211 zram_generator::config[1297]: No configuration found. Jun 21 04:40:05.384113 ldconfig[1244]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jun 21 04:40:05.392493 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jun 21 04:40:05.471608 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jun 21 04:40:05.471901 systemd[1]: Reloading finished in 249 ms. Jun 21 04:40:05.498606 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jun 21 04:40:05.500165 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jun 21 04:40:05.532736 systemd[1]: Starting ensure-sysext.service... Jun 21 04:40:05.534695 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jun 21 04:40:05.553836 systemd[1]: Reload requested from client PID 1335 ('systemctl') (unit ensure-sysext.service)... Jun 21 04:40:05.553858 systemd[1]: Reloading... Jun 21 04:40:05.562753 systemd-tmpfiles[1336]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jun 21 04:40:05.562800 systemd-tmpfiles[1336]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jun 21 04:40:05.563094 systemd-tmpfiles[1336]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jun 21 04:40:05.563375 systemd-tmpfiles[1336]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Jun 21 04:40:05.564234 systemd-tmpfiles[1336]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Jun 21 04:40:05.564650 systemd-tmpfiles[1336]: ACLs are not supported, ignoring. Jun 21 04:40:05.564723 systemd-tmpfiles[1336]: ACLs are not supported, ignoring. Jun 21 04:40:05.568733 systemd-tmpfiles[1336]: Detected autofs mount point /boot during canonicalization of boot. Jun 21 04:40:05.568746 systemd-tmpfiles[1336]: Skipping /boot Jun 21 04:40:05.581203 systemd-tmpfiles[1336]: Detected autofs mount point /boot during canonicalization of boot. Jun 21 04:40:05.581215 systemd-tmpfiles[1336]: Skipping /boot Jun 21 04:40:05.608186 zram_generator::config[1363]: No configuration found. Jun 21 04:40:05.702397 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jun 21 04:40:05.787676 systemd[1]: Reloading finished in 233 ms. Jun 21 04:40:05.815694 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jun 21 04:40:05.841711 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jun 21 04:40:05.850698 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jun 21 04:40:05.853053 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jun 21 04:40:05.855473 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jun 21 04:40:05.866029 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jun 21 04:40:05.868986 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jun 21 04:40:05.872618 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jun 21 04:40:05.878273 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jun 21 04:40:05.878516 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jun 21 04:40:05.882254 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jun 21 04:40:05.885460 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jun 21 04:40:05.890371 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jun 21 04:40:05.891587 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jun 21 04:40:05.891684 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jun 21 04:40:05.893932 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jun 21 04:40:05.895141 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jun 21 04:40:05.896782 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jun 21 04:40:05.900811 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jun 21 04:40:05.901486 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jun 21 04:40:05.904692 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jun 21 04:40:05.905469 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jun 21 04:40:05.907493 systemd[1]: modprobe@loop.service: Deactivated successfully. Jun 21 04:40:05.907755 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jun 21 04:40:05.917277 augenrules[1432]: No rules Jun 21 04:40:05.918710 systemd[1]: audit-rules.service: Deactivated successfully. Jun 21 04:40:05.919051 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jun 21 04:40:05.926299 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jun 21 04:40:05.926456 systemd-udevd[1406]: Using default interface naming scheme 'v255'. Jun 21 04:40:05.926548 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jun 21 04:40:05.928578 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jun 21 04:40:05.932320 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jun 21 04:40:05.937414 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jun 21 04:40:05.938639 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jun 21 04:40:05.938750 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jun 21 04:40:05.940477 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jun 21 04:40:05.942198 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jun 21 04:40:05.943628 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jun 21 04:40:05.945435 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jun 21 04:40:05.945996 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jun 21 04:40:05.947633 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jun 21 04:40:05.949390 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jun 21 04:40:05.951386 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jun 21 04:40:05.952272 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jun 21 04:40:05.954099 systemd[1]: modprobe@loop.service: Deactivated successfully. Jun 21 04:40:05.954486 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jun 21 04:40:05.961435 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jun 21 04:40:05.965046 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jun 21 04:40:05.974685 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jun 21 04:40:05.977373 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jun 21 04:40:05.978595 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jun 21 04:40:05.985305 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jun 21 04:40:05.992218 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jun 21 04:40:05.996366 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jun 21 04:40:06.000238 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jun 21 04:40:06.001548 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jun 21 04:40:06.001669 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jun 21 04:40:06.009832 augenrules[1477]: /sbin/augenrules: No change Jun 21 04:40:06.010721 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jun 21 04:40:06.012188 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jun 21 04:40:06.012346 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jun 21 04:40:06.014525 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jun 21 04:40:06.014809 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jun 21 04:40:06.018972 systemd[1]: modprobe@drm.service: Deactivated successfully. Jun 21 04:40:06.019216 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jun 21 04:40:06.020752 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jun 21 04:40:06.020960 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jun 21 04:40:06.021882 augenrules[1505]: No rules Jun 21 04:40:06.022753 systemd[1]: audit-rules.service: Deactivated successfully. Jun 21 04:40:06.022988 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jun 21 04:40:06.024443 systemd[1]: modprobe@loop.service: Deactivated successfully. Jun 21 04:40:06.024650 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jun 21 04:40:06.030999 systemd[1]: Finished ensure-sysext.service. Jun 21 04:40:06.042368 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jun 21 04:40:06.042438 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jun 21 04:40:06.046579 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Jun 21 04:40:06.069679 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Jun 21 04:40:06.099901 systemd-resolved[1405]: Positive Trust Anchors: Jun 21 04:40:06.100347 systemd-resolved[1405]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jun 21 04:40:06.100453 systemd-resolved[1405]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jun 21 04:40:06.105201 systemd-resolved[1405]: Defaulting to hostname 'linux'. Jun 21 04:40:06.109005 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jun 21 04:40:06.116275 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jun 21 04:40:06.125979 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jun 21 04:40:06.130089 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jun 21 04:40:06.135246 kernel: mousedev: PS/2 mouse device common for all mice Jun 21 04:40:06.142626 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Jun 21 04:40:06.153169 kernel: ACPI: button: Power Button [PWRF] Jun 21 04:40:06.154067 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jun 21 04:40:06.159232 systemd-networkd[1495]: lo: Link UP Jun 21 04:40:06.159243 systemd-networkd[1495]: lo: Gained carrier Jun 21 04:40:06.161604 systemd-networkd[1495]: Enumeration completed Jun 21 04:40:06.161971 systemd-networkd[1495]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jun 21 04:40:06.161981 systemd-networkd[1495]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jun 21 04:40:06.162715 systemd[1]: Started systemd-networkd.service - Network Configuration. Jun 21 04:40:06.163259 systemd-networkd[1495]: eth0: Link UP Jun 21 04:40:06.163430 systemd-networkd[1495]: eth0: Gained carrier Jun 21 04:40:06.163450 systemd-networkd[1495]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jun 21 04:40:06.164099 systemd[1]: Reached target network.target - Network. Jun 21 04:40:06.168303 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jun 21 04:40:06.171054 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Jun 21 04:40:06.171398 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Jun 21 04:40:06.171576 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Jun 21 04:40:06.172384 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jun 21 04:40:06.175214 systemd-networkd[1495]: eth0: DHCPv4 address 10.0.0.63/16, gateway 10.0.0.1 acquired from 10.0.0.1 Jun 21 04:40:06.217398 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jun 21 04:40:06.231279 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Jun 21 04:40:06.232725 systemd[1]: Reached target sysinit.target - System Initialization. Jun 21 04:40:07.681879 systemd-timesyncd[1516]: Contacted time server 10.0.0.1:123 (10.0.0.1). Jun 21 04:40:07.681935 systemd-timesyncd[1516]: Initial clock synchronization to Sat 2025-06-21 04:40:07.681799 UTC. Jun 21 04:40:07.681963 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jun 21 04:40:07.682048 systemd-resolved[1405]: Clock change detected. Flushing caches. Jun 21 04:40:07.683386 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jun 21 04:40:07.684824 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Jun 21 04:40:07.686100 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jun 21 04:40:07.687505 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jun 21 04:40:07.687541 systemd[1]: Reached target paths.target - Path Units. Jun 21 04:40:07.689797 systemd[1]: Reached target time-set.target - System Time Set. Jun 21 04:40:07.691057 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jun 21 04:40:07.692405 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jun 21 04:40:07.693686 systemd[1]: Reached target timers.target - Timer Units. Jun 21 04:40:07.696061 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jun 21 04:40:07.699119 systemd[1]: Starting docker.socket - Docker Socket for the API... Jun 21 04:40:07.706781 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jun 21 04:40:07.708297 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jun 21 04:40:07.709602 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jun 21 04:40:07.719472 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jun 21 04:40:07.721107 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jun 21 04:40:07.723089 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jun 21 04:40:07.729304 systemd[1]: Reached target sockets.target - Socket Units. Jun 21 04:40:07.730634 systemd[1]: Reached target basic.target - Basic System. Jun 21 04:40:07.731964 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jun 21 04:40:07.732283 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jun 21 04:40:07.734897 systemd[1]: Starting containerd.service - containerd container runtime... Jun 21 04:40:07.738067 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jun 21 04:40:07.744159 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jun 21 04:40:07.749013 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jun 21 04:40:07.752114 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jun 21 04:40:07.753159 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jun 21 04:40:07.754740 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Jun 21 04:40:07.756118 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jun 21 04:40:07.760490 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jun 21 04:40:07.765770 jq[1556]: false Jun 21 04:40:07.765669 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jun 21 04:40:07.770073 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jun 21 04:40:07.776671 systemd[1]: Starting systemd-logind.service - User Login Management... Jun 21 04:40:07.780028 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jun 21 04:40:07.782132 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jun 21 04:40:07.783525 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jun 21 04:40:07.786116 google_oslogin_nss_cache[1558]: oslogin_cache_refresh[1558]: Refreshing passwd entry cache Jun 21 04:40:07.787492 extend-filesystems[1557]: Found /dev/vda6 Jun 21 04:40:07.787351 oslogin_cache_refresh[1558]: Refreshing passwd entry cache Jun 21 04:40:07.793962 systemd[1]: Starting update-engine.service - Update Engine... Jun 21 04:40:07.797135 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jun 21 04:40:07.801336 google_oslogin_nss_cache[1558]: oslogin_cache_refresh[1558]: Failure getting users, quitting Jun 21 04:40:07.801336 google_oslogin_nss_cache[1558]: oslogin_cache_refresh[1558]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jun 21 04:40:07.801336 google_oslogin_nss_cache[1558]: oslogin_cache_refresh[1558]: Refreshing group entry cache Jun 21 04:40:07.800821 oslogin_cache_refresh[1558]: Failure getting users, quitting Jun 21 04:40:07.800844 oslogin_cache_refresh[1558]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jun 21 04:40:07.800910 oslogin_cache_refresh[1558]: Refreshing group entry cache Jun 21 04:40:07.805436 extend-filesystems[1557]: Found /dev/vda9 Jun 21 04:40:07.807317 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jun 21 04:40:07.809847 extend-filesystems[1557]: Checking size of /dev/vda9 Jun 21 04:40:07.807548 oslogin_cache_refresh[1558]: Failure getting groups, quitting Jun 21 04:40:07.811005 google_oslogin_nss_cache[1558]: oslogin_cache_refresh[1558]: Failure getting groups, quitting Jun 21 04:40:07.811005 google_oslogin_nss_cache[1558]: oslogin_cache_refresh[1558]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jun 21 04:40:07.810075 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jun 21 04:40:07.807563 oslogin_cache_refresh[1558]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jun 21 04:40:07.810435 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jun 21 04:40:07.811171 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Jun 21 04:40:07.811497 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Jun 21 04:40:07.815170 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jun 21 04:40:07.815470 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jun 21 04:40:07.826589 systemd[1]: motdgen.service: Deactivated successfully. Jun 21 04:40:07.828006 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jun 21 04:40:07.837633 extend-filesystems[1557]: Resized partition /dev/vda9 Jun 21 04:40:07.840401 extend-filesystems[1598]: resize2fs 1.47.2 (1-Jan-2025) Jun 21 04:40:07.848019 update_engine[1570]: I20250621 04:40:07.847754 1570 main.cc:92] Flatcar Update Engine starting Jun 21 04:40:07.848830 (ntainerd)[1593]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Jun 21 04:40:07.855412 kernel: kvm_amd: TSC scaling supported Jun 21 04:40:07.855453 kernel: kvm_amd: Nested Virtualization enabled Jun 21 04:40:07.855471 kernel: kvm_amd: Nested Paging enabled Jun 21 04:40:07.855497 kernel: kvm_amd: LBR virtualization supported Jun 21 04:40:07.855518 tar[1583]: linux-amd64/LICENSE Jun 21 04:40:07.855518 tar[1583]: linux-amd64/helm Jun 21 04:40:07.855803 jq[1577]: true Jun 21 04:40:07.857970 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jun 21 04:40:07.858333 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jun 21 04:40:07.863566 kernel: kvm_amd: Virtual VMLOAD VMSAVE supported Jun 21 04:40:07.863615 kernel: kvm_amd: Virtual GIF supported Jun 21 04:40:07.869772 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Jun 21 04:40:07.874433 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jun 21 04:40:07.899782 dbus-daemon[1554]: [system] SELinux support is enabled Jun 21 04:40:07.900291 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jun 21 04:40:07.903938 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Jun 21 04:40:07.907398 jq[1600]: true Jun 21 04:40:07.908530 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jun 21 04:40:07.908596 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jun 21 04:40:07.930351 update_engine[1570]: I20250621 04:40:07.922175 1570 update_check_scheduler.cc:74] Next update check in 10m33s Jun 21 04:40:07.910911 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jun 21 04:40:07.910933 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jun 21 04:40:07.922408 systemd[1]: Started update-engine.service - Update Engine. Jun 21 04:40:07.929119 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jun 21 04:40:07.933802 sshd_keygen[1582]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jun 21 04:40:07.935203 extend-filesystems[1598]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Jun 21 04:40:07.935203 extend-filesystems[1598]: old_desc_blocks = 1, new_desc_blocks = 1 Jun 21 04:40:07.935203 extend-filesystems[1598]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Jun 21 04:40:07.941127 extend-filesystems[1557]: Resized filesystem in /dev/vda9 Jun 21 04:40:07.943191 systemd[1]: extend-filesystems.service: Deactivated successfully. Jun 21 04:40:07.944357 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jun 21 04:40:07.979833 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jun 21 04:40:07.993921 systemd[1]: Starting issuegen.service - Generate /run/issue... Jun 21 04:40:08.013210 systemd[1]: issuegen.service: Deactivated successfully. Jun 21 04:40:08.013497 systemd[1]: Finished issuegen.service - Generate /run/issue. Jun 21 04:40:08.017964 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jun 21 04:40:08.044773 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jun 21 04:40:08.046975 locksmithd[1610]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jun 21 04:40:08.066956 kernel: EDAC MC: Ver: 3.0.0 Jun 21 04:40:08.066358 systemd[1]: Started getty@tty1.service - Getty on tty1. Jun 21 04:40:08.076103 systemd-logind[1566]: Watching system buttons on /dev/input/event2 (Power Button) Jun 21 04:40:08.076128 systemd-logind[1566]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Jun 21 04:40:08.077914 systemd-logind[1566]: New seat seat0. Jun 21 04:40:08.079992 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jun 21 04:40:08.094431 systemd[1]: Reached target getty.target - Login Prompts. Jun 21 04:40:08.095591 systemd[1]: Started systemd-logind.service - User Login Management. Jun 21 04:40:08.106361 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jun 21 04:40:08.150555 bash[1632]: Updated "/home/core/.ssh/authorized_keys" Jun 21 04:40:08.152781 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jun 21 04:40:08.154974 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Jun 21 04:40:08.178975 containerd[1593]: time="2025-06-21T04:40:08Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jun 21 04:40:08.181973 containerd[1593]: time="2025-06-21T04:40:08.181887743Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 Jun 21 04:40:08.191053 containerd[1593]: time="2025-06-21T04:40:08.191006371Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="10.069µs" Jun 21 04:40:08.192616 containerd[1593]: time="2025-06-21T04:40:08.191162353Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jun 21 04:40:08.192616 containerd[1593]: time="2025-06-21T04:40:08.191191277Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jun 21 04:40:08.192616 containerd[1593]: time="2025-06-21T04:40:08.191377527Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jun 21 04:40:08.192616 containerd[1593]: time="2025-06-21T04:40:08.191392475Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jun 21 04:40:08.192616 containerd[1593]: time="2025-06-21T04:40:08.191415508Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jun 21 04:40:08.192616 containerd[1593]: time="2025-06-21T04:40:08.191473016Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jun 21 04:40:08.192616 containerd[1593]: time="2025-06-21T04:40:08.191482604Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jun 21 04:40:08.192616 containerd[1593]: time="2025-06-21T04:40:08.191769392Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jun 21 04:40:08.192616 containerd[1593]: time="2025-06-21T04:40:08.191780803Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jun 21 04:40:08.192616 containerd[1593]: time="2025-06-21T04:40:08.191790612Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jun 21 04:40:08.192616 containerd[1593]: time="2025-06-21T04:40:08.191798948Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jun 21 04:40:08.192616 containerd[1593]: time="2025-06-21T04:40:08.191888696Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jun 21 04:40:08.192984 containerd[1593]: time="2025-06-21T04:40:08.192113678Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jun 21 04:40:08.192984 containerd[1593]: time="2025-06-21T04:40:08.192142121Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jun 21 04:40:08.192984 containerd[1593]: time="2025-06-21T04:40:08.192152180Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jun 21 04:40:08.192984 containerd[1593]: time="2025-06-21T04:40:08.192189500Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jun 21 04:40:08.192984 containerd[1593]: time="2025-06-21T04:40:08.192377643Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jun 21 04:40:08.192984 containerd[1593]: time="2025-06-21T04:40:08.192445410Z" level=info msg="metadata content store policy set" policy=shared Jun 21 04:40:08.198403 containerd[1593]: time="2025-06-21T04:40:08.198359526Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jun 21 04:40:08.198448 containerd[1593]: time="2025-06-21T04:40:08.198425710Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jun 21 04:40:08.198448 containerd[1593]: time="2025-06-21T04:40:08.198440968Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jun 21 04:40:08.198484 containerd[1593]: time="2025-06-21T04:40:08.198453843Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jun 21 04:40:08.198484 containerd[1593]: time="2025-06-21T04:40:08.198467288Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jun 21 04:40:08.198484 containerd[1593]: time="2025-06-21T04:40:08.198478138Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jun 21 04:40:08.198534 containerd[1593]: time="2025-06-21T04:40:08.198491513Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jun 21 04:40:08.198534 containerd[1593]: time="2025-06-21T04:40:08.198505109Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jun 21 04:40:08.198534 containerd[1593]: time="2025-06-21T04:40:08.198519225Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jun 21 04:40:08.198534 containerd[1593]: time="2025-06-21T04:40:08.198529695Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jun 21 04:40:08.198614 containerd[1593]: time="2025-06-21T04:40:08.198539012Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jun 21 04:40:08.198614 containerd[1593]: time="2025-06-21T04:40:08.198555854Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jun 21 04:40:08.198743 containerd[1593]: time="2025-06-21T04:40:08.198705014Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jun 21 04:40:08.198773 containerd[1593]: time="2025-06-21T04:40:08.198764876Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jun 21 04:40:08.198794 containerd[1593]: time="2025-06-21T04:40:08.198779253Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jun 21 04:40:08.198794 containerd[1593]: time="2025-06-21T04:40:08.198790063Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jun 21 04:40:08.198828 containerd[1593]: time="2025-06-21T04:40:08.198811694Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jun 21 04:40:08.198828 containerd[1593]: time="2025-06-21T04:40:08.198822765Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jun 21 04:40:08.198868 containerd[1593]: time="2025-06-21T04:40:08.198835218Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jun 21 04:40:08.198868 containerd[1593]: time="2025-06-21T04:40:08.198850036Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jun 21 04:40:08.198868 containerd[1593]: time="2025-06-21T04:40:08.198861157Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jun 21 04:40:08.198938 containerd[1593]: time="2025-06-21T04:40:08.198884961Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jun 21 04:40:08.198938 containerd[1593]: time="2025-06-21T04:40:08.198896794Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jun 21 04:40:08.198973 containerd[1593]: time="2025-06-21T04:40:08.198959080Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jun 21 04:40:08.198973 containerd[1593]: time="2025-06-21T04:40:08.198971554Z" level=info msg="Start snapshots syncer" Jun 21 04:40:08.199016 containerd[1593]: time="2025-06-21T04:40:08.198995439Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jun 21 04:40:08.199272 containerd[1593]: time="2025-06-21T04:40:08.199232844Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jun 21 04:40:08.199272 containerd[1593]: time="2025-06-21T04:40:08.199282327Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jun 21 04:40:08.200026 containerd[1593]: time="2025-06-21T04:40:08.199980326Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jun 21 04:40:08.200116 containerd[1593]: time="2025-06-21T04:40:08.200092597Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jun 21 04:40:08.200152 containerd[1593]: time="2025-06-21T04:40:08.200120680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jun 21 04:40:08.200152 containerd[1593]: time="2025-06-21T04:40:08.200133634Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jun 21 04:40:08.200152 containerd[1593]: time="2025-06-21T04:40:08.200145166Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jun 21 04:40:08.200152 containerd[1593]: time="2025-06-21T04:40:08.200157719Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jun 21 04:40:08.200152 containerd[1593]: time="2025-06-21T04:40:08.200168479Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jun 21 04:40:08.200323 containerd[1593]: time="2025-06-21T04:40:08.200178959Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jun 21 04:40:08.200323 containerd[1593]: time="2025-06-21T04:40:08.200201642Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jun 21 04:40:08.200323 containerd[1593]: time="2025-06-21T04:40:08.200212722Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jun 21 04:40:08.200323 containerd[1593]: time="2025-06-21T04:40:08.200224805Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jun 21 04:40:08.200323 containerd[1593]: time="2025-06-21T04:40:08.200288765Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jun 21 04:40:08.200323 containerd[1593]: time="2025-06-21T04:40:08.200306088Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jun 21 04:40:08.200323 containerd[1593]: time="2025-06-21T04:40:08.200315655Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jun 21 04:40:08.200568 containerd[1593]: time="2025-06-21T04:40:08.200385266Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jun 21 04:40:08.200568 containerd[1593]: time="2025-06-21T04:40:08.200396617Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jun 21 04:40:08.200568 containerd[1593]: time="2025-06-21T04:40:08.200407508Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jun 21 04:40:08.200568 containerd[1593]: time="2025-06-21T04:40:08.200417236Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jun 21 04:40:08.200568 containerd[1593]: time="2025-06-21T04:40:08.200435340Z" level=info msg="runtime interface created" Jun 21 04:40:08.200568 containerd[1593]: time="2025-06-21T04:40:08.200440690Z" level=info msg="created NRI interface" Jun 21 04:40:08.200568 containerd[1593]: time="2025-06-21T04:40:08.200453704Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jun 21 04:40:08.200568 containerd[1593]: time="2025-06-21T04:40:08.200480916Z" level=info msg="Connect containerd service" Jun 21 04:40:08.200568 containerd[1593]: time="2025-06-21T04:40:08.200506013Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jun 21 04:40:08.201461 containerd[1593]: time="2025-06-21T04:40:08.201420248Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jun 21 04:40:08.292966 containerd[1593]: time="2025-06-21T04:40:08.292880123Z" level=info msg="Start subscribing containerd event" Jun 21 04:40:08.293093 containerd[1593]: time="2025-06-21T04:40:08.292972807Z" level=info msg="Start recovering state" Jun 21 04:40:08.293121 containerd[1593]: time="2025-06-21T04:40:08.293086009Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jun 21 04:40:08.293147 containerd[1593]: time="2025-06-21T04:40:08.293108421Z" level=info msg="Start event monitor" Jun 21 04:40:08.293175 containerd[1593]: time="2025-06-21T04:40:08.293154758Z" level=info msg="Start cni network conf syncer for default" Jun 21 04:40:08.293175 containerd[1593]: time="2025-06-21T04:40:08.293165428Z" level=info msg="Start streaming server" Jun 21 04:40:08.293224 containerd[1593]: time="2025-06-21T04:40:08.293173543Z" level=info msg=serving... address=/run/containerd/containerd.sock Jun 21 04:40:08.293224 containerd[1593]: time="2025-06-21T04:40:08.293201516Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jun 21 04:40:08.293270 containerd[1593]: time="2025-06-21T04:40:08.293210793Z" level=info msg="runtime interface starting up..." Jun 21 04:40:08.293270 containerd[1593]: time="2025-06-21T04:40:08.293233426Z" level=info msg="starting plugins..." Jun 21 04:40:08.293270 containerd[1593]: time="2025-06-21T04:40:08.293248865Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jun 21 04:40:08.293541 containerd[1593]: time="2025-06-21T04:40:08.293518140Z" level=info msg="containerd successfully booted in 0.115074s" Jun 21 04:40:08.293662 systemd[1]: Started containerd.service - containerd container runtime. Jun 21 04:40:08.387193 tar[1583]: linux-amd64/README.md Jun 21 04:40:08.413194 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jun 21 04:40:08.641965 systemd-networkd[1495]: eth0: Gained IPv6LL Jun 21 04:40:08.645501 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jun 21 04:40:08.647314 systemd[1]: Reached target network-online.target - Network is Online. Jun 21 04:40:08.649993 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Jun 21 04:40:08.652417 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jun 21 04:40:08.654513 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jun 21 04:40:08.679621 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jun 21 04:40:08.692320 systemd[1]: coreos-metadata.service: Deactivated successfully. Jun 21 04:40:08.692588 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Jun 21 04:40:08.694157 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jun 21 04:40:09.355829 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jun 21 04:40:09.357591 systemd[1]: Reached target multi-user.target - Multi-User System. Jun 21 04:40:09.359027 systemd[1]: Startup finished in 2.899s (kernel) + 5.568s (initrd) + 3.765s (userspace) = 12.234s. Jun 21 04:40:09.362081 (kubelet)[1701]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jun 21 04:40:09.782299 kubelet[1701]: E0621 04:40:09.782143 1701 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jun 21 04:40:09.786629 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jun 21 04:40:09.786870 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jun 21 04:40:09.787263 systemd[1]: kubelet.service: Consumed 980ms CPU time, 266.4M memory peak. Jun 21 04:40:13.594154 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jun 21 04:40:13.595443 systemd[1]: Started sshd@0-10.0.0.63:22-10.0.0.1:60850.service - OpenSSH per-connection server daemon (10.0.0.1:60850). Jun 21 04:40:13.829472 sshd[1714]: Accepted publickey for core from 10.0.0.1 port 60850 ssh2: RSA SHA256:015yC5fRvb07MyWOgrdDHnl6DLRQb6q1XcuQXpFRy7c Jun 21 04:40:13.831521 sshd-session[1714]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jun 21 04:40:13.838334 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jun 21 04:40:13.839584 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jun 21 04:40:13.846053 systemd-logind[1566]: New session 1 of user core. Jun 21 04:40:13.868289 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jun 21 04:40:13.871778 systemd[1]: Starting user@500.service - User Manager for UID 500... Jun 21 04:40:13.891041 (systemd)[1718]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Jun 21 04:40:13.893560 systemd-logind[1566]: New session c1 of user core. Jun 21 04:40:14.034398 systemd[1718]: Queued start job for default target default.target. Jun 21 04:40:14.048947 systemd[1718]: Created slice app.slice - User Application Slice. Jun 21 04:40:14.048971 systemd[1718]: Reached target paths.target - Paths. Jun 21 04:40:14.049015 systemd[1718]: Reached target timers.target - Timers. Jun 21 04:40:14.050477 systemd[1718]: Starting dbus.socket - D-Bus User Message Bus Socket... Jun 21 04:40:14.061588 systemd[1718]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jun 21 04:40:14.061762 systemd[1718]: Reached target sockets.target - Sockets. Jun 21 04:40:14.061807 systemd[1718]: Reached target basic.target - Basic System. Jun 21 04:40:14.061851 systemd[1718]: Reached target default.target - Main User Target. Jun 21 04:40:14.061887 systemd[1718]: Startup finished in 161ms. Jun 21 04:40:14.062263 systemd[1]: Started user@500.service - User Manager for UID 500. Jun 21 04:40:14.072868 systemd[1]: Started session-1.scope - Session 1 of User core. Jun 21 04:40:14.138635 systemd[1]: Started sshd@1-10.0.0.63:22-10.0.0.1:60852.service - OpenSSH per-connection server daemon (10.0.0.1:60852). Jun 21 04:40:14.182042 sshd[1729]: Accepted publickey for core from 10.0.0.1 port 60852 ssh2: RSA SHA256:015yC5fRvb07MyWOgrdDHnl6DLRQb6q1XcuQXpFRy7c Jun 21 04:40:14.183601 sshd-session[1729]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jun 21 04:40:14.188347 systemd-logind[1566]: New session 2 of user core. Jun 21 04:40:14.197867 systemd[1]: Started session-2.scope - Session 2 of User core. Jun 21 04:40:14.251557 sshd[1731]: Connection closed by 10.0.0.1 port 60852 Jun 21 04:40:14.251966 sshd-session[1729]: pam_unix(sshd:session): session closed for user core Jun 21 04:40:14.271213 systemd[1]: sshd@1-10.0.0.63:22-10.0.0.1:60852.service: Deactivated successfully. Jun 21 04:40:14.272907 systemd[1]: session-2.scope: Deactivated successfully. Jun 21 04:40:14.273606 systemd-logind[1566]: Session 2 logged out. Waiting for processes to exit. Jun 21 04:40:14.276486 systemd[1]: Started sshd@2-10.0.0.63:22-10.0.0.1:60856.service - OpenSSH per-connection server daemon (10.0.0.1:60856). Jun 21 04:40:14.277037 systemd-logind[1566]: Removed session 2. Jun 21 04:40:14.325372 sshd[1737]: Accepted publickey for core from 10.0.0.1 port 60856 ssh2: RSA SHA256:015yC5fRvb07MyWOgrdDHnl6DLRQb6q1XcuQXpFRy7c Jun 21 04:40:14.326854 sshd-session[1737]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jun 21 04:40:14.332114 systemd-logind[1566]: New session 3 of user core. Jun 21 04:40:14.343830 systemd[1]: Started session-3.scope - Session 3 of User core. Jun 21 04:40:14.392792 sshd[1740]: Connection closed by 10.0.0.1 port 60856 Jun 21 04:40:14.392994 sshd-session[1737]: pam_unix(sshd:session): session closed for user core Jun 21 04:40:14.407437 systemd[1]: sshd@2-10.0.0.63:22-10.0.0.1:60856.service: Deactivated successfully. Jun 21 04:40:14.408958 systemd[1]: session-3.scope: Deactivated successfully. Jun 21 04:40:14.409592 systemd-logind[1566]: Session 3 logged out. Waiting for processes to exit. Jun 21 04:40:14.411964 systemd[1]: Started sshd@3-10.0.0.63:22-10.0.0.1:60872.service - OpenSSH per-connection server daemon (10.0.0.1:60872). Jun 21 04:40:14.412596 systemd-logind[1566]: Removed session 3. Jun 21 04:40:14.456214 sshd[1746]: Accepted publickey for core from 10.0.0.1 port 60872 ssh2: RSA SHA256:015yC5fRvb07MyWOgrdDHnl6DLRQb6q1XcuQXpFRy7c Jun 21 04:40:14.457557 sshd-session[1746]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jun 21 04:40:14.461850 systemd-logind[1566]: New session 4 of user core. Jun 21 04:40:14.476833 systemd[1]: Started session-4.scope - Session 4 of User core. Jun 21 04:40:14.529234 sshd[1749]: Connection closed by 10.0.0.1 port 60872 Jun 21 04:40:14.529621 sshd-session[1746]: pam_unix(sshd:session): session closed for user core Jun 21 04:40:14.541297 systemd[1]: sshd@3-10.0.0.63:22-10.0.0.1:60872.service: Deactivated successfully. Jun 21 04:40:14.542910 systemd[1]: session-4.scope: Deactivated successfully. Jun 21 04:40:14.543636 systemd-logind[1566]: Session 4 logged out. Waiting for processes to exit. Jun 21 04:40:14.546449 systemd[1]: Started sshd@4-10.0.0.63:22-10.0.0.1:60888.service - OpenSSH per-connection server daemon (10.0.0.1:60888). Jun 21 04:40:14.547187 systemd-logind[1566]: Removed session 4. Jun 21 04:40:14.591329 sshd[1755]: Accepted publickey for core from 10.0.0.1 port 60888 ssh2: RSA SHA256:015yC5fRvb07MyWOgrdDHnl6DLRQb6q1XcuQXpFRy7c Jun 21 04:40:14.592530 sshd-session[1755]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jun 21 04:40:14.596559 systemd-logind[1566]: New session 5 of user core. Jun 21 04:40:14.610828 systemd[1]: Started session-5.scope - Session 5 of User core. Jun 21 04:40:14.667787 sudo[1759]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jun 21 04:40:14.668074 sudo[1759]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jun 21 04:40:14.683788 sudo[1759]: pam_unix(sudo:session): session closed for user root Jun 21 04:40:14.685177 sshd[1758]: Connection closed by 10.0.0.1 port 60888 Jun 21 04:40:14.685480 sshd-session[1755]: pam_unix(sshd:session): session closed for user core Jun 21 04:40:14.695283 systemd[1]: sshd@4-10.0.0.63:22-10.0.0.1:60888.service: Deactivated successfully. Jun 21 04:40:14.697172 systemd[1]: session-5.scope: Deactivated successfully. Jun 21 04:40:14.697893 systemd-logind[1566]: Session 5 logged out. Waiting for processes to exit. Jun 21 04:40:14.700748 systemd[1]: Started sshd@5-10.0.0.63:22-10.0.0.1:60904.service - OpenSSH per-connection server daemon (10.0.0.1:60904). Jun 21 04:40:14.701431 systemd-logind[1566]: Removed session 5. Jun 21 04:40:14.754798 sshd[1765]: Accepted publickey for core from 10.0.0.1 port 60904 ssh2: RSA SHA256:015yC5fRvb07MyWOgrdDHnl6DLRQb6q1XcuQXpFRy7c Jun 21 04:40:14.756316 sshd-session[1765]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jun 21 04:40:14.760677 systemd-logind[1566]: New session 6 of user core. Jun 21 04:40:14.775843 systemd[1]: Started session-6.scope - Session 6 of User core. Jun 21 04:40:14.827301 sudo[1769]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jun 21 04:40:14.827585 sudo[1769]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jun 21 04:40:15.057227 sudo[1769]: pam_unix(sudo:session): session closed for user root Jun 21 04:40:15.063668 sudo[1768]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jun 21 04:40:15.064060 sudo[1768]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jun 21 04:40:15.074147 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jun 21 04:40:15.124490 augenrules[1791]: No rules Jun 21 04:40:15.126292 systemd[1]: audit-rules.service: Deactivated successfully. Jun 21 04:40:15.126568 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jun 21 04:40:15.127846 sudo[1768]: pam_unix(sudo:session): session closed for user root Jun 21 04:40:15.129310 sshd[1767]: Connection closed by 10.0.0.1 port 60904 Jun 21 04:40:15.129572 sshd-session[1765]: pam_unix(sshd:session): session closed for user core Jun 21 04:40:15.140218 systemd[1]: sshd@5-10.0.0.63:22-10.0.0.1:60904.service: Deactivated successfully. Jun 21 04:40:15.141801 systemd[1]: session-6.scope: Deactivated successfully. Jun 21 04:40:15.142441 systemd-logind[1566]: Session 6 logged out. Waiting for processes to exit. Jun 21 04:40:15.144980 systemd[1]: Started sshd@6-10.0.0.63:22-10.0.0.1:60908.service - OpenSSH per-connection server daemon (10.0.0.1:60908). Jun 21 04:40:15.145706 systemd-logind[1566]: Removed session 6. Jun 21 04:40:15.189517 sshd[1800]: Accepted publickey for core from 10.0.0.1 port 60908 ssh2: RSA SHA256:015yC5fRvb07MyWOgrdDHnl6DLRQb6q1XcuQXpFRy7c Jun 21 04:40:15.190886 sshd-session[1800]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jun 21 04:40:15.195463 systemd-logind[1566]: New session 7 of user core. Jun 21 04:40:15.208848 systemd[1]: Started session-7.scope - Session 7 of User core. Jun 21 04:40:15.263209 sudo[1803]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jun 21 04:40:15.263583 sudo[1803]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jun 21 04:40:15.562442 systemd[1]: Starting docker.service - Docker Application Container Engine... Jun 21 04:40:15.580077 (dockerd)[1823]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jun 21 04:40:15.814729 dockerd[1823]: time="2025-06-21T04:40:15.814563949Z" level=info msg="Starting up" Jun 21 04:40:15.816300 dockerd[1823]: time="2025-06-21T04:40:15.816267705Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jun 21 04:40:15.984533 dockerd[1823]: time="2025-06-21T04:40:15.984471738Z" level=info msg="Loading containers: start." Jun 21 04:40:15.994744 kernel: Initializing XFRM netlink socket Jun 21 04:40:16.355809 systemd-networkd[1495]: docker0: Link UP Jun 21 04:40:16.460562 dockerd[1823]: time="2025-06-21T04:40:16.460513156Z" level=info msg="Loading containers: done." Jun 21 04:40:16.479641 dockerd[1823]: time="2025-06-21T04:40:16.479597629Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jun 21 04:40:16.479789 dockerd[1823]: time="2025-06-21T04:40:16.479678261Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 Jun 21 04:40:16.479823 dockerd[1823]: time="2025-06-21T04:40:16.479811451Z" level=info msg="Initializing buildkit" Jun 21 04:40:16.507330 dockerd[1823]: time="2025-06-21T04:40:16.507287207Z" level=info msg="Completed buildkit initialization" Jun 21 04:40:16.513255 dockerd[1823]: time="2025-06-21T04:40:16.513215168Z" level=info msg="Daemon has completed initialization" Jun 21 04:40:16.513334 dockerd[1823]: time="2025-06-21T04:40:16.513271684Z" level=info msg="API listen on /run/docker.sock" Jun 21 04:40:16.513429 systemd[1]: Started docker.service - Docker Application Container Engine. Jun 21 04:40:17.134848 containerd[1593]: time="2025-06-21T04:40:17.134804023Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.2\"" Jun 21 04:40:17.782100 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2879586127.mount: Deactivated successfully. Jun 21 04:40:18.958191 containerd[1593]: time="2025-06-21T04:40:18.958120285Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:40:18.959016 containerd[1593]: time="2025-06-21T04:40:18.958950903Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.2: active requests=0, bytes read=30079099" Jun 21 04:40:18.960163 containerd[1593]: time="2025-06-21T04:40:18.960095010Z" level=info msg="ImageCreate event name:\"sha256:ee794efa53d856b7e291320be3cd6390fa2e113c3f258a21290bc27fc214233e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:40:18.962770 containerd[1593]: time="2025-06-21T04:40:18.962732868Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:e8ae58675899e946fabe38425f2b3bfd33120b7930d05b5898de97c81a7f6137\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:40:18.963529 containerd[1593]: time="2025-06-21T04:40:18.963501711Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.2\" with image id \"sha256:ee794efa53d856b7e291320be3cd6390fa2e113c3f258a21290bc27fc214233e\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.2\", repo digest \"registry.k8s.io/kube-apiserver@sha256:e8ae58675899e946fabe38425f2b3bfd33120b7930d05b5898de97c81a7f6137\", size \"30075899\" in 1.828653074s" Jun 21 04:40:18.963577 containerd[1593]: time="2025-06-21T04:40:18.963532689Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.2\" returns image reference \"sha256:ee794efa53d856b7e291320be3cd6390fa2e113c3f258a21290bc27fc214233e\"" Jun 21 04:40:18.964208 containerd[1593]: time="2025-06-21T04:40:18.964035973Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.2\"" Jun 21 04:40:20.037243 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jun 21 04:40:20.039131 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jun 21 04:40:20.266262 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jun 21 04:40:20.270212 (kubelet)[2098]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jun 21 04:40:20.377825 kubelet[2098]: E0621 04:40:20.377685 2098 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jun 21 04:40:20.385348 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jun 21 04:40:20.385560 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jun 21 04:40:20.386204 systemd[1]: kubelet.service: Consumed 314ms CPU time, 112.9M memory peak. Jun 21 04:40:20.831686 containerd[1593]: time="2025-06-21T04:40:20.831555243Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:40:20.832371 containerd[1593]: time="2025-06-21T04:40:20.832344924Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.2: active requests=0, bytes read=26018946" Jun 21 04:40:20.833734 containerd[1593]: time="2025-06-21T04:40:20.833701249Z" level=info msg="ImageCreate event name:\"sha256:ff4f56c76b82d6cda0555115a0fe479d5dd612264b85efb9cc14b1b4b937bdf2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:40:20.836156 containerd[1593]: time="2025-06-21T04:40:20.836126489Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:2236e72a4be5dcc9c04600353ff8849db1557f5364947c520ff05471ae719081\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:40:20.836962 containerd[1593]: time="2025-06-21T04:40:20.836928173Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.2\" with image id \"sha256:ff4f56c76b82d6cda0555115a0fe479d5dd612264b85efb9cc14b1b4b937bdf2\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.2\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:2236e72a4be5dcc9c04600353ff8849db1557f5364947c520ff05471ae719081\", size \"27646507\" in 1.872856963s" Jun 21 04:40:20.837008 containerd[1593]: time="2025-06-21T04:40:20.836962798Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.2\" returns image reference \"sha256:ff4f56c76b82d6cda0555115a0fe479d5dd612264b85efb9cc14b1b4b937bdf2\"" Jun 21 04:40:20.837492 containerd[1593]: time="2025-06-21T04:40:20.837460672Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.2\"" Jun 21 04:40:22.345667 containerd[1593]: time="2025-06-21T04:40:22.345606110Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:40:22.346634 containerd[1593]: time="2025-06-21T04:40:22.346603131Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.2: active requests=0, bytes read=20155055" Jun 21 04:40:22.347917 containerd[1593]: time="2025-06-21T04:40:22.347890185Z" level=info msg="ImageCreate event name:\"sha256:cfed1ff7489289d4e8d796b0d95fd251990403510563cf843912f42ab9718a7b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:40:22.350700 containerd[1593]: time="2025-06-21T04:40:22.350637048Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:304c28303133be7d927973bc9bd6c83945b3735c59d283c25b63d5b9ed53bca3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:40:22.351726 containerd[1593]: time="2025-06-21T04:40:22.351682219Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.2\" with image id \"sha256:cfed1ff7489289d4e8d796b0d95fd251990403510563cf843912f42ab9718a7b\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.2\", repo digest \"registry.k8s.io/kube-scheduler@sha256:304c28303133be7d927973bc9bd6c83945b3735c59d283c25b63d5b9ed53bca3\", size \"21782634\" in 1.514184789s" Jun 21 04:40:22.351758 containerd[1593]: time="2025-06-21T04:40:22.351737323Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.2\" returns image reference \"sha256:cfed1ff7489289d4e8d796b0d95fd251990403510563cf843912f42ab9718a7b\"" Jun 21 04:40:22.352282 containerd[1593]: time="2025-06-21T04:40:22.352216842Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.2\"" Jun 21 04:40:23.260576 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2944329789.mount: Deactivated successfully. Jun 21 04:40:23.553423 containerd[1593]: time="2025-06-21T04:40:23.553303497Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:40:23.554135 containerd[1593]: time="2025-06-21T04:40:23.554109258Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.2: active requests=0, bytes read=31892746" Jun 21 04:40:23.555644 containerd[1593]: time="2025-06-21T04:40:23.555572574Z" level=info msg="ImageCreate event name:\"sha256:661d404f36f01cd854403fd3540f18dcf0342d22bd9c6516bb9de234ac183b19\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:40:23.558730 containerd[1593]: time="2025-06-21T04:40:23.558671367Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.2\" with image id \"sha256:661d404f36f01cd854403fd3540f18dcf0342d22bd9c6516bb9de234ac183b19\", repo tag \"registry.k8s.io/kube-proxy:v1.33.2\", repo digest \"registry.k8s.io/kube-proxy@sha256:4796ef3e43efa5ed2a5b015c18f81d3c2fe3aea36f555ea643cc01827eb65e51\", size \"31891765\" in 1.206417094s" Jun 21 04:40:23.558730 containerd[1593]: time="2025-06-21T04:40:23.558728615Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.2\" returns image reference \"sha256:661d404f36f01cd854403fd3540f18dcf0342d22bd9c6516bb9de234ac183b19\"" Jun 21 04:40:23.558870 containerd[1593]: time="2025-06-21T04:40:23.558772847Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:4796ef3e43efa5ed2a5b015c18f81d3c2fe3aea36f555ea643cc01827eb65e51\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:40:23.559300 containerd[1593]: time="2025-06-21T04:40:23.559277033Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Jun 21 04:40:24.103432 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3161123335.mount: Deactivated successfully. Jun 21 04:40:25.232381 containerd[1593]: time="2025-06-21T04:40:25.232304422Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:40:25.233323 containerd[1593]: time="2025-06-21T04:40:25.233244666Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=20942238" Jun 21 04:40:25.234768 containerd[1593]: time="2025-06-21T04:40:25.234683916Z" level=info msg="ImageCreate event name:\"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:40:25.237529 containerd[1593]: time="2025-06-21T04:40:25.237474041Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:40:25.238585 containerd[1593]: time="2025-06-21T04:40:25.238531124Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"20939036\" in 1.679221349s" Jun 21 04:40:25.238585 containerd[1593]: time="2025-06-21T04:40:25.238569887Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\"" Jun 21 04:40:25.239100 containerd[1593]: time="2025-06-21T04:40:25.239061168Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jun 21 04:40:25.810927 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3138695674.mount: Deactivated successfully. Jun 21 04:40:25.818338 containerd[1593]: time="2025-06-21T04:40:25.818274875Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jun 21 04:40:25.821432 containerd[1593]: time="2025-06-21T04:40:25.821387855Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Jun 21 04:40:25.822690 containerd[1593]: time="2025-06-21T04:40:25.822649412Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jun 21 04:40:25.825172 containerd[1593]: time="2025-06-21T04:40:25.825115819Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jun 21 04:40:25.825724 containerd[1593]: time="2025-06-21T04:40:25.825684986Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 586.595826ms" Jun 21 04:40:25.825763 containerd[1593]: time="2025-06-21T04:40:25.825741723Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Jun 21 04:40:25.826267 containerd[1593]: time="2025-06-21T04:40:25.826240429Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Jun 21 04:40:26.261791 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2331074458.mount: Deactivated successfully. Jun 21 04:40:29.190363 containerd[1593]: time="2025-06-21T04:40:29.190277562Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:40:29.191084 containerd[1593]: time="2025-06-21T04:40:29.191048769Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=58247175" Jun 21 04:40:29.192418 containerd[1593]: time="2025-06-21T04:40:29.192384104Z" level=info msg="ImageCreate event name:\"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:40:29.195399 containerd[1593]: time="2025-06-21T04:40:29.195368473Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:40:29.196910 containerd[1593]: time="2025-06-21T04:40:29.196865923Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"58938593\" in 3.370594035s" Jun 21 04:40:29.196910 containerd[1593]: time="2025-06-21T04:40:29.196905817Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\"" Jun 21 04:40:30.589286 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jun 21 04:40:30.591040 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jun 21 04:40:30.801774 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jun 21 04:40:30.822235 (kubelet)[2261]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jun 21 04:40:30.917503 kubelet[2261]: E0621 04:40:30.917337 2261 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jun 21 04:40:30.922534 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jun 21 04:40:30.922810 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jun 21 04:40:30.923449 systemd[1]: kubelet.service: Consumed 274ms CPU time, 110.4M memory peak. Jun 21 04:40:32.238359 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jun 21 04:40:32.238554 systemd[1]: kubelet.service: Consumed 274ms CPU time, 110.4M memory peak. Jun 21 04:40:32.241125 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jun 21 04:40:32.270024 systemd[1]: Reload requested from client PID 2277 ('systemctl') (unit session-7.scope)... Jun 21 04:40:32.270041 systemd[1]: Reloading... Jun 21 04:40:32.352748 zram_generator::config[2324]: No configuration found. Jun 21 04:40:32.750516 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jun 21 04:40:32.867724 systemd[1]: Reloading finished in 597 ms. Jun 21 04:40:32.930454 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jun 21 04:40:32.930549 systemd[1]: kubelet.service: Failed with result 'signal'. Jun 21 04:40:32.930855 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jun 21 04:40:32.930900 systemd[1]: kubelet.service: Consumed 158ms CPU time, 98.2M memory peak. Jun 21 04:40:32.932562 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jun 21 04:40:33.108220 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jun 21 04:40:33.123105 (kubelet)[2368]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jun 21 04:40:33.170760 kubelet[2368]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jun 21 04:40:33.170760 kubelet[2368]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jun 21 04:40:33.170760 kubelet[2368]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jun 21 04:40:33.171231 kubelet[2368]: I0621 04:40:33.170803 2368 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jun 21 04:40:33.638463 kubelet[2368]: I0621 04:40:33.638413 2368 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Jun 21 04:40:33.638621 kubelet[2368]: I0621 04:40:33.638445 2368 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jun 21 04:40:33.638973 kubelet[2368]: I0621 04:40:33.638887 2368 server.go:956] "Client rotation is on, will bootstrap in background" Jun 21 04:40:33.664385 kubelet[2368]: I0621 04:40:33.664208 2368 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jun 21 04:40:33.665459 kubelet[2368]: E0621 04:40:33.665427 2368 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.0.63:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.63:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Jun 21 04:40:33.670165 kubelet[2368]: I0621 04:40:33.670144 2368 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jun 21 04:40:33.677326 kubelet[2368]: I0621 04:40:33.677275 2368 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jun 21 04:40:33.677638 kubelet[2368]: I0621 04:40:33.677595 2368 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jun 21 04:40:33.677826 kubelet[2368]: I0621 04:40:33.677624 2368 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jun 21 04:40:33.677826 kubelet[2368]: I0621 04:40:33.677821 2368 topology_manager.go:138] "Creating topology manager with none policy" Jun 21 04:40:33.678108 kubelet[2368]: I0621 04:40:33.677832 2368 container_manager_linux.go:303] "Creating device plugin manager" Jun 21 04:40:33.678674 kubelet[2368]: I0621 04:40:33.678640 2368 state_mem.go:36] "Initialized new in-memory state store" Jun 21 04:40:33.681062 kubelet[2368]: I0621 04:40:33.681027 2368 kubelet.go:480] "Attempting to sync node with API server" Jun 21 04:40:33.681062 kubelet[2368]: I0621 04:40:33.681055 2368 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Jun 21 04:40:33.681151 kubelet[2368]: I0621 04:40:33.681091 2368 kubelet.go:386] "Adding apiserver pod source" Jun 21 04:40:33.681151 kubelet[2368]: I0621 04:40:33.681116 2368 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jun 21 04:40:33.685086 kubelet[2368]: I0621 04:40:33.685056 2368 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Jun 21 04:40:33.685493 kubelet[2368]: I0621 04:40:33.685461 2368 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jun 21 04:40:33.685997 kubelet[2368]: W0621 04:40:33.685971 2368 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jun 21 04:40:33.716735 kubelet[2368]: E0621 04:40:33.716456 2368 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.0.63:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.63:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Jun 21 04:40:33.716735 kubelet[2368]: E0621 04:40:33.716642 2368 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.0.63:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.63:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Jun 21 04:40:33.719129 kubelet[2368]: I0621 04:40:33.719078 2368 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jun 21 04:40:33.719295 kubelet[2368]: I0621 04:40:33.719175 2368 server.go:1289] "Started kubelet" Jun 21 04:40:33.722756 kubelet[2368]: I0621 04:40:33.721905 2368 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jun 21 04:40:33.725401 kubelet[2368]: I0621 04:40:33.725382 2368 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jun 21 04:40:33.726094 kubelet[2368]: I0621 04:40:33.726073 2368 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jun 21 04:40:33.727910 kubelet[2368]: I0621 04:40:33.727892 2368 volume_manager.go:297] "Starting Kubelet Volume Manager" Jun 21 04:40:33.728692 kubelet[2368]: I0621 04:40:33.728675 2368 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jun 21 04:40:33.729904 kubelet[2368]: I0621 04:40:33.729823 2368 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jun 21 04:40:33.730252 kubelet[2368]: I0621 04:40:33.730225 2368 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jun 21 04:40:33.730490 kubelet[2368]: E0621 04:40:33.730467 2368 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Jun 21 04:40:33.730669 kubelet[2368]: I0621 04:40:33.730632 2368 server.go:317] "Adding debug handlers to kubelet server" Jun 21 04:40:33.731138 kubelet[2368]: E0621 04:40:33.731110 2368 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.0.63:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.63:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Jun 21 04:40:33.731341 kubelet[2368]: E0621 04:40:33.731289 2368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.63:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.63:6443: connect: connection refused" interval="200ms" Jun 21 04:40:33.732769 kubelet[2368]: I0621 04:40:33.731758 2368 reconciler.go:26] "Reconciler: start to sync state" Jun 21 04:40:33.732769 kubelet[2368]: I0621 04:40:33.732602 2368 factory.go:223] Registration of the systemd container factory successfully Jun 21 04:40:33.732769 kubelet[2368]: I0621 04:40:33.732691 2368 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jun 21 04:40:33.732989 kubelet[2368]: E0621 04:40:33.731436 2368 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.63:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.63:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.184af50b37f4f42b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-06-21 04:40:33.719104555 +0000 UTC m=+0.591840807,LastTimestamp:2025-06-21 04:40:33.719104555 +0000 UTC m=+0.591840807,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Jun 21 04:40:33.734351 kubelet[2368]: E0621 04:40:33.734312 2368 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jun 21 04:40:33.734494 kubelet[2368]: I0621 04:40:33.734421 2368 factory.go:223] Registration of the containerd container factory successfully Jun 21 04:40:33.745728 kubelet[2368]: I0621 04:40:33.745671 2368 cpu_manager.go:221] "Starting CPU manager" policy="none" Jun 21 04:40:33.745728 kubelet[2368]: I0621 04:40:33.745692 2368 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jun 21 04:40:33.745728 kubelet[2368]: I0621 04:40:33.745725 2368 state_mem.go:36] "Initialized new in-memory state store" Jun 21 04:40:33.831237 kubelet[2368]: E0621 04:40:33.831165 2368 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Jun 21 04:40:33.931921 kubelet[2368]: E0621 04:40:33.931767 2368 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Jun 21 04:40:33.932336 kubelet[2368]: E0621 04:40:33.932283 2368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.63:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.63:6443: connect: connection refused" interval="400ms" Jun 21 04:40:34.032224 kubelet[2368]: E0621 04:40:34.032132 2368 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Jun 21 04:40:34.133243 kubelet[2368]: E0621 04:40:34.133195 2368 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Jun 21 04:40:34.234465 kubelet[2368]: E0621 04:40:34.234319 2368 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Jun 21 04:40:34.271768 kubelet[2368]: I0621 04:40:34.271555 2368 policy_none.go:49] "None policy: Start" Jun 21 04:40:34.271768 kubelet[2368]: I0621 04:40:34.271610 2368 memory_manager.go:186] "Starting memorymanager" policy="None" Jun 21 04:40:34.271768 kubelet[2368]: I0621 04:40:34.271628 2368 state_mem.go:35] "Initializing new in-memory state store" Jun 21 04:40:34.275492 kubelet[2368]: I0621 04:40:34.275419 2368 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Jun 21 04:40:34.276559 kubelet[2368]: I0621 04:40:34.276540 2368 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Jun 21 04:40:34.276559 kubelet[2368]: I0621 04:40:34.276561 2368 status_manager.go:230] "Starting to sync pod status with apiserver" Jun 21 04:40:34.276747 kubelet[2368]: I0621 04:40:34.276583 2368 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jun 21 04:40:34.276747 kubelet[2368]: I0621 04:40:34.276589 2368 kubelet.go:2436] "Starting kubelet main sync loop" Jun 21 04:40:34.276747 kubelet[2368]: E0621 04:40:34.276626 2368 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jun 21 04:40:34.278863 kubelet[2368]: E0621 04:40:34.278742 2368 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.0.63:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.63:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Jun 21 04:40:34.281828 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jun 21 04:40:34.301541 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jun 21 04:40:34.305018 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jun 21 04:40:34.319749 kubelet[2368]: E0621 04:40:34.319679 2368 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jun 21 04:40:34.320218 kubelet[2368]: I0621 04:40:34.319921 2368 eviction_manager.go:189] "Eviction manager: starting control loop" Jun 21 04:40:34.320218 kubelet[2368]: I0621 04:40:34.319932 2368 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jun 21 04:40:34.320308 kubelet[2368]: I0621 04:40:34.320260 2368 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jun 21 04:40:34.320920 kubelet[2368]: E0621 04:40:34.320892 2368 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jun 21 04:40:34.320976 kubelet[2368]: E0621 04:40:34.320966 2368 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Jun 21 04:40:34.333186 kubelet[2368]: E0621 04:40:34.333126 2368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.63:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.63:6443: connect: connection refused" interval="800ms" Jun 21 04:40:34.389592 systemd[1]: Created slice kubepods-burstable-pod17a5c9d02a3e3707afb9d5e98d63f4b7.slice - libcontainer container kubepods-burstable-pod17a5c9d02a3e3707afb9d5e98d63f4b7.slice. Jun 21 04:40:34.410446 kubelet[2368]: E0621 04:40:34.410420 2368 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Jun 21 04:40:34.413449 systemd[1]: Created slice kubepods-burstable-pod84b858ec27c8b2738b1d9ff9927e0dcb.slice - libcontainer container kubepods-burstable-pod84b858ec27c8b2738b1d9ff9927e0dcb.slice. Jun 21 04:40:34.416458 kubelet[2368]: E0621 04:40:34.416422 2368 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Jun 21 04:40:34.418129 systemd[1]: Created slice kubepods-burstable-pod834ee54f1daa06092e339273649eb5ea.slice - libcontainer container kubepods-burstable-pod834ee54f1daa06092e339273649eb5ea.slice. Jun 21 04:40:34.419842 kubelet[2368]: E0621 04:40:34.419819 2368 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Jun 21 04:40:34.421896 kubelet[2368]: I0621 04:40:34.421834 2368 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Jun 21 04:40:34.424091 kubelet[2368]: E0621 04:40:34.422206 2368 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.63:6443/api/v1/nodes\": dial tcp 10.0.0.63:6443: connect: connection refused" node="localhost" Jun 21 04:40:34.436350 kubelet[2368]: I0621 04:40:34.436300 2368 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/17a5c9d02a3e3707afb9d5e98d63f4b7-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"17a5c9d02a3e3707afb9d5e98d63f4b7\") " pod="kube-system/kube-apiserver-localhost" Jun 21 04:40:34.436350 kubelet[2368]: I0621 04:40:34.436330 2368 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/17a5c9d02a3e3707afb9d5e98d63f4b7-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"17a5c9d02a3e3707afb9d5e98d63f4b7\") " pod="kube-system/kube-apiserver-localhost" Jun 21 04:40:34.436350 kubelet[2368]: I0621 04:40:34.436350 2368 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/84b858ec27c8b2738b1d9ff9927e0dcb-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"84b858ec27c8b2738b1d9ff9927e0dcb\") " pod="kube-system/kube-controller-manager-localhost" Jun 21 04:40:34.436350 kubelet[2368]: I0621 04:40:34.436366 2368 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/834ee54f1daa06092e339273649eb5ea-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"834ee54f1daa06092e339273649eb5ea\") " pod="kube-system/kube-scheduler-localhost" Jun 21 04:40:34.436577 kubelet[2368]: I0621 04:40:34.436382 2368 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/17a5c9d02a3e3707afb9d5e98d63f4b7-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"17a5c9d02a3e3707afb9d5e98d63f4b7\") " pod="kube-system/kube-apiserver-localhost" Jun 21 04:40:34.436577 kubelet[2368]: I0621 04:40:34.436434 2368 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/84b858ec27c8b2738b1d9ff9927e0dcb-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"84b858ec27c8b2738b1d9ff9927e0dcb\") " pod="kube-system/kube-controller-manager-localhost" Jun 21 04:40:34.436577 kubelet[2368]: I0621 04:40:34.436467 2368 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/84b858ec27c8b2738b1d9ff9927e0dcb-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"84b858ec27c8b2738b1d9ff9927e0dcb\") " pod="kube-system/kube-controller-manager-localhost" Jun 21 04:40:34.436577 kubelet[2368]: I0621 04:40:34.436486 2368 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/84b858ec27c8b2738b1d9ff9927e0dcb-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"84b858ec27c8b2738b1d9ff9927e0dcb\") " pod="kube-system/kube-controller-manager-localhost" Jun 21 04:40:34.436577 kubelet[2368]: I0621 04:40:34.436508 2368 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/84b858ec27c8b2738b1d9ff9927e0dcb-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"84b858ec27c8b2738b1d9ff9927e0dcb\") " pod="kube-system/kube-controller-manager-localhost" Jun 21 04:40:34.626069 kubelet[2368]: I0621 04:40:34.626040 2368 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Jun 21 04:40:34.626430 kubelet[2368]: E0621 04:40:34.626390 2368 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.63:6443/api/v1/nodes\": dial tcp 10.0.0.63:6443: connect: connection refused" node="localhost" Jun 21 04:40:34.660346 kubelet[2368]: E0621 04:40:34.660303 2368 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.0.63:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.63:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Jun 21 04:40:34.711891 containerd[1593]: time="2025-06-21T04:40:34.711851219Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:17a5c9d02a3e3707afb9d5e98d63f4b7,Namespace:kube-system,Attempt:0,}" Jun 21 04:40:34.717772 containerd[1593]: time="2025-06-21T04:40:34.717744074Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:84b858ec27c8b2738b1d9ff9927e0dcb,Namespace:kube-system,Attempt:0,}" Jun 21 04:40:34.721536 containerd[1593]: time="2025-06-21T04:40:34.721486265Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:834ee54f1daa06092e339273649eb5ea,Namespace:kube-system,Attempt:0,}" Jun 21 04:40:34.767995 containerd[1593]: time="2025-06-21T04:40:34.767943632Z" level=info msg="connecting to shim 86014004b6c07ac833112c30e5953401270a628bb9fcf2114aa3f4cd5a5918b0" address="unix:///run/containerd/s/730d74db2ffb6a7049791bdf1f989e8df38743fa3d3e8d105fca25695d4b9ad8" namespace=k8s.io protocol=ttrpc version=3 Jun 21 04:40:34.789896 containerd[1593]: time="2025-06-21T04:40:34.789839109Z" level=info msg="connecting to shim d133341ab4113e7ffdaabf20e6de9e56a226504b08366fd24c9a3bf10baa66a9" address="unix:///run/containerd/s/d12d339c4a661bcefc70569357c737dc0b10c74ebbce55e8af3c7b1173e6351e" namespace=k8s.io protocol=ttrpc version=3 Jun 21 04:40:34.796010 containerd[1593]: time="2025-06-21T04:40:34.795945886Z" level=info msg="connecting to shim 4cabc325d2d0e15b16d5d6a4d7d078a704105e7de9b57c444da2f0ab28bf334c" address="unix:///run/containerd/s/d94e126b91a4b9409b3627b4da6ecdd781538384a81fd33f22335f3680a0aac3" namespace=k8s.io protocol=ttrpc version=3 Jun 21 04:40:34.822121 systemd[1]: Started cri-containerd-86014004b6c07ac833112c30e5953401270a628bb9fcf2114aa3f4cd5a5918b0.scope - libcontainer container 86014004b6c07ac833112c30e5953401270a628bb9fcf2114aa3f4cd5a5918b0. Jun 21 04:40:34.835856 systemd[1]: Started cri-containerd-d133341ab4113e7ffdaabf20e6de9e56a226504b08366fd24c9a3bf10baa66a9.scope - libcontainer container d133341ab4113e7ffdaabf20e6de9e56a226504b08366fd24c9a3bf10baa66a9. Jun 21 04:40:34.840858 systemd[1]: Started cri-containerd-4cabc325d2d0e15b16d5d6a4d7d078a704105e7de9b57c444da2f0ab28bf334c.scope - libcontainer container 4cabc325d2d0e15b16d5d6a4d7d078a704105e7de9b57c444da2f0ab28bf334c. Jun 21 04:40:34.903887 kubelet[2368]: E0621 04:40:34.903698 2368 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.0.63:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.63:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Jun 21 04:40:34.938370 containerd[1593]: time="2025-06-21T04:40:34.938323086Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:17a5c9d02a3e3707afb9d5e98d63f4b7,Namespace:kube-system,Attempt:0,} returns sandbox id \"86014004b6c07ac833112c30e5953401270a628bb9fcf2114aa3f4cd5a5918b0\"" Jun 21 04:40:34.942089 containerd[1593]: time="2025-06-21T04:40:34.942037725Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:834ee54f1daa06092e339273649eb5ea,Namespace:kube-system,Attempt:0,} returns sandbox id \"4cabc325d2d0e15b16d5d6a4d7d078a704105e7de9b57c444da2f0ab28bf334c\"" Jun 21 04:40:34.949324 containerd[1593]: time="2025-06-21T04:40:34.949260966Z" level=info msg="CreateContainer within sandbox \"86014004b6c07ac833112c30e5953401270a628bb9fcf2114aa3f4cd5a5918b0\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jun 21 04:40:34.949559 containerd[1593]: time="2025-06-21T04:40:34.949471982Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:84b858ec27c8b2738b1d9ff9927e0dcb,Namespace:kube-system,Attempt:0,} returns sandbox id \"d133341ab4113e7ffdaabf20e6de9e56a226504b08366fd24c9a3bf10baa66a9\"" Jun 21 04:40:34.951189 containerd[1593]: time="2025-06-21T04:40:34.951130824Z" level=info msg="CreateContainer within sandbox \"4cabc325d2d0e15b16d5d6a4d7d078a704105e7de9b57c444da2f0ab28bf334c\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jun 21 04:40:34.955752 containerd[1593]: time="2025-06-21T04:40:34.955692973Z" level=info msg="CreateContainer within sandbox \"d133341ab4113e7ffdaabf20e6de9e56a226504b08366fd24c9a3bf10baa66a9\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jun 21 04:40:34.961533 containerd[1593]: time="2025-06-21T04:40:34.961504185Z" level=info msg="Container 62154c5468553d75d4720913045322df616dd4929d55e5e545e314a5a98b0519: CDI devices from CRI Config.CDIDevices: []" Jun 21 04:40:34.969135 containerd[1593]: time="2025-06-21T04:40:34.969102961Z" level=info msg="Container 8d5300cc7e0d161b7d3cd989944755d389305fec4ff75109fe61204311c07ecf: CDI devices from CRI Config.CDIDevices: []" Jun 21 04:40:34.973824 containerd[1593]: time="2025-06-21T04:40:34.973786207Z" level=info msg="Container 829af6939d7001d57f216555c57f16b75e6c5d25b1b864a5d9ecde9c0b479161: CDI devices from CRI Config.CDIDevices: []" Jun 21 04:40:34.974237 containerd[1593]: time="2025-06-21T04:40:34.974212237Z" level=info msg="CreateContainer within sandbox \"86014004b6c07ac833112c30e5953401270a628bb9fcf2114aa3f4cd5a5918b0\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"62154c5468553d75d4720913045322df616dd4929d55e5e545e314a5a98b0519\"" Jun 21 04:40:34.975271 containerd[1593]: time="2025-06-21T04:40:34.975237861Z" level=info msg="StartContainer for \"62154c5468553d75d4720913045322df616dd4929d55e5e545e314a5a98b0519\"" Jun 21 04:40:34.976348 containerd[1593]: time="2025-06-21T04:40:34.976307407Z" level=info msg="connecting to shim 62154c5468553d75d4720913045322df616dd4929d55e5e545e314a5a98b0519" address="unix:///run/containerd/s/730d74db2ffb6a7049791bdf1f989e8df38743fa3d3e8d105fca25695d4b9ad8" protocol=ttrpc version=3 Jun 21 04:40:34.978823 containerd[1593]: time="2025-06-21T04:40:34.978793862Z" level=info msg="CreateContainer within sandbox \"4cabc325d2d0e15b16d5d6a4d7d078a704105e7de9b57c444da2f0ab28bf334c\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"8d5300cc7e0d161b7d3cd989944755d389305fec4ff75109fe61204311c07ecf\"" Jun 21 04:40:34.979442 containerd[1593]: time="2025-06-21T04:40:34.979219961Z" level=info msg="StartContainer for \"8d5300cc7e0d161b7d3cd989944755d389305fec4ff75109fe61204311c07ecf\"" Jun 21 04:40:34.980289 containerd[1593]: time="2025-06-21T04:40:34.980259812Z" level=info msg="connecting to shim 8d5300cc7e0d161b7d3cd989944755d389305fec4ff75109fe61204311c07ecf" address="unix:///run/containerd/s/d94e126b91a4b9409b3627b4da6ecdd781538384a81fd33f22335f3680a0aac3" protocol=ttrpc version=3 Jun 21 04:40:34.984190 containerd[1593]: time="2025-06-21T04:40:34.984152014Z" level=info msg="CreateContainer within sandbox \"d133341ab4113e7ffdaabf20e6de9e56a226504b08366fd24c9a3bf10baa66a9\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"829af6939d7001d57f216555c57f16b75e6c5d25b1b864a5d9ecde9c0b479161\"" Jun 21 04:40:34.984549 containerd[1593]: time="2025-06-21T04:40:34.984523872Z" level=info msg="StartContainer for \"829af6939d7001d57f216555c57f16b75e6c5d25b1b864a5d9ecde9c0b479161\"" Jun 21 04:40:34.985687 containerd[1593]: time="2025-06-21T04:40:34.985635928Z" level=info msg="connecting to shim 829af6939d7001d57f216555c57f16b75e6c5d25b1b864a5d9ecde9c0b479161" address="unix:///run/containerd/s/d12d339c4a661bcefc70569357c737dc0b10c74ebbce55e8af3c7b1173e6351e" protocol=ttrpc version=3 Jun 21 04:40:35.014950 systemd[1]: Started cri-containerd-62154c5468553d75d4720913045322df616dd4929d55e5e545e314a5a98b0519.scope - libcontainer container 62154c5468553d75d4720913045322df616dd4929d55e5e545e314a5a98b0519. Jun 21 04:40:35.018847 systemd[1]: Started cri-containerd-8d5300cc7e0d161b7d3cd989944755d389305fec4ff75109fe61204311c07ecf.scope - libcontainer container 8d5300cc7e0d161b7d3cd989944755d389305fec4ff75109fe61204311c07ecf. Jun 21 04:40:35.027764 kubelet[2368]: I0621 04:40:35.027733 2368 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Jun 21 04:40:35.028188 kubelet[2368]: E0621 04:40:35.028145 2368 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.63:6443/api/v1/nodes\": dial tcp 10.0.0.63:6443: connect: connection refused" node="localhost" Jun 21 04:40:35.033934 systemd[1]: Started cri-containerd-829af6939d7001d57f216555c57f16b75e6c5d25b1b864a5d9ecde9c0b479161.scope - libcontainer container 829af6939d7001d57f216555c57f16b75e6c5d25b1b864a5d9ecde9c0b479161. Jun 21 04:40:35.051256 kubelet[2368]: E0621 04:40:35.051174 2368 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.0.63:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.63:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Jun 21 04:40:35.127101 containerd[1593]: time="2025-06-21T04:40:35.126957478Z" level=info msg="StartContainer for \"62154c5468553d75d4720913045322df616dd4929d55e5e545e314a5a98b0519\" returns successfully" Jun 21 04:40:35.127500 containerd[1593]: time="2025-06-21T04:40:35.127470420Z" level=info msg="StartContainer for \"829af6939d7001d57f216555c57f16b75e6c5d25b1b864a5d9ecde9c0b479161\" returns successfully" Jun 21 04:40:35.130192 containerd[1593]: time="2025-06-21T04:40:35.130161789Z" level=info msg="StartContainer for \"8d5300cc7e0d161b7d3cd989944755d389305fec4ff75109fe61204311c07ecf\" returns successfully" Jun 21 04:40:35.134128 kubelet[2368]: E0621 04:40:35.134091 2368 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.63:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.63:6443: connect: connection refused" interval="1.6s" Jun 21 04:40:35.286697 kubelet[2368]: E0621 04:40:35.286560 2368 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Jun 21 04:40:35.291022 kubelet[2368]: E0621 04:40:35.290990 2368 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Jun 21 04:40:35.293165 kubelet[2368]: E0621 04:40:35.293133 2368 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Jun 21 04:40:35.830244 kubelet[2368]: I0621 04:40:35.830163 2368 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Jun 21 04:40:36.297293 kubelet[2368]: E0621 04:40:36.296939 2368 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Jun 21 04:40:36.299677 kubelet[2368]: E0621 04:40:36.299459 2368 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Jun 21 04:40:37.076094 kubelet[2368]: E0621 04:40:37.076043 2368 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Jun 21 04:40:37.236174 kubelet[2368]: I0621 04:40:37.235974 2368 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Jun 21 04:40:37.236174 kubelet[2368]: E0621 04:40:37.236039 2368 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" Jun 21 04:40:37.299747 kubelet[2368]: E0621 04:40:37.298365 2368 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Jun 21 04:40:37.334867 kubelet[2368]: E0621 04:40:37.334739 2368 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Jun 21 04:40:37.435771 kubelet[2368]: E0621 04:40:37.435699 2368 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Jun 21 04:40:37.536677 kubelet[2368]: E0621 04:40:37.536618 2368 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" Jun 21 04:40:37.629663 kubelet[2368]: I0621 04:40:37.629590 2368 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Jun 21 04:40:37.637311 kubelet[2368]: E0621 04:40:37.637266 2368 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Jun 21 04:40:37.637311 kubelet[2368]: I0621 04:40:37.637300 2368 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Jun 21 04:40:37.639674 kubelet[2368]: E0621 04:40:37.639572 2368 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Jun 21 04:40:37.639674 kubelet[2368]: I0621 04:40:37.639606 2368 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Jun 21 04:40:37.641497 kubelet[2368]: E0621 04:40:37.641470 2368 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Jun 21 04:40:37.718921 kubelet[2368]: I0621 04:40:37.718835 2368 apiserver.go:52] "Watching apiserver" Jun 21 04:40:37.729712 kubelet[2368]: I0621 04:40:37.729664 2368 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jun 21 04:40:39.572942 systemd[1]: Reload requested from client PID 2655 ('systemctl') (unit session-7.scope)... Jun 21 04:40:39.572960 systemd[1]: Reloading... Jun 21 04:40:39.677882 zram_generator::config[2701]: No configuration found. Jun 21 04:40:39.780978 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jun 21 04:40:39.917864 systemd[1]: Reloading finished in 344 ms. Jun 21 04:40:39.955926 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jun 21 04:40:39.975735 systemd[1]: kubelet.service: Deactivated successfully. Jun 21 04:40:39.976094 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jun 21 04:40:39.976170 systemd[1]: kubelet.service: Consumed 1.163s CPU time, 132.7M memory peak. Jun 21 04:40:39.978459 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jun 21 04:40:40.196940 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jun 21 04:40:40.216327 (kubelet)[2743]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jun 21 04:40:40.330654 kubelet[2743]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jun 21 04:40:40.330654 kubelet[2743]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jun 21 04:40:40.330654 kubelet[2743]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jun 21 04:40:40.331180 kubelet[2743]: I0621 04:40:40.330687 2743 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jun 21 04:40:40.377321 kubelet[2743]: I0621 04:40:40.377256 2743 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Jun 21 04:40:40.377321 kubelet[2743]: I0621 04:40:40.377305 2743 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jun 21 04:40:40.377656 kubelet[2743]: I0621 04:40:40.377636 2743 server.go:956] "Client rotation is on, will bootstrap in background" Jun 21 04:40:40.379246 kubelet[2743]: I0621 04:40:40.379217 2743 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Jun 21 04:40:40.382121 kubelet[2743]: I0621 04:40:40.382085 2743 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jun 21 04:40:40.386421 kubelet[2743]: I0621 04:40:40.386381 2743 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jun 21 04:40:40.395482 kubelet[2743]: I0621 04:40:40.395447 2743 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jun 21 04:40:40.395788 kubelet[2743]: I0621 04:40:40.395758 2743 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jun 21 04:40:40.395970 kubelet[2743]: I0621 04:40:40.395789 2743 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jun 21 04:40:40.396095 kubelet[2743]: I0621 04:40:40.395978 2743 topology_manager.go:138] "Creating topology manager with none policy" Jun 21 04:40:40.396095 kubelet[2743]: I0621 04:40:40.395988 2743 container_manager_linux.go:303] "Creating device plugin manager" Jun 21 04:40:40.396095 kubelet[2743]: I0621 04:40:40.396058 2743 state_mem.go:36] "Initialized new in-memory state store" Jun 21 04:40:40.396327 kubelet[2743]: I0621 04:40:40.396314 2743 kubelet.go:480] "Attempting to sync node with API server" Jun 21 04:40:40.396352 kubelet[2743]: I0621 04:40:40.396328 2743 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Jun 21 04:40:40.396371 kubelet[2743]: I0621 04:40:40.396353 2743 kubelet.go:386] "Adding apiserver pod source" Jun 21 04:40:40.396371 kubelet[2743]: I0621 04:40:40.396368 2743 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jun 21 04:40:40.397679 kubelet[2743]: I0621 04:40:40.397627 2743 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Jun 21 04:40:40.398432 kubelet[2743]: I0621 04:40:40.398414 2743 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jun 21 04:40:40.402486 kubelet[2743]: I0621 04:40:40.402412 2743 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jun 21 04:40:40.402618 kubelet[2743]: I0621 04:40:40.402508 2743 server.go:1289] "Started kubelet" Jun 21 04:40:40.403502 kubelet[2743]: I0621 04:40:40.403450 2743 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jun 21 04:40:40.404150 kubelet[2743]: I0621 04:40:40.404116 2743 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jun 21 04:40:40.404261 kubelet[2743]: I0621 04:40:40.404216 2743 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jun 21 04:40:40.405002 kubelet[2743]: I0621 04:40:40.404975 2743 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jun 21 04:40:40.406642 kubelet[2743]: I0621 04:40:40.406189 2743 server.go:317] "Adding debug handlers to kubelet server" Jun 21 04:40:40.411099 kubelet[2743]: I0621 04:40:40.410967 2743 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jun 21 04:40:40.414751 kubelet[2743]: E0621 04:40:40.414661 2743 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jun 21 04:40:40.415790 kubelet[2743]: I0621 04:40:40.414974 2743 volume_manager.go:297] "Starting Kubelet Volume Manager" Jun 21 04:40:40.416294 kubelet[2743]: I0621 04:40:40.416263 2743 reconciler.go:26] "Reconciler: start to sync state" Jun 21 04:40:40.417025 kubelet[2743]: I0621 04:40:40.416985 2743 factory.go:223] Registration of the systemd container factory successfully Jun 21 04:40:40.417201 kubelet[2743]: I0621 04:40:40.417149 2743 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jun 21 04:40:40.417640 kubelet[2743]: I0621 04:40:40.417611 2743 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jun 21 04:40:40.423882 kubelet[2743]: I0621 04:40:40.423839 2743 factory.go:223] Registration of the containerd container factory successfully Jun 21 04:40:40.438288 kubelet[2743]: I0621 04:40:40.438147 2743 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Jun 21 04:40:40.440605 kubelet[2743]: I0621 04:40:40.439651 2743 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Jun 21 04:40:40.440605 kubelet[2743]: I0621 04:40:40.439682 2743 status_manager.go:230] "Starting to sync pod status with apiserver" Jun 21 04:40:40.440605 kubelet[2743]: I0621 04:40:40.439733 2743 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jun 21 04:40:40.440605 kubelet[2743]: I0621 04:40:40.439742 2743 kubelet.go:2436] "Starting kubelet main sync loop" Jun 21 04:40:40.440605 kubelet[2743]: E0621 04:40:40.439787 2743 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jun 21 04:40:40.463414 kubelet[2743]: I0621 04:40:40.463288 2743 cpu_manager.go:221] "Starting CPU manager" policy="none" Jun 21 04:40:40.463414 kubelet[2743]: I0621 04:40:40.463305 2743 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jun 21 04:40:40.463414 kubelet[2743]: I0621 04:40:40.463322 2743 state_mem.go:36] "Initialized new in-memory state store" Jun 21 04:40:40.463592 kubelet[2743]: I0621 04:40:40.463439 2743 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jun 21 04:40:40.463592 kubelet[2743]: I0621 04:40:40.463448 2743 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jun 21 04:40:40.463592 kubelet[2743]: I0621 04:40:40.463463 2743 policy_none.go:49] "None policy: Start" Jun 21 04:40:40.463592 kubelet[2743]: I0621 04:40:40.463472 2743 memory_manager.go:186] "Starting memorymanager" policy="None" Jun 21 04:40:40.463592 kubelet[2743]: I0621 04:40:40.463481 2743 state_mem.go:35] "Initializing new in-memory state store" Jun 21 04:40:40.463592 kubelet[2743]: I0621 04:40:40.463557 2743 state_mem.go:75] "Updated machine memory state" Jun 21 04:40:40.468974 kubelet[2743]: E0621 04:40:40.468929 2743 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jun 21 04:40:40.469200 kubelet[2743]: I0621 04:40:40.469174 2743 eviction_manager.go:189] "Eviction manager: starting control loop" Jun 21 04:40:40.469238 kubelet[2743]: I0621 04:40:40.469203 2743 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jun 21 04:40:40.469367 kubelet[2743]: I0621 04:40:40.469348 2743 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jun 21 04:40:40.470352 kubelet[2743]: E0621 04:40:40.470339 2743 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jun 21 04:40:40.540940 kubelet[2743]: I0621 04:40:40.540904 2743 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Jun 21 04:40:40.541341 kubelet[2743]: I0621 04:40:40.541093 2743 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Jun 21 04:40:40.541341 kubelet[2743]: I0621 04:40:40.541222 2743 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Jun 21 04:40:40.574762 kubelet[2743]: I0621 04:40:40.574661 2743 kubelet_node_status.go:75] "Attempting to register node" node="localhost" Jun 21 04:40:40.584269 kubelet[2743]: I0621 04:40:40.584223 2743 kubelet_node_status.go:124] "Node was previously registered" node="localhost" Jun 21 04:40:40.584427 kubelet[2743]: I0621 04:40:40.584327 2743 kubelet_node_status.go:78] "Successfully registered node" node="localhost" Jun 21 04:40:40.717578 kubelet[2743]: I0621 04:40:40.717446 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/17a5c9d02a3e3707afb9d5e98d63f4b7-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"17a5c9d02a3e3707afb9d5e98d63f4b7\") " pod="kube-system/kube-apiserver-localhost" Jun 21 04:40:40.717578 kubelet[2743]: I0621 04:40:40.717484 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/17a5c9d02a3e3707afb9d5e98d63f4b7-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"17a5c9d02a3e3707afb9d5e98d63f4b7\") " pod="kube-system/kube-apiserver-localhost" Jun 21 04:40:40.717578 kubelet[2743]: I0621 04:40:40.717511 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/84b858ec27c8b2738b1d9ff9927e0dcb-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"84b858ec27c8b2738b1d9ff9927e0dcb\") " pod="kube-system/kube-controller-manager-localhost" Jun 21 04:40:40.717578 kubelet[2743]: I0621 04:40:40.717528 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/84b858ec27c8b2738b1d9ff9927e0dcb-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"84b858ec27c8b2738b1d9ff9927e0dcb\") " pod="kube-system/kube-controller-manager-localhost" Jun 21 04:40:40.717578 kubelet[2743]: I0621 04:40:40.717543 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/84b858ec27c8b2738b1d9ff9927e0dcb-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"84b858ec27c8b2738b1d9ff9927e0dcb\") " pod="kube-system/kube-controller-manager-localhost" Jun 21 04:40:40.718218 kubelet[2743]: I0621 04:40:40.717556 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/84b858ec27c8b2738b1d9ff9927e0dcb-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"84b858ec27c8b2738b1d9ff9927e0dcb\") " pod="kube-system/kube-controller-manager-localhost" Jun 21 04:40:40.718218 kubelet[2743]: I0621 04:40:40.717568 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/17a5c9d02a3e3707afb9d5e98d63f4b7-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"17a5c9d02a3e3707afb9d5e98d63f4b7\") " pod="kube-system/kube-apiserver-localhost" Jun 21 04:40:40.718218 kubelet[2743]: I0621 04:40:40.717581 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/84b858ec27c8b2738b1d9ff9927e0dcb-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"84b858ec27c8b2738b1d9ff9927e0dcb\") " pod="kube-system/kube-controller-manager-localhost" Jun 21 04:40:40.718218 kubelet[2743]: I0621 04:40:40.717597 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/834ee54f1daa06092e339273649eb5ea-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"834ee54f1daa06092e339273649eb5ea\") " pod="kube-system/kube-scheduler-localhost" Jun 21 04:40:41.397763 kubelet[2743]: I0621 04:40:41.397439 2743 apiserver.go:52] "Watching apiserver" Jun 21 04:40:41.418213 kubelet[2743]: I0621 04:40:41.418135 2743 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jun 21 04:40:41.435696 kubelet[2743]: I0621 04:40:41.435633 2743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.435415127 podStartE2EDuration="1.435415127s" podCreationTimestamp="2025-06-21 04:40:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-21 04:40:41.428514093 +0000 UTC m=+1.203898104" watchObservedRunningTime="2025-06-21 04:40:41.435415127 +0000 UTC m=+1.210799138" Jun 21 04:40:41.445368 kubelet[2743]: I0621 04:40:41.445005 2743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.44496734 podStartE2EDuration="1.44496734s" podCreationTimestamp="2025-06-21 04:40:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-21 04:40:41.444417011 +0000 UTC m=+1.219801022" watchObservedRunningTime="2025-06-21 04:40:41.44496734 +0000 UTC m=+1.220351351" Jun 21 04:40:41.445368 kubelet[2743]: I0621 04:40:41.445175 2743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.445167776 podStartE2EDuration="1.445167776s" podCreationTimestamp="2025-06-21 04:40:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-21 04:40:41.435871014 +0000 UTC m=+1.211255025" watchObservedRunningTime="2025-06-21 04:40:41.445167776 +0000 UTC m=+1.220551797" Jun 21 04:40:41.456990 kubelet[2743]: I0621 04:40:41.456952 2743 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Jun 21 04:40:41.457111 kubelet[2743]: I0621 04:40:41.457024 2743 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Jun 21 04:40:41.457237 kubelet[2743]: I0621 04:40:41.457209 2743 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Jun 21 04:40:41.464464 kubelet[2743]: E0621 04:40:41.464402 2743 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Jun 21 04:40:41.465000 kubelet[2743]: E0621 04:40:41.464968 2743 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Jun 21 04:40:41.465168 kubelet[2743]: E0621 04:40:41.465144 2743 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Jun 21 04:40:45.402557 kubelet[2743]: I0621 04:40:45.402519 2743 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jun 21 04:40:45.403386 containerd[1593]: time="2025-06-21T04:40:45.403311739Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jun 21 04:40:45.404013 kubelet[2743]: I0621 04:40:45.403565 2743 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jun 21 04:40:46.381761 systemd[1]: Created slice kubepods-besteffort-pod35021b54_18bc_4f67_98c3_517ca39837e9.slice - libcontainer container kubepods-besteffort-pod35021b54_18bc_4f67_98c3_517ca39837e9.slice. Jun 21 04:40:46.446537 kubelet[2743]: I0621 04:40:46.446487 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/35021b54-18bc-4f67-98c3-517ca39837e9-kube-proxy\") pod \"kube-proxy-j7t7g\" (UID: \"35021b54-18bc-4f67-98c3-517ca39837e9\") " pod="kube-system/kube-proxy-j7t7g" Jun 21 04:40:46.446537 kubelet[2743]: I0621 04:40:46.446527 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/35021b54-18bc-4f67-98c3-517ca39837e9-xtables-lock\") pod \"kube-proxy-j7t7g\" (UID: \"35021b54-18bc-4f67-98c3-517ca39837e9\") " pod="kube-system/kube-proxy-j7t7g" Jun 21 04:40:46.446537 kubelet[2743]: I0621 04:40:46.446548 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/35021b54-18bc-4f67-98c3-517ca39837e9-lib-modules\") pod \"kube-proxy-j7t7g\" (UID: \"35021b54-18bc-4f67-98c3-517ca39837e9\") " pod="kube-system/kube-proxy-j7t7g" Jun 21 04:40:46.447045 kubelet[2743]: I0621 04:40:46.446584 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btc4h\" (UniqueName: \"kubernetes.io/projected/35021b54-18bc-4f67-98c3-517ca39837e9-kube-api-access-btc4h\") pod \"kube-proxy-j7t7g\" (UID: \"35021b54-18bc-4f67-98c3-517ca39837e9\") " pod="kube-system/kube-proxy-j7t7g" Jun 21 04:40:46.647419 kubelet[2743]: I0621 04:40:46.647300 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a67f1927-6a9e-4dad-b2f7-120114a1419f-var-lib-calico\") pod \"tigera-operator-68f7c7984d-wlkjm\" (UID: \"a67f1927-6a9e-4dad-b2f7-120114a1419f\") " pod="tigera-operator/tigera-operator-68f7c7984d-wlkjm" Jun 21 04:40:46.647419 kubelet[2743]: I0621 04:40:46.647339 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb6hn\" (UniqueName: \"kubernetes.io/projected/a67f1927-6a9e-4dad-b2f7-120114a1419f-kube-api-access-lb6hn\") pod \"tigera-operator-68f7c7984d-wlkjm\" (UID: \"a67f1927-6a9e-4dad-b2f7-120114a1419f\") " pod="tigera-operator/tigera-operator-68f7c7984d-wlkjm" Jun 21 04:40:46.657579 systemd[1]: Created slice kubepods-besteffort-poda67f1927_6a9e_4dad_b2f7_120114a1419f.slice - libcontainer container kubepods-besteffort-poda67f1927_6a9e_4dad_b2f7_120114a1419f.slice. Jun 21 04:40:46.697077 containerd[1593]: time="2025-06-21T04:40:46.697031874Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-j7t7g,Uid:35021b54-18bc-4f67-98c3-517ca39837e9,Namespace:kube-system,Attempt:0,}" Jun 21 04:40:46.715625 containerd[1593]: time="2025-06-21T04:40:46.715566663Z" level=info msg="connecting to shim 5761d33e5c3df3788df84bf1a5b178198b180f6daa8d0d221691b75d1e66ed8b" address="unix:///run/containerd/s/f2e2a5a109ae1f8cd95ac12e63afb09c4f8ee04cfe9972b2f8e37f4dba07144c" namespace=k8s.io protocol=ttrpc version=3 Jun 21 04:40:46.741880 systemd[1]: Started cri-containerd-5761d33e5c3df3788df84bf1a5b178198b180f6daa8d0d221691b75d1e66ed8b.scope - libcontainer container 5761d33e5c3df3788df84bf1a5b178198b180f6daa8d0d221691b75d1e66ed8b. Jun 21 04:40:46.769355 containerd[1593]: time="2025-06-21T04:40:46.769309081Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-j7t7g,Uid:35021b54-18bc-4f67-98c3-517ca39837e9,Namespace:kube-system,Attempt:0,} returns sandbox id \"5761d33e5c3df3788df84bf1a5b178198b180f6daa8d0d221691b75d1e66ed8b\"" Jun 21 04:40:46.776678 containerd[1593]: time="2025-06-21T04:40:46.776621690Z" level=info msg="CreateContainer within sandbox \"5761d33e5c3df3788df84bf1a5b178198b180f6daa8d0d221691b75d1e66ed8b\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jun 21 04:40:46.787348 containerd[1593]: time="2025-06-21T04:40:46.786396204Z" level=info msg="Container 9e8f7ce6403240bad42a9dd492b907d993efa9ef6a0beab196574d1208de419f: CDI devices from CRI Config.CDIDevices: []" Jun 21 04:40:46.794371 containerd[1593]: time="2025-06-21T04:40:46.794339809Z" level=info msg="CreateContainer within sandbox \"5761d33e5c3df3788df84bf1a5b178198b180f6daa8d0d221691b75d1e66ed8b\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"9e8f7ce6403240bad42a9dd492b907d993efa9ef6a0beab196574d1208de419f\"" Jun 21 04:40:46.795011 containerd[1593]: time="2025-06-21T04:40:46.794946688Z" level=info msg="StartContainer for \"9e8f7ce6403240bad42a9dd492b907d993efa9ef6a0beab196574d1208de419f\"" Jun 21 04:40:46.796312 containerd[1593]: time="2025-06-21T04:40:46.796279044Z" level=info msg="connecting to shim 9e8f7ce6403240bad42a9dd492b907d993efa9ef6a0beab196574d1208de419f" address="unix:///run/containerd/s/f2e2a5a109ae1f8cd95ac12e63afb09c4f8ee04cfe9972b2f8e37f4dba07144c" protocol=ttrpc version=3 Jun 21 04:40:46.826955 systemd[1]: Started cri-containerd-9e8f7ce6403240bad42a9dd492b907d993efa9ef6a0beab196574d1208de419f.scope - libcontainer container 9e8f7ce6403240bad42a9dd492b907d993efa9ef6a0beab196574d1208de419f. Jun 21 04:40:46.867988 containerd[1593]: time="2025-06-21T04:40:46.867945252Z" level=info msg="StartContainer for \"9e8f7ce6403240bad42a9dd492b907d993efa9ef6a0beab196574d1208de419f\" returns successfully" Jun 21 04:40:46.961781 containerd[1593]: time="2025-06-21T04:40:46.961662297Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-68f7c7984d-wlkjm,Uid:a67f1927-6a9e-4dad-b2f7-120114a1419f,Namespace:tigera-operator,Attempt:0,}" Jun 21 04:40:46.980871 containerd[1593]: time="2025-06-21T04:40:46.980815386Z" level=info msg="connecting to shim e84362fcf5e41573d97092abba18126d04c7aac23c5e961651200fa46cdbfe5a" address="unix:///run/containerd/s/13a8befb25cd16e46083259687390f83d730d510f4832f7b36bc70b545ba5b72" namespace=k8s.io protocol=ttrpc version=3 Jun 21 04:40:47.006996 systemd[1]: Started cri-containerd-e84362fcf5e41573d97092abba18126d04c7aac23c5e961651200fa46cdbfe5a.scope - libcontainer container e84362fcf5e41573d97092abba18126d04c7aac23c5e961651200fa46cdbfe5a. Jun 21 04:40:47.055105 containerd[1593]: time="2025-06-21T04:40:47.055058031Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-68f7c7984d-wlkjm,Uid:a67f1927-6a9e-4dad-b2f7-120114a1419f,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"e84362fcf5e41573d97092abba18126d04c7aac23c5e961651200fa46cdbfe5a\"" Jun 21 04:40:47.056631 containerd[1593]: time="2025-06-21T04:40:47.056594162Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.1\"" Jun 21 04:40:48.410513 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2508146200.mount: Deactivated successfully. Jun 21 04:40:48.741213 containerd[1593]: time="2025-06-21T04:40:48.741048979Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:40:48.741894 containerd[1593]: time="2025-06-21T04:40:48.741862901Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.1: active requests=0, bytes read=25059858" Jun 21 04:40:48.743011 containerd[1593]: time="2025-06-21T04:40:48.742981042Z" level=info msg="ImageCreate event name:\"sha256:9fe1a04a0e6c440395d63018f1a72bb1ed07d81ed81be41e9b8adcc35a64164c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:40:48.744841 containerd[1593]: time="2025-06-21T04:40:48.744810350Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:a2a468d1ac1b6a7049c1c2505cd933461fcadb127b5c3f98f03bd8e402bce456\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:40:48.745373 containerd[1593]: time="2025-06-21T04:40:48.745341673Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.1\" with image id \"sha256:9fe1a04a0e6c440395d63018f1a72bb1ed07d81ed81be41e9b8adcc35a64164c\", repo tag \"quay.io/tigera/operator:v1.38.1\", repo digest \"quay.io/tigera/operator@sha256:a2a468d1ac1b6a7049c1c2505cd933461fcadb127b5c3f98f03bd8e402bce456\", size \"25055853\" in 1.688724317s" Jun 21 04:40:48.745373 containerd[1593]: time="2025-06-21T04:40:48.745365768Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.1\" returns image reference \"sha256:9fe1a04a0e6c440395d63018f1a72bb1ed07d81ed81be41e9b8adcc35a64164c\"" Jun 21 04:40:48.750128 containerd[1593]: time="2025-06-21T04:40:48.750090414Z" level=info msg="CreateContainer within sandbox \"e84362fcf5e41573d97092abba18126d04c7aac23c5e961651200fa46cdbfe5a\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jun 21 04:40:48.759034 containerd[1593]: time="2025-06-21T04:40:48.758998385Z" level=info msg="Container b05e70649b4caece4653bf2a1a2818386374591eb5226db2b6850298959b4c50: CDI devices from CRI Config.CDIDevices: []" Jun 21 04:40:48.762507 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3982712793.mount: Deactivated successfully. Jun 21 04:40:48.764605 containerd[1593]: time="2025-06-21T04:40:48.764564455Z" level=info msg="CreateContainer within sandbox \"e84362fcf5e41573d97092abba18126d04c7aac23c5e961651200fa46cdbfe5a\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"b05e70649b4caece4653bf2a1a2818386374591eb5226db2b6850298959b4c50\"" Jun 21 04:40:48.765268 containerd[1593]: time="2025-06-21T04:40:48.764975779Z" level=info msg="StartContainer for \"b05e70649b4caece4653bf2a1a2818386374591eb5226db2b6850298959b4c50\"" Jun 21 04:40:48.765822 containerd[1593]: time="2025-06-21T04:40:48.765768120Z" level=info msg="connecting to shim b05e70649b4caece4653bf2a1a2818386374591eb5226db2b6850298959b4c50" address="unix:///run/containerd/s/13a8befb25cd16e46083259687390f83d730d510f4832f7b36bc70b545ba5b72" protocol=ttrpc version=3 Jun 21 04:40:48.818875 systemd[1]: Started cri-containerd-b05e70649b4caece4653bf2a1a2818386374591eb5226db2b6850298959b4c50.scope - libcontainer container b05e70649b4caece4653bf2a1a2818386374591eb5226db2b6850298959b4c50. Jun 21 04:40:48.846702 containerd[1593]: time="2025-06-21T04:40:48.846664115Z" level=info msg="StartContainer for \"b05e70649b4caece4653bf2a1a2818386374591eb5226db2b6850298959b4c50\" returns successfully" Jun 21 04:40:49.481565 kubelet[2743]: I0621 04:40:49.481503 2743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-j7t7g" podStartSLOduration=3.4814858490000002 podStartE2EDuration="3.481485849s" podCreationTimestamp="2025-06-21 04:40:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-21 04:40:47.478671413 +0000 UTC m=+7.254055434" watchObservedRunningTime="2025-06-21 04:40:49.481485849 +0000 UTC m=+9.256869860" Jun 21 04:40:50.461399 kubelet[2743]: I0621 04:40:50.461215 2743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-68f7c7984d-wlkjm" podStartSLOduration=2.771379082 podStartE2EDuration="4.461176625s" podCreationTimestamp="2025-06-21 04:40:46 +0000 UTC" firstStartedPulling="2025-06-21 04:40:47.056248282 +0000 UTC m=+6.831632293" lastFinishedPulling="2025-06-21 04:40:48.746045825 +0000 UTC m=+8.521429836" observedRunningTime="2025-06-21 04:40:49.481778306 +0000 UTC m=+9.257162327" watchObservedRunningTime="2025-06-21 04:40:50.461176625 +0000 UTC m=+10.236560656" Jun 21 04:40:52.478305 kubelet[2743]: E0621 04:40:52.478255 2743 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Jun 21 04:40:53.044991 kubelet[2743]: E0621 04:40:53.044951 2743 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Jun 21 04:40:53.129162 update_engine[1570]: I20250621 04:40:53.129057 1570 update_attempter.cc:509] Updating boot flags... Jun 21 04:40:54.387279 sudo[1803]: pam_unix(sudo:session): session closed for user root Jun 21 04:40:54.389935 sshd[1802]: Connection closed by 10.0.0.1 port 60908 Jun 21 04:40:54.391091 sshd-session[1800]: pam_unix(sshd:session): session closed for user core Jun 21 04:40:54.397060 systemd-logind[1566]: Session 7 logged out. Waiting for processes to exit. Jun 21 04:40:54.397862 systemd[1]: sshd@6-10.0.0.63:22-10.0.0.1:60908.service: Deactivated successfully. Jun 21 04:40:54.400619 systemd[1]: session-7.scope: Deactivated successfully. Jun 21 04:40:54.401067 systemd[1]: session-7.scope: Consumed 5.096s CPU time, 221.5M memory peak. Jun 21 04:40:54.404424 systemd-logind[1566]: Removed session 7. Jun 21 04:40:57.221505 kubelet[2743]: I0621 04:40:57.221176 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/e1e41340-aa3d-4fa4-a44a-b93a10a1ded4-typha-certs\") pod \"calico-typha-77bd46cd8c-mcrvb\" (UID: \"e1e41340-aa3d-4fa4-a44a-b93a10a1ded4\") " pod="calico-system/calico-typha-77bd46cd8c-mcrvb" Jun 21 04:40:57.223772 kubelet[2743]: I0621 04:40:57.222182 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltr5w\" (UniqueName: \"kubernetes.io/projected/e1e41340-aa3d-4fa4-a44a-b93a10a1ded4-kube-api-access-ltr5w\") pod \"calico-typha-77bd46cd8c-mcrvb\" (UID: \"e1e41340-aa3d-4fa4-a44a-b93a10a1ded4\") " pod="calico-system/calico-typha-77bd46cd8c-mcrvb" Jun 21 04:40:57.223772 kubelet[2743]: I0621 04:40:57.222227 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1e41340-aa3d-4fa4-a44a-b93a10a1ded4-tigera-ca-bundle\") pod \"calico-typha-77bd46cd8c-mcrvb\" (UID: \"e1e41340-aa3d-4fa4-a44a-b93a10a1ded4\") " pod="calico-system/calico-typha-77bd46cd8c-mcrvb" Jun 21 04:40:57.226025 systemd[1]: Created slice kubepods-besteffort-pode1e41340_aa3d_4fa4_a44a_b93a10a1ded4.slice - libcontainer container kubepods-besteffort-pode1e41340_aa3d_4fa4_a44a_b93a10a1ded4.slice. Jun 21 04:40:57.444194 systemd[1]: Created slice kubepods-besteffort-pod03b0e599_b154_4660_ad70_20fd12a05dd1.slice - libcontainer container kubepods-besteffort-pod03b0e599_b154_4660_ad70_20fd12a05dd1.slice. Jun 21 04:40:57.524662 kubelet[2743]: I0621 04:40:57.524514 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/03b0e599-b154-4660-ad70-20fd12a05dd1-node-certs\") pod \"calico-node-n67pk\" (UID: \"03b0e599-b154-4660-ad70-20fd12a05dd1\") " pod="calico-system/calico-node-n67pk" Jun 21 04:40:57.524662 kubelet[2743]: I0621 04:40:57.524573 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03b0e599-b154-4660-ad70-20fd12a05dd1-tigera-ca-bundle\") pod \"calico-node-n67pk\" (UID: \"03b0e599-b154-4660-ad70-20fd12a05dd1\") " pod="calico-system/calico-node-n67pk" Jun 21 04:40:57.524662 kubelet[2743]: I0621 04:40:57.524591 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/03b0e599-b154-4660-ad70-20fd12a05dd1-var-run-calico\") pod \"calico-node-n67pk\" (UID: \"03b0e599-b154-4660-ad70-20fd12a05dd1\") " pod="calico-system/calico-node-n67pk" Jun 21 04:40:57.524662 kubelet[2743]: I0621 04:40:57.524616 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/03b0e599-b154-4660-ad70-20fd12a05dd1-lib-modules\") pod \"calico-node-n67pk\" (UID: \"03b0e599-b154-4660-ad70-20fd12a05dd1\") " pod="calico-system/calico-node-n67pk" Jun 21 04:40:57.524662 kubelet[2743]: I0621 04:40:57.524637 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lkr4\" (UniqueName: \"kubernetes.io/projected/03b0e599-b154-4660-ad70-20fd12a05dd1-kube-api-access-5lkr4\") pod \"calico-node-n67pk\" (UID: \"03b0e599-b154-4660-ad70-20fd12a05dd1\") " pod="calico-system/calico-node-n67pk" Jun 21 04:40:57.524928 kubelet[2743]: I0621 04:40:57.524658 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/03b0e599-b154-4660-ad70-20fd12a05dd1-policysync\") pod \"calico-node-n67pk\" (UID: \"03b0e599-b154-4660-ad70-20fd12a05dd1\") " pod="calico-system/calico-node-n67pk" Jun 21 04:40:57.524928 kubelet[2743]: I0621 04:40:57.524693 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/03b0e599-b154-4660-ad70-20fd12a05dd1-cni-bin-dir\") pod \"calico-node-n67pk\" (UID: \"03b0e599-b154-4660-ad70-20fd12a05dd1\") " pod="calico-system/calico-node-n67pk" Jun 21 04:40:57.524928 kubelet[2743]: I0621 04:40:57.524737 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/03b0e599-b154-4660-ad70-20fd12a05dd1-cni-log-dir\") pod \"calico-node-n67pk\" (UID: \"03b0e599-b154-4660-ad70-20fd12a05dd1\") " pod="calico-system/calico-node-n67pk" Jun 21 04:40:57.524928 kubelet[2743]: I0621 04:40:57.524761 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/03b0e599-b154-4660-ad70-20fd12a05dd1-cni-net-dir\") pod \"calico-node-n67pk\" (UID: \"03b0e599-b154-4660-ad70-20fd12a05dd1\") " pod="calico-system/calico-node-n67pk" Jun 21 04:40:57.524928 kubelet[2743]: I0621 04:40:57.524811 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/03b0e599-b154-4660-ad70-20fd12a05dd1-var-lib-calico\") pod \"calico-node-n67pk\" (UID: \"03b0e599-b154-4660-ad70-20fd12a05dd1\") " pod="calico-system/calico-node-n67pk" Jun 21 04:40:57.525066 kubelet[2743]: I0621 04:40:57.524866 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/03b0e599-b154-4660-ad70-20fd12a05dd1-xtables-lock\") pod \"calico-node-n67pk\" (UID: \"03b0e599-b154-4660-ad70-20fd12a05dd1\") " pod="calico-system/calico-node-n67pk" Jun 21 04:40:57.525066 kubelet[2743]: I0621 04:40:57.524882 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/03b0e599-b154-4660-ad70-20fd12a05dd1-flexvol-driver-host\") pod \"calico-node-n67pk\" (UID: \"03b0e599-b154-4660-ad70-20fd12a05dd1\") " pod="calico-system/calico-node-n67pk" Jun 21 04:40:57.532781 kubelet[2743]: E0621 04:40:57.532701 2743 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Jun 21 04:40:57.533387 containerd[1593]: time="2025-06-21T04:40:57.533343626Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-77bd46cd8c-mcrvb,Uid:e1e41340-aa3d-4fa4-a44a-b93a10a1ded4,Namespace:calico-system,Attempt:0,}" Jun 21 04:40:57.575530 containerd[1593]: time="2025-06-21T04:40:57.575464708Z" level=info msg="connecting to shim f176ab9a2a67852864cef0f189dd0a3593fd2349708516b5f677710930102e55" address="unix:///run/containerd/s/f67a9ea53e2efe760b03a1083d4587942902636f47c2316a7cbb78fe21487148" namespace=k8s.io protocol=ttrpc version=3 Jun 21 04:40:57.601872 systemd[1]: Started cri-containerd-f176ab9a2a67852864cef0f189dd0a3593fd2349708516b5f677710930102e55.scope - libcontainer container f176ab9a2a67852864cef0f189dd0a3593fd2349708516b5f677710930102e55. Jun 21 04:40:57.627594 kubelet[2743]: E0621 04:40:57.627298 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.627594 kubelet[2743]: W0621 04:40:57.627346 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.627594 kubelet[2743]: E0621 04:40:57.627368 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.635009 kubelet[2743]: E0621 04:40:57.634779 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.635009 kubelet[2743]: W0621 04:40:57.634801 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.635009 kubelet[2743]: E0621 04:40:57.634822 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.637617 kubelet[2743]: E0621 04:40:57.637595 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.637617 kubelet[2743]: W0621 04:40:57.637611 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.637730 kubelet[2743]: E0621 04:40:57.637625 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.652681 containerd[1593]: time="2025-06-21T04:40:57.652632192Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-77bd46cd8c-mcrvb,Uid:e1e41340-aa3d-4fa4-a44a-b93a10a1ded4,Namespace:calico-system,Attempt:0,} returns sandbox id \"f176ab9a2a67852864cef0f189dd0a3593fd2349708516b5f677710930102e55\"" Jun 21 04:40:57.653683 kubelet[2743]: E0621 04:40:57.653641 2743 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Jun 21 04:40:57.655191 containerd[1593]: time="2025-06-21T04:40:57.654906788Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.1\"" Jun 21 04:40:57.736290 kubelet[2743]: E0621 04:40:57.735876 2743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ldc5m" podUID="8757dac4-0fac-47e3-9805-744183a4690a" Jun 21 04:40:57.747834 containerd[1593]: time="2025-06-21T04:40:57.747172992Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-n67pk,Uid:03b0e599-b154-4660-ad70-20fd12a05dd1,Namespace:calico-system,Attempt:0,}" Jun 21 04:40:57.772532 containerd[1593]: time="2025-06-21T04:40:57.772448777Z" level=info msg="connecting to shim 704e7548d8ffd1f161df9d3e40314a5eb2af317f05326cc8832814acfa88f6f4" address="unix:///run/containerd/s/ed01531c7e6bd8c9ea35addd8b10a9d349110d44dbfe56d4b33dcebc7cfbff48" namespace=k8s.io protocol=ttrpc version=3 Jun 21 04:40:57.799949 systemd[1]: Started cri-containerd-704e7548d8ffd1f161df9d3e40314a5eb2af317f05326cc8832814acfa88f6f4.scope - libcontainer container 704e7548d8ffd1f161df9d3e40314a5eb2af317f05326cc8832814acfa88f6f4. Jun 21 04:40:57.817576 kubelet[2743]: E0621 04:40:57.817514 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.817576 kubelet[2743]: W0621 04:40:57.817537 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.817576 kubelet[2743]: E0621 04:40:57.817561 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.817837 kubelet[2743]: E0621 04:40:57.817822 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.817837 kubelet[2743]: W0621 04:40:57.817835 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.817898 kubelet[2743]: E0621 04:40:57.817846 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.818094 kubelet[2743]: E0621 04:40:57.818069 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.818094 kubelet[2743]: W0621 04:40:57.818082 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.818094 kubelet[2743]: E0621 04:40:57.818093 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.818399 kubelet[2743]: E0621 04:40:57.818381 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.818399 kubelet[2743]: W0621 04:40:57.818394 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.818525 kubelet[2743]: E0621 04:40:57.818405 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.818679 kubelet[2743]: E0621 04:40:57.818662 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.818858 kubelet[2743]: W0621 04:40:57.818815 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.818858 kubelet[2743]: E0621 04:40:57.818835 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.819107 kubelet[2743]: E0621 04:40:57.819082 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.819107 kubelet[2743]: W0621 04:40:57.819095 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.819184 kubelet[2743]: E0621 04:40:57.819120 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.819360 kubelet[2743]: E0621 04:40:57.819335 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.819360 kubelet[2743]: W0621 04:40:57.819347 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.819360 kubelet[2743]: E0621 04:40:57.819357 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.819571 kubelet[2743]: E0621 04:40:57.819554 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.819571 kubelet[2743]: W0621 04:40:57.819566 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.819643 kubelet[2743]: E0621 04:40:57.819575 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.819827 kubelet[2743]: E0621 04:40:57.819811 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.819827 kubelet[2743]: W0621 04:40:57.819824 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.819905 kubelet[2743]: E0621 04:40:57.819834 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.820060 kubelet[2743]: E0621 04:40:57.820042 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.820060 kubelet[2743]: W0621 04:40:57.820055 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.820141 kubelet[2743]: E0621 04:40:57.820066 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.820310 kubelet[2743]: E0621 04:40:57.820292 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.820310 kubelet[2743]: W0621 04:40:57.820304 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.820387 kubelet[2743]: E0621 04:40:57.820313 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.820540 kubelet[2743]: E0621 04:40:57.820506 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.820540 kubelet[2743]: W0621 04:40:57.820519 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.820540 kubelet[2743]: E0621 04:40:57.820528 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.821323 kubelet[2743]: E0621 04:40:57.820841 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.821323 kubelet[2743]: W0621 04:40:57.820854 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.821323 kubelet[2743]: E0621 04:40:57.820864 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.821323 kubelet[2743]: E0621 04:40:57.821111 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.821323 kubelet[2743]: W0621 04:40:57.821121 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.821323 kubelet[2743]: E0621 04:40:57.821131 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.821768 kubelet[2743]: E0621 04:40:57.821747 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.821768 kubelet[2743]: W0621 04:40:57.821764 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.821846 kubelet[2743]: E0621 04:40:57.821777 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.822491 kubelet[2743]: E0621 04:40:57.822472 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.822491 kubelet[2743]: W0621 04:40:57.822486 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.822575 kubelet[2743]: E0621 04:40:57.822497 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.823032 kubelet[2743]: E0621 04:40:57.822980 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.823032 kubelet[2743]: W0621 04:40:57.822997 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.823032 kubelet[2743]: E0621 04:40:57.823008 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.823856 kubelet[2743]: E0621 04:40:57.823814 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.823856 kubelet[2743]: W0621 04:40:57.823842 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.823944 kubelet[2743]: E0621 04:40:57.823869 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.824139 kubelet[2743]: E0621 04:40:57.824111 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.824139 kubelet[2743]: W0621 04:40:57.824123 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.824139 kubelet[2743]: E0621 04:40:57.824133 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.824916 kubelet[2743]: E0621 04:40:57.824890 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.824916 kubelet[2743]: W0621 04:40:57.824906 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.824916 kubelet[2743]: E0621 04:40:57.824915 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.827120 kubelet[2743]: E0621 04:40:57.827084 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.827120 kubelet[2743]: W0621 04:40:57.827114 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.827209 kubelet[2743]: E0621 04:40:57.827130 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.827209 kubelet[2743]: I0621 04:40:57.827161 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8757dac4-0fac-47e3-9805-744183a4690a-kubelet-dir\") pod \"csi-node-driver-ldc5m\" (UID: \"8757dac4-0fac-47e3-9805-744183a4690a\") " pod="calico-system/csi-node-driver-ldc5m" Jun 21 04:40:57.827518 kubelet[2743]: E0621 04:40:57.827497 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.827518 kubelet[2743]: W0621 04:40:57.827513 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.827595 kubelet[2743]: E0621 04:40:57.827524 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.827595 kubelet[2743]: I0621 04:40:57.827551 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8757dac4-0fac-47e3-9805-744183a4690a-socket-dir\") pod \"csi-node-driver-ldc5m\" (UID: \"8757dac4-0fac-47e3-9805-744183a4690a\") " pod="calico-system/csi-node-driver-ldc5m" Jun 21 04:40:57.827925 kubelet[2743]: E0621 04:40:57.827891 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.827986 kubelet[2743]: W0621 04:40:57.827935 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.827986 kubelet[2743]: E0621 04:40:57.827949 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.828341 kubelet[2743]: E0621 04:40:57.828301 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.828341 kubelet[2743]: W0621 04:40:57.828320 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.828341 kubelet[2743]: E0621 04:40:57.828332 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.828706 kubelet[2743]: E0621 04:40:57.828676 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.828706 kubelet[2743]: W0621 04:40:57.828693 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.828706 kubelet[2743]: E0621 04:40:57.828704 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.828886 kubelet[2743]: I0621 04:40:57.828843 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k245\" (UniqueName: \"kubernetes.io/projected/8757dac4-0fac-47e3-9805-744183a4690a-kube-api-access-7k245\") pod \"csi-node-driver-ldc5m\" (UID: \"8757dac4-0fac-47e3-9805-744183a4690a\") " pod="calico-system/csi-node-driver-ldc5m" Jun 21 04:40:57.829237 kubelet[2743]: E0621 04:40:57.829206 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.829237 kubelet[2743]: W0621 04:40:57.829222 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.829316 kubelet[2743]: E0621 04:40:57.829244 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.829530 kubelet[2743]: E0621 04:40:57.829504 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.829530 kubelet[2743]: W0621 04:40:57.829519 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.829599 kubelet[2743]: E0621 04:40:57.829532 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.829840 kubelet[2743]: E0621 04:40:57.829815 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.829840 kubelet[2743]: W0621 04:40:57.829829 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.829840 kubelet[2743]: E0621 04:40:57.829839 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.830119 kubelet[2743]: E0621 04:40:57.830084 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.830119 kubelet[2743]: W0621 04:40:57.830108 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.830119 kubelet[2743]: E0621 04:40:57.830118 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.830363 kubelet[2743]: E0621 04:40:57.830345 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.830363 kubelet[2743]: W0621 04:40:57.830360 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.830448 kubelet[2743]: E0621 04:40:57.830370 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.830448 kubelet[2743]: I0621 04:40:57.830405 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8757dac4-0fac-47e3-9805-744183a4690a-registration-dir\") pod \"csi-node-driver-ldc5m\" (UID: \"8757dac4-0fac-47e3-9805-744183a4690a\") " pod="calico-system/csi-node-driver-ldc5m" Jun 21 04:40:57.830698 kubelet[2743]: E0621 04:40:57.830678 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.830698 kubelet[2743]: W0621 04:40:57.830692 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.830822 kubelet[2743]: E0621 04:40:57.830703 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.830822 kubelet[2743]: I0621 04:40:57.830768 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/8757dac4-0fac-47e3-9805-744183a4690a-varrun\") pod \"csi-node-driver-ldc5m\" (UID: \"8757dac4-0fac-47e3-9805-744183a4690a\") " pod="calico-system/csi-node-driver-ldc5m" Jun 21 04:40:57.831093 kubelet[2743]: E0621 04:40:57.831074 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.831093 kubelet[2743]: W0621 04:40:57.831088 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.831168 kubelet[2743]: E0621 04:40:57.831109 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.832118 kubelet[2743]: E0621 04:40:57.832018 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.832118 kubelet[2743]: W0621 04:40:57.832032 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.832118 kubelet[2743]: E0621 04:40:57.832043 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.832490 kubelet[2743]: E0621 04:40:57.832421 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.832490 kubelet[2743]: W0621 04:40:57.832436 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.832490 kubelet[2743]: E0621 04:40:57.832446 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.832819 kubelet[2743]: E0621 04:40:57.832773 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.832819 kubelet[2743]: W0621 04:40:57.832785 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.832819 kubelet[2743]: E0621 04:40:57.832796 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.834917 containerd[1593]: time="2025-06-21T04:40:57.834858775Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-n67pk,Uid:03b0e599-b154-4660-ad70-20fd12a05dd1,Namespace:calico-system,Attempt:0,} returns sandbox id \"704e7548d8ffd1f161df9d3e40314a5eb2af317f05326cc8832814acfa88f6f4\"" Jun 21 04:40:57.931482 kubelet[2743]: E0621 04:40:57.931425 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.931482 kubelet[2743]: W0621 04:40:57.931447 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.931482 kubelet[2743]: E0621 04:40:57.931468 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.931753 kubelet[2743]: E0621 04:40:57.931708 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.931753 kubelet[2743]: W0621 04:40:57.931738 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.931753 kubelet[2743]: E0621 04:40:57.931748 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.932015 kubelet[2743]: E0621 04:40:57.931986 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.932015 kubelet[2743]: W0621 04:40:57.931999 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.932015 kubelet[2743]: E0621 04:40:57.932008 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.932242 kubelet[2743]: E0621 04:40:57.932213 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.932242 kubelet[2743]: W0621 04:40:57.932223 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.932242 kubelet[2743]: E0621 04:40:57.932231 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.932448 kubelet[2743]: E0621 04:40:57.932418 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.932448 kubelet[2743]: W0621 04:40:57.932429 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.932448 kubelet[2743]: E0621 04:40:57.932439 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.932691 kubelet[2743]: E0621 04:40:57.932658 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.932691 kubelet[2743]: W0621 04:40:57.932672 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.932691 kubelet[2743]: E0621 04:40:57.932681 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.932910 kubelet[2743]: E0621 04:40:57.932890 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.932910 kubelet[2743]: W0621 04:40:57.932900 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.932910 kubelet[2743]: E0621 04:40:57.932907 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.933136 kubelet[2743]: E0621 04:40:57.933116 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.933136 kubelet[2743]: W0621 04:40:57.933127 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.933136 kubelet[2743]: E0621 04:40:57.933135 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.933343 kubelet[2743]: E0621 04:40:57.933323 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.933343 kubelet[2743]: W0621 04:40:57.933332 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.933343 kubelet[2743]: E0621 04:40:57.933340 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.934431 kubelet[2743]: E0621 04:40:57.934402 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.934431 kubelet[2743]: W0621 04:40:57.934414 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.934431 kubelet[2743]: E0621 04:40:57.934423 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.934648 kubelet[2743]: E0621 04:40:57.934621 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.934648 kubelet[2743]: W0621 04:40:57.934632 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.934648 kubelet[2743]: E0621 04:40:57.934639 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.934850 kubelet[2743]: E0621 04:40:57.934821 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.934850 kubelet[2743]: W0621 04:40:57.934832 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.934850 kubelet[2743]: E0621 04:40:57.934839 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.935038 kubelet[2743]: E0621 04:40:57.935010 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.935038 kubelet[2743]: W0621 04:40:57.935020 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.935038 kubelet[2743]: E0621 04:40:57.935027 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.935244 kubelet[2743]: E0621 04:40:57.935213 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.935244 kubelet[2743]: W0621 04:40:57.935225 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.935244 kubelet[2743]: E0621 04:40:57.935232 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.935416 kubelet[2743]: E0621 04:40:57.935396 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.935416 kubelet[2743]: W0621 04:40:57.935405 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.935416 kubelet[2743]: E0621 04:40:57.935413 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.935618 kubelet[2743]: E0621 04:40:57.935593 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.935618 kubelet[2743]: W0621 04:40:57.935605 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.935618 kubelet[2743]: E0621 04:40:57.935614 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.935828 kubelet[2743]: E0621 04:40:57.935808 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.935828 kubelet[2743]: W0621 04:40:57.935818 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.935828 kubelet[2743]: E0621 04:40:57.935825 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.936108 kubelet[2743]: E0621 04:40:57.936076 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.936108 kubelet[2743]: W0621 04:40:57.936090 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.936108 kubelet[2743]: E0621 04:40:57.936107 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.936293 kubelet[2743]: E0621 04:40:57.936276 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.936293 kubelet[2743]: W0621 04:40:57.936286 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.936293 kubelet[2743]: E0621 04:40:57.936295 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.936551 kubelet[2743]: E0621 04:40:57.936531 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.936551 kubelet[2743]: W0621 04:40:57.936543 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.936551 kubelet[2743]: E0621 04:40:57.936552 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.936745 kubelet[2743]: E0621 04:40:57.936729 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.936745 kubelet[2743]: W0621 04:40:57.936740 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.936815 kubelet[2743]: E0621 04:40:57.936750 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.936940 kubelet[2743]: E0621 04:40:57.936922 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.936940 kubelet[2743]: W0621 04:40:57.936933 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.936940 kubelet[2743]: E0621 04:40:57.936942 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.937181 kubelet[2743]: E0621 04:40:57.937155 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.937181 kubelet[2743]: W0621 04:40:57.937171 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.937181 kubelet[2743]: E0621 04:40:57.937179 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.937417 kubelet[2743]: E0621 04:40:57.937398 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.937417 kubelet[2743]: W0621 04:40:57.937410 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.937417 kubelet[2743]: E0621 04:40:57.937417 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.937596 kubelet[2743]: E0621 04:40:57.937570 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.937596 kubelet[2743]: W0621 04:40:57.937581 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.937596 kubelet[2743]: E0621 04:40:57.937588 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:57.943618 kubelet[2743]: E0621 04:40:57.943588 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:40:57.943618 kubelet[2743]: W0621 04:40:57.943604 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:40:57.943618 kubelet[2743]: E0621 04:40:57.943614 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:40:59.122109 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount161832658.mount: Deactivated successfully. Jun 21 04:40:59.440389 kubelet[2743]: E0621 04:40:59.440260 2743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ldc5m" podUID="8757dac4-0fac-47e3-9805-744183a4690a" Jun 21 04:41:00.349364 containerd[1593]: time="2025-06-21T04:41:00.349288842Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:00.350344 containerd[1593]: time="2025-06-21T04:41:00.350247303Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.1: active requests=0, bytes read=35227888" Jun 21 04:41:00.351615 containerd[1593]: time="2025-06-21T04:41:00.351574651Z" level=info msg="ImageCreate event name:\"sha256:11d920cd1d8c935bdf3cb40dd9e67f22c3624df627bdd58cf6d0e503230688d7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:00.353599 containerd[1593]: time="2025-06-21T04:41:00.353561196Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f1edaa4eaa6349a958c409e0dab2d6ee7d1234e5f0eeefc9f508d0b1c9d7d0d1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:00.354238 containerd[1593]: time="2025-06-21T04:41:00.354185426Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.1\" with image id \"sha256:11d920cd1d8c935bdf3cb40dd9e67f22c3624df627bdd58cf6d0e503230688d7\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.1\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f1edaa4eaa6349a958c409e0dab2d6ee7d1234e5f0eeefc9f508d0b1c9d7d0d1\", size \"35227742\" in 2.699229817s" Jun 21 04:41:00.354238 containerd[1593]: time="2025-06-21T04:41:00.354218609Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.1\" returns image reference \"sha256:11d920cd1d8c935bdf3cb40dd9e67f22c3624df627bdd58cf6d0e503230688d7\"" Jun 21 04:41:00.356433 containerd[1593]: time="2025-06-21T04:41:00.355907421Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.1\"" Jun 21 04:41:00.366377 containerd[1593]: time="2025-06-21T04:41:00.366334778Z" level=info msg="CreateContainer within sandbox \"f176ab9a2a67852864cef0f189dd0a3593fd2349708516b5f677710930102e55\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jun 21 04:41:00.374893 containerd[1593]: time="2025-06-21T04:41:00.374844732Z" level=info msg="Container c62f61073c6d4993d84983a475d57b82d8917ad7354f83b36244c68d66b5e1cb: CDI devices from CRI Config.CDIDevices: []" Jun 21 04:41:00.382454 containerd[1593]: time="2025-06-21T04:41:00.382422744Z" level=info msg="CreateContainer within sandbox \"f176ab9a2a67852864cef0f189dd0a3593fd2349708516b5f677710930102e55\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"c62f61073c6d4993d84983a475d57b82d8917ad7354f83b36244c68d66b5e1cb\"" Jun 21 04:41:00.382832 containerd[1593]: time="2025-06-21T04:41:00.382788846Z" level=info msg="StartContainer for \"c62f61073c6d4993d84983a475d57b82d8917ad7354f83b36244c68d66b5e1cb\"" Jun 21 04:41:00.383699 containerd[1593]: time="2025-06-21T04:41:00.383661715Z" level=info msg="connecting to shim c62f61073c6d4993d84983a475d57b82d8917ad7354f83b36244c68d66b5e1cb" address="unix:///run/containerd/s/f67a9ea53e2efe760b03a1083d4587942902636f47c2316a7cbb78fe21487148" protocol=ttrpc version=3 Jun 21 04:41:00.402995 systemd[1]: Started cri-containerd-c62f61073c6d4993d84983a475d57b82d8917ad7354f83b36244c68d66b5e1cb.scope - libcontainer container c62f61073c6d4993d84983a475d57b82d8917ad7354f83b36244c68d66b5e1cb. Jun 21 04:41:00.455746 containerd[1593]: time="2025-06-21T04:41:00.455551885Z" level=info msg="StartContainer for \"c62f61073c6d4993d84983a475d57b82d8917ad7354f83b36244c68d66b5e1cb\" returns successfully" Jun 21 04:41:00.503192 kubelet[2743]: E0621 04:41:00.503144 2743 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Jun 21 04:41:00.520041 kubelet[2743]: I0621 04:41:00.519679 2743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-77bd46cd8c-mcrvb" podStartSLOduration=0.818629247 podStartE2EDuration="3.519662172s" podCreationTimestamp="2025-06-21 04:40:57 +0000 UTC" firstStartedPulling="2025-06-21 04:40:57.65445897 +0000 UTC m=+17.429842981" lastFinishedPulling="2025-06-21 04:41:00.355491895 +0000 UTC m=+20.130875906" observedRunningTime="2025-06-21 04:41:00.51914771 +0000 UTC m=+20.294531721" watchObservedRunningTime="2025-06-21 04:41:00.519662172 +0000 UTC m=+20.295046183" Jun 21 04:41:00.543636 kubelet[2743]: E0621 04:41:00.543510 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:00.543636 kubelet[2743]: W0621 04:41:00.543540 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:00.543636 kubelet[2743]: E0621 04:41:00.543564 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:00.544004 kubelet[2743]: E0621 04:41:00.543808 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:00.544004 kubelet[2743]: W0621 04:41:00.543815 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:00.544004 kubelet[2743]: E0621 04:41:00.543823 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:00.544004 kubelet[2743]: E0621 04:41:00.543977 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:00.544004 kubelet[2743]: W0621 04:41:00.543983 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:00.544004 kubelet[2743]: E0621 04:41:00.543991 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:00.544551 kubelet[2743]: E0621 04:41:00.544530 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:00.544551 kubelet[2743]: W0621 04:41:00.544543 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:00.544551 kubelet[2743]: E0621 04:41:00.544550 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:00.545554 kubelet[2743]: E0621 04:41:00.544780 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:00.545554 kubelet[2743]: W0621 04:41:00.544790 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:00.545554 kubelet[2743]: E0621 04:41:00.544797 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:00.545554 kubelet[2743]: E0621 04:41:00.544944 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:00.545554 kubelet[2743]: W0621 04:41:00.544951 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:00.545554 kubelet[2743]: E0621 04:41:00.544959 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:00.545554 kubelet[2743]: E0621 04:41:00.545106 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:00.545554 kubelet[2743]: W0621 04:41:00.545112 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:00.545554 kubelet[2743]: E0621 04:41:00.545119 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:00.545554 kubelet[2743]: E0621 04:41:00.545257 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:00.545826 kubelet[2743]: W0621 04:41:00.545264 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:00.545826 kubelet[2743]: E0621 04:41:00.545270 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:00.545826 kubelet[2743]: E0621 04:41:00.545413 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:00.545826 kubelet[2743]: W0621 04:41:00.545420 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:00.545826 kubelet[2743]: E0621 04:41:00.545427 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:00.545826 kubelet[2743]: E0621 04:41:00.545570 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:00.545826 kubelet[2743]: W0621 04:41:00.545577 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:00.545826 kubelet[2743]: E0621 04:41:00.545584 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:00.545826 kubelet[2743]: E0621 04:41:00.545736 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:00.545826 kubelet[2743]: W0621 04:41:00.545742 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:00.546038 kubelet[2743]: E0621 04:41:00.545749 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:00.546038 kubelet[2743]: E0621 04:41:00.545886 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:00.546038 kubelet[2743]: W0621 04:41:00.545892 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:00.546038 kubelet[2743]: E0621 04:41:00.545899 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:00.546164 kubelet[2743]: E0621 04:41:00.546077 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:00.546164 kubelet[2743]: W0621 04:41:00.546086 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:00.546164 kubelet[2743]: E0621 04:41:00.546095 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:00.546754 kubelet[2743]: E0621 04:41:00.546319 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:00.546754 kubelet[2743]: W0621 04:41:00.546329 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:00.546754 kubelet[2743]: E0621 04:41:00.546341 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:00.546850 kubelet[2743]: E0621 04:41:00.546837 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:00.546850 kubelet[2743]: W0621 04:41:00.546845 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:00.546891 kubelet[2743]: E0621 04:41:00.546854 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:00.552499 kubelet[2743]: E0621 04:41:00.552467 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:00.552499 kubelet[2743]: W0621 04:41:00.552494 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:00.552646 kubelet[2743]: E0621 04:41:00.552513 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:00.552764 kubelet[2743]: E0621 04:41:00.552751 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:00.552764 kubelet[2743]: W0621 04:41:00.552762 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:00.552830 kubelet[2743]: E0621 04:41:00.552773 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:00.553261 kubelet[2743]: E0621 04:41:00.553247 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:00.553261 kubelet[2743]: W0621 04:41:00.553259 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:00.553337 kubelet[2743]: E0621 04:41:00.553269 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:00.553528 kubelet[2743]: E0621 04:41:00.553516 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:00.553528 kubelet[2743]: W0621 04:41:00.553527 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:00.553595 kubelet[2743]: E0621 04:41:00.553535 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:00.553772 kubelet[2743]: E0621 04:41:00.553760 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:00.553772 kubelet[2743]: W0621 04:41:00.553770 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:00.553851 kubelet[2743]: E0621 04:41:00.553780 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:00.553993 kubelet[2743]: E0621 04:41:00.553981 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:00.553993 kubelet[2743]: W0621 04:41:00.553991 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:00.554051 kubelet[2743]: E0621 04:41:00.554000 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:00.554198 kubelet[2743]: E0621 04:41:00.554186 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:00.554198 kubelet[2743]: W0621 04:41:00.554197 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:00.554259 kubelet[2743]: E0621 04:41:00.554205 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:00.554423 kubelet[2743]: E0621 04:41:00.554408 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:00.554477 kubelet[2743]: W0621 04:41:00.554427 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:00.554477 kubelet[2743]: E0621 04:41:00.554436 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:00.555171 kubelet[2743]: E0621 04:41:00.555150 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:00.555226 kubelet[2743]: W0621 04:41:00.555173 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:00.555226 kubelet[2743]: E0621 04:41:00.555183 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:00.555486 kubelet[2743]: E0621 04:41:00.555462 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:00.555486 kubelet[2743]: W0621 04:41:00.555474 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:00.555486 kubelet[2743]: E0621 04:41:00.555482 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:00.555791 kubelet[2743]: E0621 04:41:00.555775 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:00.555791 kubelet[2743]: W0621 04:41:00.555788 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:00.555922 kubelet[2743]: E0621 04:41:00.555797 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:00.558794 kubelet[2743]: E0621 04:41:00.558649 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:00.558794 kubelet[2743]: W0621 04:41:00.558703 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:00.558794 kubelet[2743]: E0621 04:41:00.558778 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:00.561861 kubelet[2743]: E0621 04:41:00.561819 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:00.561861 kubelet[2743]: W0621 04:41:00.561840 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:00.561861 kubelet[2743]: E0621 04:41:00.561858 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:00.562883 kubelet[2743]: E0621 04:41:00.562317 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:00.562883 kubelet[2743]: W0621 04:41:00.562325 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:00.562883 kubelet[2743]: E0621 04:41:00.562334 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:00.562883 kubelet[2743]: E0621 04:41:00.562792 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:00.562883 kubelet[2743]: W0621 04:41:00.562799 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:00.562883 kubelet[2743]: E0621 04:41:00.562807 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:00.564632 kubelet[2743]: E0621 04:41:00.564604 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:00.564632 kubelet[2743]: W0621 04:41:00.564621 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:00.564843 kubelet[2743]: E0621 04:41:00.564635 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:00.565275 kubelet[2743]: E0621 04:41:00.565226 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:00.565275 kubelet[2743]: W0621 04:41:00.565240 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:00.565275 kubelet[2743]: E0621 04:41:00.565253 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:00.565912 kubelet[2743]: E0621 04:41:00.565891 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:00.565912 kubelet[2743]: W0621 04:41:00.565907 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:00.566028 kubelet[2743]: E0621 04:41:00.565920 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:01.440671 kubelet[2743]: E0621 04:41:01.440614 2743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ldc5m" podUID="8757dac4-0fac-47e3-9805-744183a4690a" Jun 21 04:41:01.504986 kubelet[2743]: E0621 04:41:01.504954 2743 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Jun 21 04:41:01.556069 kubelet[2743]: E0621 04:41:01.555814 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:01.556069 kubelet[2743]: W0621 04:41:01.555845 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:01.556069 kubelet[2743]: E0621 04:41:01.555873 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:01.556432 kubelet[2743]: E0621 04:41:01.556403 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:01.556432 kubelet[2743]: W0621 04:41:01.556418 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:01.556432 kubelet[2743]: E0621 04:41:01.556431 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:01.556701 kubelet[2743]: E0621 04:41:01.556675 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:01.556852 kubelet[2743]: W0621 04:41:01.556703 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:01.556852 kubelet[2743]: E0621 04:41:01.556735 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:01.557077 kubelet[2743]: E0621 04:41:01.557025 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:01.557077 kubelet[2743]: W0621 04:41:01.557039 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:01.557077 kubelet[2743]: E0621 04:41:01.557049 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:01.557291 kubelet[2743]: E0621 04:41:01.557276 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:01.557291 kubelet[2743]: W0621 04:41:01.557286 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:01.557368 kubelet[2743]: E0621 04:41:01.557294 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:01.557470 kubelet[2743]: E0621 04:41:01.557437 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:01.557470 kubelet[2743]: W0621 04:41:01.557449 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:01.557470 kubelet[2743]: E0621 04:41:01.557458 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:01.557697 kubelet[2743]: E0621 04:41:01.557664 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:01.557697 kubelet[2743]: W0621 04:41:01.557682 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:01.557814 kubelet[2743]: E0621 04:41:01.557704 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:01.557976 kubelet[2743]: E0621 04:41:01.557946 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:01.557976 kubelet[2743]: W0621 04:41:01.557956 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:01.557976 kubelet[2743]: E0621 04:41:01.557964 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:01.558190 kubelet[2743]: E0621 04:41:01.558160 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:01.558190 kubelet[2743]: W0621 04:41:01.558172 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:01.558190 kubelet[2743]: E0621 04:41:01.558179 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:01.558347 kubelet[2743]: E0621 04:41:01.558326 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:01.558347 kubelet[2743]: W0621 04:41:01.558337 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:01.558347 kubelet[2743]: E0621 04:41:01.558344 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:01.558512 kubelet[2743]: E0621 04:41:01.558492 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:01.558512 kubelet[2743]: W0621 04:41:01.558501 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:01.558512 kubelet[2743]: E0621 04:41:01.558508 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:01.558673 kubelet[2743]: E0621 04:41:01.558653 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:01.558673 kubelet[2743]: W0621 04:41:01.558663 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:01.558673 kubelet[2743]: E0621 04:41:01.558670 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:01.558872 kubelet[2743]: E0621 04:41:01.558853 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:01.558872 kubelet[2743]: W0621 04:41:01.558865 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:01.558872 kubelet[2743]: E0621 04:41:01.558874 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:01.559063 kubelet[2743]: E0621 04:41:01.559029 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:01.559063 kubelet[2743]: W0621 04:41:01.559043 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:01.559063 kubelet[2743]: E0621 04:41:01.559060 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:01.559250 kubelet[2743]: E0621 04:41:01.559224 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:01.559250 kubelet[2743]: W0621 04:41:01.559240 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:01.559250 kubelet[2743]: E0621 04:41:01.559250 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:01.562560 kubelet[2743]: E0621 04:41:01.562519 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:01.562560 kubelet[2743]: W0621 04:41:01.562532 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:01.562560 kubelet[2743]: E0621 04:41:01.562541 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:01.562759 kubelet[2743]: E0621 04:41:01.562737 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:01.562759 kubelet[2743]: W0621 04:41:01.562746 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:01.562759 kubelet[2743]: E0621 04:41:01.562754 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:01.563018 kubelet[2743]: E0621 04:41:01.562990 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:01.563018 kubelet[2743]: W0621 04:41:01.563005 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:01.563018 kubelet[2743]: E0621 04:41:01.563014 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:01.563227 kubelet[2743]: E0621 04:41:01.563205 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:01.563227 kubelet[2743]: W0621 04:41:01.563215 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:01.563227 kubelet[2743]: E0621 04:41:01.563223 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:01.563414 kubelet[2743]: E0621 04:41:01.563391 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:01.563414 kubelet[2743]: W0621 04:41:01.563402 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:01.563414 kubelet[2743]: E0621 04:41:01.563408 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:01.563694 kubelet[2743]: E0621 04:41:01.563677 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:01.563694 kubelet[2743]: W0621 04:41:01.563689 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:01.563789 kubelet[2743]: E0621 04:41:01.563697 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:01.563992 kubelet[2743]: E0621 04:41:01.563971 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:01.563992 kubelet[2743]: W0621 04:41:01.563986 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:01.563992 kubelet[2743]: E0621 04:41:01.563995 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:01.564222 kubelet[2743]: E0621 04:41:01.564205 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:01.564222 kubelet[2743]: W0621 04:41:01.564219 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:01.564283 kubelet[2743]: E0621 04:41:01.564227 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:01.564426 kubelet[2743]: E0621 04:41:01.564412 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:01.564426 kubelet[2743]: W0621 04:41:01.564422 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:01.564476 kubelet[2743]: E0621 04:41:01.564428 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:01.564613 kubelet[2743]: E0621 04:41:01.564600 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:01.564613 kubelet[2743]: W0621 04:41:01.564609 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:01.564663 kubelet[2743]: E0621 04:41:01.564616 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:01.564841 kubelet[2743]: E0621 04:41:01.564824 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:01.564841 kubelet[2743]: W0621 04:41:01.564836 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:01.564905 kubelet[2743]: E0621 04:41:01.564846 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:01.565280 kubelet[2743]: E0621 04:41:01.565249 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:01.565327 kubelet[2743]: W0621 04:41:01.565280 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:01.565327 kubelet[2743]: E0621 04:41:01.565307 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:01.565565 kubelet[2743]: E0621 04:41:01.565549 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:01.565565 kubelet[2743]: W0621 04:41:01.565563 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:01.565619 kubelet[2743]: E0621 04:41:01.565573 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:01.565818 kubelet[2743]: E0621 04:41:01.565796 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:01.565818 kubelet[2743]: W0621 04:41:01.565807 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:01.565818 kubelet[2743]: E0621 04:41:01.565816 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:01.566003 kubelet[2743]: E0621 04:41:01.565992 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:01.566003 kubelet[2743]: W0621 04:41:01.566000 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:01.566062 kubelet[2743]: E0621 04:41:01.566008 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:01.566229 kubelet[2743]: E0621 04:41:01.566217 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:01.566229 kubelet[2743]: W0621 04:41:01.566226 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:01.566292 kubelet[2743]: E0621 04:41:01.566233 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:01.566407 kubelet[2743]: E0621 04:41:01.566397 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:01.566407 kubelet[2743]: W0621 04:41:01.566405 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:01.566455 kubelet[2743]: E0621 04:41:01.566412 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:01.566687 kubelet[2743]: E0621 04:41:01.566671 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:01.566687 kubelet[2743]: W0621 04:41:01.566684 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:01.566781 kubelet[2743]: E0621 04:41:01.566693 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:02.506663 kubelet[2743]: E0621 04:41:02.506631 2743 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Jun 21 04:41:02.565113 kubelet[2743]: E0621 04:41:02.565083 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:02.565113 kubelet[2743]: W0621 04:41:02.565105 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:02.565286 kubelet[2743]: E0621 04:41:02.565128 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:02.565330 kubelet[2743]: E0621 04:41:02.565317 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:02.565330 kubelet[2743]: W0621 04:41:02.565328 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:02.565385 kubelet[2743]: E0621 04:41:02.565337 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:02.565515 kubelet[2743]: E0621 04:41:02.565504 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:02.565560 kubelet[2743]: W0621 04:41:02.565515 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:02.565560 kubelet[2743]: E0621 04:41:02.565525 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:02.565712 kubelet[2743]: E0621 04:41:02.565698 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:02.565752 kubelet[2743]: W0621 04:41:02.565710 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:02.565752 kubelet[2743]: E0621 04:41:02.565739 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:02.565919 kubelet[2743]: E0621 04:41:02.565905 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:02.565919 kubelet[2743]: W0621 04:41:02.565915 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:02.565980 kubelet[2743]: E0621 04:41:02.565924 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:02.566166 kubelet[2743]: E0621 04:41:02.566149 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:02.566166 kubelet[2743]: W0621 04:41:02.566161 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:02.566249 kubelet[2743]: E0621 04:41:02.566176 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:02.566399 kubelet[2743]: E0621 04:41:02.566374 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:02.566399 kubelet[2743]: W0621 04:41:02.566387 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:02.566399 kubelet[2743]: E0621 04:41:02.566397 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:02.566576 kubelet[2743]: E0621 04:41:02.566560 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:02.566607 kubelet[2743]: W0621 04:41:02.566577 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:02.566607 kubelet[2743]: E0621 04:41:02.566585 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:02.566798 kubelet[2743]: E0621 04:41:02.566782 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:02.566798 kubelet[2743]: W0621 04:41:02.566793 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:02.566893 kubelet[2743]: E0621 04:41:02.566803 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:02.566971 kubelet[2743]: E0621 04:41:02.566958 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:02.566971 kubelet[2743]: W0621 04:41:02.566967 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:02.567033 kubelet[2743]: E0621 04:41:02.566975 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:02.567136 kubelet[2743]: E0621 04:41:02.567124 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:02.567136 kubelet[2743]: W0621 04:41:02.567132 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:02.567192 kubelet[2743]: E0621 04:41:02.567140 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:02.567296 kubelet[2743]: E0621 04:41:02.567283 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:02.567296 kubelet[2743]: W0621 04:41:02.567292 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:02.567353 kubelet[2743]: E0621 04:41:02.567299 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:02.567512 kubelet[2743]: E0621 04:41:02.567483 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:02.567512 kubelet[2743]: W0621 04:41:02.567507 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:02.567576 kubelet[2743]: E0621 04:41:02.567519 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:02.567734 kubelet[2743]: E0621 04:41:02.567698 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:02.567734 kubelet[2743]: W0621 04:41:02.567708 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:02.567734 kubelet[2743]: E0621 04:41:02.567734 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:02.567918 kubelet[2743]: E0621 04:41:02.567902 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:02.567918 kubelet[2743]: W0621 04:41:02.567913 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:02.567995 kubelet[2743]: E0621 04:41:02.567923 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:02.570246 kubelet[2743]: E0621 04:41:02.570230 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:02.570246 kubelet[2743]: W0621 04:41:02.570243 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:02.570332 kubelet[2743]: E0621 04:41:02.570260 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:02.571109 kubelet[2743]: E0621 04:41:02.571088 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:02.571109 kubelet[2743]: W0621 04:41:02.571102 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:02.571187 kubelet[2743]: E0621 04:41:02.571113 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:02.571405 kubelet[2743]: E0621 04:41:02.571390 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:02.571405 kubelet[2743]: W0621 04:41:02.571401 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:02.571478 kubelet[2743]: E0621 04:41:02.571411 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:02.571640 kubelet[2743]: E0621 04:41:02.571625 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:02.571640 kubelet[2743]: W0621 04:41:02.571636 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:02.571706 kubelet[2743]: E0621 04:41:02.571646 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:02.571867 kubelet[2743]: E0621 04:41:02.571852 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:02.571867 kubelet[2743]: W0621 04:41:02.571866 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:02.571946 kubelet[2743]: E0621 04:41:02.571878 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:02.572127 kubelet[2743]: E0621 04:41:02.572113 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:02.572169 kubelet[2743]: W0621 04:41:02.572125 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:02.572169 kubelet[2743]: E0621 04:41:02.572137 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:02.572366 kubelet[2743]: E0621 04:41:02.572351 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:02.572366 kubelet[2743]: W0621 04:41:02.572363 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:02.572449 kubelet[2743]: E0621 04:41:02.572375 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:02.572670 kubelet[2743]: E0621 04:41:02.572655 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:02.572705 kubelet[2743]: W0621 04:41:02.572668 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:02.572705 kubelet[2743]: E0621 04:41:02.572680 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:02.572918 kubelet[2743]: E0621 04:41:02.572903 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:02.572918 kubelet[2743]: W0621 04:41:02.572915 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:02.572976 kubelet[2743]: E0621 04:41:02.572925 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:02.573114 kubelet[2743]: E0621 04:41:02.573101 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:02.573114 kubelet[2743]: W0621 04:41:02.573112 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:02.573164 kubelet[2743]: E0621 04:41:02.573121 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:02.573303 kubelet[2743]: E0621 04:41:02.573290 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:02.573303 kubelet[2743]: W0621 04:41:02.573300 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:02.573363 kubelet[2743]: E0621 04:41:02.573313 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:02.573487 kubelet[2743]: E0621 04:41:02.573475 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:02.573487 kubelet[2743]: W0621 04:41:02.573485 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:02.573534 kubelet[2743]: E0621 04:41:02.573494 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:02.573703 kubelet[2743]: E0621 04:41:02.573690 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:02.573703 kubelet[2743]: W0621 04:41:02.573701 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:02.573792 kubelet[2743]: E0621 04:41:02.573732 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:02.573993 kubelet[2743]: E0621 04:41:02.573974 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:02.573993 kubelet[2743]: W0621 04:41:02.573988 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:02.574060 kubelet[2743]: E0621 04:41:02.573999 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:02.574166 kubelet[2743]: E0621 04:41:02.574154 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:02.574166 kubelet[2743]: W0621 04:41:02.574163 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:02.574222 kubelet[2743]: E0621 04:41:02.574171 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:02.574347 kubelet[2743]: E0621 04:41:02.574335 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:02.574347 kubelet[2743]: W0621 04:41:02.574343 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:02.574409 kubelet[2743]: E0621 04:41:02.574351 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:02.574618 kubelet[2743]: E0621 04:41:02.574597 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:02.574618 kubelet[2743]: W0621 04:41:02.574610 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:02.574667 kubelet[2743]: E0621 04:41:02.574620 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:02.574818 kubelet[2743]: E0621 04:41:02.574806 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jun 21 04:41:02.574818 kubelet[2743]: W0621 04:41:02.574817 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jun 21 04:41:02.574882 kubelet[2743]: E0621 04:41:02.574825 2743 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jun 21 04:41:02.718507 containerd[1593]: time="2025-06-21T04:41:02.718444244Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:02.719367 containerd[1593]: time="2025-06-21T04:41:02.719340065Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.1: active requests=0, bytes read=4441627" Jun 21 04:41:02.720409 containerd[1593]: time="2025-06-21T04:41:02.720375040Z" level=info msg="ImageCreate event name:\"sha256:2eb0d46821080fd806e1b7f8ca42889800fcb3f0af912b6fbb09a13b21454d48\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:02.722267 containerd[1593]: time="2025-06-21T04:41:02.722228439Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:b9246fe925ee5b8a5c7dfe1d1c3c29063cbfd512663088b135a015828c20401e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:02.722827 containerd[1593]: time="2025-06-21T04:41:02.722786052Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.1\" with image id \"sha256:2eb0d46821080fd806e1b7f8ca42889800fcb3f0af912b6fbb09a13b21454d48\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.1\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:b9246fe925ee5b8a5c7dfe1d1c3c29063cbfd512663088b135a015828c20401e\", size \"5934290\" in 2.36684072s" Jun 21 04:41:02.722827 containerd[1593]: time="2025-06-21T04:41:02.722813845Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.1\" returns image reference \"sha256:2eb0d46821080fd806e1b7f8ca42889800fcb3f0af912b6fbb09a13b21454d48\"" Jun 21 04:41:02.726989 containerd[1593]: time="2025-06-21T04:41:02.726949644Z" level=info msg="CreateContainer within sandbox \"704e7548d8ffd1f161df9d3e40314a5eb2af317f05326cc8832814acfa88f6f4\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jun 21 04:41:02.737977 containerd[1593]: time="2025-06-21T04:41:02.737946901Z" level=info msg="Container 78008227bbf729460c706c57e6ff03483672853e5d80f6c80fb4c72ee9a48771: CDI devices from CRI Config.CDIDevices: []" Jun 21 04:41:02.749300 containerd[1593]: time="2025-06-21T04:41:02.749255965Z" level=info msg="CreateContainer within sandbox \"704e7548d8ffd1f161df9d3e40314a5eb2af317f05326cc8832814acfa88f6f4\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"78008227bbf729460c706c57e6ff03483672853e5d80f6c80fb4c72ee9a48771\"" Jun 21 04:41:02.749858 containerd[1593]: time="2025-06-21T04:41:02.749734248Z" level=info msg="StartContainer for \"78008227bbf729460c706c57e6ff03483672853e5d80f6c80fb4c72ee9a48771\"" Jun 21 04:41:02.751174 containerd[1593]: time="2025-06-21T04:41:02.751149170Z" level=info msg="connecting to shim 78008227bbf729460c706c57e6ff03483672853e5d80f6c80fb4c72ee9a48771" address="unix:///run/containerd/s/ed01531c7e6bd8c9ea35addd8b10a9d349110d44dbfe56d4b33dcebc7cfbff48" protocol=ttrpc version=3 Jun 21 04:41:02.785874 systemd[1]: Started cri-containerd-78008227bbf729460c706c57e6ff03483672853e5d80f6c80fb4c72ee9a48771.scope - libcontainer container 78008227bbf729460c706c57e6ff03483672853e5d80f6c80fb4c72ee9a48771. Jun 21 04:41:02.852465 systemd[1]: cri-containerd-78008227bbf729460c706c57e6ff03483672853e5d80f6c80fb4c72ee9a48771.scope: Deactivated successfully. Jun 21 04:41:02.852865 systemd[1]: cri-containerd-78008227bbf729460c706c57e6ff03483672853e5d80f6c80fb4c72ee9a48771.scope: Consumed 39ms CPU time, 6.4M memory peak, 4.6M written to disk. Jun 21 04:41:02.855074 containerd[1593]: time="2025-06-21T04:41:02.855030919Z" level=info msg="TaskExit event in podsandbox handler container_id:\"78008227bbf729460c706c57e6ff03483672853e5d80f6c80fb4c72ee9a48771\" id:\"78008227bbf729460c706c57e6ff03483672853e5d80f6c80fb4c72ee9a48771\" pid:3526 exited_at:{seconds:1750480862 nanos:854344973}" Jun 21 04:41:03.105753 containerd[1593]: time="2025-06-21T04:41:03.105594283Z" level=info msg="received exit event container_id:\"78008227bbf729460c706c57e6ff03483672853e5d80f6c80fb4c72ee9a48771\" id:\"78008227bbf729460c706c57e6ff03483672853e5d80f6c80fb4c72ee9a48771\" pid:3526 exited_at:{seconds:1750480862 nanos:854344973}" Jun 21 04:41:03.116144 containerd[1593]: time="2025-06-21T04:41:03.116104542Z" level=info msg="StartContainer for \"78008227bbf729460c706c57e6ff03483672853e5d80f6c80fb4c72ee9a48771\" returns successfully" Jun 21 04:41:03.131248 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-78008227bbf729460c706c57e6ff03483672853e5d80f6c80fb4c72ee9a48771-rootfs.mount: Deactivated successfully. Jun 21 04:41:03.440919 kubelet[2743]: E0621 04:41:03.440771 2743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ldc5m" podUID="8757dac4-0fac-47e3-9805-744183a4690a" Jun 21 04:41:03.510973 containerd[1593]: time="2025-06-21T04:41:03.510934703Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.1\"" Jun 21 04:41:05.441053 kubelet[2743]: E0621 04:41:05.440975 2743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ldc5m" podUID="8757dac4-0fac-47e3-9805-744183a4690a" Jun 21 04:41:07.441088 kubelet[2743]: E0621 04:41:07.441014 2743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ldc5m" podUID="8757dac4-0fac-47e3-9805-744183a4690a" Jun 21 04:41:07.530974 containerd[1593]: time="2025-06-21T04:41:07.530923061Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:07.531664 containerd[1593]: time="2025-06-21T04:41:07.531629853Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.1: active requests=0, bytes read=70405879" Jun 21 04:41:07.532805 containerd[1593]: time="2025-06-21T04:41:07.532775362Z" level=info msg="ImageCreate event name:\"sha256:0d2cd976ff6ee711927e02b1c2ba0b532275ff85d5dc05fc413cc660d5bec68e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:07.534735 containerd[1593]: time="2025-06-21T04:41:07.534695361Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:930b33311eec7523e36d95977281681d74d33efff937302b26516b2bc03a5fe9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:07.535266 containerd[1593]: time="2025-06-21T04:41:07.535239066Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.1\" with image id \"sha256:0d2cd976ff6ee711927e02b1c2ba0b532275ff85d5dc05fc413cc660d5bec68e\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.1\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:930b33311eec7523e36d95977281681d74d33efff937302b26516b2bc03a5fe9\", size \"71898582\" in 4.024269718s" Jun 21 04:41:07.535266 containerd[1593]: time="2025-06-21T04:41:07.535261388Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.1\" returns image reference \"sha256:0d2cd976ff6ee711927e02b1c2ba0b532275ff85d5dc05fc413cc660d5bec68e\"" Jun 21 04:41:07.539906 containerd[1593]: time="2025-06-21T04:41:07.539869023Z" level=info msg="CreateContainer within sandbox \"704e7548d8ffd1f161df9d3e40314a5eb2af317f05326cc8832814acfa88f6f4\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jun 21 04:41:07.549419 containerd[1593]: time="2025-06-21T04:41:07.549380672Z" level=info msg="Container a95442c61e77edd63d40950878a7753f2078cae3e136da3c2209ffb2165dcf15: CDI devices from CRI Config.CDIDevices: []" Jun 21 04:41:07.558118 containerd[1593]: time="2025-06-21T04:41:07.558080932Z" level=info msg="CreateContainer within sandbox \"704e7548d8ffd1f161df9d3e40314a5eb2af317f05326cc8832814acfa88f6f4\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"a95442c61e77edd63d40950878a7753f2078cae3e136da3c2209ffb2165dcf15\"" Jun 21 04:41:07.558628 containerd[1593]: time="2025-06-21T04:41:07.558586254Z" level=info msg="StartContainer for \"a95442c61e77edd63d40950878a7753f2078cae3e136da3c2209ffb2165dcf15\"" Jun 21 04:41:07.560106 containerd[1593]: time="2025-06-21T04:41:07.560065582Z" level=info msg="connecting to shim a95442c61e77edd63d40950878a7753f2078cae3e136da3c2209ffb2165dcf15" address="unix:///run/containerd/s/ed01531c7e6bd8c9ea35addd8b10a9d349110d44dbfe56d4b33dcebc7cfbff48" protocol=ttrpc version=3 Jun 21 04:41:07.579841 systemd[1]: Started cri-containerd-a95442c61e77edd63d40950878a7753f2078cae3e136da3c2209ffb2165dcf15.scope - libcontainer container a95442c61e77edd63d40950878a7753f2078cae3e136da3c2209ffb2165dcf15. Jun 21 04:41:07.619847 containerd[1593]: time="2025-06-21T04:41:07.619806559Z" level=info msg="StartContainer for \"a95442c61e77edd63d40950878a7753f2078cae3e136da3c2209ffb2165dcf15\" returns successfully" Jun 21 04:41:08.890392 containerd[1593]: time="2025-06-21T04:41:08.890321291Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jun 21 04:41:08.893128 systemd[1]: cri-containerd-a95442c61e77edd63d40950878a7753f2078cae3e136da3c2209ffb2165dcf15.scope: Deactivated successfully. Jun 21 04:41:08.893952 systemd[1]: cri-containerd-a95442c61e77edd63d40950878a7753f2078cae3e136da3c2209ffb2165dcf15.scope: Consumed 585ms CPU time, 177.4M memory peak, 1.9M read from disk, 171.2M written to disk. Jun 21 04:41:08.894452 containerd[1593]: time="2025-06-21T04:41:08.894318363Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a95442c61e77edd63d40950878a7753f2078cae3e136da3c2209ffb2165dcf15\" id:\"a95442c61e77edd63d40950878a7753f2078cae3e136da3c2209ffb2165dcf15\" pid:3585 exited_at:{seconds:1750480868 nanos:893922177}" Jun 21 04:41:08.894452 containerd[1593]: time="2025-06-21T04:41:08.894354702Z" level=info msg="received exit event container_id:\"a95442c61e77edd63d40950878a7753f2078cae3e136da3c2209ffb2165dcf15\" id:\"a95442c61e77edd63d40950878a7753f2078cae3e136da3c2209ffb2165dcf15\" pid:3585 exited_at:{seconds:1750480868 nanos:893922177}" Jun 21 04:41:08.912237 kubelet[2743]: I0621 04:41:08.912199 2743 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Jun 21 04:41:08.919988 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a95442c61e77edd63d40950878a7753f2078cae3e136da3c2209ffb2165dcf15-rootfs.mount: Deactivated successfully. Jun 21 04:41:09.203200 systemd[1]: Created slice kubepods-besteffort-pode32c81e0_9485_4122_916a_bc8d4204e9f3.slice - libcontainer container kubepods-besteffort-pode32c81e0_9485_4122_916a_bc8d4204e9f3.slice. Jun 21 04:41:09.214829 systemd[1]: Created slice kubepods-burstable-poddecddb52_80c6_4fed_bf16_dbdc189f89be.slice - libcontainer container kubepods-burstable-poddecddb52_80c6_4fed_bf16_dbdc189f89be.slice. Jun 21 04:41:09.219234 kubelet[2743]: I0621 04:41:09.219176 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/08ed4df7-8d16-4be6-afa7-a63368b4f265-calico-apiserver-certs\") pod \"calico-apiserver-6db98b7fcf-rhrc4\" (UID: \"08ed4df7-8d16-4be6-afa7-a63368b4f265\") " pod="calico-apiserver/calico-apiserver-6db98b7fcf-rhrc4" Jun 21 04:41:09.219388 kubelet[2743]: I0621 04:41:09.219246 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzjbk\" (UniqueName: \"kubernetes.io/projected/308176fb-c3b9-477d-ba55-c124940f5841-kube-api-access-kzjbk\") pod \"goldmane-5bd85449d4-89ct8\" (UID: \"308176fb-c3b9-477d-ba55-c124940f5841\") " pod="calico-system/goldmane-5bd85449d4-89ct8" Jun 21 04:41:09.219388 kubelet[2743]: I0621 04:41:09.219270 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e319519b-3d7c-436c-a209-e61e1b845712-tigera-ca-bundle\") pod \"calico-kube-controllers-c7b7955d6-gf7gk\" (UID: \"e319519b-3d7c-436c-a209-e61e1b845712\") " pod="calico-system/calico-kube-controllers-c7b7955d6-gf7gk" Jun 21 04:41:09.219388 kubelet[2743]: I0621 04:41:09.219293 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e32c81e0-9485-4122-916a-bc8d4204e9f3-whisker-backend-key-pair\") pod \"whisker-5fbdf476b5-56qh2\" (UID: \"e32c81e0-9485-4122-916a-bc8d4204e9f3\") " pod="calico-system/whisker-5fbdf476b5-56qh2" Jun 21 04:41:09.219388 kubelet[2743]: I0621 04:41:09.219313 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw5hw\" (UniqueName: \"kubernetes.io/projected/decddb52-80c6-4fed-bf16-dbdc189f89be-kube-api-access-hw5hw\") pod \"coredns-674b8bbfcf-5x9q2\" (UID: \"decddb52-80c6-4fed-bf16-dbdc189f89be\") " pod="kube-system/coredns-674b8bbfcf-5x9q2" Jun 21 04:41:09.219388 kubelet[2743]: I0621 04:41:09.219335 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/308176fb-c3b9-477d-ba55-c124940f5841-goldmane-key-pair\") pod \"goldmane-5bd85449d4-89ct8\" (UID: \"308176fb-c3b9-477d-ba55-c124940f5841\") " pod="calico-system/goldmane-5bd85449d4-89ct8" Jun 21 04:41:09.219507 kubelet[2743]: I0621 04:41:09.219360 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/308176fb-c3b9-477d-ba55-c124940f5841-goldmane-ca-bundle\") pod \"goldmane-5bd85449d4-89ct8\" (UID: \"308176fb-c3b9-477d-ba55-c124940f5841\") " pod="calico-system/goldmane-5bd85449d4-89ct8" Jun 21 04:41:09.219507 kubelet[2743]: I0621 04:41:09.219388 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ff624f6-6dd1-44df-847d-619cdf482fbf-config-volume\") pod \"coredns-674b8bbfcf-lchxl\" (UID: \"4ff624f6-6dd1-44df-847d-619cdf482fbf\") " pod="kube-system/coredns-674b8bbfcf-lchxl" Jun 21 04:41:09.219507 kubelet[2743]: I0621 04:41:09.219422 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e32c81e0-9485-4122-916a-bc8d4204e9f3-whisker-ca-bundle\") pod \"whisker-5fbdf476b5-56qh2\" (UID: \"e32c81e0-9485-4122-916a-bc8d4204e9f3\") " pod="calico-system/whisker-5fbdf476b5-56qh2" Jun 21 04:41:09.219507 kubelet[2743]: I0621 04:41:09.219441 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsxw9\" (UniqueName: \"kubernetes.io/projected/e32c81e0-9485-4122-916a-bc8d4204e9f3-kube-api-access-hsxw9\") pod \"whisker-5fbdf476b5-56qh2\" (UID: \"e32c81e0-9485-4122-916a-bc8d4204e9f3\") " pod="calico-system/whisker-5fbdf476b5-56qh2" Jun 21 04:41:09.219507 kubelet[2743]: I0621 04:41:09.219460 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57bl4\" (UniqueName: \"kubernetes.io/projected/ba7afd7e-002b-4c26-9fd9-a17eb66b7f50-kube-api-access-57bl4\") pod \"calico-apiserver-6ddd6f5999-lwdv6\" (UID: \"ba7afd7e-002b-4c26-9fd9-a17eb66b7f50\") " pod="calico-apiserver/calico-apiserver-6ddd6f5999-lwdv6" Jun 21 04:41:09.219624 kubelet[2743]: I0621 04:41:09.219477 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/decddb52-80c6-4fed-bf16-dbdc189f89be-config-volume\") pod \"coredns-674b8bbfcf-5x9q2\" (UID: \"decddb52-80c6-4fed-bf16-dbdc189f89be\") " pod="kube-system/coredns-674b8bbfcf-5x9q2" Jun 21 04:41:09.219624 kubelet[2743]: I0621 04:41:09.219505 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/308176fb-c3b9-477d-ba55-c124940f5841-config\") pod \"goldmane-5bd85449d4-89ct8\" (UID: \"308176fb-c3b9-477d-ba55-c124940f5841\") " pod="calico-system/goldmane-5bd85449d4-89ct8" Jun 21 04:41:09.219624 kubelet[2743]: I0621 04:41:09.219533 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n4vb\" (UniqueName: \"kubernetes.io/projected/4ff624f6-6dd1-44df-847d-619cdf482fbf-kube-api-access-9n4vb\") pod \"coredns-674b8bbfcf-lchxl\" (UID: \"4ff624f6-6dd1-44df-847d-619cdf482fbf\") " pod="kube-system/coredns-674b8bbfcf-lchxl" Jun 21 04:41:09.219624 kubelet[2743]: I0621 04:41:09.219554 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29qt9\" (UniqueName: \"kubernetes.io/projected/08ed4df7-8d16-4be6-afa7-a63368b4f265-kube-api-access-29qt9\") pod \"calico-apiserver-6db98b7fcf-rhrc4\" (UID: \"08ed4df7-8d16-4be6-afa7-a63368b4f265\") " pod="calico-apiserver/calico-apiserver-6db98b7fcf-rhrc4" Jun 21 04:41:09.219624 kubelet[2743]: I0621 04:41:09.219573 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/ba7afd7e-002b-4c26-9fd9-a17eb66b7f50-calico-apiserver-certs\") pod \"calico-apiserver-6ddd6f5999-lwdv6\" (UID: \"ba7afd7e-002b-4c26-9fd9-a17eb66b7f50\") " pod="calico-apiserver/calico-apiserver-6ddd6f5999-lwdv6" Jun 21 04:41:09.220192 kubelet[2743]: I0621 04:41:09.219593 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjrlr\" (UniqueName: \"kubernetes.io/projected/e319519b-3d7c-436c-a209-e61e1b845712-kube-api-access-qjrlr\") pod \"calico-kube-controllers-c7b7955d6-gf7gk\" (UID: \"e319519b-3d7c-436c-a209-e61e1b845712\") " pod="calico-system/calico-kube-controllers-c7b7955d6-gf7gk" Jun 21 04:41:09.222149 systemd[1]: Created slice kubepods-besteffort-pod08ed4df7_8d16_4be6_afa7_a63368b4f265.slice - libcontainer container kubepods-besteffort-pod08ed4df7_8d16_4be6_afa7_a63368b4f265.slice. Jun 21 04:41:09.229422 systemd[1]: Created slice kubepods-besteffort-pod308176fb_c3b9_477d_ba55_c124940f5841.slice - libcontainer container kubepods-besteffort-pod308176fb_c3b9_477d_ba55_c124940f5841.slice. Jun 21 04:41:09.235696 systemd[1]: Created slice kubepods-besteffort-pode319519b_3d7c_436c_a209_e61e1b845712.slice - libcontainer container kubepods-besteffort-pode319519b_3d7c_436c_a209_e61e1b845712.slice. Jun 21 04:41:09.241349 systemd[1]: Created slice kubepods-besteffort-podba7afd7e_002b_4c26_9fd9_a17eb66b7f50.slice - libcontainer container kubepods-besteffort-podba7afd7e_002b_4c26_9fd9_a17eb66b7f50.slice. Jun 21 04:41:09.246010 systemd[1]: Created slice kubepods-besteffort-pod82964b29_a003_4f56_b879_9024b11edde1.slice - libcontainer container kubepods-besteffort-pod82964b29_a003_4f56_b879_9024b11edde1.slice. Jun 21 04:41:09.251825 systemd[1]: Created slice kubepods-burstable-pod4ff624f6_6dd1_44df_847d_619cdf482fbf.slice - libcontainer container kubepods-burstable-pod4ff624f6_6dd1_44df_847d_619cdf482fbf.slice. Jun 21 04:41:09.320232 kubelet[2743]: I0621 04:41:09.320172 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/82964b29-a003-4f56-b879-9024b11edde1-calico-apiserver-certs\") pod \"calico-apiserver-6ddd6f5999-tkdl5\" (UID: \"82964b29-a003-4f56-b879-9024b11edde1\") " pod="calico-apiserver/calico-apiserver-6ddd6f5999-tkdl5" Jun 21 04:41:09.320232 kubelet[2743]: I0621 04:41:09.320233 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69m64\" (UniqueName: \"kubernetes.io/projected/82964b29-a003-4f56-b879-9024b11edde1-kube-api-access-69m64\") pod \"calico-apiserver-6ddd6f5999-tkdl5\" (UID: \"82964b29-a003-4f56-b879-9024b11edde1\") " pod="calico-apiserver/calico-apiserver-6ddd6f5999-tkdl5" Jun 21 04:41:09.446447 systemd[1]: Created slice kubepods-besteffort-pod8757dac4_0fac_47e3_9805_744183a4690a.slice - libcontainer container kubepods-besteffort-pod8757dac4_0fac_47e3_9805_744183a4690a.slice. Jun 21 04:41:09.448816 containerd[1593]: time="2025-06-21T04:41:09.448780590Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ldc5m,Uid:8757dac4-0fac-47e3-9805-744183a4690a,Namespace:calico-system,Attempt:0,}" Jun 21 04:41:09.508015 containerd[1593]: time="2025-06-21T04:41:09.507870085Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5fbdf476b5-56qh2,Uid:e32c81e0-9485-4122-916a-bc8d4204e9f3,Namespace:calico-system,Attempt:0,}" Jun 21 04:41:09.521911 kubelet[2743]: E0621 04:41:09.521666 2743 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Jun 21 04:41:09.523402 containerd[1593]: time="2025-06-21T04:41:09.523317927Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-5x9q2,Uid:decddb52-80c6-4fed-bf16-dbdc189f89be,Namespace:kube-system,Attempt:0,}" Jun 21 04:41:09.527913 containerd[1593]: time="2025-06-21T04:41:09.526993891Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6db98b7fcf-rhrc4,Uid:08ed4df7-8d16-4be6-afa7-a63368b4f265,Namespace:calico-apiserver,Attempt:0,}" Jun 21 04:41:09.532608 containerd[1593]: time="2025-06-21T04:41:09.532562192Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5bd85449d4-89ct8,Uid:308176fb-c3b9-477d-ba55-c124940f5841,Namespace:calico-system,Attempt:0,}" Jun 21 04:41:09.542492 containerd[1593]: time="2025-06-21T04:41:09.542444318Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-c7b7955d6-gf7gk,Uid:e319519b-3d7c-436c-a209-e61e1b845712,Namespace:calico-system,Attempt:0,}" Jun 21 04:41:09.543796 containerd[1593]: time="2025-06-21T04:41:09.542819054Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.1\"" Jun 21 04:41:09.545831 containerd[1593]: time="2025-06-21T04:41:09.545793167Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6ddd6f5999-lwdv6,Uid:ba7afd7e-002b-4c26-9fd9-a17eb66b7f50,Namespace:calico-apiserver,Attempt:0,}" Jun 21 04:41:09.553201 containerd[1593]: time="2025-06-21T04:41:09.553163140Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6ddd6f5999-tkdl5,Uid:82964b29-a003-4f56-b879-9024b11edde1,Namespace:calico-apiserver,Attempt:0,}" Jun 21 04:41:09.555048 kubelet[2743]: E0621 04:41:09.554840 2743 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Jun 21 04:41:09.556044 containerd[1593]: time="2025-06-21T04:41:09.555999473Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-lchxl,Uid:4ff624f6-6dd1-44df-847d-619cdf482fbf,Namespace:kube-system,Attempt:0,}" Jun 21 04:41:09.627035 containerd[1593]: time="2025-06-21T04:41:09.626983807Z" level=error msg="Failed to destroy network for sandbox \"79f855e529cd10e54bde0ea05cd35f0b57fd26dbe90e7bc0ea1bb1e43d59a3f3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jun 21 04:41:09.637782 containerd[1593]: time="2025-06-21T04:41:09.637358021Z" level=error msg="Failed to destroy network for sandbox \"211f4f3740e5fd691ef4502dea787935c97d61fd004aee6aaa21ba05d48f5dc4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jun 21 04:41:09.678185 containerd[1593]: time="2025-06-21T04:41:09.677963355Z" level=error msg="Failed to destroy network for sandbox \"a8b7fb80d876973cdbc5f3d4945ac3b516cb0ea6e28305f2c0407b59e66d2fe7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jun 21 04:41:09.687804 containerd[1593]: time="2025-06-21T04:41:09.687617371Z" level=error msg="Failed to destroy network for sandbox \"c14adba3b12e0f09a0453604ac4416588583b1981525c29f31404955e23b8d18\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jun 21 04:41:09.693814 containerd[1593]: time="2025-06-21T04:41:09.693678430Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ldc5m,Uid:8757dac4-0fac-47e3-9805-744183a4690a,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"79f855e529cd10e54bde0ea05cd35f0b57fd26dbe90e7bc0ea1bb1e43d59a3f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jun 21 04:41:09.694231 containerd[1593]: time="2025-06-21T04:41:09.694039008Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5fbdf476b5-56qh2,Uid:e32c81e0-9485-4122-916a-bc8d4204e9f3,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"211f4f3740e5fd691ef4502dea787935c97d61fd004aee6aaa21ba05d48f5dc4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jun 21 04:41:09.706145 kubelet[2743]: E0621 04:41:09.705651 2743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"79f855e529cd10e54bde0ea05cd35f0b57fd26dbe90e7bc0ea1bb1e43d59a3f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jun 21 04:41:09.706145 kubelet[2743]: E0621 04:41:09.705696 2743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"211f4f3740e5fd691ef4502dea787935c97d61fd004aee6aaa21ba05d48f5dc4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jun 21 04:41:09.706145 kubelet[2743]: E0621 04:41:09.705856 2743 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"211f4f3740e5fd691ef4502dea787935c97d61fd004aee6aaa21ba05d48f5dc4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5fbdf476b5-56qh2" Jun 21 04:41:09.706145 kubelet[2743]: E0621 04:41:09.705912 2743 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"79f855e529cd10e54bde0ea05cd35f0b57fd26dbe90e7bc0ea1bb1e43d59a3f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-ldc5m" Jun 21 04:41:09.706419 kubelet[2743]: E0621 04:41:09.705934 2743 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"79f855e529cd10e54bde0ea05cd35f0b57fd26dbe90e7bc0ea1bb1e43d59a3f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-ldc5m" Jun 21 04:41:09.706419 kubelet[2743]: E0621 04:41:09.705877 2743 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"211f4f3740e5fd691ef4502dea787935c97d61fd004aee6aaa21ba05d48f5dc4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5fbdf476b5-56qh2" Jun 21 04:41:09.706419 kubelet[2743]: E0621 04:41:09.706087 2743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-ldc5m_calico-system(8757dac4-0fac-47e3-9805-744183a4690a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-ldc5m_calico-system(8757dac4-0fac-47e3-9805-744183a4690a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"79f855e529cd10e54bde0ea05cd35f0b57fd26dbe90e7bc0ea1bb1e43d59a3f3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-ldc5m" podUID="8757dac4-0fac-47e3-9805-744183a4690a" Jun 21 04:41:09.706570 kubelet[2743]: E0621 04:41:09.706240 2743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5fbdf476b5-56qh2_calico-system(e32c81e0-9485-4122-916a-bc8d4204e9f3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5fbdf476b5-56qh2_calico-system(e32c81e0-9485-4122-916a-bc8d4204e9f3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"211f4f3740e5fd691ef4502dea787935c97d61fd004aee6aaa21ba05d48f5dc4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5fbdf476b5-56qh2" podUID="e32c81e0-9485-4122-916a-bc8d4204e9f3" Jun 21 04:41:09.717259 containerd[1593]: time="2025-06-21T04:41:09.717207334Z" level=error msg="Failed to destroy network for sandbox \"06d7c455c54526b4da62c8eeeaebb64c76e78ffa9f5c220b5bcfab7ba8603c5e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jun 21 04:41:09.724045 containerd[1593]: time="2025-06-21T04:41:09.723843126Z" level=error msg="Failed to destroy network for sandbox \"d467a18074b3d31f8be77dd677a524757142b1d0c004ce12a9135a5714f1cde1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jun 21 04:41:09.724341 containerd[1593]: time="2025-06-21T04:41:09.723908228Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6db98b7fcf-rhrc4,Uid:08ed4df7-8d16-4be6-afa7-a63368b4f265,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a8b7fb80d876973cdbc5f3d4945ac3b516cb0ea6e28305f2c0407b59e66d2fe7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jun 21 04:41:09.724974 kubelet[2743]: E0621 04:41:09.724739 2743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a8b7fb80d876973cdbc5f3d4945ac3b516cb0ea6e28305f2c0407b59e66d2fe7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jun 21 04:41:09.724974 kubelet[2743]: E0621 04:41:09.724811 2743 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a8b7fb80d876973cdbc5f3d4945ac3b516cb0ea6e28305f2c0407b59e66d2fe7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6db98b7fcf-rhrc4" Jun 21 04:41:09.724974 kubelet[2743]: E0621 04:41:09.724835 2743 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a8b7fb80d876973cdbc5f3d4945ac3b516cb0ea6e28305f2c0407b59e66d2fe7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6db98b7fcf-rhrc4" Jun 21 04:41:09.725141 kubelet[2743]: E0621 04:41:09.724920 2743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6db98b7fcf-rhrc4_calico-apiserver(08ed4df7-8d16-4be6-afa7-a63368b4f265)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6db98b7fcf-rhrc4_calico-apiserver(08ed4df7-8d16-4be6-afa7-a63368b4f265)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a8b7fb80d876973cdbc5f3d4945ac3b516cb0ea6e28305f2c0407b59e66d2fe7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6db98b7fcf-rhrc4" podUID="08ed4df7-8d16-4be6-afa7-a63368b4f265" Jun 21 04:41:09.727134 containerd[1593]: time="2025-06-21T04:41:09.727094871Z" level=error msg="Failed to destroy network for sandbox \"f004e76fbbc739563f73f3222c3f682a1391ad86c13426c74003f5e4c7577148\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jun 21 04:41:09.729389 containerd[1593]: time="2025-06-21T04:41:09.729332336Z" level=error msg="Failed to destroy network for sandbox \"c3883d4f803eef13504d50e4e5616349a3404644decfb0dc47da976240b0f3e0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jun 21 04:41:09.734490 containerd[1593]: time="2025-06-21T04:41:09.734445047Z" level=error msg="Failed to destroy network for sandbox \"c44610e869b6676490a27176a715649754dbb12f59ac599c75e59331afe41787\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jun 21 04:41:09.764519 containerd[1593]: time="2025-06-21T04:41:09.764404808Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6ddd6f5999-tkdl5,Uid:82964b29-a003-4f56-b879-9024b11edde1,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c14adba3b12e0f09a0453604ac4416588583b1981525c29f31404955e23b8d18\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jun 21 04:41:09.764773 kubelet[2743]: E0621 04:41:09.764731 2743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c14adba3b12e0f09a0453604ac4416588583b1981525c29f31404955e23b8d18\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jun 21 04:41:09.764845 kubelet[2743]: E0621 04:41:09.764804 2743 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c14adba3b12e0f09a0453604ac4416588583b1981525c29f31404955e23b8d18\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6ddd6f5999-tkdl5" Jun 21 04:41:09.764845 kubelet[2743]: E0621 04:41:09.764828 2743 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c14adba3b12e0f09a0453604ac4416588583b1981525c29f31404955e23b8d18\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6ddd6f5999-tkdl5" Jun 21 04:41:09.764940 kubelet[2743]: E0621 04:41:09.764888 2743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6ddd6f5999-tkdl5_calico-apiserver(82964b29-a003-4f56-b879-9024b11edde1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6ddd6f5999-tkdl5_calico-apiserver(82964b29-a003-4f56-b879-9024b11edde1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c14adba3b12e0f09a0453604ac4416588583b1981525c29f31404955e23b8d18\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6ddd6f5999-tkdl5" podUID="82964b29-a003-4f56-b879-9024b11edde1" Jun 21 04:41:09.805158 containerd[1593]: time="2025-06-21T04:41:09.805103798Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-5x9q2,Uid:decddb52-80c6-4fed-bf16-dbdc189f89be,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"06d7c455c54526b4da62c8eeeaebb64c76e78ffa9f5c220b5bcfab7ba8603c5e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jun 21 04:41:09.805471 kubelet[2743]: E0621 04:41:09.805396 2743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"06d7c455c54526b4da62c8eeeaebb64c76e78ffa9f5c220b5bcfab7ba8603c5e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jun 21 04:41:09.805568 kubelet[2743]: E0621 04:41:09.805474 2743 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"06d7c455c54526b4da62c8eeeaebb64c76e78ffa9f5c220b5bcfab7ba8603c5e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-5x9q2" Jun 21 04:41:09.805568 kubelet[2743]: E0621 04:41:09.805495 2743 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"06d7c455c54526b4da62c8eeeaebb64c76e78ffa9f5c220b5bcfab7ba8603c5e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-5x9q2" Jun 21 04:41:09.805568 kubelet[2743]: E0621 04:41:09.805556 2743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-5x9q2_kube-system(decddb52-80c6-4fed-bf16-dbdc189f89be)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-5x9q2_kube-system(decddb52-80c6-4fed-bf16-dbdc189f89be)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"06d7c455c54526b4da62c8eeeaebb64c76e78ffa9f5c220b5bcfab7ba8603c5e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-5x9q2" podUID="decddb52-80c6-4fed-bf16-dbdc189f89be" Jun 21 04:41:09.835995 containerd[1593]: time="2025-06-21T04:41:09.835941430Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5bd85449d4-89ct8,Uid:308176fb-c3b9-477d-ba55-c124940f5841,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d467a18074b3d31f8be77dd677a524757142b1d0c004ce12a9135a5714f1cde1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jun 21 04:41:09.836286 kubelet[2743]: E0621 04:41:09.836077 2743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d467a18074b3d31f8be77dd677a524757142b1d0c004ce12a9135a5714f1cde1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jun 21 04:41:09.836286 kubelet[2743]: E0621 04:41:09.836106 2743 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d467a18074b3d31f8be77dd677a524757142b1d0c004ce12a9135a5714f1cde1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-5bd85449d4-89ct8" Jun 21 04:41:09.836286 kubelet[2743]: E0621 04:41:09.836121 2743 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d467a18074b3d31f8be77dd677a524757142b1d0c004ce12a9135a5714f1cde1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-5bd85449d4-89ct8" Jun 21 04:41:09.836381 kubelet[2743]: E0621 04:41:09.836156 2743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-5bd85449d4-89ct8_calico-system(308176fb-c3b9-477d-ba55-c124940f5841)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-5bd85449d4-89ct8_calico-system(308176fb-c3b9-477d-ba55-c124940f5841)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d467a18074b3d31f8be77dd677a524757142b1d0c004ce12a9135a5714f1cde1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-5bd85449d4-89ct8" podUID="308176fb-c3b9-477d-ba55-c124940f5841" Jun 21 04:41:09.869451 containerd[1593]: time="2025-06-21T04:41:09.869373863Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-lchxl,Uid:4ff624f6-6dd1-44df-847d-619cdf482fbf,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f004e76fbbc739563f73f3222c3f682a1391ad86c13426c74003f5e4c7577148\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jun 21 04:41:09.870181 kubelet[2743]: E0621 04:41:09.869704 2743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f004e76fbbc739563f73f3222c3f682a1391ad86c13426c74003f5e4c7577148\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jun 21 04:41:09.870181 kubelet[2743]: E0621 04:41:09.869812 2743 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f004e76fbbc739563f73f3222c3f682a1391ad86c13426c74003f5e4c7577148\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-lchxl" Jun 21 04:41:09.870181 kubelet[2743]: E0621 04:41:09.869838 2743 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f004e76fbbc739563f73f3222c3f682a1391ad86c13426c74003f5e4c7577148\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-lchxl" Jun 21 04:41:09.870294 kubelet[2743]: E0621 04:41:09.869929 2743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-lchxl_kube-system(4ff624f6-6dd1-44df-847d-619cdf482fbf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-lchxl_kube-system(4ff624f6-6dd1-44df-847d-619cdf482fbf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f004e76fbbc739563f73f3222c3f682a1391ad86c13426c74003f5e4c7577148\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-lchxl" podUID="4ff624f6-6dd1-44df-847d-619cdf482fbf" Jun 21 04:41:09.871138 containerd[1593]: time="2025-06-21T04:41:09.871074827Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6ddd6f5999-lwdv6,Uid:ba7afd7e-002b-4c26-9fd9-a17eb66b7f50,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c3883d4f803eef13504d50e4e5616349a3404644decfb0dc47da976240b0f3e0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jun 21 04:41:09.871823 kubelet[2743]: E0621 04:41:09.871672 2743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c3883d4f803eef13504d50e4e5616349a3404644decfb0dc47da976240b0f3e0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jun 21 04:41:09.871942 kubelet[2743]: E0621 04:41:09.871899 2743 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c3883d4f803eef13504d50e4e5616349a3404644decfb0dc47da976240b0f3e0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6ddd6f5999-lwdv6" Jun 21 04:41:09.871942 kubelet[2743]: E0621 04:41:09.871936 2743 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c3883d4f803eef13504d50e4e5616349a3404644decfb0dc47da976240b0f3e0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6ddd6f5999-lwdv6" Jun 21 04:41:09.872113 kubelet[2743]: E0621 04:41:09.872006 2743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6ddd6f5999-lwdv6_calico-apiserver(ba7afd7e-002b-4c26-9fd9-a17eb66b7f50)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6ddd6f5999-lwdv6_calico-apiserver(ba7afd7e-002b-4c26-9fd9-a17eb66b7f50)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c3883d4f803eef13504d50e4e5616349a3404644decfb0dc47da976240b0f3e0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6ddd6f5999-lwdv6" podUID="ba7afd7e-002b-4c26-9fd9-a17eb66b7f50" Jun 21 04:41:09.872293 containerd[1593]: time="2025-06-21T04:41:09.872247315Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-c7b7955d6-gf7gk,Uid:e319519b-3d7c-436c-a209-e61e1b845712,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c44610e869b6676490a27176a715649754dbb12f59ac599c75e59331afe41787\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jun 21 04:41:09.872469 kubelet[2743]: E0621 04:41:09.872433 2743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c44610e869b6676490a27176a715649754dbb12f59ac599c75e59331afe41787\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jun 21 04:41:09.872469 kubelet[2743]: E0621 04:41:09.872466 2743 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c44610e869b6676490a27176a715649754dbb12f59ac599c75e59331afe41787\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-c7b7955d6-gf7gk" Jun 21 04:41:09.872469 kubelet[2743]: E0621 04:41:09.872479 2743 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c44610e869b6676490a27176a715649754dbb12f59ac599c75e59331afe41787\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-c7b7955d6-gf7gk" Jun 21 04:41:09.872677 kubelet[2743]: E0621 04:41:09.872522 2743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-c7b7955d6-gf7gk_calico-system(e319519b-3d7c-436c-a209-e61e1b845712)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-c7b7955d6-gf7gk_calico-system(e319519b-3d7c-436c-a209-e61e1b845712)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c44610e869b6676490a27176a715649754dbb12f59ac599c75e59331afe41787\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-c7b7955d6-gf7gk" podUID="e319519b-3d7c-436c-a209-e61e1b845712" Jun 21 04:41:18.333136 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2223845895.mount: Deactivated successfully. Jun 21 04:41:19.186697 containerd[1593]: time="2025-06-21T04:41:19.186626605Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:19.187678 containerd[1593]: time="2025-06-21T04:41:19.187647534Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.1: active requests=0, bytes read=156518913" Jun 21 04:41:19.189159 containerd[1593]: time="2025-06-21T04:41:19.189121055Z" level=info msg="ImageCreate event name:\"sha256:9ac26af2ca9c35e475f921a9bcf40c7c0ce106819208883b006e64c489251722\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:19.191561 containerd[1593]: time="2025-06-21T04:41:19.191487443Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:8da6d025e5cf2ff5080c801ac8611bedb513e5922500fcc8161d8164e4679597\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:19.191985 containerd[1593]: time="2025-06-21T04:41:19.191946757Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.1\" with image id \"sha256:9ac26af2ca9c35e475f921a9bcf40c7c0ce106819208883b006e64c489251722\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.1\", repo digest \"ghcr.io/flatcar/calico/node@sha256:8da6d025e5cf2ff5080c801ac8611bedb513e5922500fcc8161d8164e4679597\", size \"156518775\" in 9.649097516s" Jun 21 04:41:19.191985 containerd[1593]: time="2025-06-21T04:41:19.191978526Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.1\" returns image reference \"sha256:9ac26af2ca9c35e475f921a9bcf40c7c0ce106819208883b006e64c489251722\"" Jun 21 04:41:19.214958 containerd[1593]: time="2025-06-21T04:41:19.214908629Z" level=info msg="CreateContainer within sandbox \"704e7548d8ffd1f161df9d3e40314a5eb2af317f05326cc8832814acfa88f6f4\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jun 21 04:41:19.224567 containerd[1593]: time="2025-06-21T04:41:19.224517444Z" level=info msg="Container 63d880203e0ec15d8f94bf30b42d59a6f30d545ca3ef57b97eda6dfc727cf2f2: CDI devices from CRI Config.CDIDevices: []" Jun 21 04:41:19.237559 containerd[1593]: time="2025-06-21T04:41:19.237503991Z" level=info msg="CreateContainer within sandbox \"704e7548d8ffd1f161df9d3e40314a5eb2af317f05326cc8832814acfa88f6f4\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"63d880203e0ec15d8f94bf30b42d59a6f30d545ca3ef57b97eda6dfc727cf2f2\"" Jun 21 04:41:19.238128 containerd[1593]: time="2025-06-21T04:41:19.238099171Z" level=info msg="StartContainer for \"63d880203e0ec15d8f94bf30b42d59a6f30d545ca3ef57b97eda6dfc727cf2f2\"" Jun 21 04:41:19.239533 containerd[1593]: time="2025-06-21T04:41:19.239486959Z" level=info msg="connecting to shim 63d880203e0ec15d8f94bf30b42d59a6f30d545ca3ef57b97eda6dfc727cf2f2" address="unix:///run/containerd/s/ed01531c7e6bd8c9ea35addd8b10a9d349110d44dbfe56d4b33dcebc7cfbff48" protocol=ttrpc version=3 Jun 21 04:41:19.263854 systemd[1]: Started cri-containerd-63d880203e0ec15d8f94bf30b42d59a6f30d545ca3ef57b97eda6dfc727cf2f2.scope - libcontainer container 63d880203e0ec15d8f94bf30b42d59a6f30d545ca3ef57b97eda6dfc727cf2f2. Jun 21 04:41:19.343026 containerd[1593]: time="2025-06-21T04:41:19.342979721Z" level=info msg="StartContainer for \"63d880203e0ec15d8f94bf30b42d59a6f30d545ca3ef57b97eda6dfc727cf2f2\" returns successfully" Jun 21 04:41:19.418615 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jun 21 04:41:19.419317 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jun 21 04:41:19.577960 kubelet[2743]: I0621 04:41:19.577798 2743 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e32c81e0-9485-4122-916a-bc8d4204e9f3-whisker-ca-bundle\") pod \"e32c81e0-9485-4122-916a-bc8d4204e9f3\" (UID: \"e32c81e0-9485-4122-916a-bc8d4204e9f3\") " Jun 21 04:41:19.577960 kubelet[2743]: I0621 04:41:19.577854 2743 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsxw9\" (UniqueName: \"kubernetes.io/projected/e32c81e0-9485-4122-916a-bc8d4204e9f3-kube-api-access-hsxw9\") pod \"e32c81e0-9485-4122-916a-bc8d4204e9f3\" (UID: \"e32c81e0-9485-4122-916a-bc8d4204e9f3\") " Jun 21 04:41:19.577960 kubelet[2743]: I0621 04:41:19.577890 2743 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e32c81e0-9485-4122-916a-bc8d4204e9f3-whisker-backend-key-pair\") pod \"e32c81e0-9485-4122-916a-bc8d4204e9f3\" (UID: \"e32c81e0-9485-4122-916a-bc8d4204e9f3\") " Jun 21 04:41:19.578978 kubelet[2743]: I0621 04:41:19.578942 2743 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e32c81e0-9485-4122-916a-bc8d4204e9f3-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "e32c81e0-9485-4122-916a-bc8d4204e9f3" (UID: "e32c81e0-9485-4122-916a-bc8d4204e9f3"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jun 21 04:41:19.583453 kubelet[2743]: I0621 04:41:19.583413 2743 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e32c81e0-9485-4122-916a-bc8d4204e9f3-kube-api-access-hsxw9" (OuterVolumeSpecName: "kube-api-access-hsxw9") pod "e32c81e0-9485-4122-916a-bc8d4204e9f3" (UID: "e32c81e0-9485-4122-916a-bc8d4204e9f3"). InnerVolumeSpecName "kube-api-access-hsxw9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jun 21 04:41:19.583626 kubelet[2743]: I0621 04:41:19.583589 2743 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e32c81e0-9485-4122-916a-bc8d4204e9f3-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "e32c81e0-9485-4122-916a-bc8d4204e9f3" (UID: "e32c81e0-9485-4122-916a-bc8d4204e9f3"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jun 21 04:41:19.584534 systemd[1]: var-lib-kubelet-pods-e32c81e0\x2d9485\x2d4122\x2d916a\x2dbc8d4204e9f3-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dhsxw9.mount: Deactivated successfully. Jun 21 04:41:19.584657 systemd[1]: var-lib-kubelet-pods-e32c81e0\x2d9485\x2d4122\x2d916a\x2dbc8d4204e9f3-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jun 21 04:41:19.606903 kubelet[2743]: I0621 04:41:19.606387 2743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-n67pk" podStartSLOduration=1.249911293 podStartE2EDuration="22.606367049s" podCreationTimestamp="2025-06-21 04:40:57 +0000 UTC" firstStartedPulling="2025-06-21 04:40:57.836211295 +0000 UTC m=+17.611595296" lastFinishedPulling="2025-06-21 04:41:19.192667041 +0000 UTC m=+38.968051052" observedRunningTime="2025-06-21 04:41:19.605247975 +0000 UTC m=+39.380632016" watchObservedRunningTime="2025-06-21 04:41:19.606367049 +0000 UTC m=+39.381751070" Jun 21 04:41:19.679190 kubelet[2743]: I0621 04:41:19.679139 2743 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e32c81e0-9485-4122-916a-bc8d4204e9f3-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Jun 21 04:41:19.679190 kubelet[2743]: I0621 04:41:19.679176 2743 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e32c81e0-9485-4122-916a-bc8d4204e9f3-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Jun 21 04:41:19.679190 kubelet[2743]: I0621 04:41:19.679184 2743 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hsxw9\" (UniqueName: \"kubernetes.io/projected/e32c81e0-9485-4122-916a-bc8d4204e9f3-kube-api-access-hsxw9\") on node \"localhost\" DevicePath \"\"" Jun 21 04:41:19.865672 systemd[1]: Removed slice kubepods-besteffort-pode32c81e0_9485_4122_916a_bc8d4204e9f3.slice - libcontainer container kubepods-besteffort-pode32c81e0_9485_4122_916a_bc8d4204e9f3.slice. Jun 21 04:41:19.929932 systemd[1]: Created slice kubepods-besteffort-podc2d897b7_b764_49d5_b3ba_2400c9c37ac4.slice - libcontainer container kubepods-besteffort-podc2d897b7_b764_49d5_b3ba_2400c9c37ac4.slice. Jun 21 04:41:19.981031 kubelet[2743]: I0621 04:41:19.980967 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/c2d897b7-b764-49d5-b3ba-2400c9c37ac4-whisker-backend-key-pair\") pod \"whisker-7447955458-q4xz2\" (UID: \"c2d897b7-b764-49d5-b3ba-2400c9c37ac4\") " pod="calico-system/whisker-7447955458-q4xz2" Jun 21 04:41:19.981031 kubelet[2743]: I0621 04:41:19.981020 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c2d897b7-b764-49d5-b3ba-2400c9c37ac4-whisker-ca-bundle\") pod \"whisker-7447955458-q4xz2\" (UID: \"c2d897b7-b764-49d5-b3ba-2400c9c37ac4\") " pod="calico-system/whisker-7447955458-q4xz2" Jun 21 04:41:19.981224 kubelet[2743]: I0621 04:41:19.981089 2743 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nlqv\" (UniqueName: \"kubernetes.io/projected/c2d897b7-b764-49d5-b3ba-2400c9c37ac4-kube-api-access-4nlqv\") pod \"whisker-7447955458-q4xz2\" (UID: \"c2d897b7-b764-49d5-b3ba-2400c9c37ac4\") " pod="calico-system/whisker-7447955458-q4xz2" Jun 21 04:41:20.234988 containerd[1593]: time="2025-06-21T04:41:20.234839130Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7447955458-q4xz2,Uid:c2d897b7-b764-49d5-b3ba-2400c9c37ac4,Namespace:calico-system,Attempt:0,}" Jun 21 04:41:20.443224 kubelet[2743]: I0621 04:41:20.443168 2743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e32c81e0-9485-4122-916a-bc8d4204e9f3" path="/var/lib/kubelet/pods/e32c81e0-9485-4122-916a-bc8d4204e9f3/volumes" Jun 21 04:41:20.443994 containerd[1593]: time="2025-06-21T04:41:20.443938655Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6ddd6f5999-lwdv6,Uid:ba7afd7e-002b-4c26-9fd9-a17eb66b7f50,Namespace:calico-apiserver,Attempt:0,}" Jun 21 04:41:20.444153 kubelet[2743]: E0621 04:41:20.443980 2743 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Jun 21 04:41:20.444388 containerd[1593]: time="2025-06-21T04:41:20.444341723Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-5x9q2,Uid:decddb52-80c6-4fed-bf16-dbdc189f89be,Namespace:kube-system,Attempt:0,}" Jun 21 04:41:20.444430 containerd[1593]: time="2025-06-21T04:41:20.444388100Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5bd85449d4-89ct8,Uid:308176fb-c3b9-477d-ba55-c124940f5841,Namespace:calico-system,Attempt:0,}" Jun 21 04:41:21.062434 systemd-networkd[1495]: cali2c76488dbb7: Link UP Jun 21 04:41:21.064060 systemd-networkd[1495]: cali2c76488dbb7: Gained carrier Jun 21 04:41:21.220392 containerd[1593]: 2025-06-21 04:41:20.317 [INFO][3997] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jun 21 04:41:21.220392 containerd[1593]: 2025-06-21 04:41:20.336 [INFO][3997] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--7447955458--q4xz2-eth0 whisker-7447955458- calico-system c2d897b7-b764-49d5-b3ba-2400c9c37ac4 953 0 2025-06-21 04:41:19 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:7447955458 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-7447955458-q4xz2 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali2c76488dbb7 [] [] }} ContainerID="c31cb3ecd7aa568f12def2b05a895c9b479cc0e3c97edc18a919901910168e43" Namespace="calico-system" Pod="whisker-7447955458-q4xz2" WorkloadEndpoint="localhost-k8s-whisker--7447955458--q4xz2-" Jun 21 04:41:21.220392 containerd[1593]: 2025-06-21 04:41:20.336 [INFO][3997] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c31cb3ecd7aa568f12def2b05a895c9b479cc0e3c97edc18a919901910168e43" Namespace="calico-system" Pod="whisker-7447955458-q4xz2" WorkloadEndpoint="localhost-k8s-whisker--7447955458--q4xz2-eth0" Jun 21 04:41:21.220392 containerd[1593]: 2025-06-21 04:41:20.407 [INFO][4012] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c31cb3ecd7aa568f12def2b05a895c9b479cc0e3c97edc18a919901910168e43" HandleID="k8s-pod-network.c31cb3ecd7aa568f12def2b05a895c9b479cc0e3c97edc18a919901910168e43" Workload="localhost-k8s-whisker--7447955458--q4xz2-eth0" Jun 21 04:41:21.220781 containerd[1593]: 2025-06-21 04:41:20.408 [INFO][4012] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c31cb3ecd7aa568f12def2b05a895c9b479cc0e3c97edc18a919901910168e43" HandleID="k8s-pod-network.c31cb3ecd7aa568f12def2b05a895c9b479cc0e3c97edc18a919901910168e43" Workload="localhost-k8s-whisker--7447955458--q4xz2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00039aef0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-7447955458-q4xz2", "timestamp":"2025-06-21 04:41:20.407828301 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jun 21 04:41:21.220781 containerd[1593]: 2025-06-21 04:41:20.408 [INFO][4012] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jun 21 04:41:21.220781 containerd[1593]: 2025-06-21 04:41:20.408 [INFO][4012] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jun 21 04:41:21.220781 containerd[1593]: 2025-06-21 04:41:20.408 [INFO][4012] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jun 21 04:41:21.220781 containerd[1593]: 2025-06-21 04:41:20.419 [INFO][4012] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c31cb3ecd7aa568f12def2b05a895c9b479cc0e3c97edc18a919901910168e43" host="localhost" Jun 21 04:41:21.220781 containerd[1593]: 2025-06-21 04:41:20.428 [INFO][4012] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jun 21 04:41:21.220781 containerd[1593]: 2025-06-21 04:41:20.433 [INFO][4012] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jun 21 04:41:21.220781 containerd[1593]: 2025-06-21 04:41:20.435 [INFO][4012] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jun 21 04:41:21.220781 containerd[1593]: 2025-06-21 04:41:20.437 [INFO][4012] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jun 21 04:41:21.220781 containerd[1593]: 2025-06-21 04:41:20.437 [INFO][4012] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.c31cb3ecd7aa568f12def2b05a895c9b479cc0e3c97edc18a919901910168e43" host="localhost" Jun 21 04:41:21.264425 containerd[1593]: 2025-06-21 04:41:20.439 [INFO][4012] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c31cb3ecd7aa568f12def2b05a895c9b479cc0e3c97edc18a919901910168e43 Jun 21 04:41:21.264425 containerd[1593]: 2025-06-21 04:41:20.455 [INFO][4012] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.c31cb3ecd7aa568f12def2b05a895c9b479cc0e3c97edc18a919901910168e43" host="localhost" Jun 21 04:41:21.264425 containerd[1593]: 2025-06-21 04:41:20.587 [INFO][4012] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.c31cb3ecd7aa568f12def2b05a895c9b479cc0e3c97edc18a919901910168e43" host="localhost" Jun 21 04:41:21.264425 containerd[1593]: 2025-06-21 04:41:20.587 [INFO][4012] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.c31cb3ecd7aa568f12def2b05a895c9b479cc0e3c97edc18a919901910168e43" host="localhost" Jun 21 04:41:21.264425 containerd[1593]: 2025-06-21 04:41:20.587 [INFO][4012] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jun 21 04:41:21.264425 containerd[1593]: 2025-06-21 04:41:20.587 [INFO][4012] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="c31cb3ecd7aa568f12def2b05a895c9b479cc0e3c97edc18a919901910168e43" HandleID="k8s-pod-network.c31cb3ecd7aa568f12def2b05a895c9b479cc0e3c97edc18a919901910168e43" Workload="localhost-k8s-whisker--7447955458--q4xz2-eth0" Jun 21 04:41:21.264580 containerd[1593]: 2025-06-21 04:41:20.590 [INFO][3997] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c31cb3ecd7aa568f12def2b05a895c9b479cc0e3c97edc18a919901910168e43" Namespace="calico-system" Pod="whisker-7447955458-q4xz2" WorkloadEndpoint="localhost-k8s-whisker--7447955458--q4xz2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--7447955458--q4xz2-eth0", GenerateName:"whisker-7447955458-", Namespace:"calico-system", SelfLink:"", UID:"c2d897b7-b764-49d5-b3ba-2400c9c37ac4", ResourceVersion:"953", Generation:0, CreationTimestamp:time.Date(2025, time.June, 21, 4, 41, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7447955458", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-7447955458-q4xz2", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali2c76488dbb7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jun 21 04:41:21.264580 containerd[1593]: 2025-06-21 04:41:20.590 [INFO][3997] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="c31cb3ecd7aa568f12def2b05a895c9b479cc0e3c97edc18a919901910168e43" Namespace="calico-system" Pod="whisker-7447955458-q4xz2" WorkloadEndpoint="localhost-k8s-whisker--7447955458--q4xz2-eth0" Jun 21 04:41:21.264706 containerd[1593]: 2025-06-21 04:41:20.590 [INFO][3997] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2c76488dbb7 ContainerID="c31cb3ecd7aa568f12def2b05a895c9b479cc0e3c97edc18a919901910168e43" Namespace="calico-system" Pod="whisker-7447955458-q4xz2" WorkloadEndpoint="localhost-k8s-whisker--7447955458--q4xz2-eth0" Jun 21 04:41:21.264706 containerd[1593]: 2025-06-21 04:41:21.067 [INFO][3997] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c31cb3ecd7aa568f12def2b05a895c9b479cc0e3c97edc18a919901910168e43" Namespace="calico-system" Pod="whisker-7447955458-q4xz2" WorkloadEndpoint="localhost-k8s-whisker--7447955458--q4xz2-eth0" Jun 21 04:41:21.264790 containerd[1593]: 2025-06-21 04:41:21.067 [INFO][3997] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c31cb3ecd7aa568f12def2b05a895c9b479cc0e3c97edc18a919901910168e43" Namespace="calico-system" Pod="whisker-7447955458-q4xz2" WorkloadEndpoint="localhost-k8s-whisker--7447955458--q4xz2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--7447955458--q4xz2-eth0", GenerateName:"whisker-7447955458-", Namespace:"calico-system", SelfLink:"", UID:"c2d897b7-b764-49d5-b3ba-2400c9c37ac4", ResourceVersion:"953", Generation:0, CreationTimestamp:time.Date(2025, time.June, 21, 4, 41, 19, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7447955458", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"c31cb3ecd7aa568f12def2b05a895c9b479cc0e3c97edc18a919901910168e43", Pod:"whisker-7447955458-q4xz2", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali2c76488dbb7", MAC:"7a:d3:f5:61:26:22", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jun 21 04:41:21.264870 containerd[1593]: 2025-06-21 04:41:21.215 [INFO][3997] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c31cb3ecd7aa568f12def2b05a895c9b479cc0e3c97edc18a919901910168e43" Namespace="calico-system" Pod="whisker-7447955458-q4xz2" WorkloadEndpoint="localhost-k8s-whisker--7447955458--q4xz2-eth0" Jun 21 04:41:21.347121 containerd[1593]: time="2025-06-21T04:41:21.347086865Z" level=info msg="TaskExit event in podsandbox handler container_id:\"63d880203e0ec15d8f94bf30b42d59a6f30d545ca3ef57b97eda6dfc727cf2f2\" id:\"9057c5f7151f5984f87931a856a93326e84858055f0b60304f67816b93b32436\" pid:4133 exit_status:1 exited_at:{seconds:1750480881 nanos:346825594}" Jun 21 04:41:21.441705 containerd[1593]: time="2025-06-21T04:41:21.441646474Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ldc5m,Uid:8757dac4-0fac-47e3-9805-744183a4690a,Namespace:calico-system,Attempt:0,}" Jun 21 04:41:21.519801 systemd-networkd[1495]: cali5bdbd24404b: Link UP Jun 21 04:41:21.519990 systemd-networkd[1495]: cali5bdbd24404b: Gained carrier Jun 21 04:41:21.559980 containerd[1593]: time="2025-06-21T04:41:21.559876205Z" level=info msg="connecting to shim c31cb3ecd7aa568f12def2b05a895c9b479cc0e3c97edc18a919901910168e43" address="unix:///run/containerd/s/c8675cb2c7f9ccfa3d61cf897e99cd6ed18efd523e67dd6ebeb0c5b5d386f2ba" namespace=k8s.io protocol=ttrpc version=3 Jun 21 04:41:21.588929 systemd[1]: Started cri-containerd-c31cb3ecd7aa568f12def2b05a895c9b479cc0e3c97edc18a919901910168e43.scope - libcontainer container c31cb3ecd7aa568f12def2b05a895c9b479cc0e3c97edc18a919901910168e43. Jun 21 04:41:21.613577 systemd-resolved[1405]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jun 21 04:41:21.617294 containerd[1593]: 2025-06-21 04:41:21.383 [INFO][4181] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--674b8bbfcf--5x9q2-eth0 coredns-674b8bbfcf- kube-system decddb52-80c6-4fed-bf16-dbdc189f89be 875 0 2025-06-21 04:40:46 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-674b8bbfcf-5x9q2 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali5bdbd24404b [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="beaa005873d688668efef8cf5a72a4984807d217b661248f8c278eb3f8c830fb" Namespace="kube-system" Pod="coredns-674b8bbfcf-5x9q2" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--5x9q2-" Jun 21 04:41:21.617294 containerd[1593]: 2025-06-21 04:41:21.384 [INFO][4181] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="beaa005873d688668efef8cf5a72a4984807d217b661248f8c278eb3f8c830fb" Namespace="kube-system" Pod="coredns-674b8bbfcf-5x9q2" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--5x9q2-eth0" Jun 21 04:41:21.617294 containerd[1593]: 2025-06-21 04:41:21.444 [INFO][4209] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="beaa005873d688668efef8cf5a72a4984807d217b661248f8c278eb3f8c830fb" HandleID="k8s-pod-network.beaa005873d688668efef8cf5a72a4984807d217b661248f8c278eb3f8c830fb" Workload="localhost-k8s-coredns--674b8bbfcf--5x9q2-eth0" Jun 21 04:41:21.617445 containerd[1593]: 2025-06-21 04:41:21.445 [INFO][4209] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="beaa005873d688668efef8cf5a72a4984807d217b661248f8c278eb3f8c830fb" HandleID="k8s-pod-network.beaa005873d688668efef8cf5a72a4984807d217b661248f8c278eb3f8c830fb" Workload="localhost-k8s-coredns--674b8bbfcf--5x9q2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00050eac0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-674b8bbfcf-5x9q2", "timestamp":"2025-06-21 04:41:21.444856298 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jun 21 04:41:21.617445 containerd[1593]: 2025-06-21 04:41:21.445 [INFO][4209] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jun 21 04:41:21.617445 containerd[1593]: 2025-06-21 04:41:21.445 [INFO][4209] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jun 21 04:41:21.617445 containerd[1593]: 2025-06-21 04:41:21.445 [INFO][4209] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jun 21 04:41:21.617445 containerd[1593]: 2025-06-21 04:41:21.456 [INFO][4209] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.beaa005873d688668efef8cf5a72a4984807d217b661248f8c278eb3f8c830fb" host="localhost" Jun 21 04:41:21.617445 containerd[1593]: 2025-06-21 04:41:21.468 [INFO][4209] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jun 21 04:41:21.617445 containerd[1593]: 2025-06-21 04:41:21.473 [INFO][4209] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jun 21 04:41:21.617445 containerd[1593]: 2025-06-21 04:41:21.475 [INFO][4209] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jun 21 04:41:21.617445 containerd[1593]: 2025-06-21 04:41:21.478 [INFO][4209] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jun 21 04:41:21.617445 containerd[1593]: 2025-06-21 04:41:21.478 [INFO][4209] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.beaa005873d688668efef8cf5a72a4984807d217b661248f8c278eb3f8c830fb" host="localhost" Jun 21 04:41:21.617671 containerd[1593]: 2025-06-21 04:41:21.480 [INFO][4209] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.beaa005873d688668efef8cf5a72a4984807d217b661248f8c278eb3f8c830fb Jun 21 04:41:21.617671 containerd[1593]: 2025-06-21 04:41:21.495 [INFO][4209] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.beaa005873d688668efef8cf5a72a4984807d217b661248f8c278eb3f8c830fb" host="localhost" Jun 21 04:41:21.617671 containerd[1593]: 2025-06-21 04:41:21.506 [INFO][4209] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.beaa005873d688668efef8cf5a72a4984807d217b661248f8c278eb3f8c830fb" host="localhost" Jun 21 04:41:21.617671 containerd[1593]: 2025-06-21 04:41:21.506 [INFO][4209] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.beaa005873d688668efef8cf5a72a4984807d217b661248f8c278eb3f8c830fb" host="localhost" Jun 21 04:41:21.617671 containerd[1593]: 2025-06-21 04:41:21.506 [INFO][4209] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jun 21 04:41:21.617671 containerd[1593]: 2025-06-21 04:41:21.507 [INFO][4209] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="beaa005873d688668efef8cf5a72a4984807d217b661248f8c278eb3f8c830fb" HandleID="k8s-pod-network.beaa005873d688668efef8cf5a72a4984807d217b661248f8c278eb3f8c830fb" Workload="localhost-k8s-coredns--674b8bbfcf--5x9q2-eth0" Jun 21 04:41:21.617813 containerd[1593]: 2025-06-21 04:41:21.514 [INFO][4181] cni-plugin/k8s.go 418: Populated endpoint ContainerID="beaa005873d688668efef8cf5a72a4984807d217b661248f8c278eb3f8c830fb" Namespace="kube-system" Pod="coredns-674b8bbfcf-5x9q2" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--5x9q2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--5x9q2-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"decddb52-80c6-4fed-bf16-dbdc189f89be", ResourceVersion:"875", Generation:0, CreationTimestamp:time.Date(2025, time.June, 21, 4, 40, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-674b8bbfcf-5x9q2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5bdbd24404b", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jun 21 04:41:21.617882 containerd[1593]: 2025-06-21 04:41:21.515 [INFO][4181] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="beaa005873d688668efef8cf5a72a4984807d217b661248f8c278eb3f8c830fb" Namespace="kube-system" Pod="coredns-674b8bbfcf-5x9q2" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--5x9q2-eth0" Jun 21 04:41:21.617882 containerd[1593]: 2025-06-21 04:41:21.515 [INFO][4181] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5bdbd24404b ContainerID="beaa005873d688668efef8cf5a72a4984807d217b661248f8c278eb3f8c830fb" Namespace="kube-system" Pod="coredns-674b8bbfcf-5x9q2" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--5x9q2-eth0" Jun 21 04:41:21.617882 containerd[1593]: 2025-06-21 04:41:21.518 [INFO][4181] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="beaa005873d688668efef8cf5a72a4984807d217b661248f8c278eb3f8c830fb" Namespace="kube-system" Pod="coredns-674b8bbfcf-5x9q2" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--5x9q2-eth0" Jun 21 04:41:21.617949 containerd[1593]: 2025-06-21 04:41:21.518 [INFO][4181] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="beaa005873d688668efef8cf5a72a4984807d217b661248f8c278eb3f8c830fb" Namespace="kube-system" Pod="coredns-674b8bbfcf-5x9q2" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--5x9q2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--5x9q2-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"decddb52-80c6-4fed-bf16-dbdc189f89be", ResourceVersion:"875", Generation:0, CreationTimestamp:time.Date(2025, time.June, 21, 4, 40, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"beaa005873d688668efef8cf5a72a4984807d217b661248f8c278eb3f8c830fb", Pod:"coredns-674b8bbfcf-5x9q2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5bdbd24404b", MAC:"f2:d4:40:af:b5:bc", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jun 21 04:41:21.617949 containerd[1593]: 2025-06-21 04:41:21.608 [INFO][4181] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="beaa005873d688668efef8cf5a72a4984807d217b661248f8c278eb3f8c830fb" Namespace="kube-system" Pod="coredns-674b8bbfcf-5x9q2" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--5x9q2-eth0" Jun 21 04:41:21.681755 systemd-networkd[1495]: cali7894cbf81f1: Link UP Jun 21 04:41:21.683352 systemd-networkd[1495]: cali7894cbf81f1: Gained carrier Jun 21 04:41:21.706734 containerd[1593]: time="2025-06-21T04:41:21.706670748Z" level=info msg="TaskExit event in podsandbox handler container_id:\"63d880203e0ec15d8f94bf30b42d59a6f30d545ca3ef57b97eda6dfc727cf2f2\" id:\"ac4ee9938327cb0dc4482b41b47e9967366fa5c8a7b9cdde8a01f83b98d7abaa\" pid:4308 exit_status:1 exited_at:{seconds:1750480881 nanos:706263412}" Jun 21 04:41:21.730997 systemd-networkd[1495]: vxlan.calico: Link UP Jun 21 04:41:21.731511 systemd-networkd[1495]: vxlan.calico: Gained carrier Jun 21 04:41:21.734122 containerd[1593]: 2025-06-21 04:41:21.457 [INFO][4195] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--6ddd6f5999--lwdv6-eth0 calico-apiserver-6ddd6f5999- calico-apiserver ba7afd7e-002b-4c26-9fd9-a17eb66b7f50 878 0 2025-06-21 04:40:54 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6ddd6f5999 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-6ddd6f5999-lwdv6 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali7894cbf81f1 [] [] }} ContainerID="1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018" Namespace="calico-apiserver" Pod="calico-apiserver-6ddd6f5999-lwdv6" WorkloadEndpoint="localhost-k8s-calico--apiserver--6ddd6f5999--lwdv6-" Jun 21 04:41:21.734122 containerd[1593]: 2025-06-21 04:41:21.457 [INFO][4195] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018" Namespace="calico-apiserver" Pod="calico-apiserver-6ddd6f5999-lwdv6" WorkloadEndpoint="localhost-k8s-calico--apiserver--6ddd6f5999--lwdv6-eth0" Jun 21 04:41:21.734122 containerd[1593]: 2025-06-21 04:41:21.499 [INFO][4234] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018" HandleID="k8s-pod-network.1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018" Workload="localhost-k8s-calico--apiserver--6ddd6f5999--lwdv6-eth0" Jun 21 04:41:21.734122 containerd[1593]: 2025-06-21 04:41:21.499 [INFO][4234] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018" HandleID="k8s-pod-network.1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018" Workload="localhost-k8s-calico--apiserver--6ddd6f5999--lwdv6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000258fe0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-6ddd6f5999-lwdv6", "timestamp":"2025-06-21 04:41:21.499279955 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jun 21 04:41:21.734122 containerd[1593]: 2025-06-21 04:41:21.499 [INFO][4234] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jun 21 04:41:21.734122 containerd[1593]: 2025-06-21 04:41:21.507 [INFO][4234] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jun 21 04:41:21.734122 containerd[1593]: 2025-06-21 04:41:21.507 [INFO][4234] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jun 21 04:41:21.734122 containerd[1593]: 2025-06-21 04:41:21.604 [INFO][4234] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018" host="localhost" Jun 21 04:41:21.734122 containerd[1593]: 2025-06-21 04:41:21.616 [INFO][4234] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jun 21 04:41:21.734122 containerd[1593]: 2025-06-21 04:41:21.625 [INFO][4234] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jun 21 04:41:21.734122 containerd[1593]: 2025-06-21 04:41:21.632 [INFO][4234] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jun 21 04:41:21.734122 containerd[1593]: 2025-06-21 04:41:21.637 [INFO][4234] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jun 21 04:41:21.734122 containerd[1593]: 2025-06-21 04:41:21.637 [INFO][4234] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018" host="localhost" Jun 21 04:41:21.734122 containerd[1593]: 2025-06-21 04:41:21.641 [INFO][4234] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018 Jun 21 04:41:21.734122 containerd[1593]: 2025-06-21 04:41:21.650 [INFO][4234] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018" host="localhost" Jun 21 04:41:21.734122 containerd[1593]: 2025-06-21 04:41:21.663 [INFO][4234] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018" host="localhost" Jun 21 04:41:21.734122 containerd[1593]: 2025-06-21 04:41:21.664 [INFO][4234] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018" host="localhost" Jun 21 04:41:21.734122 containerd[1593]: 2025-06-21 04:41:21.664 [INFO][4234] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jun 21 04:41:21.734122 containerd[1593]: 2025-06-21 04:41:21.664 [INFO][4234] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018" HandleID="k8s-pod-network.1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018" Workload="localhost-k8s-calico--apiserver--6ddd6f5999--lwdv6-eth0" Jun 21 04:41:21.734652 containerd[1593]: 2025-06-21 04:41:21.672 [INFO][4195] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018" Namespace="calico-apiserver" Pod="calico-apiserver-6ddd6f5999-lwdv6" WorkloadEndpoint="localhost-k8s-calico--apiserver--6ddd6f5999--lwdv6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6ddd6f5999--lwdv6-eth0", GenerateName:"calico-apiserver-6ddd6f5999-", Namespace:"calico-apiserver", SelfLink:"", UID:"ba7afd7e-002b-4c26-9fd9-a17eb66b7f50", ResourceVersion:"878", Generation:0, CreationTimestamp:time.Date(2025, time.June, 21, 4, 40, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6ddd6f5999", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-6ddd6f5999-lwdv6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali7894cbf81f1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jun 21 04:41:21.734652 containerd[1593]: 2025-06-21 04:41:21.672 [INFO][4195] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018" Namespace="calico-apiserver" Pod="calico-apiserver-6ddd6f5999-lwdv6" WorkloadEndpoint="localhost-k8s-calico--apiserver--6ddd6f5999--lwdv6-eth0" Jun 21 04:41:21.734652 containerd[1593]: 2025-06-21 04:41:21.672 [INFO][4195] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7894cbf81f1 ContainerID="1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018" Namespace="calico-apiserver" Pod="calico-apiserver-6ddd6f5999-lwdv6" WorkloadEndpoint="localhost-k8s-calico--apiserver--6ddd6f5999--lwdv6-eth0" Jun 21 04:41:21.734652 containerd[1593]: 2025-06-21 04:41:21.683 [INFO][4195] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018" Namespace="calico-apiserver" Pod="calico-apiserver-6ddd6f5999-lwdv6" WorkloadEndpoint="localhost-k8s-calico--apiserver--6ddd6f5999--lwdv6-eth0" Jun 21 04:41:21.734652 containerd[1593]: 2025-06-21 04:41:21.684 [INFO][4195] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018" Namespace="calico-apiserver" Pod="calico-apiserver-6ddd6f5999-lwdv6" WorkloadEndpoint="localhost-k8s-calico--apiserver--6ddd6f5999--lwdv6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6ddd6f5999--lwdv6-eth0", GenerateName:"calico-apiserver-6ddd6f5999-", Namespace:"calico-apiserver", SelfLink:"", UID:"ba7afd7e-002b-4c26-9fd9-a17eb66b7f50", ResourceVersion:"878", Generation:0, CreationTimestamp:time.Date(2025, time.June, 21, 4, 40, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6ddd6f5999", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018", Pod:"calico-apiserver-6ddd6f5999-lwdv6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali7894cbf81f1", MAC:"4e:77:78:bb:bf:34", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jun 21 04:41:21.734652 containerd[1593]: 2025-06-21 04:41:21.722 [INFO][4195] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018" Namespace="calico-apiserver" Pod="calico-apiserver-6ddd6f5999-lwdv6" WorkloadEndpoint="localhost-k8s-calico--apiserver--6ddd6f5999--lwdv6-eth0" Jun 21 04:41:21.782788 containerd[1593]: time="2025-06-21T04:41:21.782631969Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7447955458-q4xz2,Uid:c2d897b7-b764-49d5-b3ba-2400c9c37ac4,Namespace:calico-system,Attempt:0,} returns sandbox id \"c31cb3ecd7aa568f12def2b05a895c9b479cc0e3c97edc18a919901910168e43\"" Jun 21 04:41:21.787390 containerd[1593]: time="2025-06-21T04:41:21.787347111Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.1\"" Jun 21 04:41:21.817743 containerd[1593]: time="2025-06-21T04:41:21.817608533Z" level=info msg="connecting to shim beaa005873d688668efef8cf5a72a4984807d217b661248f8c278eb3f8c830fb" address="unix:///run/containerd/s/f22a3eef1dd0d7c142fe2fec81e2326100e65fe0872fa2f70962e615ad8f42c9" namespace=k8s.io protocol=ttrpc version=3 Jun 21 04:41:21.830799 systemd-networkd[1495]: calif97c5bb0370: Link UP Jun 21 04:41:21.832028 systemd-networkd[1495]: calif97c5bb0370: Gained carrier Jun 21 04:41:21.853888 systemd[1]: Started cri-containerd-beaa005873d688668efef8cf5a72a4984807d217b661248f8c278eb3f8c830fb.scope - libcontainer container beaa005873d688668efef8cf5a72a4984807d217b661248f8c278eb3f8c830fb. Jun 21 04:41:21.867674 systemd-resolved[1405]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jun 21 04:41:21.880495 containerd[1593]: 2025-06-21 04:41:21.459 [INFO][4214] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--5bd85449d4--89ct8-eth0 goldmane-5bd85449d4- calico-system 308176fb-c3b9-477d-ba55-c124940f5841 880 0 2025-06-21 04:40:56 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:5bd85449d4 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-5bd85449d4-89ct8 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calif97c5bb0370 [] [] }} ContainerID="5adee51526c6efb8c573cc7ea1542a926d7d047fdc8b934c22f204c38cdc47ab" Namespace="calico-system" Pod="goldmane-5bd85449d4-89ct8" WorkloadEndpoint="localhost-k8s-goldmane--5bd85449d4--89ct8-" Jun 21 04:41:21.880495 containerd[1593]: 2025-06-21 04:41:21.460 [INFO][4214] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5adee51526c6efb8c573cc7ea1542a926d7d047fdc8b934c22f204c38cdc47ab" Namespace="calico-system" Pod="goldmane-5bd85449d4-89ct8" WorkloadEndpoint="localhost-k8s-goldmane--5bd85449d4--89ct8-eth0" Jun 21 04:41:21.880495 containerd[1593]: 2025-06-21 04:41:21.514 [INFO][4240] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5adee51526c6efb8c573cc7ea1542a926d7d047fdc8b934c22f204c38cdc47ab" HandleID="k8s-pod-network.5adee51526c6efb8c573cc7ea1542a926d7d047fdc8b934c22f204c38cdc47ab" Workload="localhost-k8s-goldmane--5bd85449d4--89ct8-eth0" Jun 21 04:41:21.880495 containerd[1593]: 2025-06-21 04:41:21.515 [INFO][4240] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5adee51526c6efb8c573cc7ea1542a926d7d047fdc8b934c22f204c38cdc47ab" HandleID="k8s-pod-network.5adee51526c6efb8c573cc7ea1542a926d7d047fdc8b934c22f204c38cdc47ab" Workload="localhost-k8s-goldmane--5bd85449d4--89ct8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00012db80), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-5bd85449d4-89ct8", "timestamp":"2025-06-21 04:41:21.514611924 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jun 21 04:41:21.880495 containerd[1593]: 2025-06-21 04:41:21.515 [INFO][4240] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jun 21 04:41:21.880495 containerd[1593]: 2025-06-21 04:41:21.664 [INFO][4240] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jun 21 04:41:21.880495 containerd[1593]: 2025-06-21 04:41:21.664 [INFO][4240] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jun 21 04:41:21.880495 containerd[1593]: 2025-06-21 04:41:21.683 [INFO][4240] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5adee51526c6efb8c573cc7ea1542a926d7d047fdc8b934c22f204c38cdc47ab" host="localhost" Jun 21 04:41:21.880495 containerd[1593]: 2025-06-21 04:41:21.723 [INFO][4240] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jun 21 04:41:21.880495 containerd[1593]: 2025-06-21 04:41:21.740 [INFO][4240] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jun 21 04:41:21.880495 containerd[1593]: 2025-06-21 04:41:21.742 [INFO][4240] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jun 21 04:41:21.880495 containerd[1593]: 2025-06-21 04:41:21.745 [INFO][4240] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jun 21 04:41:21.880495 containerd[1593]: 2025-06-21 04:41:21.745 [INFO][4240] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.5adee51526c6efb8c573cc7ea1542a926d7d047fdc8b934c22f204c38cdc47ab" host="localhost" Jun 21 04:41:21.880495 containerd[1593]: 2025-06-21 04:41:21.746 [INFO][4240] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5adee51526c6efb8c573cc7ea1542a926d7d047fdc8b934c22f204c38cdc47ab Jun 21 04:41:21.880495 containerd[1593]: 2025-06-21 04:41:21.787 [INFO][4240] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.5adee51526c6efb8c573cc7ea1542a926d7d047fdc8b934c22f204c38cdc47ab" host="localhost" Jun 21 04:41:21.880495 containerd[1593]: 2025-06-21 04:41:21.815 [INFO][4240] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.5adee51526c6efb8c573cc7ea1542a926d7d047fdc8b934c22f204c38cdc47ab" host="localhost" Jun 21 04:41:21.880495 containerd[1593]: 2025-06-21 04:41:21.815 [INFO][4240] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.5adee51526c6efb8c573cc7ea1542a926d7d047fdc8b934c22f204c38cdc47ab" host="localhost" Jun 21 04:41:21.880495 containerd[1593]: 2025-06-21 04:41:21.815 [INFO][4240] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jun 21 04:41:21.880495 containerd[1593]: 2025-06-21 04:41:21.815 [INFO][4240] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="5adee51526c6efb8c573cc7ea1542a926d7d047fdc8b934c22f204c38cdc47ab" HandleID="k8s-pod-network.5adee51526c6efb8c573cc7ea1542a926d7d047fdc8b934c22f204c38cdc47ab" Workload="localhost-k8s-goldmane--5bd85449d4--89ct8-eth0" Jun 21 04:41:21.881303 containerd[1593]: 2025-06-21 04:41:21.823 [INFO][4214] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5adee51526c6efb8c573cc7ea1542a926d7d047fdc8b934c22f204c38cdc47ab" Namespace="calico-system" Pod="goldmane-5bd85449d4-89ct8" WorkloadEndpoint="localhost-k8s-goldmane--5bd85449d4--89ct8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--5bd85449d4--89ct8-eth0", GenerateName:"goldmane-5bd85449d4-", Namespace:"calico-system", SelfLink:"", UID:"308176fb-c3b9-477d-ba55-c124940f5841", ResourceVersion:"880", Generation:0, CreationTimestamp:time.Date(2025, time.June, 21, 4, 40, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"5bd85449d4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-5bd85449d4-89ct8", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calif97c5bb0370", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jun 21 04:41:21.881303 containerd[1593]: 2025-06-21 04:41:21.823 [INFO][4214] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="5adee51526c6efb8c573cc7ea1542a926d7d047fdc8b934c22f204c38cdc47ab" Namespace="calico-system" Pod="goldmane-5bd85449d4-89ct8" WorkloadEndpoint="localhost-k8s-goldmane--5bd85449d4--89ct8-eth0" Jun 21 04:41:21.881303 containerd[1593]: 2025-06-21 04:41:21.823 [INFO][4214] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif97c5bb0370 ContainerID="5adee51526c6efb8c573cc7ea1542a926d7d047fdc8b934c22f204c38cdc47ab" Namespace="calico-system" Pod="goldmane-5bd85449d4-89ct8" WorkloadEndpoint="localhost-k8s-goldmane--5bd85449d4--89ct8-eth0" Jun 21 04:41:21.881303 containerd[1593]: 2025-06-21 04:41:21.833 [INFO][4214] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5adee51526c6efb8c573cc7ea1542a926d7d047fdc8b934c22f204c38cdc47ab" Namespace="calico-system" Pod="goldmane-5bd85449d4-89ct8" WorkloadEndpoint="localhost-k8s-goldmane--5bd85449d4--89ct8-eth0" Jun 21 04:41:21.881303 containerd[1593]: 2025-06-21 04:41:21.833 [INFO][4214] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5adee51526c6efb8c573cc7ea1542a926d7d047fdc8b934c22f204c38cdc47ab" Namespace="calico-system" Pod="goldmane-5bd85449d4-89ct8" WorkloadEndpoint="localhost-k8s-goldmane--5bd85449d4--89ct8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--5bd85449d4--89ct8-eth0", GenerateName:"goldmane-5bd85449d4-", Namespace:"calico-system", SelfLink:"", UID:"308176fb-c3b9-477d-ba55-c124940f5841", ResourceVersion:"880", Generation:0, CreationTimestamp:time.Date(2025, time.June, 21, 4, 40, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"5bd85449d4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"5adee51526c6efb8c573cc7ea1542a926d7d047fdc8b934c22f204c38cdc47ab", Pod:"goldmane-5bd85449d4-89ct8", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calif97c5bb0370", MAC:"aa:36:6b:6c:3a:2c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jun 21 04:41:21.881303 containerd[1593]: 2025-06-21 04:41:21.874 [INFO][4214] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5adee51526c6efb8c573cc7ea1542a926d7d047fdc8b934c22f204c38cdc47ab" Namespace="calico-system" Pod="goldmane-5bd85449d4-89ct8" WorkloadEndpoint="localhost-k8s-goldmane--5bd85449d4--89ct8-eth0" Jun 21 04:41:21.989906 systemd-networkd[1495]: calic92fe7deb99: Link UP Jun 21 04:41:21.990749 systemd-networkd[1495]: calic92fe7deb99: Gained carrier Jun 21 04:41:21.999421 containerd[1593]: time="2025-06-21T04:41:21.999381403Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-5x9q2,Uid:decddb52-80c6-4fed-bf16-dbdc189f89be,Namespace:kube-system,Attempt:0,} returns sandbox id \"beaa005873d688668efef8cf5a72a4984807d217b661248f8c278eb3f8c830fb\"" Jun 21 04:41:22.000240 kubelet[2743]: E0621 04:41:22.000207 2743 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Jun 21 04:41:22.053845 containerd[1593]: 2025-06-21 04:41:21.610 [INFO][4247] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--ldc5m-eth0 csi-node-driver- calico-system 8757dac4-0fac-47e3-9805-744183a4690a 760 0 2025-06-21 04:40:57 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:85b8c9d4df k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-ldc5m eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calic92fe7deb99 [] [] }} ContainerID="5be0ac59eda054e44a474b341f4ef18b942353c06882071e57a9c28d892a1821" Namespace="calico-system" Pod="csi-node-driver-ldc5m" WorkloadEndpoint="localhost-k8s-csi--node--driver--ldc5m-" Jun 21 04:41:22.053845 containerd[1593]: 2025-06-21 04:41:21.611 [INFO][4247] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5be0ac59eda054e44a474b341f4ef18b942353c06882071e57a9c28d892a1821" Namespace="calico-system" Pod="csi-node-driver-ldc5m" WorkloadEndpoint="localhost-k8s-csi--node--driver--ldc5m-eth0" Jun 21 04:41:22.053845 containerd[1593]: 2025-06-21 04:41:21.668 [INFO][4341] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5be0ac59eda054e44a474b341f4ef18b942353c06882071e57a9c28d892a1821" HandleID="k8s-pod-network.5be0ac59eda054e44a474b341f4ef18b942353c06882071e57a9c28d892a1821" Workload="localhost-k8s-csi--node--driver--ldc5m-eth0" Jun 21 04:41:22.053845 containerd[1593]: 2025-06-21 04:41:21.668 [INFO][4341] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5be0ac59eda054e44a474b341f4ef18b942353c06882071e57a9c28d892a1821" HandleID="k8s-pod-network.5be0ac59eda054e44a474b341f4ef18b942353c06882071e57a9c28d892a1821" Workload="localhost-k8s-csi--node--driver--ldc5m-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001393a0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-ldc5m", "timestamp":"2025-06-21 04:41:21.668344738 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jun 21 04:41:22.053845 containerd[1593]: 2025-06-21 04:41:21.668 [INFO][4341] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jun 21 04:41:22.053845 containerd[1593]: 2025-06-21 04:41:21.815 [INFO][4341] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jun 21 04:41:22.053845 containerd[1593]: 2025-06-21 04:41:21.816 [INFO][4341] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jun 21 04:41:22.053845 containerd[1593]: 2025-06-21 04:41:21.872 [INFO][4341] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5be0ac59eda054e44a474b341f4ef18b942353c06882071e57a9c28d892a1821" host="localhost" Jun 21 04:41:22.053845 containerd[1593]: 2025-06-21 04:41:21.881 [INFO][4341] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jun 21 04:41:22.053845 containerd[1593]: 2025-06-21 04:41:21.887 [INFO][4341] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jun 21 04:41:22.053845 containerd[1593]: 2025-06-21 04:41:21.889 [INFO][4341] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jun 21 04:41:22.053845 containerd[1593]: 2025-06-21 04:41:21.891 [INFO][4341] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jun 21 04:41:22.053845 containerd[1593]: 2025-06-21 04:41:21.891 [INFO][4341] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.5be0ac59eda054e44a474b341f4ef18b942353c06882071e57a9c28d892a1821" host="localhost" Jun 21 04:41:22.053845 containerd[1593]: 2025-06-21 04:41:21.892 [INFO][4341] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5be0ac59eda054e44a474b341f4ef18b942353c06882071e57a9c28d892a1821 Jun 21 04:41:22.053845 containerd[1593]: 2025-06-21 04:41:21.945 [INFO][4341] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.5be0ac59eda054e44a474b341f4ef18b942353c06882071e57a9c28d892a1821" host="localhost" Jun 21 04:41:22.053845 containerd[1593]: 2025-06-21 04:41:21.984 [INFO][4341] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.5be0ac59eda054e44a474b341f4ef18b942353c06882071e57a9c28d892a1821" host="localhost" Jun 21 04:41:22.053845 containerd[1593]: 2025-06-21 04:41:21.984 [INFO][4341] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.5be0ac59eda054e44a474b341f4ef18b942353c06882071e57a9c28d892a1821" host="localhost" Jun 21 04:41:22.053845 containerd[1593]: 2025-06-21 04:41:21.984 [INFO][4341] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jun 21 04:41:22.053845 containerd[1593]: 2025-06-21 04:41:21.984 [INFO][4341] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="5be0ac59eda054e44a474b341f4ef18b942353c06882071e57a9c28d892a1821" HandleID="k8s-pod-network.5be0ac59eda054e44a474b341f4ef18b942353c06882071e57a9c28d892a1821" Workload="localhost-k8s-csi--node--driver--ldc5m-eth0" Jun 21 04:41:22.054428 containerd[1593]: 2025-06-21 04:41:21.987 [INFO][4247] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5be0ac59eda054e44a474b341f4ef18b942353c06882071e57a9c28d892a1821" Namespace="calico-system" Pod="csi-node-driver-ldc5m" WorkloadEndpoint="localhost-k8s-csi--node--driver--ldc5m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--ldc5m-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"8757dac4-0fac-47e3-9805-744183a4690a", ResourceVersion:"760", Generation:0, CreationTimestamp:time.Date(2025, time.June, 21, 4, 40, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"85b8c9d4df", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-ldc5m", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calic92fe7deb99", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jun 21 04:41:22.054428 containerd[1593]: 2025-06-21 04:41:21.987 [INFO][4247] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="5be0ac59eda054e44a474b341f4ef18b942353c06882071e57a9c28d892a1821" Namespace="calico-system" Pod="csi-node-driver-ldc5m" WorkloadEndpoint="localhost-k8s-csi--node--driver--ldc5m-eth0" Jun 21 04:41:22.054428 containerd[1593]: 2025-06-21 04:41:21.987 [INFO][4247] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic92fe7deb99 ContainerID="5be0ac59eda054e44a474b341f4ef18b942353c06882071e57a9c28d892a1821" Namespace="calico-system" Pod="csi-node-driver-ldc5m" WorkloadEndpoint="localhost-k8s-csi--node--driver--ldc5m-eth0" Jun 21 04:41:22.054428 containerd[1593]: 2025-06-21 04:41:21.990 [INFO][4247] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5be0ac59eda054e44a474b341f4ef18b942353c06882071e57a9c28d892a1821" Namespace="calico-system" Pod="csi-node-driver-ldc5m" WorkloadEndpoint="localhost-k8s-csi--node--driver--ldc5m-eth0" Jun 21 04:41:22.054428 containerd[1593]: 2025-06-21 04:41:21.991 [INFO][4247] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5be0ac59eda054e44a474b341f4ef18b942353c06882071e57a9c28d892a1821" Namespace="calico-system" Pod="csi-node-driver-ldc5m" WorkloadEndpoint="localhost-k8s-csi--node--driver--ldc5m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--ldc5m-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"8757dac4-0fac-47e3-9805-744183a4690a", ResourceVersion:"760", Generation:0, CreationTimestamp:time.Date(2025, time.June, 21, 4, 40, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"85b8c9d4df", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"5be0ac59eda054e44a474b341f4ef18b942353c06882071e57a9c28d892a1821", Pod:"csi-node-driver-ldc5m", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calic92fe7deb99", MAC:"b6:1d:5c:57:4d:16", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jun 21 04:41:22.054428 containerd[1593]: 2025-06-21 04:41:22.050 [INFO][4247] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5be0ac59eda054e44a474b341f4ef18b942353c06882071e57a9c28d892a1821" Namespace="calico-system" Pod="csi-node-driver-ldc5m" WorkloadEndpoint="localhost-k8s-csi--node--driver--ldc5m-eth0" Jun 21 04:41:22.066910 containerd[1593]: time="2025-06-21T04:41:22.066840318Z" level=info msg="CreateContainer within sandbox \"beaa005873d688668efef8cf5a72a4984807d217b661248f8c278eb3f8c830fb\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jun 21 04:41:22.192483 containerd[1593]: time="2025-06-21T04:41:22.191809809Z" level=info msg="connecting to shim 1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018" address="unix:///run/containerd/s/64f8a233a2649cf2594945805faa36c0db1f2f75dee70658da5bcf7cf015648f" namespace=k8s.io protocol=ttrpc version=3 Jun 21 04:41:22.228858 systemd[1]: Started cri-containerd-1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018.scope - libcontainer container 1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018. Jun 21 04:41:22.242435 systemd-resolved[1405]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jun 21 04:41:22.404333 containerd[1593]: time="2025-06-21T04:41:22.404223737Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6ddd6f5999-lwdv6,Uid:ba7afd7e-002b-4c26-9fd9-a17eb66b7f50,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018\"" Jun 21 04:41:22.442590 containerd[1593]: time="2025-06-21T04:41:22.442461012Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-c7b7955d6-gf7gk,Uid:e319519b-3d7c-436c-a209-e61e1b845712,Namespace:calico-system,Attempt:0,}" Jun 21 04:41:22.475902 containerd[1593]: time="2025-06-21T04:41:22.475846186Z" level=info msg="Container a75a7c2456c566339243cb11c7de1c4e3465f5ddac33028df4afb3cd0a28ac94: CDI devices from CRI Config.CDIDevices: []" Jun 21 04:41:22.476378 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2416259351.mount: Deactivated successfully. Jun 21 04:41:22.480307 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1107196513.mount: Deactivated successfully. Jun 21 04:41:22.491838 containerd[1593]: time="2025-06-21T04:41:22.491750266Z" level=info msg="connecting to shim 5adee51526c6efb8c573cc7ea1542a926d7d047fdc8b934c22f204c38cdc47ab" address="unix:///run/containerd/s/f02175c75e3cead0eea55924a37c49f629393b4e11241d6910638dc6c591aad0" namespace=k8s.io protocol=ttrpc version=3 Jun 21 04:41:22.532886 systemd[1]: Started cri-containerd-5adee51526c6efb8c573cc7ea1542a926d7d047fdc8b934c22f204c38cdc47ab.scope - libcontainer container 5adee51526c6efb8c573cc7ea1542a926d7d047fdc8b934c22f204c38cdc47ab. Jun 21 04:41:22.544982 systemd-resolved[1405]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jun 21 04:41:22.659445 containerd[1593]: time="2025-06-21T04:41:22.659318680Z" level=info msg="connecting to shim 5be0ac59eda054e44a474b341f4ef18b942353c06882071e57a9c28d892a1821" address="unix:///run/containerd/s/7f1caae9bab80aa415c2def328ecdbdcbef693249a4e9a5f7b3130aea95db900" namespace=k8s.io protocol=ttrpc version=3 Jun 21 04:41:22.688010 systemd[1]: Started cri-containerd-5be0ac59eda054e44a474b341f4ef18b942353c06882071e57a9c28d892a1821.scope - libcontainer container 5be0ac59eda054e44a474b341f4ef18b942353c06882071e57a9c28d892a1821. Jun 21 04:41:22.702535 systemd-resolved[1405]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jun 21 04:41:22.817906 systemd-networkd[1495]: cali2c76488dbb7: Gained IPv6LL Jun 21 04:41:22.909935 containerd[1593]: time="2025-06-21T04:41:22.909884642Z" level=info msg="CreateContainer within sandbox \"beaa005873d688668efef8cf5a72a4984807d217b661248f8c278eb3f8c830fb\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"a75a7c2456c566339243cb11c7de1c4e3465f5ddac33028df4afb3cd0a28ac94\"" Jun 21 04:41:22.910076 containerd[1593]: time="2025-06-21T04:41:22.909908166Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-5bd85449d4-89ct8,Uid:308176fb-c3b9-477d-ba55-c124940f5841,Namespace:calico-system,Attempt:0,} returns sandbox id \"5adee51526c6efb8c573cc7ea1542a926d7d047fdc8b934c22f204c38cdc47ab\"" Jun 21 04:41:22.910559 containerd[1593]: time="2025-06-21T04:41:22.910491532Z" level=info msg="StartContainer for \"a75a7c2456c566339243cb11c7de1c4e3465f5ddac33028df4afb3cd0a28ac94\"" Jun 21 04:41:22.911649 containerd[1593]: time="2025-06-21T04:41:22.911383248Z" level=info msg="connecting to shim a75a7c2456c566339243cb11c7de1c4e3465f5ddac33028df4afb3cd0a28ac94" address="unix:///run/containerd/s/f22a3eef1dd0d7c142fe2fec81e2326100e65fe0872fa2f70962e615ad8f42c9" protocol=ttrpc version=3 Jun 21 04:41:22.936852 systemd[1]: Started cri-containerd-a75a7c2456c566339243cb11c7de1c4e3465f5ddac33028df4afb3cd0a28ac94.scope - libcontainer container a75a7c2456c566339243cb11c7de1c4e3465f5ddac33028df4afb3cd0a28ac94. Jun 21 04:41:22.945921 systemd-networkd[1495]: cali5bdbd24404b: Gained IPv6LL Jun 21 04:41:23.055394 containerd[1593]: time="2025-06-21T04:41:23.055283898Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ldc5m,Uid:8757dac4-0fac-47e3-9805-744183a4690a,Namespace:calico-system,Attempt:0,} returns sandbox id \"5be0ac59eda054e44a474b341f4ef18b942353c06882071e57a9c28d892a1821\"" Jun 21 04:41:23.174281 systemd-networkd[1495]: cali24544dba475: Link UP Jun 21 04:41:23.174750 systemd-networkd[1495]: cali24544dba475: Gained carrier Jun 21 04:41:23.213563 containerd[1593]: time="2025-06-21T04:41:23.213478059Z" level=info msg="StartContainer for \"a75a7c2456c566339243cb11c7de1c4e3465f5ddac33028df4afb3cd0a28ac94\" returns successfully" Jun 21 04:41:23.224865 containerd[1593]: 2025-06-21 04:41:22.676 [INFO][4593] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--c7b7955d6--gf7gk-eth0 calico-kube-controllers-c7b7955d6- calico-system e319519b-3d7c-436c-a209-e61e1b845712 879 0 2025-06-21 04:40:57 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:c7b7955d6 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-c7b7955d6-gf7gk eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali24544dba475 [] [] }} ContainerID="bef7e47ae325855c2f3508bf5aa758c4239cec229ff40b7371a59e41aecd1ba5" Namespace="calico-system" Pod="calico-kube-controllers-c7b7955d6-gf7gk" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--c7b7955d6--gf7gk-" Jun 21 04:41:23.224865 containerd[1593]: 2025-06-21 04:41:22.676 [INFO][4593] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bef7e47ae325855c2f3508bf5aa758c4239cec229ff40b7371a59e41aecd1ba5" Namespace="calico-system" Pod="calico-kube-controllers-c7b7955d6-gf7gk" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--c7b7955d6--gf7gk-eth0" Jun 21 04:41:23.224865 containerd[1593]: 2025-06-21 04:41:22.707 [INFO][4639] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bef7e47ae325855c2f3508bf5aa758c4239cec229ff40b7371a59e41aecd1ba5" HandleID="k8s-pod-network.bef7e47ae325855c2f3508bf5aa758c4239cec229ff40b7371a59e41aecd1ba5" Workload="localhost-k8s-calico--kube--controllers--c7b7955d6--gf7gk-eth0" Jun 21 04:41:23.224865 containerd[1593]: 2025-06-21 04:41:22.707 [INFO][4639] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="bef7e47ae325855c2f3508bf5aa758c4239cec229ff40b7371a59e41aecd1ba5" HandleID="k8s-pod-network.bef7e47ae325855c2f3508bf5aa758c4239cec229ff40b7371a59e41aecd1ba5" Workload="localhost-k8s-calico--kube--controllers--c7b7955d6--gf7gk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f750), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-c7b7955d6-gf7gk", "timestamp":"2025-06-21 04:41:22.707166047 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jun 21 04:41:23.224865 containerd[1593]: 2025-06-21 04:41:22.707 [INFO][4639] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jun 21 04:41:23.224865 containerd[1593]: 2025-06-21 04:41:22.707 [INFO][4639] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jun 21 04:41:23.224865 containerd[1593]: 2025-06-21 04:41:22.707 [INFO][4639] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jun 21 04:41:23.224865 containerd[1593]: 2025-06-21 04:41:22.748 [INFO][4639] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.bef7e47ae325855c2f3508bf5aa758c4239cec229ff40b7371a59e41aecd1ba5" host="localhost" Jun 21 04:41:23.224865 containerd[1593]: 2025-06-21 04:41:22.753 [INFO][4639] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jun 21 04:41:23.224865 containerd[1593]: 2025-06-21 04:41:22.758 [INFO][4639] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jun 21 04:41:23.224865 containerd[1593]: 2025-06-21 04:41:22.760 [INFO][4639] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jun 21 04:41:23.224865 containerd[1593]: 2025-06-21 04:41:22.762 [INFO][4639] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jun 21 04:41:23.224865 containerd[1593]: 2025-06-21 04:41:22.762 [INFO][4639] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.bef7e47ae325855c2f3508bf5aa758c4239cec229ff40b7371a59e41aecd1ba5" host="localhost" Jun 21 04:41:23.224865 containerd[1593]: 2025-06-21 04:41:22.763 [INFO][4639] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.bef7e47ae325855c2f3508bf5aa758c4239cec229ff40b7371a59e41aecd1ba5 Jun 21 04:41:23.224865 containerd[1593]: 2025-06-21 04:41:22.916 [INFO][4639] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.bef7e47ae325855c2f3508bf5aa758c4239cec229ff40b7371a59e41aecd1ba5" host="localhost" Jun 21 04:41:23.224865 containerd[1593]: 2025-06-21 04:41:23.168 [INFO][4639] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.bef7e47ae325855c2f3508bf5aa758c4239cec229ff40b7371a59e41aecd1ba5" host="localhost" Jun 21 04:41:23.224865 containerd[1593]: 2025-06-21 04:41:23.168 [INFO][4639] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.bef7e47ae325855c2f3508bf5aa758c4239cec229ff40b7371a59e41aecd1ba5" host="localhost" Jun 21 04:41:23.224865 containerd[1593]: 2025-06-21 04:41:23.168 [INFO][4639] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jun 21 04:41:23.224865 containerd[1593]: 2025-06-21 04:41:23.168 [INFO][4639] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="bef7e47ae325855c2f3508bf5aa758c4239cec229ff40b7371a59e41aecd1ba5" HandleID="k8s-pod-network.bef7e47ae325855c2f3508bf5aa758c4239cec229ff40b7371a59e41aecd1ba5" Workload="localhost-k8s-calico--kube--controllers--c7b7955d6--gf7gk-eth0" Jun 21 04:41:23.225472 containerd[1593]: 2025-06-21 04:41:23.172 [INFO][4593] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bef7e47ae325855c2f3508bf5aa758c4239cec229ff40b7371a59e41aecd1ba5" Namespace="calico-system" Pod="calico-kube-controllers-c7b7955d6-gf7gk" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--c7b7955d6--gf7gk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--c7b7955d6--gf7gk-eth0", GenerateName:"calico-kube-controllers-c7b7955d6-", Namespace:"calico-system", SelfLink:"", UID:"e319519b-3d7c-436c-a209-e61e1b845712", ResourceVersion:"879", Generation:0, CreationTimestamp:time.Date(2025, time.June, 21, 4, 40, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"c7b7955d6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-c7b7955d6-gf7gk", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali24544dba475", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jun 21 04:41:23.225472 containerd[1593]: 2025-06-21 04:41:23.172 [INFO][4593] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="bef7e47ae325855c2f3508bf5aa758c4239cec229ff40b7371a59e41aecd1ba5" Namespace="calico-system" Pod="calico-kube-controllers-c7b7955d6-gf7gk" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--c7b7955d6--gf7gk-eth0" Jun 21 04:41:23.225472 containerd[1593]: 2025-06-21 04:41:23.172 [INFO][4593] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali24544dba475 ContainerID="bef7e47ae325855c2f3508bf5aa758c4239cec229ff40b7371a59e41aecd1ba5" Namespace="calico-system" Pod="calico-kube-controllers-c7b7955d6-gf7gk" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--c7b7955d6--gf7gk-eth0" Jun 21 04:41:23.225472 containerd[1593]: 2025-06-21 04:41:23.175 [INFO][4593] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bef7e47ae325855c2f3508bf5aa758c4239cec229ff40b7371a59e41aecd1ba5" Namespace="calico-system" Pod="calico-kube-controllers-c7b7955d6-gf7gk" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--c7b7955d6--gf7gk-eth0" Jun 21 04:41:23.225472 containerd[1593]: 2025-06-21 04:41:23.175 [INFO][4593] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bef7e47ae325855c2f3508bf5aa758c4239cec229ff40b7371a59e41aecd1ba5" Namespace="calico-system" Pod="calico-kube-controllers-c7b7955d6-gf7gk" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--c7b7955d6--gf7gk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--c7b7955d6--gf7gk-eth0", GenerateName:"calico-kube-controllers-c7b7955d6-", Namespace:"calico-system", SelfLink:"", UID:"e319519b-3d7c-436c-a209-e61e1b845712", ResourceVersion:"879", Generation:0, CreationTimestamp:time.Date(2025, time.June, 21, 4, 40, 57, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"c7b7955d6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"bef7e47ae325855c2f3508bf5aa758c4239cec229ff40b7371a59e41aecd1ba5", Pod:"calico-kube-controllers-c7b7955d6-gf7gk", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali24544dba475", MAC:"b2:8c:bb:c0:df:41", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jun 21 04:41:23.225472 containerd[1593]: 2025-06-21 04:41:23.217 [INFO][4593] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bef7e47ae325855c2f3508bf5aa758c4239cec229ff40b7371a59e41aecd1ba5" Namespace="calico-system" Pod="calico-kube-controllers-c7b7955d6-gf7gk" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--c7b7955d6--gf7gk-eth0" Jun 21 04:41:23.253413 containerd[1593]: time="2025-06-21T04:41:23.253345569Z" level=info msg="connecting to shim bef7e47ae325855c2f3508bf5aa758c4239cec229ff40b7371a59e41aecd1ba5" address="unix:///run/containerd/s/bbee7799e6da36a08e6b6f5a28beebadf1b6f33a84e14121af017e85ac2eeaa6" namespace=k8s.io protocol=ttrpc version=3 Jun 21 04:41:23.288976 systemd[1]: Started cri-containerd-bef7e47ae325855c2f3508bf5aa758c4239cec229ff40b7371a59e41aecd1ba5.scope - libcontainer container bef7e47ae325855c2f3508bf5aa758c4239cec229ff40b7371a59e41aecd1ba5. Jun 21 04:41:23.303942 systemd-resolved[1405]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jun 21 04:41:23.330390 systemd-networkd[1495]: cali7894cbf81f1: Gained IPv6LL Jun 21 04:41:23.338908 containerd[1593]: time="2025-06-21T04:41:23.338823902Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-c7b7955d6-gf7gk,Uid:e319519b-3d7c-436c-a209-e61e1b845712,Namespace:calico-system,Attempt:0,} returns sandbox id \"bef7e47ae325855c2f3508bf5aa758c4239cec229ff40b7371a59e41aecd1ba5\"" Jun 21 04:41:23.394926 systemd-networkd[1495]: vxlan.calico: Gained IPv6LL Jun 21 04:41:23.413847 systemd[1]: Started sshd@7-10.0.0.63:22-10.0.0.1:39452.service - OpenSSH per-connection server daemon (10.0.0.1:39452). Jun 21 04:41:23.441735 kubelet[2743]: E0621 04:41:23.441120 2743 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Jun 21 04:41:23.443815 containerd[1593]: time="2025-06-21T04:41:23.443702191Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-lchxl,Uid:4ff624f6-6dd1-44df-847d-619cdf482fbf,Namespace:kube-system,Attempt:0,}" Jun 21 04:41:23.496802 sshd[4755]: Accepted publickey for core from 10.0.0.1 port 39452 ssh2: RSA SHA256:015yC5fRvb07MyWOgrdDHnl6DLRQb6q1XcuQXpFRy7c Jun 21 04:41:23.499021 sshd-session[4755]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jun 21 04:41:23.506816 systemd-logind[1566]: New session 8 of user core. Jun 21 04:41:23.511064 systemd[1]: Started session-8.scope - Session 8 of User core. Jun 21 04:41:23.574107 kubelet[2743]: E0621 04:41:23.574052 2743 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Jun 21 04:41:23.579482 systemd-networkd[1495]: calic688da05d16: Link UP Jun 21 04:41:23.582163 systemd-networkd[1495]: calic688da05d16: Gained carrier Jun 21 04:41:23.601735 kubelet[2743]: I0621 04:41:23.600849 2743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-5x9q2" podStartSLOduration=37.600116298 podStartE2EDuration="37.600116298s" podCreationTimestamp="2025-06-21 04:40:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-21 04:41:23.596362174 +0000 UTC m=+43.371746185" watchObservedRunningTime="2025-06-21 04:41:23.600116298 +0000 UTC m=+43.375500309" Jun 21 04:41:23.605335 containerd[1593]: 2025-06-21 04:41:23.494 [INFO][4756] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--674b8bbfcf--lchxl-eth0 coredns-674b8bbfcf- kube-system 4ff624f6-6dd1-44df-847d-619cdf482fbf 877 0 2025-06-21 04:40:46 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-674b8bbfcf-lchxl eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calic688da05d16 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="928a2ec37287ea2fdadaf56d4c0558baa0a553f073ed92de1dc2d9434ed5d806" Namespace="kube-system" Pod="coredns-674b8bbfcf-lchxl" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--lchxl-" Jun 21 04:41:23.605335 containerd[1593]: 2025-06-21 04:41:23.494 [INFO][4756] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="928a2ec37287ea2fdadaf56d4c0558baa0a553f073ed92de1dc2d9434ed5d806" Namespace="kube-system" Pod="coredns-674b8bbfcf-lchxl" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--lchxl-eth0" Jun 21 04:41:23.605335 containerd[1593]: 2025-06-21 04:41:23.537 [INFO][4774] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="928a2ec37287ea2fdadaf56d4c0558baa0a553f073ed92de1dc2d9434ed5d806" HandleID="k8s-pod-network.928a2ec37287ea2fdadaf56d4c0558baa0a553f073ed92de1dc2d9434ed5d806" Workload="localhost-k8s-coredns--674b8bbfcf--lchxl-eth0" Jun 21 04:41:23.605335 containerd[1593]: 2025-06-21 04:41:23.537 [INFO][4774] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="928a2ec37287ea2fdadaf56d4c0558baa0a553f073ed92de1dc2d9434ed5d806" HandleID="k8s-pod-network.928a2ec37287ea2fdadaf56d4c0558baa0a553f073ed92de1dc2d9434ed5d806" Workload="localhost-k8s-coredns--674b8bbfcf--lchxl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000586e50), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-674b8bbfcf-lchxl", "timestamp":"2025-06-21 04:41:23.537471795 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jun 21 04:41:23.605335 containerd[1593]: 2025-06-21 04:41:23.537 [INFO][4774] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jun 21 04:41:23.605335 containerd[1593]: 2025-06-21 04:41:23.537 [INFO][4774] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jun 21 04:41:23.605335 containerd[1593]: 2025-06-21 04:41:23.537 [INFO][4774] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jun 21 04:41:23.605335 containerd[1593]: 2025-06-21 04:41:23.548 [INFO][4774] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.928a2ec37287ea2fdadaf56d4c0558baa0a553f073ed92de1dc2d9434ed5d806" host="localhost" Jun 21 04:41:23.605335 containerd[1593]: 2025-06-21 04:41:23.553 [INFO][4774] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jun 21 04:41:23.605335 containerd[1593]: 2025-06-21 04:41:23.557 [INFO][4774] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jun 21 04:41:23.605335 containerd[1593]: 2025-06-21 04:41:23.558 [INFO][4774] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jun 21 04:41:23.605335 containerd[1593]: 2025-06-21 04:41:23.560 [INFO][4774] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jun 21 04:41:23.605335 containerd[1593]: 2025-06-21 04:41:23.560 [INFO][4774] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.928a2ec37287ea2fdadaf56d4c0558baa0a553f073ed92de1dc2d9434ed5d806" host="localhost" Jun 21 04:41:23.605335 containerd[1593]: 2025-06-21 04:41:23.562 [INFO][4774] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.928a2ec37287ea2fdadaf56d4c0558baa0a553f073ed92de1dc2d9434ed5d806 Jun 21 04:41:23.605335 containerd[1593]: 2025-06-21 04:41:23.565 [INFO][4774] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.928a2ec37287ea2fdadaf56d4c0558baa0a553f073ed92de1dc2d9434ed5d806" host="localhost" Jun 21 04:41:23.605335 containerd[1593]: 2025-06-21 04:41:23.571 [INFO][4774] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.928a2ec37287ea2fdadaf56d4c0558baa0a553f073ed92de1dc2d9434ed5d806" host="localhost" Jun 21 04:41:23.605335 containerd[1593]: 2025-06-21 04:41:23.571 [INFO][4774] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.928a2ec37287ea2fdadaf56d4c0558baa0a553f073ed92de1dc2d9434ed5d806" host="localhost" Jun 21 04:41:23.605335 containerd[1593]: 2025-06-21 04:41:23.572 [INFO][4774] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jun 21 04:41:23.605335 containerd[1593]: 2025-06-21 04:41:23.572 [INFO][4774] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="928a2ec37287ea2fdadaf56d4c0558baa0a553f073ed92de1dc2d9434ed5d806" HandleID="k8s-pod-network.928a2ec37287ea2fdadaf56d4c0558baa0a553f073ed92de1dc2d9434ed5d806" Workload="localhost-k8s-coredns--674b8bbfcf--lchxl-eth0" Jun 21 04:41:23.605881 containerd[1593]: 2025-06-21 04:41:23.575 [INFO][4756] cni-plugin/k8s.go 418: Populated endpoint ContainerID="928a2ec37287ea2fdadaf56d4c0558baa0a553f073ed92de1dc2d9434ed5d806" Namespace="kube-system" Pod="coredns-674b8bbfcf-lchxl" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--lchxl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--lchxl-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"4ff624f6-6dd1-44df-847d-619cdf482fbf", ResourceVersion:"877", Generation:0, CreationTimestamp:time.Date(2025, time.June, 21, 4, 40, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-674b8bbfcf-lchxl", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calic688da05d16", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jun 21 04:41:23.605881 containerd[1593]: 2025-06-21 04:41:23.575 [INFO][4756] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="928a2ec37287ea2fdadaf56d4c0558baa0a553f073ed92de1dc2d9434ed5d806" Namespace="kube-system" Pod="coredns-674b8bbfcf-lchxl" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--lchxl-eth0" Jun 21 04:41:23.605881 containerd[1593]: 2025-06-21 04:41:23.575 [INFO][4756] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic688da05d16 ContainerID="928a2ec37287ea2fdadaf56d4c0558baa0a553f073ed92de1dc2d9434ed5d806" Namespace="kube-system" Pod="coredns-674b8bbfcf-lchxl" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--lchxl-eth0" Jun 21 04:41:23.605881 containerd[1593]: 2025-06-21 04:41:23.583 [INFO][4756] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="928a2ec37287ea2fdadaf56d4c0558baa0a553f073ed92de1dc2d9434ed5d806" Namespace="kube-system" Pod="coredns-674b8bbfcf-lchxl" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--lchxl-eth0" Jun 21 04:41:23.605881 containerd[1593]: 2025-06-21 04:41:23.584 [INFO][4756] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="928a2ec37287ea2fdadaf56d4c0558baa0a553f073ed92de1dc2d9434ed5d806" Namespace="kube-system" Pod="coredns-674b8bbfcf-lchxl" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--lchxl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--lchxl-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"4ff624f6-6dd1-44df-847d-619cdf482fbf", ResourceVersion:"877", Generation:0, CreationTimestamp:time.Date(2025, time.June, 21, 4, 40, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"928a2ec37287ea2fdadaf56d4c0558baa0a553f073ed92de1dc2d9434ed5d806", Pod:"coredns-674b8bbfcf-lchxl", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calic688da05d16", MAC:"06:a6:ce:84:e4:bb", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jun 21 04:41:23.605881 containerd[1593]: 2025-06-21 04:41:23.597 [INFO][4756] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="928a2ec37287ea2fdadaf56d4c0558baa0a553f073ed92de1dc2d9434ed5d806" Namespace="kube-system" Pod="coredns-674b8bbfcf-lchxl" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--lchxl-eth0" Jun 21 04:41:23.656328 containerd[1593]: time="2025-06-21T04:41:23.656291845Z" level=info msg="connecting to shim 928a2ec37287ea2fdadaf56d4c0558baa0a553f073ed92de1dc2d9434ed5d806" address="unix:///run/containerd/s/8307689f587971e7afda028d1ff3d5f18227fb3bf02a1f9175ff4ab940c9664d" namespace=k8s.io protocol=ttrpc version=3 Jun 21 04:41:23.681411 sshd[4779]: Connection closed by 10.0.0.1 port 39452 Jun 21 04:41:23.682011 sshd-session[4755]: pam_unix(sshd:session): session closed for user core Jun 21 04:41:23.689895 systemd[1]: Started cri-containerd-928a2ec37287ea2fdadaf56d4c0558baa0a553f073ed92de1dc2d9434ed5d806.scope - libcontainer container 928a2ec37287ea2fdadaf56d4c0558baa0a553f073ed92de1dc2d9434ed5d806. Jun 21 04:41:23.690472 systemd[1]: sshd@7-10.0.0.63:22-10.0.0.1:39452.service: Deactivated successfully. Jun 21 04:41:23.693302 systemd[1]: session-8.scope: Deactivated successfully. Jun 21 04:41:23.695894 systemd-logind[1566]: Session 8 logged out. Waiting for processes to exit. Jun 21 04:41:23.697666 systemd-logind[1566]: Removed session 8. Jun 21 04:41:23.706368 systemd-resolved[1405]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jun 21 04:41:23.738846 containerd[1593]: time="2025-06-21T04:41:23.738808321Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-lchxl,Uid:4ff624f6-6dd1-44df-847d-619cdf482fbf,Namespace:kube-system,Attempt:0,} returns sandbox id \"928a2ec37287ea2fdadaf56d4c0558baa0a553f073ed92de1dc2d9434ed5d806\"" Jun 21 04:41:23.739502 kubelet[2743]: E0621 04:41:23.739475 2743 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Jun 21 04:41:23.745557 containerd[1593]: time="2025-06-21T04:41:23.745520184Z" level=info msg="CreateContainer within sandbox \"928a2ec37287ea2fdadaf56d4c0558baa0a553f073ed92de1dc2d9434ed5d806\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jun 21 04:41:23.752935 containerd[1593]: time="2025-06-21T04:41:23.752894732Z" level=info msg="Container 0039966092c4b414144fc02093e89fdb59ae2eebce8e28bf364c9ec47895b234: CDI devices from CRI Config.CDIDevices: []" Jun 21 04:41:23.760099 containerd[1593]: time="2025-06-21T04:41:23.760064766Z" level=info msg="CreateContainer within sandbox \"928a2ec37287ea2fdadaf56d4c0558baa0a553f073ed92de1dc2d9434ed5d806\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"0039966092c4b414144fc02093e89fdb59ae2eebce8e28bf364c9ec47895b234\"" Jun 21 04:41:23.760554 containerd[1593]: time="2025-06-21T04:41:23.760533498Z" level=info msg="StartContainer for \"0039966092c4b414144fc02093e89fdb59ae2eebce8e28bf364c9ec47895b234\"" Jun 21 04:41:23.761493 containerd[1593]: time="2025-06-21T04:41:23.761468585Z" level=info msg="connecting to shim 0039966092c4b414144fc02093e89fdb59ae2eebce8e28bf364c9ec47895b234" address="unix:///run/containerd/s/8307689f587971e7afda028d1ff3d5f18227fb3bf02a1f9175ff4ab940c9664d" protocol=ttrpc version=3 Jun 21 04:41:23.777910 systemd-networkd[1495]: calif97c5bb0370: Gained IPv6LL Jun 21 04:41:23.783856 systemd[1]: Started cri-containerd-0039966092c4b414144fc02093e89fdb59ae2eebce8e28bf364c9ec47895b234.scope - libcontainer container 0039966092c4b414144fc02093e89fdb59ae2eebce8e28bf364c9ec47895b234. Jun 21 04:41:23.813602 containerd[1593]: time="2025-06-21T04:41:23.813548023Z" level=info msg="StartContainer for \"0039966092c4b414144fc02093e89fdb59ae2eebce8e28bf364c9ec47895b234\" returns successfully" Jun 21 04:41:23.844021 systemd-networkd[1495]: calic92fe7deb99: Gained IPv6LL Jun 21 04:41:24.073069 containerd[1593]: time="2025-06-21T04:41:24.073010967Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:24.073924 containerd[1593]: time="2025-06-21T04:41:24.073886452Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.1: active requests=0, bytes read=4661202" Jun 21 04:41:24.075038 containerd[1593]: time="2025-06-21T04:41:24.075008791Z" level=info msg="ImageCreate event name:\"sha256:f9c2addb6553484a4cf8cf5e38959c95aff70d213991bb2626aab9eb9b0ce51c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:24.076999 containerd[1593]: time="2025-06-21T04:41:24.076964656Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:7f323954f2f741238d256690a674536bf562d4b4bd7cd6bab3c21a0a1327e1fc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:24.077498 containerd[1593]: time="2025-06-21T04:41:24.077467040Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.1\" with image id \"sha256:f9c2addb6553484a4cf8cf5e38959c95aff70d213991bb2626aab9eb9b0ce51c\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.1\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:7f323954f2f741238d256690a674536bf562d4b4bd7cd6bab3c21a0a1327e1fc\", size \"6153897\" in 2.290084041s" Jun 21 04:41:24.077498 containerd[1593]: time="2025-06-21T04:41:24.077491496Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.1\" returns image reference \"sha256:f9c2addb6553484a4cf8cf5e38959c95aff70d213991bb2626aab9eb9b0ce51c\"" Jun 21 04:41:24.078600 containerd[1593]: time="2025-06-21T04:41:24.078382390Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.1\"" Jun 21 04:41:24.082295 containerd[1593]: time="2025-06-21T04:41:24.082263743Z" level=info msg="CreateContainer within sandbox \"c31cb3ecd7aa568f12def2b05a895c9b479cc0e3c97edc18a919901910168e43\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Jun 21 04:41:24.090578 containerd[1593]: time="2025-06-21T04:41:24.090534775Z" level=info msg="Container 1cf3df3851ab2272ba0f72001cae63d9e0c966750c1b1b8bb0a7448d974442b7: CDI devices from CRI Config.CDIDevices: []" Jun 21 04:41:24.101787 containerd[1593]: time="2025-06-21T04:41:24.101651473Z" level=info msg="CreateContainer within sandbox \"c31cb3ecd7aa568f12def2b05a895c9b479cc0e3c97edc18a919901910168e43\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"1cf3df3851ab2272ba0f72001cae63d9e0c966750c1b1b8bb0a7448d974442b7\"" Jun 21 04:41:24.102356 containerd[1593]: time="2025-06-21T04:41:24.102324197Z" level=info msg="StartContainer for \"1cf3df3851ab2272ba0f72001cae63d9e0c966750c1b1b8bb0a7448d974442b7\"" Jun 21 04:41:24.103942 containerd[1593]: time="2025-06-21T04:41:24.103905760Z" level=info msg="connecting to shim 1cf3df3851ab2272ba0f72001cae63d9e0c966750c1b1b8bb0a7448d974442b7" address="unix:///run/containerd/s/c8675cb2c7f9ccfa3d61cf897e99cd6ed18efd523e67dd6ebeb0c5b5d386f2ba" protocol=ttrpc version=3 Jun 21 04:41:24.130876 systemd[1]: Started cri-containerd-1cf3df3851ab2272ba0f72001cae63d9e0c966750c1b1b8bb0a7448d974442b7.scope - libcontainer container 1cf3df3851ab2272ba0f72001cae63d9e0c966750c1b1b8bb0a7448d974442b7. Jun 21 04:41:24.194911 containerd[1593]: time="2025-06-21T04:41:24.194863171Z" level=info msg="StartContainer for \"1cf3df3851ab2272ba0f72001cae63d9e0c966750c1b1b8bb0a7448d974442b7\" returns successfully" Jun 21 04:41:24.441664 containerd[1593]: time="2025-06-21T04:41:24.441538509Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6ddd6f5999-tkdl5,Uid:82964b29-a003-4f56-b879-9024b11edde1,Namespace:calico-apiserver,Attempt:0,}" Jun 21 04:41:24.441664 containerd[1593]: time="2025-06-21T04:41:24.441563345Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6db98b7fcf-rhrc4,Uid:08ed4df7-8d16-4be6-afa7-a63368b4f265,Namespace:calico-apiserver,Attempt:0,}" Jun 21 04:41:24.549114 systemd-networkd[1495]: cali134fad6bef5: Link UP Jun 21 04:41:24.552985 systemd-networkd[1495]: cali134fad6bef5: Gained carrier Jun 21 04:41:24.568369 containerd[1593]: 2025-06-21 04:41:24.481 [INFO][4928] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--6ddd6f5999--tkdl5-eth0 calico-apiserver-6ddd6f5999- calico-apiserver 82964b29-a003-4f56-b879-9024b11edde1 881 0 2025-06-21 04:40:54 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6ddd6f5999 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-6ddd6f5999-tkdl5 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali134fad6bef5 [] [] }} ContainerID="490b17c35820196578e295a9ef1143fe1ae43122135988823bab716fe75e58ac" Namespace="calico-apiserver" Pod="calico-apiserver-6ddd6f5999-tkdl5" WorkloadEndpoint="localhost-k8s-calico--apiserver--6ddd6f5999--tkdl5-" Jun 21 04:41:24.568369 containerd[1593]: 2025-06-21 04:41:24.483 [INFO][4928] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="490b17c35820196578e295a9ef1143fe1ae43122135988823bab716fe75e58ac" Namespace="calico-apiserver" Pod="calico-apiserver-6ddd6f5999-tkdl5" WorkloadEndpoint="localhost-k8s-calico--apiserver--6ddd6f5999--tkdl5-eth0" Jun 21 04:41:24.568369 containerd[1593]: 2025-06-21 04:41:24.512 [INFO][4960] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="490b17c35820196578e295a9ef1143fe1ae43122135988823bab716fe75e58ac" HandleID="k8s-pod-network.490b17c35820196578e295a9ef1143fe1ae43122135988823bab716fe75e58ac" Workload="localhost-k8s-calico--apiserver--6ddd6f5999--tkdl5-eth0" Jun 21 04:41:24.568369 containerd[1593]: 2025-06-21 04:41:24.513 [INFO][4960] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="490b17c35820196578e295a9ef1143fe1ae43122135988823bab716fe75e58ac" HandleID="k8s-pod-network.490b17c35820196578e295a9ef1143fe1ae43122135988823bab716fe75e58ac" Workload="localhost-k8s-calico--apiserver--6ddd6f5999--tkdl5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00024f060), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-6ddd6f5999-tkdl5", "timestamp":"2025-06-21 04:41:24.512932911 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jun 21 04:41:24.568369 containerd[1593]: 2025-06-21 04:41:24.513 [INFO][4960] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jun 21 04:41:24.568369 containerd[1593]: 2025-06-21 04:41:24.513 [INFO][4960] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jun 21 04:41:24.568369 containerd[1593]: 2025-06-21 04:41:24.513 [INFO][4960] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jun 21 04:41:24.568369 containerd[1593]: 2025-06-21 04:41:24.520 [INFO][4960] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.490b17c35820196578e295a9ef1143fe1ae43122135988823bab716fe75e58ac" host="localhost" Jun 21 04:41:24.568369 containerd[1593]: 2025-06-21 04:41:24.527 [INFO][4960] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jun 21 04:41:24.568369 containerd[1593]: 2025-06-21 04:41:24.531 [INFO][4960] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jun 21 04:41:24.568369 containerd[1593]: 2025-06-21 04:41:24.532 [INFO][4960] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jun 21 04:41:24.568369 containerd[1593]: 2025-06-21 04:41:24.534 [INFO][4960] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jun 21 04:41:24.568369 containerd[1593]: 2025-06-21 04:41:24.534 [INFO][4960] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.490b17c35820196578e295a9ef1143fe1ae43122135988823bab716fe75e58ac" host="localhost" Jun 21 04:41:24.568369 containerd[1593]: 2025-06-21 04:41:24.535 [INFO][4960] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.490b17c35820196578e295a9ef1143fe1ae43122135988823bab716fe75e58ac Jun 21 04:41:24.568369 containerd[1593]: 2025-06-21 04:41:24.538 [INFO][4960] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.490b17c35820196578e295a9ef1143fe1ae43122135988823bab716fe75e58ac" host="localhost" Jun 21 04:41:24.568369 containerd[1593]: 2025-06-21 04:41:24.543 [INFO][4960] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.490b17c35820196578e295a9ef1143fe1ae43122135988823bab716fe75e58ac" host="localhost" Jun 21 04:41:24.568369 containerd[1593]: 2025-06-21 04:41:24.543 [INFO][4960] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.490b17c35820196578e295a9ef1143fe1ae43122135988823bab716fe75e58ac" host="localhost" Jun 21 04:41:24.568369 containerd[1593]: 2025-06-21 04:41:24.543 [INFO][4960] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jun 21 04:41:24.568369 containerd[1593]: 2025-06-21 04:41:24.543 [INFO][4960] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="490b17c35820196578e295a9ef1143fe1ae43122135988823bab716fe75e58ac" HandleID="k8s-pod-network.490b17c35820196578e295a9ef1143fe1ae43122135988823bab716fe75e58ac" Workload="localhost-k8s-calico--apiserver--6ddd6f5999--tkdl5-eth0" Jun 21 04:41:24.569694 containerd[1593]: 2025-06-21 04:41:24.546 [INFO][4928] cni-plugin/k8s.go 418: Populated endpoint ContainerID="490b17c35820196578e295a9ef1143fe1ae43122135988823bab716fe75e58ac" Namespace="calico-apiserver" Pod="calico-apiserver-6ddd6f5999-tkdl5" WorkloadEndpoint="localhost-k8s-calico--apiserver--6ddd6f5999--tkdl5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6ddd6f5999--tkdl5-eth0", GenerateName:"calico-apiserver-6ddd6f5999-", Namespace:"calico-apiserver", SelfLink:"", UID:"82964b29-a003-4f56-b879-9024b11edde1", ResourceVersion:"881", Generation:0, CreationTimestamp:time.Date(2025, time.June, 21, 4, 40, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6ddd6f5999", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-6ddd6f5999-tkdl5", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali134fad6bef5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jun 21 04:41:24.569694 containerd[1593]: 2025-06-21 04:41:24.546 [INFO][4928] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="490b17c35820196578e295a9ef1143fe1ae43122135988823bab716fe75e58ac" Namespace="calico-apiserver" Pod="calico-apiserver-6ddd6f5999-tkdl5" WorkloadEndpoint="localhost-k8s-calico--apiserver--6ddd6f5999--tkdl5-eth0" Jun 21 04:41:24.569694 containerd[1593]: 2025-06-21 04:41:24.546 [INFO][4928] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali134fad6bef5 ContainerID="490b17c35820196578e295a9ef1143fe1ae43122135988823bab716fe75e58ac" Namespace="calico-apiserver" Pod="calico-apiserver-6ddd6f5999-tkdl5" WorkloadEndpoint="localhost-k8s-calico--apiserver--6ddd6f5999--tkdl5-eth0" Jun 21 04:41:24.569694 containerd[1593]: 2025-06-21 04:41:24.552 [INFO][4928] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="490b17c35820196578e295a9ef1143fe1ae43122135988823bab716fe75e58ac" Namespace="calico-apiserver" Pod="calico-apiserver-6ddd6f5999-tkdl5" WorkloadEndpoint="localhost-k8s-calico--apiserver--6ddd6f5999--tkdl5-eth0" Jun 21 04:41:24.569694 containerd[1593]: 2025-06-21 04:41:24.553 [INFO][4928] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="490b17c35820196578e295a9ef1143fe1ae43122135988823bab716fe75e58ac" Namespace="calico-apiserver" Pod="calico-apiserver-6ddd6f5999-tkdl5" WorkloadEndpoint="localhost-k8s-calico--apiserver--6ddd6f5999--tkdl5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6ddd6f5999--tkdl5-eth0", GenerateName:"calico-apiserver-6ddd6f5999-", Namespace:"calico-apiserver", SelfLink:"", UID:"82964b29-a003-4f56-b879-9024b11edde1", ResourceVersion:"881", Generation:0, CreationTimestamp:time.Date(2025, time.June, 21, 4, 40, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6ddd6f5999", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"490b17c35820196578e295a9ef1143fe1ae43122135988823bab716fe75e58ac", Pod:"calico-apiserver-6ddd6f5999-tkdl5", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali134fad6bef5", MAC:"32:70:f1:46:6c:0f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jun 21 04:41:24.569694 containerd[1593]: 2025-06-21 04:41:24.565 [INFO][4928] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="490b17c35820196578e295a9ef1143fe1ae43122135988823bab716fe75e58ac" Namespace="calico-apiserver" Pod="calico-apiserver-6ddd6f5999-tkdl5" WorkloadEndpoint="localhost-k8s-calico--apiserver--6ddd6f5999--tkdl5-eth0" Jun 21 04:41:24.585166 kubelet[2743]: E0621 04:41:24.585136 2743 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Jun 21 04:41:24.587142 kubelet[2743]: E0621 04:41:24.586948 2743 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Jun 21 04:41:24.592477 containerd[1593]: time="2025-06-21T04:41:24.592388510Z" level=info msg="connecting to shim 490b17c35820196578e295a9ef1143fe1ae43122135988823bab716fe75e58ac" address="unix:///run/containerd/s/578b388b135a6c600fa5eb9ae01a485a44bde8f83e35d7dbbe98b781f07e8f85" namespace=k8s.io protocol=ttrpc version=3 Jun 21 04:41:24.609918 kubelet[2743]: I0621 04:41:24.609828 2743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-lchxl" podStartSLOduration=38.609810697 podStartE2EDuration="38.609810697s" podCreationTimestamp="2025-06-21 04:40:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-06-21 04:41:24.609437926 +0000 UTC m=+44.384821937" watchObservedRunningTime="2025-06-21 04:41:24.609810697 +0000 UTC m=+44.385194708" Jun 21 04:41:24.627164 systemd[1]: Started cri-containerd-490b17c35820196578e295a9ef1143fe1ae43122135988823bab716fe75e58ac.scope - libcontainer container 490b17c35820196578e295a9ef1143fe1ae43122135988823bab716fe75e58ac. Jun 21 04:41:24.655508 systemd-resolved[1405]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jun 21 04:41:24.674558 systemd-networkd[1495]: cali891c5551885: Link UP Jun 21 04:41:24.675405 systemd-networkd[1495]: cali891c5551885: Gained carrier Jun 21 04:41:24.690320 containerd[1593]: 2025-06-21 04:41:24.481 [INFO][4940] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--6db98b7fcf--rhrc4-eth0 calico-apiserver-6db98b7fcf- calico-apiserver 08ed4df7-8d16-4be6-afa7-a63368b4f265 876 0 2025-06-21 04:40:54 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6db98b7fcf projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-6db98b7fcf-rhrc4 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali891c5551885 [] [] }} ContainerID="97e2ba5f0568a342a7073cdc4712790089b2d90112ae7d98c6a8edd7b307512a" Namespace="calico-apiserver" Pod="calico-apiserver-6db98b7fcf-rhrc4" WorkloadEndpoint="localhost-k8s-calico--apiserver--6db98b7fcf--rhrc4-" Jun 21 04:41:24.690320 containerd[1593]: 2025-06-21 04:41:24.481 [INFO][4940] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="97e2ba5f0568a342a7073cdc4712790089b2d90112ae7d98c6a8edd7b307512a" Namespace="calico-apiserver" Pod="calico-apiserver-6db98b7fcf-rhrc4" WorkloadEndpoint="localhost-k8s-calico--apiserver--6db98b7fcf--rhrc4-eth0" Jun 21 04:41:24.690320 containerd[1593]: 2025-06-21 04:41:24.512 [INFO][4957] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="97e2ba5f0568a342a7073cdc4712790089b2d90112ae7d98c6a8edd7b307512a" HandleID="k8s-pod-network.97e2ba5f0568a342a7073cdc4712790089b2d90112ae7d98c6a8edd7b307512a" Workload="localhost-k8s-calico--apiserver--6db98b7fcf--rhrc4-eth0" Jun 21 04:41:24.690320 containerd[1593]: 2025-06-21 04:41:24.513 [INFO][4957] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="97e2ba5f0568a342a7073cdc4712790089b2d90112ae7d98c6a8edd7b307512a" HandleID="k8s-pod-network.97e2ba5f0568a342a7073cdc4712790089b2d90112ae7d98c6a8edd7b307512a" Workload="localhost-k8s-calico--apiserver--6db98b7fcf--rhrc4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000138e30), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-6db98b7fcf-rhrc4", "timestamp":"2025-06-21 04:41:24.512932571 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jun 21 04:41:24.690320 containerd[1593]: 2025-06-21 04:41:24.513 [INFO][4957] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jun 21 04:41:24.690320 containerd[1593]: 2025-06-21 04:41:24.544 [INFO][4957] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jun 21 04:41:24.690320 containerd[1593]: 2025-06-21 04:41:24.544 [INFO][4957] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jun 21 04:41:24.690320 containerd[1593]: 2025-06-21 04:41:24.623 [INFO][4957] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.97e2ba5f0568a342a7073cdc4712790089b2d90112ae7d98c6a8edd7b307512a" host="localhost" Jun 21 04:41:24.690320 containerd[1593]: 2025-06-21 04:41:24.636 [INFO][4957] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jun 21 04:41:24.690320 containerd[1593]: 2025-06-21 04:41:24.641 [INFO][4957] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jun 21 04:41:24.690320 containerd[1593]: 2025-06-21 04:41:24.646 [INFO][4957] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jun 21 04:41:24.690320 containerd[1593]: 2025-06-21 04:41:24.650 [INFO][4957] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jun 21 04:41:24.690320 containerd[1593]: 2025-06-21 04:41:24.651 [INFO][4957] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.97e2ba5f0568a342a7073cdc4712790089b2d90112ae7d98c6a8edd7b307512a" host="localhost" Jun 21 04:41:24.690320 containerd[1593]: 2025-06-21 04:41:24.653 [INFO][4957] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.97e2ba5f0568a342a7073cdc4712790089b2d90112ae7d98c6a8edd7b307512a Jun 21 04:41:24.690320 containerd[1593]: 2025-06-21 04:41:24.659 [INFO][4957] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.97e2ba5f0568a342a7073cdc4712790089b2d90112ae7d98c6a8edd7b307512a" host="localhost" Jun 21 04:41:24.690320 containerd[1593]: 2025-06-21 04:41:24.668 [INFO][4957] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.137/26] block=192.168.88.128/26 handle="k8s-pod-network.97e2ba5f0568a342a7073cdc4712790089b2d90112ae7d98c6a8edd7b307512a" host="localhost" Jun 21 04:41:24.690320 containerd[1593]: 2025-06-21 04:41:24.668 [INFO][4957] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.137/26] handle="k8s-pod-network.97e2ba5f0568a342a7073cdc4712790089b2d90112ae7d98c6a8edd7b307512a" host="localhost" Jun 21 04:41:24.690320 containerd[1593]: 2025-06-21 04:41:24.669 [INFO][4957] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jun 21 04:41:24.690320 containerd[1593]: 2025-06-21 04:41:24.669 [INFO][4957] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.137/26] IPv6=[] ContainerID="97e2ba5f0568a342a7073cdc4712790089b2d90112ae7d98c6a8edd7b307512a" HandleID="k8s-pod-network.97e2ba5f0568a342a7073cdc4712790089b2d90112ae7d98c6a8edd7b307512a" Workload="localhost-k8s-calico--apiserver--6db98b7fcf--rhrc4-eth0" Jun 21 04:41:24.691057 containerd[1593]: 2025-06-21 04:41:24.672 [INFO][4940] cni-plugin/k8s.go 418: Populated endpoint ContainerID="97e2ba5f0568a342a7073cdc4712790089b2d90112ae7d98c6a8edd7b307512a" Namespace="calico-apiserver" Pod="calico-apiserver-6db98b7fcf-rhrc4" WorkloadEndpoint="localhost-k8s-calico--apiserver--6db98b7fcf--rhrc4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6db98b7fcf--rhrc4-eth0", GenerateName:"calico-apiserver-6db98b7fcf-", Namespace:"calico-apiserver", SelfLink:"", UID:"08ed4df7-8d16-4be6-afa7-a63368b4f265", ResourceVersion:"876", Generation:0, CreationTimestamp:time.Date(2025, time.June, 21, 4, 40, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6db98b7fcf", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-6db98b7fcf-rhrc4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.137/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali891c5551885", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jun 21 04:41:24.691057 containerd[1593]: 2025-06-21 04:41:24.672 [INFO][4940] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.137/32] ContainerID="97e2ba5f0568a342a7073cdc4712790089b2d90112ae7d98c6a8edd7b307512a" Namespace="calico-apiserver" Pod="calico-apiserver-6db98b7fcf-rhrc4" WorkloadEndpoint="localhost-k8s-calico--apiserver--6db98b7fcf--rhrc4-eth0" Jun 21 04:41:24.691057 containerd[1593]: 2025-06-21 04:41:24.672 [INFO][4940] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali891c5551885 ContainerID="97e2ba5f0568a342a7073cdc4712790089b2d90112ae7d98c6a8edd7b307512a" Namespace="calico-apiserver" Pod="calico-apiserver-6db98b7fcf-rhrc4" WorkloadEndpoint="localhost-k8s-calico--apiserver--6db98b7fcf--rhrc4-eth0" Jun 21 04:41:24.691057 containerd[1593]: 2025-06-21 04:41:24.675 [INFO][4940] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="97e2ba5f0568a342a7073cdc4712790089b2d90112ae7d98c6a8edd7b307512a" Namespace="calico-apiserver" Pod="calico-apiserver-6db98b7fcf-rhrc4" WorkloadEndpoint="localhost-k8s-calico--apiserver--6db98b7fcf--rhrc4-eth0" Jun 21 04:41:24.691057 containerd[1593]: 2025-06-21 04:41:24.675 [INFO][4940] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="97e2ba5f0568a342a7073cdc4712790089b2d90112ae7d98c6a8edd7b307512a" Namespace="calico-apiserver" Pod="calico-apiserver-6db98b7fcf-rhrc4" WorkloadEndpoint="localhost-k8s-calico--apiserver--6db98b7fcf--rhrc4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--6db98b7fcf--rhrc4-eth0", GenerateName:"calico-apiserver-6db98b7fcf-", Namespace:"calico-apiserver", SelfLink:"", UID:"08ed4df7-8d16-4be6-afa7-a63368b4f265", ResourceVersion:"876", Generation:0, CreationTimestamp:time.Date(2025, time.June, 21, 4, 40, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6db98b7fcf", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"97e2ba5f0568a342a7073cdc4712790089b2d90112ae7d98c6a8edd7b307512a", Pod:"calico-apiserver-6db98b7fcf-rhrc4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.137/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali891c5551885", MAC:"c6:4e:27:34:41:0c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jun 21 04:41:24.691057 containerd[1593]: 2025-06-21 04:41:24.686 [INFO][4940] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="97e2ba5f0568a342a7073cdc4712790089b2d90112ae7d98c6a8edd7b307512a" Namespace="calico-apiserver" Pod="calico-apiserver-6db98b7fcf-rhrc4" WorkloadEndpoint="localhost-k8s-calico--apiserver--6db98b7fcf--rhrc4-eth0" Jun 21 04:41:24.700999 containerd[1593]: time="2025-06-21T04:41:24.700768007Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6ddd6f5999-tkdl5,Uid:82964b29-a003-4f56-b879-9024b11edde1,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"490b17c35820196578e295a9ef1143fe1ae43122135988823bab716fe75e58ac\"" Jun 21 04:41:24.712206 containerd[1593]: time="2025-06-21T04:41:24.712160094Z" level=info msg="connecting to shim 97e2ba5f0568a342a7073cdc4712790089b2d90112ae7d98c6a8edd7b307512a" address="unix:///run/containerd/s/86adc9eea415b3cea9050a1927c502f2c6f9e69616a5c727cb1afbba6bc82af2" namespace=k8s.io protocol=ttrpc version=3 Jun 21 04:41:24.736840 systemd[1]: Started cri-containerd-97e2ba5f0568a342a7073cdc4712790089b2d90112ae7d98c6a8edd7b307512a.scope - libcontainer container 97e2ba5f0568a342a7073cdc4712790089b2d90112ae7d98c6a8edd7b307512a. Jun 21 04:41:24.748600 systemd-resolved[1405]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jun 21 04:41:24.779218 containerd[1593]: time="2025-06-21T04:41:24.779177982Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6db98b7fcf-rhrc4,Uid:08ed4df7-8d16-4be6-afa7-a63368b4f265,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"97e2ba5f0568a342a7073cdc4712790089b2d90112ae7d98c6a8edd7b307512a\"" Jun 21 04:41:24.865869 systemd-networkd[1495]: calic688da05d16: Gained IPv6LL Jun 21 04:41:24.929853 systemd-networkd[1495]: cali24544dba475: Gained IPv6LL Jun 21 04:41:25.588801 kubelet[2743]: E0621 04:41:25.588709 2743 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Jun 21 04:41:25.589220 kubelet[2743]: E0621 04:41:25.588831 2743 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Jun 21 04:41:26.337957 systemd-networkd[1495]: cali134fad6bef5: Gained IPv6LL Jun 21 04:41:26.338295 systemd-networkd[1495]: cali891c5551885: Gained IPv6LL Jun 21 04:41:26.949882 containerd[1593]: time="2025-06-21T04:41:26.949820528Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:26.950563 containerd[1593]: time="2025-06-21T04:41:26.950502109Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.1: active requests=0, bytes read=47305653" Jun 21 04:41:26.951727 containerd[1593]: time="2025-06-21T04:41:26.951673539Z" level=info msg="ImageCreate event name:\"sha256:5d29e6e796e41d7383da7c5b73fc136f7e486d40c52f79a04098396b7f85106c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:26.953682 containerd[1593]: time="2025-06-21T04:41:26.953618002Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:f6439af8b6022a48d2c6c75d92ec31fe177e7b6a90c58c78ca3964db2b94e21b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:26.954309 containerd[1593]: time="2025-06-21T04:41:26.954258045Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.1\" with image id \"sha256:5d29e6e796e41d7383da7c5b73fc136f7e486d40c52f79a04098396b7f85106c\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:f6439af8b6022a48d2c6c75d92ec31fe177e7b6a90c58c78ca3964db2b94e21b\", size \"48798372\" in 2.87584651s" Jun 21 04:41:26.954309 containerd[1593]: time="2025-06-21T04:41:26.954299763Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.1\" returns image reference \"sha256:5d29e6e796e41d7383da7c5b73fc136f7e486d40c52f79a04098396b7f85106c\"" Jun 21 04:41:26.955623 containerd[1593]: time="2025-06-21T04:41:26.955594014Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.1\"" Jun 21 04:41:26.959756 containerd[1593]: time="2025-06-21T04:41:26.959708394Z" level=info msg="CreateContainer within sandbox \"1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jun 21 04:41:26.967854 containerd[1593]: time="2025-06-21T04:41:26.967812038Z" level=info msg="Container 28d17d87c39425ade07af9ac9cbcfc433b45e7dbe25812553f801864d08fcbbe: CDI devices from CRI Config.CDIDevices: []" Jun 21 04:41:26.975104 containerd[1593]: time="2025-06-21T04:41:26.975064604Z" level=info msg="CreateContainer within sandbox \"1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"28d17d87c39425ade07af9ac9cbcfc433b45e7dbe25812553f801864d08fcbbe\"" Jun 21 04:41:26.975526 containerd[1593]: time="2025-06-21T04:41:26.975496565Z" level=info msg="StartContainer for \"28d17d87c39425ade07af9ac9cbcfc433b45e7dbe25812553f801864d08fcbbe\"" Jun 21 04:41:26.976727 containerd[1593]: time="2025-06-21T04:41:26.976682262Z" level=info msg="connecting to shim 28d17d87c39425ade07af9ac9cbcfc433b45e7dbe25812553f801864d08fcbbe" address="unix:///run/containerd/s/64f8a233a2649cf2594945805faa36c0db1f2f75dee70658da5bcf7cf015648f" protocol=ttrpc version=3 Jun 21 04:41:27.005854 systemd[1]: Started cri-containerd-28d17d87c39425ade07af9ac9cbcfc433b45e7dbe25812553f801864d08fcbbe.scope - libcontainer container 28d17d87c39425ade07af9ac9cbcfc433b45e7dbe25812553f801864d08fcbbe. Jun 21 04:41:27.057515 containerd[1593]: time="2025-06-21T04:41:27.057476479Z" level=info msg="StartContainer for \"28d17d87c39425ade07af9ac9cbcfc433b45e7dbe25812553f801864d08fcbbe\" returns successfully" Jun 21 04:41:27.604803 kubelet[2743]: I0621 04:41:27.604367 2743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6ddd6f5999-lwdv6" podStartSLOduration=29.055528196 podStartE2EDuration="33.604350925s" podCreationTimestamp="2025-06-21 04:40:54 +0000 UTC" firstStartedPulling="2025-06-21 04:41:22.406525253 +0000 UTC m=+42.181909264" lastFinishedPulling="2025-06-21 04:41:26.955347972 +0000 UTC m=+46.730731993" observedRunningTime="2025-06-21 04:41:27.60411994 +0000 UTC m=+47.379503961" watchObservedRunningTime="2025-06-21 04:41:27.604350925 +0000 UTC m=+47.379734936" Jun 21 04:41:28.598074 kubelet[2743]: I0621 04:41:28.598028 2743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jun 21 04:41:28.699276 systemd[1]: Started sshd@8-10.0.0.63:22-10.0.0.1:46216.service - OpenSSH per-connection server daemon (10.0.0.1:46216). Jun 21 04:41:28.768703 sshd[5140]: Accepted publickey for core from 10.0.0.1 port 46216 ssh2: RSA SHA256:015yC5fRvb07MyWOgrdDHnl6DLRQb6q1XcuQXpFRy7c Jun 21 04:41:28.769967 sshd-session[5140]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jun 21 04:41:28.775671 systemd-logind[1566]: New session 9 of user core. Jun 21 04:41:28.781946 systemd[1]: Started session-9.scope - Session 9 of User core. Jun 21 04:41:28.932541 sshd[5142]: Connection closed by 10.0.0.1 port 46216 Jun 21 04:41:28.933546 sshd-session[5140]: pam_unix(sshd:session): session closed for user core Jun 21 04:41:28.940051 systemd[1]: sshd@8-10.0.0.63:22-10.0.0.1:46216.service: Deactivated successfully. Jun 21 04:41:28.943268 systemd[1]: session-9.scope: Deactivated successfully. Jun 21 04:41:28.945625 systemd-logind[1566]: Session 9 logged out. Waiting for processes to exit. Jun 21 04:41:28.949582 systemd-logind[1566]: Removed session 9. Jun 21 04:41:30.080858 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount261446938.mount: Deactivated successfully. Jun 21 04:41:30.802155 containerd[1593]: time="2025-06-21T04:41:30.802096244Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:30.802834 containerd[1593]: time="2025-06-21T04:41:30.802778385Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.1: active requests=0, bytes read=66352249" Jun 21 04:41:30.804080 containerd[1593]: time="2025-06-21T04:41:30.804026690Z" level=info msg="ImageCreate event name:\"sha256:7ded2fef2b18e2077114599de13fa300df0e1437753deab5c59843a86d2dad82\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:30.806026 containerd[1593]: time="2025-06-21T04:41:30.805992121Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:173a10ef7a65a843f99fc366c7c860fa4068a8f52fda1b30ee589bc4ca43f45a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:30.806597 containerd[1593]: time="2025-06-21T04:41:30.806572942Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.1\" with image id \"sha256:7ded2fef2b18e2077114599de13fa300df0e1437753deab5c59843a86d2dad82\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.1\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:173a10ef7a65a843f99fc366c7c860fa4068a8f52fda1b30ee589bc4ca43f45a\", size \"66352095\" in 3.850946527s" Jun 21 04:41:30.806648 containerd[1593]: time="2025-06-21T04:41:30.806599992Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.1\" returns image reference \"sha256:7ded2fef2b18e2077114599de13fa300df0e1437753deab5c59843a86d2dad82\"" Jun 21 04:41:30.809464 containerd[1593]: time="2025-06-21T04:41:30.809420289Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.1\"" Jun 21 04:41:30.814575 containerd[1593]: time="2025-06-21T04:41:30.814531659Z" level=info msg="CreateContainer within sandbox \"5adee51526c6efb8c573cc7ea1542a926d7d047fdc8b934c22f204c38cdc47ab\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Jun 21 04:41:30.832030 containerd[1593]: time="2025-06-21T04:41:30.832007940Z" level=info msg="Container 32f739dfb62dc19088e1ce4190b4a36736d7d5b0051319868a92477462eb4641: CDI devices from CRI Config.CDIDevices: []" Jun 21 04:41:30.840697 containerd[1593]: time="2025-06-21T04:41:30.840668245Z" level=info msg="CreateContainer within sandbox \"5adee51526c6efb8c573cc7ea1542a926d7d047fdc8b934c22f204c38cdc47ab\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"32f739dfb62dc19088e1ce4190b4a36736d7d5b0051319868a92477462eb4641\"" Jun 21 04:41:30.841254 containerd[1593]: time="2025-06-21T04:41:30.841201456Z" level=info msg="StartContainer for \"32f739dfb62dc19088e1ce4190b4a36736d7d5b0051319868a92477462eb4641\"" Jun 21 04:41:30.842323 containerd[1593]: time="2025-06-21T04:41:30.842300331Z" level=info msg="connecting to shim 32f739dfb62dc19088e1ce4190b4a36736d7d5b0051319868a92477462eb4641" address="unix:///run/containerd/s/f02175c75e3cead0eea55924a37c49f629393b4e11241d6910638dc6c591aad0" protocol=ttrpc version=3 Jun 21 04:41:30.915845 systemd[1]: Started cri-containerd-32f739dfb62dc19088e1ce4190b4a36736d7d5b0051319868a92477462eb4641.scope - libcontainer container 32f739dfb62dc19088e1ce4190b4a36736d7d5b0051319868a92477462eb4641. Jun 21 04:41:30.962740 containerd[1593]: time="2025-06-21T04:41:30.962577961Z" level=info msg="StartContainer for \"32f739dfb62dc19088e1ce4190b4a36736d7d5b0051319868a92477462eb4641\" returns successfully" Jun 21 04:41:31.697637 containerd[1593]: time="2025-06-21T04:41:31.697600742Z" level=info msg="TaskExit event in podsandbox handler container_id:\"32f739dfb62dc19088e1ce4190b4a36736d7d5b0051319868a92477462eb4641\" id:\"e6b8fb455026960a48a7c16f611132c9b61a8b21da4ed250db76168ea5d5734e\" pid:5215 exited_at:{seconds:1750480891 nanos:697208576}" Jun 21 04:41:31.715669 kubelet[2743]: I0621 04:41:31.715591 2743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-5bd85449d4-89ct8" podStartSLOduration=27.817856675 podStartE2EDuration="35.715555551s" podCreationTimestamp="2025-06-21 04:40:56 +0000 UTC" firstStartedPulling="2025-06-21 04:41:22.911483005 +0000 UTC m=+42.686867006" lastFinishedPulling="2025-06-21 04:41:30.809181851 +0000 UTC m=+50.584565882" observedRunningTime="2025-06-21 04:41:31.653700189 +0000 UTC m=+51.429084200" watchObservedRunningTime="2025-06-21 04:41:31.715555551 +0000 UTC m=+51.490939562" Jun 21 04:41:32.693935 containerd[1593]: time="2025-06-21T04:41:32.693882899Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:32.694850 containerd[1593]: time="2025-06-21T04:41:32.694830158Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.1: active requests=0, bytes read=8758389" Jun 21 04:41:32.696181 containerd[1593]: time="2025-06-21T04:41:32.696154335Z" level=info msg="ImageCreate event name:\"sha256:8a733c30ec1a8c9f3f51e2da387b425052ed4a9ca631da57c6b185183243e8e9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:32.698330 containerd[1593]: time="2025-06-21T04:41:32.698290406Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:b2a5699992dd6c84cfab94ef60536b9aaf19ad8de648e8e0b92d3733f5f52d23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:32.698938 containerd[1593]: time="2025-06-21T04:41:32.698894581Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.1\" with image id \"sha256:8a733c30ec1a8c9f3f51e2da387b425052ed4a9ca631da57c6b185183243e8e9\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.1\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:b2a5699992dd6c84cfab94ef60536b9aaf19ad8de648e8e0b92d3733f5f52d23\", size \"10251092\" in 1.889434667s" Jun 21 04:41:32.698988 containerd[1593]: time="2025-06-21T04:41:32.698943021Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.1\" returns image reference \"sha256:8a733c30ec1a8c9f3f51e2da387b425052ed4a9ca631da57c6b185183243e8e9\"" Jun 21 04:41:32.699919 containerd[1593]: time="2025-06-21T04:41:32.699779873Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.1\"" Jun 21 04:41:32.705199 containerd[1593]: time="2025-06-21T04:41:32.705152973Z" level=info msg="CreateContainer within sandbox \"5be0ac59eda054e44a474b341f4ef18b942353c06882071e57a9c28d892a1821\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Jun 21 04:41:32.730161 containerd[1593]: time="2025-06-21T04:41:32.730120177Z" level=info msg="Container 656bc9f6e0aab039a99a02ab968f3fba0466c484fd99c9a09d2cfdb6a0fa160e: CDI devices from CRI Config.CDIDevices: []" Jun 21 04:41:32.745653 containerd[1593]: time="2025-06-21T04:41:32.745617608Z" level=info msg="CreateContainer within sandbox \"5be0ac59eda054e44a474b341f4ef18b942353c06882071e57a9c28d892a1821\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"656bc9f6e0aab039a99a02ab968f3fba0466c484fd99c9a09d2cfdb6a0fa160e\"" Jun 21 04:41:32.746364 containerd[1593]: time="2025-06-21T04:41:32.746297856Z" level=info msg="StartContainer for \"656bc9f6e0aab039a99a02ab968f3fba0466c484fd99c9a09d2cfdb6a0fa160e\"" Jun 21 04:41:32.747965 containerd[1593]: time="2025-06-21T04:41:32.747936684Z" level=info msg="connecting to shim 656bc9f6e0aab039a99a02ab968f3fba0466c484fd99c9a09d2cfdb6a0fa160e" address="unix:///run/containerd/s/7f1caae9bab80aa415c2def328ecdbdcbef693249a4e9a5f7b3130aea95db900" protocol=ttrpc version=3 Jun 21 04:41:32.774920 systemd[1]: Started cri-containerd-656bc9f6e0aab039a99a02ab968f3fba0466c484fd99c9a09d2cfdb6a0fa160e.scope - libcontainer container 656bc9f6e0aab039a99a02ab968f3fba0466c484fd99c9a09d2cfdb6a0fa160e. Jun 21 04:41:33.246596 containerd[1593]: time="2025-06-21T04:41:33.246553726Z" level=info msg="StartContainer for \"656bc9f6e0aab039a99a02ab968f3fba0466c484fd99c9a09d2cfdb6a0fa160e\" returns successfully" Jun 21 04:41:33.947426 systemd[1]: Started sshd@9-10.0.0.63:22-10.0.0.1:46220.service - OpenSSH per-connection server daemon (10.0.0.1:46220). Jun 21 04:41:34.010774 sshd[5272]: Accepted publickey for core from 10.0.0.1 port 46220 ssh2: RSA SHA256:015yC5fRvb07MyWOgrdDHnl6DLRQb6q1XcuQXpFRy7c Jun 21 04:41:34.012511 sshd-session[5272]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jun 21 04:41:34.017519 systemd-logind[1566]: New session 10 of user core. Jun 21 04:41:34.023874 systemd[1]: Started session-10.scope - Session 10 of User core. Jun 21 04:41:34.169660 sshd[5275]: Connection closed by 10.0.0.1 port 46220 Jun 21 04:41:34.170025 sshd-session[5272]: pam_unix(sshd:session): session closed for user core Jun 21 04:41:34.183416 systemd[1]: sshd@9-10.0.0.63:22-10.0.0.1:46220.service: Deactivated successfully. Jun 21 04:41:34.185208 systemd[1]: session-10.scope: Deactivated successfully. Jun 21 04:41:34.186096 systemd-logind[1566]: Session 10 logged out. Waiting for processes to exit. Jun 21 04:41:34.188814 systemd[1]: Started sshd@10-10.0.0.63:22-10.0.0.1:46236.service - OpenSSH per-connection server daemon (10.0.0.1:46236). Jun 21 04:41:34.189639 systemd-logind[1566]: Removed session 10. Jun 21 04:41:34.241476 sshd[5290]: Accepted publickey for core from 10.0.0.1 port 46236 ssh2: RSA SHA256:015yC5fRvb07MyWOgrdDHnl6DLRQb6q1XcuQXpFRy7c Jun 21 04:41:34.243206 sshd-session[5290]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jun 21 04:41:34.248345 systemd-logind[1566]: New session 11 of user core. Jun 21 04:41:34.258834 systemd[1]: Started session-11.scope - Session 11 of User core. Jun 21 04:41:34.457104 sshd[5292]: Connection closed by 10.0.0.1 port 46236 Jun 21 04:41:34.457442 sshd-session[5290]: pam_unix(sshd:session): session closed for user core Jun 21 04:41:34.470136 systemd[1]: sshd@10-10.0.0.63:22-10.0.0.1:46236.service: Deactivated successfully. Jun 21 04:41:34.472395 systemd[1]: session-11.scope: Deactivated successfully. Jun 21 04:41:34.473315 systemd-logind[1566]: Session 11 logged out. Waiting for processes to exit. Jun 21 04:41:34.477047 systemd[1]: Started sshd@11-10.0.0.63:22-10.0.0.1:46244.service - OpenSSH per-connection server daemon (10.0.0.1:46244). Jun 21 04:41:34.477815 systemd-logind[1566]: Removed session 11. Jun 21 04:41:34.528957 sshd[5304]: Accepted publickey for core from 10.0.0.1 port 46244 ssh2: RSA SHA256:015yC5fRvb07MyWOgrdDHnl6DLRQb6q1XcuQXpFRy7c Jun 21 04:41:34.531632 sshd-session[5304]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jun 21 04:41:34.538498 systemd-logind[1566]: New session 12 of user core. Jun 21 04:41:34.545103 systemd[1]: Started session-12.scope - Session 12 of User core. Jun 21 04:41:34.701313 sshd[5306]: Connection closed by 10.0.0.1 port 46244 Jun 21 04:41:34.701671 sshd-session[5304]: pam_unix(sshd:session): session closed for user core Jun 21 04:41:34.705505 systemd[1]: sshd@11-10.0.0.63:22-10.0.0.1:46244.service: Deactivated successfully. Jun 21 04:41:34.707741 systemd[1]: session-12.scope: Deactivated successfully. Jun 21 04:41:34.708581 systemd-logind[1566]: Session 12 logged out. Waiting for processes to exit. Jun 21 04:41:34.710047 systemd-logind[1566]: Removed session 12. Jun 21 04:41:37.883590 containerd[1593]: time="2025-06-21T04:41:37.883442102Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:37.884566 containerd[1593]: time="2025-06-21T04:41:37.884529845Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.1: active requests=0, bytes read=51246233" Jun 21 04:41:37.886141 containerd[1593]: time="2025-06-21T04:41:37.886107878Z" level=info msg="ImageCreate event name:\"sha256:6df5d7da55b19142ea456ddaa7f49909709419c92a39991e84b0f6708f953d73\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:37.899882 containerd[1593]: time="2025-06-21T04:41:37.899824929Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:5a988b0c09389a083a7f37e3f14e361659f0bcf538c01d50e9f785671a7d9b20\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:37.900446 containerd[1593]: time="2025-06-21T04:41:37.900387034Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.1\" with image id \"sha256:6df5d7da55b19142ea456ddaa7f49909709419c92a39991e84b0f6708f953d73\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.1\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:5a988b0c09389a083a7f37e3f14e361659f0bcf538c01d50e9f785671a7d9b20\", size \"52738904\" in 5.200574439s" Jun 21 04:41:37.900446 containerd[1593]: time="2025-06-21T04:41:37.900429865Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.1\" returns image reference \"sha256:6df5d7da55b19142ea456ddaa7f49909709419c92a39991e84b0f6708f953d73\"" Jun 21 04:41:37.901390 containerd[1593]: time="2025-06-21T04:41:37.901361914Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.1\"" Jun 21 04:41:37.917030 containerd[1593]: time="2025-06-21T04:41:37.916973242Z" level=info msg="CreateContainer within sandbox \"bef7e47ae325855c2f3508bf5aa758c4239cec229ff40b7371a59e41aecd1ba5\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Jun 21 04:41:37.926927 containerd[1593]: time="2025-06-21T04:41:37.926878899Z" level=info msg="Container 0e738ae249ced5a21790ad4bc36c6ee23bc4f688be03954044be79c7a86f2d63: CDI devices from CRI Config.CDIDevices: []" Jun 21 04:41:37.939180 containerd[1593]: time="2025-06-21T04:41:37.939112094Z" level=info msg="CreateContainer within sandbox \"bef7e47ae325855c2f3508bf5aa758c4239cec229ff40b7371a59e41aecd1ba5\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"0e738ae249ced5a21790ad4bc36c6ee23bc4f688be03954044be79c7a86f2d63\"" Jun 21 04:41:37.939851 containerd[1593]: time="2025-06-21T04:41:37.939817799Z" level=info msg="StartContainer for \"0e738ae249ced5a21790ad4bc36c6ee23bc4f688be03954044be79c7a86f2d63\"" Jun 21 04:41:37.940993 containerd[1593]: time="2025-06-21T04:41:37.940954183Z" level=info msg="connecting to shim 0e738ae249ced5a21790ad4bc36c6ee23bc4f688be03954044be79c7a86f2d63" address="unix:///run/containerd/s/bbee7799e6da36a08e6b6f5a28beebadf1b6f33a84e14121af017e85ac2eeaa6" protocol=ttrpc version=3 Jun 21 04:41:37.977933 systemd[1]: Started cri-containerd-0e738ae249ced5a21790ad4bc36c6ee23bc4f688be03954044be79c7a86f2d63.scope - libcontainer container 0e738ae249ced5a21790ad4bc36c6ee23bc4f688be03954044be79c7a86f2d63. Jun 21 04:41:38.030295 containerd[1593]: time="2025-06-21T04:41:38.030237487Z" level=info msg="StartContainer for \"0e738ae249ced5a21790ad4bc36c6ee23bc4f688be03954044be79c7a86f2d63\" returns successfully" Jun 21 04:41:38.669380 containerd[1593]: time="2025-06-21T04:41:38.669192165Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0e738ae249ced5a21790ad4bc36c6ee23bc4f688be03954044be79c7a86f2d63\" id:\"f64845de1ad205c079787a056667331f5aab31fa39054b7a2527e8bd489d88c8\" pid:5383 exited_at:{seconds:1750480898 nanos:668934451}" Jun 21 04:41:38.778842 kubelet[2743]: I0621 04:41:38.778730 2743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-c7b7955d6-gf7gk" podStartSLOduration=27.217892781 podStartE2EDuration="41.778684362s" podCreationTimestamp="2025-06-21 04:40:57 +0000 UTC" firstStartedPulling="2025-06-21 04:41:23.3402966 +0000 UTC m=+43.115680611" lastFinishedPulling="2025-06-21 04:41:37.901088181 +0000 UTC m=+57.676472192" observedRunningTime="2025-06-21 04:41:38.675381124 +0000 UTC m=+58.450765155" watchObservedRunningTime="2025-06-21 04:41:38.778684362 +0000 UTC m=+58.554068373" Jun 21 04:41:39.718585 systemd[1]: Started sshd@12-10.0.0.63:22-10.0.0.1:33112.service - OpenSSH per-connection server daemon (10.0.0.1:33112). Jun 21 04:41:39.781740 sshd[5399]: Accepted publickey for core from 10.0.0.1 port 33112 ssh2: RSA SHA256:015yC5fRvb07MyWOgrdDHnl6DLRQb6q1XcuQXpFRy7c Jun 21 04:41:39.783851 sshd-session[5399]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jun 21 04:41:39.788799 systemd-logind[1566]: New session 13 of user core. Jun 21 04:41:39.798942 systemd[1]: Started session-13.scope - Session 13 of User core. Jun 21 04:41:39.931690 sshd[5401]: Connection closed by 10.0.0.1 port 33112 Jun 21 04:41:39.932017 sshd-session[5399]: pam_unix(sshd:session): session closed for user core Jun 21 04:41:39.937497 systemd[1]: sshd@12-10.0.0.63:22-10.0.0.1:33112.service: Deactivated successfully. Jun 21 04:41:39.940185 systemd[1]: session-13.scope: Deactivated successfully. Jun 21 04:41:39.941711 systemd-logind[1566]: Session 13 logged out. Waiting for processes to exit. Jun 21 04:41:39.943340 systemd-logind[1566]: Removed session 13. Jun 21 04:41:40.819509 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4239760835.mount: Deactivated successfully. Jun 21 04:41:40.946929 containerd[1593]: time="2025-06-21T04:41:40.946882949Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:40.951033 containerd[1593]: time="2025-06-21T04:41:40.950999155Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.1: active requests=0, bytes read=33086345" Jun 21 04:41:40.952236 containerd[1593]: time="2025-06-21T04:41:40.952205991Z" level=info msg="ImageCreate event name:\"sha256:a8d73c8fd22b3a7a28e9baab63169fb459bc504d71d871f96225c4f2d5e660a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:40.954239 containerd[1593]: time="2025-06-21T04:41:40.954174666Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:4b8bcb8b4fc05026ba811bf0b25b736086c1b8b26a83a9039a84dd3a06b06bd4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:40.954758 containerd[1593]: time="2025-06-21T04:41:40.954706775Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.1\" with image id \"sha256:a8d73c8fd22b3a7a28e9baab63169fb459bc504d71d871f96225c4f2d5e660a5\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.1\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:4b8bcb8b4fc05026ba811bf0b25b736086c1b8b26a83a9039a84dd3a06b06bd4\", size \"33086175\" in 3.053319393s" Jun 21 04:41:40.954813 containerd[1593]: time="2025-06-21T04:41:40.954758592Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.1\" returns image reference \"sha256:a8d73c8fd22b3a7a28e9baab63169fb459bc504d71d871f96225c4f2d5e660a5\"" Jun 21 04:41:40.955802 containerd[1593]: time="2025-06-21T04:41:40.955775340Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.1\"" Jun 21 04:41:40.959441 containerd[1593]: time="2025-06-21T04:41:40.959411957Z" level=info msg="CreateContainer within sandbox \"c31cb3ecd7aa568f12def2b05a895c9b479cc0e3c97edc18a919901910168e43\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Jun 21 04:41:40.967568 containerd[1593]: time="2025-06-21T04:41:40.967532951Z" level=info msg="Container ca80b5b55c97f451cf77d8ed67eab46a28588370343dfb77b22a74c35e9dade0: CDI devices from CRI Config.CDIDevices: []" Jun 21 04:41:40.976982 containerd[1593]: time="2025-06-21T04:41:40.976941302Z" level=info msg="CreateContainer within sandbox \"c31cb3ecd7aa568f12def2b05a895c9b479cc0e3c97edc18a919901910168e43\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"ca80b5b55c97f451cf77d8ed67eab46a28588370343dfb77b22a74c35e9dade0\"" Jun 21 04:41:40.977523 containerd[1593]: time="2025-06-21T04:41:40.977438475Z" level=info msg="StartContainer for \"ca80b5b55c97f451cf77d8ed67eab46a28588370343dfb77b22a74c35e9dade0\"" Jun 21 04:41:40.979598 containerd[1593]: time="2025-06-21T04:41:40.979537875Z" level=info msg="connecting to shim ca80b5b55c97f451cf77d8ed67eab46a28588370343dfb77b22a74c35e9dade0" address="unix:///run/containerd/s/c8675cb2c7f9ccfa3d61cf897e99cd6ed18efd523e67dd6ebeb0c5b5d386f2ba" protocol=ttrpc version=3 Jun 21 04:41:41.020883 systemd[1]: Started cri-containerd-ca80b5b55c97f451cf77d8ed67eab46a28588370343dfb77b22a74c35e9dade0.scope - libcontainer container ca80b5b55c97f451cf77d8ed67eab46a28588370343dfb77b22a74c35e9dade0. Jun 21 04:41:41.077416 containerd[1593]: time="2025-06-21T04:41:41.076658649Z" level=info msg="StartContainer for \"ca80b5b55c97f451cf77d8ed67eab46a28588370343dfb77b22a74c35e9dade0\" returns successfully" Jun 21 04:41:41.346814 containerd[1593]: time="2025-06-21T04:41:41.346770534Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:41.347650 containerd[1593]: time="2025-06-21T04:41:41.347616813Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.1: active requests=0, bytes read=77" Jun 21 04:41:41.361209 containerd[1593]: time="2025-06-21T04:41:41.361180362Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.1\" with image id \"sha256:5d29e6e796e41d7383da7c5b73fc136f7e486d40c52f79a04098396b7f85106c\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:f6439af8b6022a48d2c6c75d92ec31fe177e7b6a90c58c78ca3964db2b94e21b\", size \"48798372\" in 405.374295ms" Jun 21 04:41:41.361209 containerd[1593]: time="2025-06-21T04:41:41.361205830Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.1\" returns image reference \"sha256:5d29e6e796e41d7383da7c5b73fc136f7e486d40c52f79a04098396b7f85106c\"" Jun 21 04:41:41.361903 containerd[1593]: time="2025-06-21T04:41:41.361879665Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.1\"" Jun 21 04:41:41.365556 containerd[1593]: time="2025-06-21T04:41:41.365527071Z" level=info msg="CreateContainer within sandbox \"490b17c35820196578e295a9ef1143fe1ae43122135988823bab716fe75e58ac\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jun 21 04:41:41.373342 containerd[1593]: time="2025-06-21T04:41:41.373317524Z" level=info msg="Container bb64cca574c996e040e24cd43b4fa7571c36eb1b2cd87a5fed4490e3698bf038: CDI devices from CRI Config.CDIDevices: []" Jun 21 04:41:41.380556 containerd[1593]: time="2025-06-21T04:41:41.380520184Z" level=info msg="CreateContainer within sandbox \"490b17c35820196578e295a9ef1143fe1ae43122135988823bab716fe75e58ac\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"bb64cca574c996e040e24cd43b4fa7571c36eb1b2cd87a5fed4490e3698bf038\"" Jun 21 04:41:41.380927 containerd[1593]: time="2025-06-21T04:41:41.380903855Z" level=info msg="StartContainer for \"bb64cca574c996e040e24cd43b4fa7571c36eb1b2cd87a5fed4490e3698bf038\"" Jun 21 04:41:41.381898 containerd[1593]: time="2025-06-21T04:41:41.381867404Z" level=info msg="connecting to shim bb64cca574c996e040e24cd43b4fa7571c36eb1b2cd87a5fed4490e3698bf038" address="unix:///run/containerd/s/578b388b135a6c600fa5eb9ae01a485a44bde8f83e35d7dbbe98b781f07e8f85" protocol=ttrpc version=3 Jun 21 04:41:41.402162 systemd[1]: Started cri-containerd-bb64cca574c996e040e24cd43b4fa7571c36eb1b2cd87a5fed4490e3698bf038.scope - libcontainer container bb64cca574c996e040e24cd43b4fa7571c36eb1b2cd87a5fed4490e3698bf038. Jun 21 04:41:41.448368 containerd[1593]: time="2025-06-21T04:41:41.448328755Z" level=info msg="StartContainer for \"bb64cca574c996e040e24cd43b4fa7571c36eb1b2cd87a5fed4490e3698bf038\" returns successfully" Jun 21 04:41:41.658443 kubelet[2743]: I0621 04:41:41.658149 2743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6ddd6f5999-tkdl5" podStartSLOduration=30.998581223 podStartE2EDuration="47.658130324s" podCreationTimestamp="2025-06-21 04:40:54 +0000 UTC" firstStartedPulling="2025-06-21 04:41:24.702250454 +0000 UTC m=+44.477634465" lastFinishedPulling="2025-06-21 04:41:41.361799535 +0000 UTC m=+61.137183566" observedRunningTime="2025-06-21 04:41:41.657577507 +0000 UTC m=+61.432961518" watchObservedRunningTime="2025-06-21 04:41:41.658130324 +0000 UTC m=+61.433514325" Jun 21 04:41:41.661577 kubelet[2743]: I0621 04:41:41.659510 2743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-7447955458-q4xz2" podStartSLOduration=3.48892956 podStartE2EDuration="22.659501008s" podCreationTimestamp="2025-06-21 04:41:19 +0000 UTC" firstStartedPulling="2025-06-21 04:41:21.785064551 +0000 UTC m=+41.560448562" lastFinishedPulling="2025-06-21 04:41:40.955635999 +0000 UTC m=+60.731020010" observedRunningTime="2025-06-21 04:41:41.645026738 +0000 UTC m=+61.420410749" watchObservedRunningTime="2025-06-21 04:41:41.659501008 +0000 UTC m=+61.434885009" Jun 21 04:41:41.804102 containerd[1593]: time="2025-06-21T04:41:41.804048662Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:41.805731 containerd[1593]: time="2025-06-21T04:41:41.804931158Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.1: active requests=0, bytes read=77" Jun 21 04:41:41.807085 containerd[1593]: time="2025-06-21T04:41:41.807041720Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.1\" with image id \"sha256:5d29e6e796e41d7383da7c5b73fc136f7e486d40c52f79a04098396b7f85106c\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:f6439af8b6022a48d2c6c75d92ec31fe177e7b6a90c58c78ca3964db2b94e21b\", size \"48798372\" in 445.139232ms" Jun 21 04:41:41.807085 containerd[1593]: time="2025-06-21T04:41:41.807075574Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.1\" returns image reference \"sha256:5d29e6e796e41d7383da7c5b73fc136f7e486d40c52f79a04098396b7f85106c\"" Jun 21 04:41:41.808980 containerd[1593]: time="2025-06-21T04:41:41.808770014Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.1\"" Jun 21 04:41:41.813032 containerd[1593]: time="2025-06-21T04:41:41.812992429Z" level=info msg="CreateContainer within sandbox \"97e2ba5f0568a342a7073cdc4712790089b2d90112ae7d98c6a8edd7b307512a\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jun 21 04:41:41.821622 containerd[1593]: time="2025-06-21T04:41:41.821511420Z" level=info msg="Container 02a2aac3913002949b2d7b28ef79106a5b9913e4c3240c96176a8a214ff98709: CDI devices from CRI Config.CDIDevices: []" Jun 21 04:41:41.832022 containerd[1593]: time="2025-06-21T04:41:41.831973227Z" level=info msg="CreateContainer within sandbox \"97e2ba5f0568a342a7073cdc4712790089b2d90112ae7d98c6a8edd7b307512a\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"02a2aac3913002949b2d7b28ef79106a5b9913e4c3240c96176a8a214ff98709\"" Jun 21 04:41:41.832611 containerd[1593]: time="2025-06-21T04:41:41.832571581Z" level=info msg="StartContainer for \"02a2aac3913002949b2d7b28ef79106a5b9913e4c3240c96176a8a214ff98709\"" Jun 21 04:41:41.833706 containerd[1593]: time="2025-06-21T04:41:41.833676224Z" level=info msg="connecting to shim 02a2aac3913002949b2d7b28ef79106a5b9913e4c3240c96176a8a214ff98709" address="unix:///run/containerd/s/86adc9eea415b3cea9050a1927c502f2c6f9e69616a5c727cb1afbba6bc82af2" protocol=ttrpc version=3 Jun 21 04:41:41.860091 systemd[1]: Started cri-containerd-02a2aac3913002949b2d7b28ef79106a5b9913e4c3240c96176a8a214ff98709.scope - libcontainer container 02a2aac3913002949b2d7b28ef79106a5b9913e4c3240c96176a8a214ff98709. Jun 21 04:41:41.917795 containerd[1593]: time="2025-06-21T04:41:41.917660408Z" level=info msg="StartContainer for \"02a2aac3913002949b2d7b28ef79106a5b9913e4c3240c96176a8a214ff98709\" returns successfully" Jun 21 04:41:42.658692 kubelet[2743]: I0621 04:41:42.658521 2743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6db98b7fcf-rhrc4" podStartSLOduration=31.630927997 podStartE2EDuration="48.6585027s" podCreationTimestamp="2025-06-21 04:40:54 +0000 UTC" firstStartedPulling="2025-06-21 04:41:24.7804167 +0000 UTC m=+44.555800711" lastFinishedPulling="2025-06-21 04:41:41.807991403 +0000 UTC m=+61.583375414" observedRunningTime="2025-06-21 04:41:42.658496058 +0000 UTC m=+62.433880079" watchObservedRunningTime="2025-06-21 04:41:42.6585027 +0000 UTC m=+62.433886711" Jun 21 04:41:43.641538 kubelet[2743]: I0621 04:41:43.641493 2743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jun 21 04:41:44.564823 containerd[1593]: time="2025-06-21T04:41:44.564764288Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:44.565425 containerd[1593]: time="2025-06-21T04:41:44.565392401Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.1: active requests=0, bytes read=14705633" Jun 21 04:41:44.566615 containerd[1593]: time="2025-06-21T04:41:44.566571983Z" level=info msg="ImageCreate event name:\"sha256:dfc00385e8755bddd1053a2a482a3559ad6c93bd8b882371b9ed8b5c3dfe22b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:44.568511 containerd[1593]: time="2025-06-21T04:41:44.568474196Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:1a882b6866dd22d783a39f1e041b87a154666ea4dd8b669fe98d0b0fac58d225\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jun 21 04:41:44.569029 containerd[1593]: time="2025-06-21T04:41:44.569005867Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.1\" with image id \"sha256:dfc00385e8755bddd1053a2a482a3559ad6c93bd8b882371b9ed8b5c3dfe22b5\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.1\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:1a882b6866dd22d783a39f1e041b87a154666ea4dd8b669fe98d0b0fac58d225\", size \"16198288\" in 2.760205968s" Jun 21 04:41:44.569082 containerd[1593]: time="2025-06-21T04:41:44.569049349Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.1\" returns image reference \"sha256:dfc00385e8755bddd1053a2a482a3559ad6c93bd8b882371b9ed8b5c3dfe22b5\"" Jun 21 04:41:44.573484 containerd[1593]: time="2025-06-21T04:41:44.573444028Z" level=info msg="CreateContainer within sandbox \"5be0ac59eda054e44a474b341f4ef18b942353c06882071e57a9c28d892a1821\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Jun 21 04:41:44.616883 containerd[1593]: time="2025-06-21T04:41:44.616832024Z" level=info msg="Container 13c999f77a2daf56f35903429c5f2c3f297159162041b892107bd6993f64a6ce: CDI devices from CRI Config.CDIDevices: []" Jun 21 04:41:44.628063 containerd[1593]: time="2025-06-21T04:41:44.628004599Z" level=info msg="CreateContainer within sandbox \"5be0ac59eda054e44a474b341f4ef18b942353c06882071e57a9c28d892a1821\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"13c999f77a2daf56f35903429c5f2c3f297159162041b892107bd6993f64a6ce\"" Jun 21 04:41:44.628624 containerd[1593]: time="2025-06-21T04:41:44.628564574Z" level=info msg="StartContainer for \"13c999f77a2daf56f35903429c5f2c3f297159162041b892107bd6993f64a6ce\"" Jun 21 04:41:44.630252 containerd[1593]: time="2025-06-21T04:41:44.630207319Z" level=info msg="connecting to shim 13c999f77a2daf56f35903429c5f2c3f297159162041b892107bd6993f64a6ce" address="unix:///run/containerd/s/7f1caae9bab80aa415c2def328ecdbdcbef693249a4e9a5f7b3130aea95db900" protocol=ttrpc version=3 Jun 21 04:41:44.666861 systemd[1]: Started cri-containerd-13c999f77a2daf56f35903429c5f2c3f297159162041b892107bd6993f64a6ce.scope - libcontainer container 13c999f77a2daf56f35903429c5f2c3f297159162041b892107bd6993f64a6ce. Jun 21 04:41:44.706221 containerd[1593]: time="2025-06-21T04:41:44.706168752Z" level=info msg="StartContainer for \"13c999f77a2daf56f35903429c5f2c3f297159162041b892107bd6993f64a6ce\" returns successfully" Jun 21 04:41:44.953823 systemd[1]: Started sshd@13-10.0.0.63:22-10.0.0.1:33128.service - OpenSSH per-connection server daemon (10.0.0.1:33128). Jun 21 04:41:45.020561 sshd[5585]: Accepted publickey for core from 10.0.0.1 port 33128 ssh2: RSA SHA256:015yC5fRvb07MyWOgrdDHnl6DLRQb6q1XcuQXpFRy7c Jun 21 04:41:45.022110 sshd-session[5585]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jun 21 04:41:45.026396 systemd-logind[1566]: New session 14 of user core. Jun 21 04:41:45.035868 systemd[1]: Started session-14.scope - Session 14 of User core. Jun 21 04:41:45.222621 sshd[5587]: Connection closed by 10.0.0.1 port 33128 Jun 21 04:41:45.222921 sshd-session[5585]: pam_unix(sshd:session): session closed for user core Jun 21 04:41:45.227305 systemd[1]: sshd@13-10.0.0.63:22-10.0.0.1:33128.service: Deactivated successfully. Jun 21 04:41:45.229314 systemd[1]: session-14.scope: Deactivated successfully. Jun 21 04:41:45.230241 systemd-logind[1566]: Session 14 logged out. Waiting for processes to exit. Jun 21 04:41:45.231276 systemd-logind[1566]: Removed session 14. Jun 21 04:41:45.502093 kubelet[2743]: I0621 04:41:45.502001 2743 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Jun 21 04:41:45.503463 kubelet[2743]: I0621 04:41:45.503442 2743 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Jun 21 04:41:50.240520 systemd[1]: Started sshd@14-10.0.0.63:22-10.0.0.1:41116.service - OpenSSH per-connection server daemon (10.0.0.1:41116). Jun 21 04:41:50.296695 sshd[5604]: Accepted publickey for core from 10.0.0.1 port 41116 ssh2: RSA SHA256:015yC5fRvb07MyWOgrdDHnl6DLRQb6q1XcuQXpFRy7c Jun 21 04:41:50.298182 sshd-session[5604]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jun 21 04:41:50.302895 systemd-logind[1566]: New session 15 of user core. Jun 21 04:41:50.311862 systemd[1]: Started session-15.scope - Session 15 of User core. Jun 21 04:41:50.480289 sshd[5606]: Connection closed by 10.0.0.1 port 41116 Jun 21 04:41:50.480609 sshd-session[5604]: pam_unix(sshd:session): session closed for user core Jun 21 04:41:50.486399 systemd[1]: sshd@14-10.0.0.63:22-10.0.0.1:41116.service: Deactivated successfully. Jun 21 04:41:50.488973 systemd[1]: session-15.scope: Deactivated successfully. Jun 21 04:41:50.490483 systemd-logind[1566]: Session 15 logged out. Waiting for processes to exit. Jun 21 04:41:50.492083 systemd-logind[1566]: Removed session 15. Jun 21 04:41:51.639860 containerd[1593]: time="2025-06-21T04:41:51.639822487Z" level=info msg="TaskExit event in podsandbox handler container_id:\"63d880203e0ec15d8f94bf30b42d59a6f30d545ca3ef57b97eda6dfc727cf2f2\" id:\"643489ec5eb1f32a039f677d21a3825d1ce862acce7791de968504bb8f5b7dbf\" pid:5633 exited_at:{seconds:1750480911 nanos:639502360}" Jun 21 04:41:51.658876 kubelet[2743]: I0621 04:41:51.658745 2743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-ldc5m" podStartSLOduration=33.145666633 podStartE2EDuration="54.658488133s" podCreationTimestamp="2025-06-21 04:40:57 +0000 UTC" firstStartedPulling="2025-06-21 04:41:23.056873375 +0000 UTC m=+42.832257386" lastFinishedPulling="2025-06-21 04:41:44.569694875 +0000 UTC m=+64.345078886" observedRunningTime="2025-06-21 04:41:45.666589032 +0000 UTC m=+65.441973063" watchObservedRunningTime="2025-06-21 04:41:51.658488133 +0000 UTC m=+71.433872144" Jun 21 04:41:53.505482 containerd[1593]: time="2025-06-21T04:41:53.505440959Z" level=info msg="TaskExit event in podsandbox handler container_id:\"32f739dfb62dc19088e1ce4190b4a36736d7d5b0051319868a92477462eb4641\" id:\"93b747ae9afd3f0389e7345fe33909dd4c1c085d142d057287828bc3cc27762b\" pid:5659 exited_at:{seconds:1750480913 nanos:505105292}" Jun 21 04:41:55.496449 systemd[1]: Started sshd@15-10.0.0.63:22-10.0.0.1:41122.service - OpenSSH per-connection server daemon (10.0.0.1:41122). Jun 21 04:41:55.558653 sshd[5672]: Accepted publickey for core from 10.0.0.1 port 41122 ssh2: RSA SHA256:015yC5fRvb07MyWOgrdDHnl6DLRQb6q1XcuQXpFRy7c Jun 21 04:41:55.560389 sshd-session[5672]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jun 21 04:41:55.564766 systemd-logind[1566]: New session 16 of user core. Jun 21 04:41:55.570834 systemd[1]: Started session-16.scope - Session 16 of User core. Jun 21 04:41:55.805046 sshd[5674]: Connection closed by 10.0.0.1 port 41122 Jun 21 04:41:55.805291 sshd-session[5672]: pam_unix(sshd:session): session closed for user core Jun 21 04:41:55.820160 systemd[1]: sshd@15-10.0.0.63:22-10.0.0.1:41122.service: Deactivated successfully. Jun 21 04:41:55.822338 systemd[1]: session-16.scope: Deactivated successfully. Jun 21 04:41:55.823238 systemd-logind[1566]: Session 16 logged out. Waiting for processes to exit. Jun 21 04:41:55.826214 systemd[1]: Started sshd@16-10.0.0.63:22-10.0.0.1:54562.service - OpenSSH per-connection server daemon (10.0.0.1:54562). Jun 21 04:41:55.826840 systemd-logind[1566]: Removed session 16. Jun 21 04:41:55.875154 sshd[5688]: Accepted publickey for core from 10.0.0.1 port 54562 ssh2: RSA SHA256:015yC5fRvb07MyWOgrdDHnl6DLRQb6q1XcuQXpFRy7c Jun 21 04:41:55.876605 sshd-session[5688]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jun 21 04:41:55.881311 systemd-logind[1566]: New session 17 of user core. Jun 21 04:41:55.894845 systemd[1]: Started session-17.scope - Session 17 of User core. Jun 21 04:41:56.192303 sshd[5690]: Connection closed by 10.0.0.1 port 54562 Jun 21 04:41:56.192881 sshd-session[5688]: pam_unix(sshd:session): session closed for user core Jun 21 04:41:56.206132 systemd[1]: sshd@16-10.0.0.63:22-10.0.0.1:54562.service: Deactivated successfully. Jun 21 04:41:56.208420 systemd[1]: session-17.scope: Deactivated successfully. Jun 21 04:41:56.209222 systemd-logind[1566]: Session 17 logged out. Waiting for processes to exit. Jun 21 04:41:56.213579 systemd[1]: Started sshd@17-10.0.0.63:22-10.0.0.1:54570.service - OpenSSH per-connection server daemon (10.0.0.1:54570). Jun 21 04:41:56.214610 systemd-logind[1566]: Removed session 17. Jun 21 04:41:56.269197 sshd[5702]: Accepted publickey for core from 10.0.0.1 port 54570 ssh2: RSA SHA256:015yC5fRvb07MyWOgrdDHnl6DLRQb6q1XcuQXpFRy7c Jun 21 04:41:56.270816 sshd-session[5702]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jun 21 04:41:56.275654 systemd-logind[1566]: New session 18 of user core. Jun 21 04:41:56.291885 systemd[1]: Started session-18.scope - Session 18 of User core. Jun 21 04:41:57.056204 sshd[5704]: Connection closed by 10.0.0.1 port 54570 Jun 21 04:41:57.056663 sshd-session[5702]: pam_unix(sshd:session): session closed for user core Jun 21 04:41:57.067238 systemd[1]: sshd@17-10.0.0.63:22-10.0.0.1:54570.service: Deactivated successfully. Jun 21 04:41:57.071921 systemd[1]: session-18.scope: Deactivated successfully. Jun 21 04:41:57.073696 systemd-logind[1566]: Session 18 logged out. Waiting for processes to exit. Jun 21 04:41:57.078273 systemd[1]: Started sshd@18-10.0.0.63:22-10.0.0.1:54584.service - OpenSSH per-connection server daemon (10.0.0.1:54584). Jun 21 04:41:57.081201 systemd-logind[1566]: Removed session 18. Jun 21 04:41:57.122682 sshd[5724]: Accepted publickey for core from 10.0.0.1 port 54584 ssh2: RSA SHA256:015yC5fRvb07MyWOgrdDHnl6DLRQb6q1XcuQXpFRy7c Jun 21 04:41:57.124171 sshd-session[5724]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jun 21 04:41:57.128912 systemd-logind[1566]: New session 19 of user core. Jun 21 04:41:57.136865 systemd[1]: Started session-19.scope - Session 19 of User core. Jun 21 04:41:57.499216 sshd[5727]: Connection closed by 10.0.0.1 port 54584 Jun 21 04:41:57.501007 sshd-session[5724]: pam_unix(sshd:session): session closed for user core Jun 21 04:41:57.510836 systemd[1]: sshd@18-10.0.0.63:22-10.0.0.1:54584.service: Deactivated successfully. Jun 21 04:41:57.514021 systemd[1]: session-19.scope: Deactivated successfully. Jun 21 04:41:57.515269 systemd-logind[1566]: Session 19 logged out. Waiting for processes to exit. Jun 21 04:41:57.520604 systemd[1]: Started sshd@19-10.0.0.63:22-10.0.0.1:54596.service - OpenSSH per-connection server daemon (10.0.0.1:54596). Jun 21 04:41:57.522002 systemd-logind[1566]: Removed session 19. Jun 21 04:41:57.566946 sshd[5739]: Accepted publickey for core from 10.0.0.1 port 54596 ssh2: RSA SHA256:015yC5fRvb07MyWOgrdDHnl6DLRQb6q1XcuQXpFRy7c Jun 21 04:41:57.568846 sshd-session[5739]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jun 21 04:41:57.574309 systemd-logind[1566]: New session 20 of user core. Jun 21 04:41:57.583854 systemd[1]: Started session-20.scope - Session 20 of User core. Jun 21 04:41:57.693278 sshd[5741]: Connection closed by 10.0.0.1 port 54596 Jun 21 04:41:57.693621 sshd-session[5739]: pam_unix(sshd:session): session closed for user core Jun 21 04:41:57.698133 systemd[1]: sshd@19-10.0.0.63:22-10.0.0.1:54596.service: Deactivated successfully. Jun 21 04:41:57.700327 systemd[1]: session-20.scope: Deactivated successfully. Jun 21 04:41:57.701161 systemd-logind[1566]: Session 20 logged out. Waiting for processes to exit. Jun 21 04:41:57.702336 systemd-logind[1566]: Removed session 20. Jun 21 04:41:59.441207 kubelet[2743]: E0621 04:41:59.441153 2743 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Jun 21 04:42:01.702792 containerd[1593]: time="2025-06-21T04:42:01.702746699Z" level=info msg="TaskExit event in podsandbox handler container_id:\"32f739dfb62dc19088e1ce4190b4a36736d7d5b0051319868a92477462eb4641\" id:\"207c72693f63dce77fe105d7b755c2194b2879e28f59bde710aabf586cc51379\" pid:5767 exited_at:{seconds:1750480921 nanos:702467675}" Jun 21 04:42:02.707689 systemd[1]: Started sshd@20-10.0.0.63:22-10.0.0.1:54606.service - OpenSSH per-connection server daemon (10.0.0.1:54606). Jun 21 04:42:02.772446 sshd[5788]: Accepted publickey for core from 10.0.0.1 port 54606 ssh2: RSA SHA256:015yC5fRvb07MyWOgrdDHnl6DLRQb6q1XcuQXpFRy7c Jun 21 04:42:02.774117 sshd-session[5788]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jun 21 04:42:02.778874 systemd-logind[1566]: New session 21 of user core. Jun 21 04:42:02.790846 systemd[1]: Started session-21.scope - Session 21 of User core. Jun 21 04:42:02.824975 kubelet[2743]: I0621 04:42:02.824934 2743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jun 21 04:42:02.964418 sshd[5790]: Connection closed by 10.0.0.1 port 54606 Jun 21 04:42:02.964657 sshd-session[5788]: pam_unix(sshd:session): session closed for user core Jun 21 04:42:02.969576 systemd[1]: sshd@20-10.0.0.63:22-10.0.0.1:54606.service: Deactivated successfully. Jun 21 04:42:02.972054 systemd[1]: session-21.scope: Deactivated successfully. Jun 21 04:42:02.972803 systemd-logind[1566]: Session 21 logged out. Waiting for processes to exit. Jun 21 04:42:02.974097 systemd-logind[1566]: Removed session 21. Jun 21 04:42:03.441140 kubelet[2743]: E0621 04:42:03.441102 2743 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Jun 21 04:42:03.489410 kubelet[2743]: I0621 04:42:03.489375 2743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jun 21 04:42:04.777415 containerd[1593]: time="2025-06-21T04:42:04.777369388Z" level=info msg="StopContainer for \"28d17d87c39425ade07af9ac9cbcfc433b45e7dbe25812553f801864d08fcbbe\" with timeout 30 (s)" Jun 21 04:42:04.793981 containerd[1593]: time="2025-06-21T04:42:04.793956409Z" level=info msg="Stop container \"28d17d87c39425ade07af9ac9cbcfc433b45e7dbe25812553f801864d08fcbbe\" with signal terminated" Jun 21 04:42:04.805915 systemd[1]: cri-containerd-28d17d87c39425ade07af9ac9cbcfc433b45e7dbe25812553f801864d08fcbbe.scope: Deactivated successfully. Jun 21 04:42:04.806424 systemd[1]: cri-containerd-28d17d87c39425ade07af9ac9cbcfc433b45e7dbe25812553f801864d08fcbbe.scope: Consumed 1.260s CPU time, 52.1M memory peak, 1M read from disk. Jun 21 04:42:04.807032 containerd[1593]: time="2025-06-21T04:42:04.806997898Z" level=info msg="received exit event container_id:\"28d17d87c39425ade07af9ac9cbcfc433b45e7dbe25812553f801864d08fcbbe\" id:\"28d17d87c39425ade07af9ac9cbcfc433b45e7dbe25812553f801864d08fcbbe\" pid:5106 exit_status:1 exited_at:{seconds:1750480924 nanos:806519203}" Jun 21 04:42:04.807265 containerd[1593]: time="2025-06-21T04:42:04.807187762Z" level=info msg="TaskExit event in podsandbox handler container_id:\"28d17d87c39425ade07af9ac9cbcfc433b45e7dbe25812553f801864d08fcbbe\" id:\"28d17d87c39425ade07af9ac9cbcfc433b45e7dbe25812553f801864d08fcbbe\" pid:5106 exit_status:1 exited_at:{seconds:1750480924 nanos:806519203}" Jun 21 04:42:04.856597 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-28d17d87c39425ade07af9ac9cbcfc433b45e7dbe25812553f801864d08fcbbe-rootfs.mount: Deactivated successfully. Jun 21 04:42:04.923209 containerd[1593]: time="2025-06-21T04:42:04.923160412Z" level=info msg="StopContainer for \"28d17d87c39425ade07af9ac9cbcfc433b45e7dbe25812553f801864d08fcbbe\" returns successfully" Jun 21 04:42:04.928067 containerd[1593]: time="2025-06-21T04:42:04.928035696Z" level=info msg="StopPodSandbox for \"1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018\"" Jun 21 04:42:04.938362 containerd[1593]: time="2025-06-21T04:42:04.938326325Z" level=info msg="Container to stop \"28d17d87c39425ade07af9ac9cbcfc433b45e7dbe25812553f801864d08fcbbe\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jun 21 04:42:04.949440 systemd[1]: cri-containerd-1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018.scope: Deactivated successfully. Jun 21 04:42:04.951767 containerd[1593]: time="2025-06-21T04:42:04.951708326Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018\" id:\"1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018\" pid:4535 exit_status:137 exited_at:{seconds:1750480924 nanos:950978740}" Jun 21 04:42:04.980363 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018-rootfs.mount: Deactivated successfully. Jun 21 04:42:04.991098 containerd[1593]: time="2025-06-21T04:42:04.991049550Z" level=info msg="shim disconnected" id=1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018 namespace=k8s.io Jun 21 04:42:04.991098 containerd[1593]: time="2025-06-21T04:42:04.991094235Z" level=warning msg="cleaning up after shim disconnected" id=1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018 namespace=k8s.io Jun 21 04:42:05.006012 containerd[1593]: time="2025-06-21T04:42:04.991103433Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jun 21 04:42:05.090937 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018-shm.mount: Deactivated successfully. Jun 21 04:42:05.102487 containerd[1593]: time="2025-06-21T04:42:05.102387804Z" level=info msg="received exit event sandbox_id:\"1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018\" exit_status:137 exited_at:{seconds:1750480924 nanos:950978740}" Jun 21 04:42:05.175490 systemd-networkd[1495]: cali7894cbf81f1: Link DOWN Jun 21 04:42:05.175511 systemd-networkd[1495]: cali7894cbf81f1: Lost carrier Jun 21 04:42:05.248110 containerd[1593]: 2025-06-21 04:42:05.172 [INFO][5879] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018" Jun 21 04:42:05.248110 containerd[1593]: 2025-06-21 04:42:05.174 [INFO][5879] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018" iface="eth0" netns="/var/run/netns/cni-013b991d-8678-aef9-03c9-1e6608be8f00" Jun 21 04:42:05.248110 containerd[1593]: 2025-06-21 04:42:05.174 [INFO][5879] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018" iface="eth0" netns="/var/run/netns/cni-013b991d-8678-aef9-03c9-1e6608be8f00" Jun 21 04:42:05.248110 containerd[1593]: 2025-06-21 04:42:05.184 [INFO][5879] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018" after=9.863975ms iface="eth0" netns="/var/run/netns/cni-013b991d-8678-aef9-03c9-1e6608be8f00" Jun 21 04:42:05.248110 containerd[1593]: 2025-06-21 04:42:05.184 [INFO][5879] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018" Jun 21 04:42:05.248110 containerd[1593]: 2025-06-21 04:42:05.184 [INFO][5879] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018" Jun 21 04:42:05.248110 containerd[1593]: 2025-06-21 04:42:05.210 [INFO][5892] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018" HandleID="k8s-pod-network.1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018" Workload="localhost-k8s-calico--apiserver--6ddd6f5999--lwdv6-eth0" Jun 21 04:42:05.248110 containerd[1593]: 2025-06-21 04:42:05.211 [INFO][5892] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jun 21 04:42:05.248110 containerd[1593]: 2025-06-21 04:42:05.211 [INFO][5892] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jun 21 04:42:05.248110 containerd[1593]: 2025-06-21 04:42:05.240 [INFO][5892] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018" HandleID="k8s-pod-network.1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018" Workload="localhost-k8s-calico--apiserver--6ddd6f5999--lwdv6-eth0" Jun 21 04:42:05.248110 containerd[1593]: 2025-06-21 04:42:05.240 [INFO][5892] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018" HandleID="k8s-pod-network.1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018" Workload="localhost-k8s-calico--apiserver--6ddd6f5999--lwdv6-eth0" Jun 21 04:42:05.248110 containerd[1593]: 2025-06-21 04:42:05.241 [INFO][5892] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jun 21 04:42:05.248110 containerd[1593]: 2025-06-21 04:42:05.245 [INFO][5879] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018" Jun 21 04:42:05.251332 systemd[1]: run-netns-cni\x2d013b991d\x2d8678\x2daef9\x2d03c9\x2d1e6608be8f00.mount: Deactivated successfully. Jun 21 04:42:05.261644 containerd[1593]: time="2025-06-21T04:42:05.261603593Z" level=info msg="TearDown network for sandbox \"1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018\" successfully" Jun 21 04:42:05.261644 containerd[1593]: time="2025-06-21T04:42:05.261637457Z" level=info msg="StopPodSandbox for \"1a11a1469040273b65af84beb6c9ef9d3f70e62f123c58641fec444a5db8d018\" returns successfully" Jun 21 04:42:05.381923 kubelet[2743]: I0621 04:42:05.381809 2743 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57bl4\" (UniqueName: \"kubernetes.io/projected/ba7afd7e-002b-4c26-9fd9-a17eb66b7f50-kube-api-access-57bl4\") pod \"ba7afd7e-002b-4c26-9fd9-a17eb66b7f50\" (UID: \"ba7afd7e-002b-4c26-9fd9-a17eb66b7f50\") " Jun 21 04:42:05.381923 kubelet[2743]: I0621 04:42:05.381848 2743 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/ba7afd7e-002b-4c26-9fd9-a17eb66b7f50-calico-apiserver-certs\") pod \"ba7afd7e-002b-4c26-9fd9-a17eb66b7f50\" (UID: \"ba7afd7e-002b-4c26-9fd9-a17eb66b7f50\") " Jun 21 04:42:05.386186 kubelet[2743]: I0621 04:42:05.386128 2743 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba7afd7e-002b-4c26-9fd9-a17eb66b7f50-kube-api-access-57bl4" (OuterVolumeSpecName: "kube-api-access-57bl4") pod "ba7afd7e-002b-4c26-9fd9-a17eb66b7f50" (UID: "ba7afd7e-002b-4c26-9fd9-a17eb66b7f50"). InnerVolumeSpecName "kube-api-access-57bl4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jun 21 04:42:05.386186 kubelet[2743]: I0621 04:42:05.386175 2743 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba7afd7e-002b-4c26-9fd9-a17eb66b7f50-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "ba7afd7e-002b-4c26-9fd9-a17eb66b7f50" (UID: "ba7afd7e-002b-4c26-9fd9-a17eb66b7f50"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jun 21 04:42:05.388804 systemd[1]: var-lib-kubelet-pods-ba7afd7e\x2d002b\x2d4c26\x2d9fd9\x2da17eb66b7f50-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d57bl4.mount: Deactivated successfully. Jun 21 04:42:05.388947 systemd[1]: var-lib-kubelet-pods-ba7afd7e\x2d002b\x2d4c26\x2d9fd9\x2da17eb66b7f50-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Jun 21 04:42:05.482231 kubelet[2743]: I0621 04:42:05.482195 2743 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-57bl4\" (UniqueName: \"kubernetes.io/projected/ba7afd7e-002b-4c26-9fd9-a17eb66b7f50-kube-api-access-57bl4\") on node \"localhost\" DevicePath \"\"" Jun 21 04:42:05.482231 kubelet[2743]: I0621 04:42:05.482221 2743 reconciler_common.go:299] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/ba7afd7e-002b-4c26-9fd9-a17eb66b7f50-calico-apiserver-certs\") on node \"localhost\" DevicePath \"\"" Jun 21 04:42:05.706512 kubelet[2743]: I0621 04:42:05.706389 2743 scope.go:117] "RemoveContainer" containerID="28d17d87c39425ade07af9ac9cbcfc433b45e7dbe25812553f801864d08fcbbe" Jun 21 04:42:05.708121 containerd[1593]: time="2025-06-21T04:42:05.708093094Z" level=info msg="RemoveContainer for \"28d17d87c39425ade07af9ac9cbcfc433b45e7dbe25812553f801864d08fcbbe\"" Jun 21 04:42:05.713267 systemd[1]: Removed slice kubepods-besteffort-podba7afd7e_002b_4c26_9fd9_a17eb66b7f50.slice - libcontainer container kubepods-besteffort-podba7afd7e_002b_4c26_9fd9_a17eb66b7f50.slice. Jun 21 04:42:05.713390 systemd[1]: kubepods-besteffort-podba7afd7e_002b_4c26_9fd9_a17eb66b7f50.slice: Consumed 1.287s CPU time, 52.4M memory peak, 1M read from disk. Jun 21 04:42:05.789573 containerd[1593]: time="2025-06-21T04:42:05.789531698Z" level=info msg="RemoveContainer for \"28d17d87c39425ade07af9ac9cbcfc433b45e7dbe25812553f801864d08fcbbe\" returns successfully" Jun 21 04:42:05.814079 kubelet[2743]: I0621 04:42:05.814031 2743 scope.go:117] "RemoveContainer" containerID="28d17d87c39425ade07af9ac9cbcfc433b45e7dbe25812553f801864d08fcbbe" Jun 21 04:42:05.816962 containerd[1593]: time="2025-06-21T04:42:05.814347462Z" level=error msg="ContainerStatus for \"28d17d87c39425ade07af9ac9cbcfc433b45e7dbe25812553f801864d08fcbbe\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"28d17d87c39425ade07af9ac9cbcfc433b45e7dbe25812553f801864d08fcbbe\": not found" Jun 21 04:42:05.820068 kubelet[2743]: E0621 04:42:05.820042 2743 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"28d17d87c39425ade07af9ac9cbcfc433b45e7dbe25812553f801864d08fcbbe\": not found" containerID="28d17d87c39425ade07af9ac9cbcfc433b45e7dbe25812553f801864d08fcbbe" Jun 21 04:42:05.820120 kubelet[2743]: I0621 04:42:05.820071 2743 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"28d17d87c39425ade07af9ac9cbcfc433b45e7dbe25812553f801864d08fcbbe"} err="failed to get container status \"28d17d87c39425ade07af9ac9cbcfc433b45e7dbe25812553f801864d08fcbbe\": rpc error: code = NotFound desc = an error occurred when try to find container \"28d17d87c39425ade07af9ac9cbcfc433b45e7dbe25812553f801864d08fcbbe\": not found" Jun 21 04:42:06.443601 kubelet[2743]: I0621 04:42:06.443551 2743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba7afd7e-002b-4c26-9fd9-a17eb66b7f50" path="/var/lib/kubelet/pods/ba7afd7e-002b-4c26-9fd9-a17eb66b7f50/volumes" Jun 21 04:42:07.979748 systemd[1]: Started sshd@21-10.0.0.63:22-10.0.0.1:39716.service - OpenSSH per-connection server daemon (10.0.0.1:39716). Jun 21 04:42:08.031079 sshd[5908]: Accepted publickey for core from 10.0.0.1 port 39716 ssh2: RSA SHA256:015yC5fRvb07MyWOgrdDHnl6DLRQb6q1XcuQXpFRy7c Jun 21 04:42:08.032825 sshd-session[5908]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jun 21 04:42:08.037204 systemd-logind[1566]: New session 22 of user core. Jun 21 04:42:08.044835 systemd[1]: Started session-22.scope - Session 22 of User core. Jun 21 04:42:08.162678 sshd[5910]: Connection closed by 10.0.0.1 port 39716 Jun 21 04:42:08.163092 sshd-session[5908]: pam_unix(sshd:session): session closed for user core Jun 21 04:42:08.168290 systemd[1]: sshd@21-10.0.0.63:22-10.0.0.1:39716.service: Deactivated successfully. Jun 21 04:42:08.170481 systemd[1]: session-22.scope: Deactivated successfully. Jun 21 04:42:08.171561 systemd-logind[1566]: Session 22 logged out. Waiting for processes to exit. Jun 21 04:42:08.173793 systemd-logind[1566]: Removed session 22. Jun 21 04:42:08.674085 containerd[1593]: time="2025-06-21T04:42:08.674034378Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0e738ae249ced5a21790ad4bc36c6ee23bc4f688be03954044be79c7a86f2d63\" id:\"27dfd49acab9ec95acc3a97f1c2d585f688b922728339c29fad5a29a6e81a49b\" pid:5934 exited_at:{seconds:1750480928 nanos:673676324}" Jun 21 04:42:13.184029 systemd[1]: Started sshd@22-10.0.0.63:22-10.0.0.1:39730.service - OpenSSH per-connection server daemon (10.0.0.1:39730). Jun 21 04:42:13.230206 sshd[5947]: Accepted publickey for core from 10.0.0.1 port 39730 ssh2: RSA SHA256:015yC5fRvb07MyWOgrdDHnl6DLRQb6q1XcuQXpFRy7c Jun 21 04:42:13.231602 sshd-session[5947]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jun 21 04:42:13.236123 systemd-logind[1566]: New session 23 of user core. Jun 21 04:42:13.245874 systemd[1]: Started session-23.scope - Session 23 of User core. Jun 21 04:42:13.366350 sshd[5949]: Connection closed by 10.0.0.1 port 39730 Jun 21 04:42:13.366663 sshd-session[5947]: pam_unix(sshd:session): session closed for user core Jun 21 04:42:13.370554 systemd[1]: sshd@22-10.0.0.63:22-10.0.0.1:39730.service: Deactivated successfully. Jun 21 04:42:13.372390 systemd[1]: session-23.scope: Deactivated successfully. Jun 21 04:42:13.373121 systemd-logind[1566]: Session 23 logged out. Waiting for processes to exit. Jun 21 04:42:13.374593 systemd-logind[1566]: Removed session 23. Jun 21 04:42:13.441004 kubelet[2743]: E0621 04:42:13.440871 2743 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8"