Jul 6 23:21:02.848300 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Jul 6 23:21:02.848322 kernel: Linux version 6.12.35-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT Sun Jul 6 21:52:18 -00 2025 Jul 6 23:21:02.848332 kernel: KASLR enabled Jul 6 23:21:02.848338 kernel: efi: EFI v2.7 by EDK II Jul 6 23:21:02.848344 kernel: efi: SMBIOS 3.0=0xdced0000 MEMATTR=0xdb228018 ACPI 2.0=0xdb9b8018 RNG=0xdb9b8a18 MEMRESERVE=0xdb21fd18 Jul 6 23:21:02.848349 kernel: random: crng init done Jul 6 23:21:02.848356 kernel: Kernel is locked down from EFI Secure Boot; see man kernel_lockdown.7 Jul 6 23:21:02.848362 kernel: secureboot: Secure boot enabled Jul 6 23:21:02.848368 kernel: ACPI: Early table checksum verification disabled Jul 6 23:21:02.848376 kernel: ACPI: RSDP 0x00000000DB9B8018 000024 (v02 BOCHS ) Jul 6 23:21:02.848382 kernel: ACPI: XSDT 0x00000000DB9B8F18 000064 (v01 BOCHS BXPC 00000001 01000013) Jul 6 23:21:02.848388 kernel: ACPI: FACP 0x00000000DB9B8B18 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Jul 6 23:21:02.848403 kernel: ACPI: DSDT 0x00000000DB904018 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jul 6 23:21:02.848409 kernel: ACPI: APIC 0x00000000DB9B8C98 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Jul 6 23:21:02.848417 kernel: ACPI: PPTT 0x00000000DB9B8098 00009C (v02 BOCHS BXPC 00000001 BXPC 00000001) Jul 6 23:21:02.848425 kernel: ACPI: GTDT 0x00000000DB9B8818 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jul 6 23:21:02.848431 kernel: ACPI: MCFG 0x00000000DB9B8A98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jul 6 23:21:02.848438 kernel: ACPI: SPCR 0x00000000DB9B8918 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jul 6 23:21:02.848444 kernel: ACPI: DBG2 0x00000000DB9B8998 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Jul 6 23:21:02.848450 kernel: ACPI: IORT 0x00000000DB9B8198 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jul 6 23:21:02.848456 kernel: ACPI: SPCR: console: pl011,mmio,0x9000000,9600 Jul 6 23:21:02.848462 kernel: ACPI: Use ACPI SPCR as default console: Yes Jul 6 23:21:02.848469 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x00000000dcffffff] Jul 6 23:21:02.848475 kernel: NODE_DATA(0) allocated [mem 0xdc737dc0-0xdc73efff] Jul 6 23:21:02.848481 kernel: Zone ranges: Jul 6 23:21:02.848489 kernel: DMA [mem 0x0000000040000000-0x00000000dcffffff] Jul 6 23:21:02.848495 kernel: DMA32 empty Jul 6 23:21:02.848501 kernel: Normal empty Jul 6 23:21:02.848507 kernel: Device empty Jul 6 23:21:02.848513 kernel: Movable zone start for each node Jul 6 23:21:02.848519 kernel: Early memory node ranges Jul 6 23:21:02.848525 kernel: node 0: [mem 0x0000000040000000-0x00000000dbb4ffff] Jul 6 23:21:02.848531 kernel: node 0: [mem 0x00000000dbb50000-0x00000000dbe7ffff] Jul 6 23:21:02.848538 kernel: node 0: [mem 0x00000000dbe80000-0x00000000dbe9ffff] Jul 6 23:21:02.848544 kernel: node 0: [mem 0x00000000dbea0000-0x00000000dbedffff] Jul 6 23:21:02.848550 kernel: node 0: [mem 0x00000000dbee0000-0x00000000dbf1ffff] Jul 6 23:21:02.848556 kernel: node 0: [mem 0x00000000dbf20000-0x00000000dbf6ffff] Jul 6 23:21:02.848564 kernel: node 0: [mem 0x00000000dbf70000-0x00000000dcbfffff] Jul 6 23:21:02.848570 kernel: node 0: [mem 0x00000000dcc00000-0x00000000dcfdffff] Jul 6 23:21:02.848577 kernel: node 0: [mem 0x00000000dcfe0000-0x00000000dcffffff] Jul 6 23:21:02.848586 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x00000000dcffffff] Jul 6 23:21:02.848593 kernel: On node 0, zone DMA: 12288 pages in unavailable ranges Jul 6 23:21:02.848599 kernel: psci: probing for conduit method from ACPI. Jul 6 23:21:02.848606 kernel: psci: PSCIv1.1 detected in firmware. Jul 6 23:21:02.848614 kernel: psci: Using standard PSCI v0.2 function IDs Jul 6 23:21:02.848621 kernel: psci: Trusted OS migration not required Jul 6 23:21:02.848627 kernel: psci: SMC Calling Convention v1.1 Jul 6 23:21:02.848634 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Jul 6 23:21:02.848641 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Jul 6 23:21:02.848647 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Jul 6 23:21:02.848654 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Jul 6 23:21:02.848661 kernel: Detected PIPT I-cache on CPU0 Jul 6 23:21:02.848668 kernel: CPU features: detected: GIC system register CPU interface Jul 6 23:21:02.848675 kernel: CPU features: detected: Spectre-v4 Jul 6 23:21:02.848682 kernel: CPU features: detected: Spectre-BHB Jul 6 23:21:02.848689 kernel: CPU features: kernel page table isolation forced ON by KASLR Jul 6 23:21:02.848695 kernel: CPU features: detected: Kernel page table isolation (KPTI) Jul 6 23:21:02.848702 kernel: CPU features: detected: ARM erratum 1418040 Jul 6 23:21:02.848708 kernel: CPU features: detected: SSBS not fully self-synchronizing Jul 6 23:21:02.848715 kernel: alternatives: applying boot alternatives Jul 6 23:21:02.848723 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=dd2d39de40482a23e9bb75390ff5ca85cd9bd34d902b8049121a8373f8cb2ef2 Jul 6 23:21:02.848730 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Jul 6 23:21:02.848737 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Jul 6 23:21:02.848744 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jul 6 23:21:02.848752 kernel: Fallback order for Node 0: 0 Jul 6 23:21:02.848759 kernel: Built 1 zonelists, mobility grouping on. Total pages: 643072 Jul 6 23:21:02.848766 kernel: Policy zone: DMA Jul 6 23:21:02.848773 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jul 6 23:21:02.848779 kernel: software IO TLB: SWIOTLB bounce buffer size adjusted to 2MB Jul 6 23:21:02.848786 kernel: software IO TLB: area num 4. Jul 6 23:21:02.848792 kernel: software IO TLB: SWIOTLB bounce buffer size roundup to 4MB Jul 6 23:21:02.848799 kernel: software IO TLB: mapped [mem 0x00000000db504000-0x00000000db904000] (4MB) Jul 6 23:21:02.848806 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Jul 6 23:21:02.848816 kernel: rcu: Preemptible hierarchical RCU implementation. Jul 6 23:21:02.848824 kernel: rcu: RCU event tracing is enabled. Jul 6 23:21:02.848830 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Jul 6 23:21:02.848840 kernel: Trampoline variant of Tasks RCU enabled. Jul 6 23:21:02.848849 kernel: Tracing variant of Tasks RCU enabled. Jul 6 23:21:02.848862 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jul 6 23:21:02.848870 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Jul 6 23:21:02.848877 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Jul 6 23:21:02.848884 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Jul 6 23:21:02.848891 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Jul 6 23:21:02.848898 kernel: GICv3: 256 SPIs implemented Jul 6 23:21:02.848905 kernel: GICv3: 0 Extended SPIs implemented Jul 6 23:21:02.848912 kernel: Root IRQ handler: gic_handle_irq Jul 6 23:21:02.848918 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Jul 6 23:21:02.848926 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Jul 6 23:21:02.848933 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Jul 6 23:21:02.848940 kernel: ITS [mem 0x08080000-0x0809ffff] Jul 6 23:21:02.848947 kernel: ITS@0x0000000008080000: allocated 8192 Devices @40110000 (indirect, esz 8, psz 64K, shr 1) Jul 6 23:21:02.848953 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @40120000 (flat, esz 8, psz 64K, shr 1) Jul 6 23:21:02.848960 kernel: GICv3: using LPI property table @0x0000000040130000 Jul 6 23:21:02.848967 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000040140000 Jul 6 23:21:02.848974 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jul 6 23:21:02.848980 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jul 6 23:21:02.848987 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Jul 6 23:21:02.848994 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Jul 6 23:21:02.849001 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Jul 6 23:21:02.849010 kernel: arm-pv: using stolen time PV Jul 6 23:21:02.849027 kernel: Console: colour dummy device 80x25 Jul 6 23:21:02.849035 kernel: ACPI: Core revision 20240827 Jul 6 23:21:02.849042 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Jul 6 23:21:02.849049 kernel: pid_max: default: 32768 minimum: 301 Jul 6 23:21:02.849055 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jul 6 23:21:02.849062 kernel: landlock: Up and running. Jul 6 23:21:02.849069 kernel: SELinux: Initializing. Jul 6 23:21:02.849076 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jul 6 23:21:02.849088 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jul 6 23:21:02.849095 kernel: rcu: Hierarchical SRCU implementation. Jul 6 23:21:02.849102 kernel: rcu: Max phase no-delay instances is 400. Jul 6 23:21:02.849109 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jul 6 23:21:02.849116 kernel: Remapping and enabling EFI services. Jul 6 23:21:02.849123 kernel: smp: Bringing up secondary CPUs ... Jul 6 23:21:02.849130 kernel: Detected PIPT I-cache on CPU1 Jul 6 23:21:02.849137 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Jul 6 23:21:02.849144 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000040150000 Jul 6 23:21:02.849153 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jul 6 23:21:02.849166 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Jul 6 23:21:02.849173 kernel: Detected PIPT I-cache on CPU2 Jul 6 23:21:02.849183 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Jul 6 23:21:02.849190 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000040160000 Jul 6 23:21:02.849198 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jul 6 23:21:02.849205 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Jul 6 23:21:02.849212 kernel: Detected PIPT I-cache on CPU3 Jul 6 23:21:02.849220 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Jul 6 23:21:02.849230 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000040170000 Jul 6 23:21:02.849237 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jul 6 23:21:02.849244 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Jul 6 23:21:02.849251 kernel: smp: Brought up 1 node, 4 CPUs Jul 6 23:21:02.849258 kernel: SMP: Total of 4 processors activated. Jul 6 23:21:02.849265 kernel: CPU: All CPU(s) started at EL1 Jul 6 23:21:02.849272 kernel: CPU features: detected: 32-bit EL0 Support Jul 6 23:21:02.849279 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Jul 6 23:21:02.849287 kernel: CPU features: detected: Common not Private translations Jul 6 23:21:02.849299 kernel: CPU features: detected: CRC32 instructions Jul 6 23:21:02.849307 kernel: CPU features: detected: Enhanced Virtualization Traps Jul 6 23:21:02.849314 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Jul 6 23:21:02.849322 kernel: CPU features: detected: LSE atomic instructions Jul 6 23:21:02.849331 kernel: CPU features: detected: Privileged Access Never Jul 6 23:21:02.849338 kernel: CPU features: detected: RAS Extension Support Jul 6 23:21:02.849346 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Jul 6 23:21:02.849353 kernel: alternatives: applying system-wide alternatives Jul 6 23:21:02.849361 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 Jul 6 23:21:02.849371 kernel: Memory: 2438448K/2572288K available (11072K kernel code, 2428K rwdata, 9032K rodata, 39424K init, 1035K bss, 127892K reserved, 0K cma-reserved) Jul 6 23:21:02.849378 kernel: devtmpfs: initialized Jul 6 23:21:02.849385 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jul 6 23:21:02.849397 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Jul 6 23:21:02.849404 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Jul 6 23:21:02.849412 kernel: 0 pages in range for non-PLT usage Jul 6 23:21:02.849419 kernel: 508480 pages in range for PLT usage Jul 6 23:21:02.849426 kernel: pinctrl core: initialized pinctrl subsystem Jul 6 23:21:02.849433 kernel: SMBIOS 3.0.0 present. Jul 6 23:21:02.849443 kernel: DMI: QEMU KVM Virtual Machine, BIOS unknown 02/02/2022 Jul 6 23:21:02.849450 kernel: DMI: Memory slots populated: 1/1 Jul 6 23:21:02.849457 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jul 6 23:21:02.849464 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Jul 6 23:21:02.849471 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Jul 6 23:21:02.849478 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Jul 6 23:21:02.849485 kernel: audit: initializing netlink subsys (disabled) Jul 6 23:21:02.849493 kernel: audit: type=2000 audit(0.050:1): state=initialized audit_enabled=0 res=1 Jul 6 23:21:02.849501 kernel: thermal_sys: Registered thermal governor 'step_wise' Jul 6 23:21:02.849509 kernel: cpuidle: using governor menu Jul 6 23:21:02.849516 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Jul 6 23:21:02.849524 kernel: ASID allocator initialised with 32768 entries Jul 6 23:21:02.849531 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jul 6 23:21:02.849538 kernel: Serial: AMBA PL011 UART driver Jul 6 23:21:02.849545 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jul 6 23:21:02.849552 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Jul 6 23:21:02.849559 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Jul 6 23:21:02.849568 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Jul 6 23:21:02.849575 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jul 6 23:21:02.849582 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Jul 6 23:21:02.849589 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Jul 6 23:21:02.849596 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Jul 6 23:21:02.849603 kernel: ACPI: Added _OSI(Module Device) Jul 6 23:21:02.849610 kernel: ACPI: Added _OSI(Processor Device) Jul 6 23:21:02.849617 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jul 6 23:21:02.849624 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jul 6 23:21:02.849633 kernel: ACPI: Interpreter enabled Jul 6 23:21:02.849640 kernel: ACPI: Using GIC for interrupt routing Jul 6 23:21:02.849647 kernel: ACPI: MCFG table detected, 1 entries Jul 6 23:21:02.849654 kernel: ACPI: CPU0 has been hot-added Jul 6 23:21:02.849662 kernel: ACPI: CPU1 has been hot-added Jul 6 23:21:02.849669 kernel: ACPI: CPU2 has been hot-added Jul 6 23:21:02.849676 kernel: ACPI: CPU3 has been hot-added Jul 6 23:21:02.849683 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Jul 6 23:21:02.849690 kernel: printk: legacy console [ttyAMA0] enabled Jul 6 23:21:02.849697 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jul 6 23:21:02.849863 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jul 6 23:21:02.849930 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Jul 6 23:21:02.849990 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Jul 6 23:21:02.850087 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Jul 6 23:21:02.850149 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Jul 6 23:21:02.850158 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Jul 6 23:21:02.850166 kernel: PCI host bridge to bus 0000:00 Jul 6 23:21:02.850258 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Jul 6 23:21:02.850326 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Jul 6 23:21:02.850384 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Jul 6 23:21:02.850452 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jul 6 23:21:02.850537 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Jul 6 23:21:02.850609 kernel: pci 0000:00:01.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint Jul 6 23:21:02.850676 kernel: pci 0000:00:01.0: BAR 0 [io 0x0000-0x001f] Jul 6 23:21:02.850749 kernel: pci 0000:00:01.0: BAR 1 [mem 0x10000000-0x10000fff] Jul 6 23:21:02.850811 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Jul 6 23:21:02.850871 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Jul 6 23:21:02.850933 kernel: pci 0000:00:01.0: BAR 1 [mem 0x10000000-0x10000fff]: assigned Jul 6 23:21:02.850994 kernel: pci 0000:00:01.0: BAR 0 [io 0x1000-0x101f]: assigned Jul 6 23:21:02.851169 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Jul 6 23:21:02.851234 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Jul 6 23:21:02.851287 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Jul 6 23:21:02.851297 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Jul 6 23:21:02.851304 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Jul 6 23:21:02.851312 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Jul 6 23:21:02.851320 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Jul 6 23:21:02.851327 kernel: iommu: Default domain type: Translated Jul 6 23:21:02.851334 kernel: iommu: DMA domain TLB invalidation policy: strict mode Jul 6 23:21:02.851346 kernel: efivars: Registered efivars operations Jul 6 23:21:02.851355 kernel: vgaarb: loaded Jul 6 23:21:02.851364 kernel: clocksource: Switched to clocksource arch_sys_counter Jul 6 23:21:02.851372 kernel: VFS: Disk quotas dquot_6.6.0 Jul 6 23:21:02.851379 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jul 6 23:21:02.851386 kernel: pnp: PnP ACPI init Jul 6 23:21:02.851469 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Jul 6 23:21:02.851480 kernel: pnp: PnP ACPI: found 1 devices Jul 6 23:21:02.851488 kernel: NET: Registered PF_INET protocol family Jul 6 23:21:02.851498 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Jul 6 23:21:02.851505 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Jul 6 23:21:02.851513 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jul 6 23:21:02.851520 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Jul 6 23:21:02.851527 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Jul 6 23:21:02.851535 kernel: TCP: Hash tables configured (established 32768 bind 32768) Jul 6 23:21:02.851543 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Jul 6 23:21:02.851550 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Jul 6 23:21:02.851559 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jul 6 23:21:02.851566 kernel: PCI: CLS 0 bytes, default 64 Jul 6 23:21:02.851573 kernel: kvm [1]: HYP mode not available Jul 6 23:21:02.851581 kernel: Initialise system trusted keyrings Jul 6 23:21:02.851588 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Jul 6 23:21:02.851595 kernel: Key type asymmetric registered Jul 6 23:21:02.851602 kernel: Asymmetric key parser 'x509' registered Jul 6 23:21:02.851609 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Jul 6 23:21:02.851616 kernel: io scheduler mq-deadline registered Jul 6 23:21:02.851624 kernel: io scheduler kyber registered Jul 6 23:21:02.851633 kernel: io scheduler bfq registered Jul 6 23:21:02.851640 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Jul 6 23:21:02.851647 kernel: ACPI: button: Power Button [PWRB] Jul 6 23:21:02.851655 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Jul 6 23:21:02.851721 kernel: virtio-pci 0000:00:01.0: enabling device (0005 -> 0007) Jul 6 23:21:02.851731 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jul 6 23:21:02.851738 kernel: thunder_xcv, ver 1.0 Jul 6 23:21:02.851745 kernel: thunder_bgx, ver 1.0 Jul 6 23:21:02.851752 kernel: nicpf, ver 1.0 Jul 6 23:21:02.851761 kernel: nicvf, ver 1.0 Jul 6 23:21:02.851846 kernel: rtc-efi rtc-efi.0: registered as rtc0 Jul 6 23:21:02.851906 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-07-06T23:21:02 UTC (1751844062) Jul 6 23:21:02.851916 kernel: hid: raw HID events driver (C) Jiri Kosina Jul 6 23:21:02.851923 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Jul 6 23:21:02.851931 kernel: NET: Registered PF_INET6 protocol family Jul 6 23:21:02.851938 kernel: watchdog: NMI not fully supported Jul 6 23:21:02.851945 kernel: watchdog: Hard watchdog permanently disabled Jul 6 23:21:02.851955 kernel: Segment Routing with IPv6 Jul 6 23:21:02.851962 kernel: In-situ OAM (IOAM) with IPv6 Jul 6 23:21:02.851969 kernel: NET: Registered PF_PACKET protocol family Jul 6 23:21:02.851976 kernel: Key type dns_resolver registered Jul 6 23:21:02.851983 kernel: registered taskstats version 1 Jul 6 23:21:02.851991 kernel: Loading compiled-in X.509 certificates Jul 6 23:21:02.851998 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.35-flatcar: 90fb300ebe1fa0773739bb35dad461c5679d8dfb' Jul 6 23:21:02.852005 kernel: Demotion targets for Node 0: null Jul 6 23:21:02.852022 kernel: Key type .fscrypt registered Jul 6 23:21:02.852033 kernel: Key type fscrypt-provisioning registered Jul 6 23:21:02.852040 kernel: ima: No TPM chip found, activating TPM-bypass! Jul 6 23:21:02.852047 kernel: ima: Allocated hash algorithm: sha1 Jul 6 23:21:02.852055 kernel: ima: No architecture policies found Jul 6 23:21:02.852062 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Jul 6 23:21:02.852069 kernel: clk: Disabling unused clocks Jul 6 23:21:02.852076 kernel: PM: genpd: Disabling unused power domains Jul 6 23:21:02.852083 kernel: Warning: unable to open an initial console. Jul 6 23:21:02.852093 kernel: Freeing unused kernel memory: 39424K Jul 6 23:21:02.852100 kernel: Run /init as init process Jul 6 23:21:02.852108 kernel: with arguments: Jul 6 23:21:02.852115 kernel: /init Jul 6 23:21:02.852122 kernel: with environment: Jul 6 23:21:02.852130 kernel: HOME=/ Jul 6 23:21:02.852139 kernel: TERM=linux Jul 6 23:21:02.852148 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Jul 6 23:21:02.852156 systemd[1]: Successfully made /usr/ read-only. Jul 6 23:21:02.852168 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jul 6 23:21:02.852176 systemd[1]: Detected virtualization kvm. Jul 6 23:21:02.852184 systemd[1]: Detected architecture arm64. Jul 6 23:21:02.852191 systemd[1]: Running in initrd. Jul 6 23:21:02.852199 systemd[1]: No hostname configured, using default hostname. Jul 6 23:21:02.852207 systemd[1]: Hostname set to . Jul 6 23:21:02.852215 systemd[1]: Initializing machine ID from VM UUID. Jul 6 23:21:02.852222 systemd[1]: Queued start job for default target initrd.target. Jul 6 23:21:02.852231 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 6 23:21:02.852239 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 6 23:21:02.852247 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jul 6 23:21:02.852255 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jul 6 23:21:02.852263 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jul 6 23:21:02.852272 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jul 6 23:21:02.852282 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jul 6 23:21:02.852290 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jul 6 23:21:02.852298 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 6 23:21:02.852306 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jul 6 23:21:02.852314 systemd[1]: Reached target paths.target - Path Units. Jul 6 23:21:02.852321 systemd[1]: Reached target slices.target - Slice Units. Jul 6 23:21:02.852329 systemd[1]: Reached target swap.target - Swaps. Jul 6 23:21:02.852337 systemd[1]: Reached target timers.target - Timer Units. Jul 6 23:21:02.852345 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jul 6 23:21:02.852355 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jul 6 23:21:02.852362 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jul 6 23:21:02.852371 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jul 6 23:21:02.852379 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jul 6 23:21:02.852386 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jul 6 23:21:02.852400 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jul 6 23:21:02.852408 systemd[1]: Reached target sockets.target - Socket Units. Jul 6 23:21:02.852415 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jul 6 23:21:02.852426 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jul 6 23:21:02.852434 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jul 6 23:21:02.852442 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jul 6 23:21:02.852450 systemd[1]: Starting systemd-fsck-usr.service... Jul 6 23:21:02.852458 systemd[1]: Starting systemd-journald.service - Journal Service... Jul 6 23:21:02.852465 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jul 6 23:21:02.852473 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 6 23:21:02.852481 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jul 6 23:21:02.852491 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jul 6 23:21:02.852499 systemd[1]: Finished systemd-fsck-usr.service. Jul 6 23:21:02.852507 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jul 6 23:21:02.852544 systemd-journald[243]: Collecting audit messages is disabled. Jul 6 23:21:02.852568 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 6 23:21:02.852578 systemd-journald[243]: Journal started Jul 6 23:21:02.852598 systemd-journald[243]: Runtime Journal (/run/log/journal/37c53b345e2242cb822cb9cf563df1d3) is 6M, max 48.5M, 42.4M free. Jul 6 23:21:02.845277 systemd-modules-load[245]: Inserted module 'overlay' Jul 6 23:21:02.856696 systemd[1]: Started systemd-journald.service - Journal Service. Jul 6 23:21:02.857175 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jul 6 23:21:02.861581 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jul 6 23:21:02.863356 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jul 6 23:21:02.866737 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jul 6 23:21:02.871481 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jul 6 23:21:02.871925 systemd-modules-load[245]: Inserted module 'br_netfilter' Jul 6 23:21:02.872636 kernel: Bridge firewalling registered Jul 6 23:21:02.873253 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jul 6 23:21:02.875166 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jul 6 23:21:02.880133 systemd-tmpfiles[265]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jul 6 23:21:02.881598 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 6 23:21:02.883334 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 6 23:21:02.888761 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jul 6 23:21:02.891368 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jul 6 23:21:02.893090 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jul 6 23:21:02.905854 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jul 6 23:21:02.922541 dracut-cmdline[294]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=dd2d39de40482a23e9bb75390ff5ca85cd9bd34d902b8049121a8373f8cb2ef2 Jul 6 23:21:02.933431 systemd-resolved[291]: Positive Trust Anchors: Jul 6 23:21:02.933452 systemd-resolved[291]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jul 6 23:21:02.933483 systemd-resolved[291]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jul 6 23:21:02.940309 systemd-resolved[291]: Defaulting to hostname 'linux'. Jul 6 23:21:02.941363 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jul 6 23:21:02.942222 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jul 6 23:21:03.010046 kernel: SCSI subsystem initialized Jul 6 23:21:03.015029 kernel: Loading iSCSI transport class v2.0-870. Jul 6 23:21:03.023046 kernel: iscsi: registered transport (tcp) Jul 6 23:21:03.038042 kernel: iscsi: registered transport (qla4xxx) Jul 6 23:21:03.038107 kernel: QLogic iSCSI HBA Driver Jul 6 23:21:03.057472 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jul 6 23:21:03.080649 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jul 6 23:21:03.082184 systemd[1]: Reached target network-pre.target - Preparation for Network. Jul 6 23:21:03.133737 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jul 6 23:21:03.136001 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jul 6 23:21:03.203049 kernel: raid6: neonx8 gen() 15741 MB/s Jul 6 23:21:03.220031 kernel: raid6: neonx4 gen() 15729 MB/s Jul 6 23:21:03.237029 kernel: raid6: neonx2 gen() 13166 MB/s Jul 6 23:21:03.254026 kernel: raid6: neonx1 gen() 10437 MB/s Jul 6 23:21:03.271032 kernel: raid6: int64x8 gen() 6842 MB/s Jul 6 23:21:03.288033 kernel: raid6: int64x4 gen() 7328 MB/s Jul 6 23:21:03.305032 kernel: raid6: int64x2 gen() 6079 MB/s Jul 6 23:21:03.322027 kernel: raid6: int64x1 gen() 4917 MB/s Jul 6 23:21:03.322048 kernel: raid6: using algorithm neonx8 gen() 15741 MB/s Jul 6 23:21:03.339038 kernel: raid6: .... xor() 11890 MB/s, rmw enabled Jul 6 23:21:03.339065 kernel: raid6: using neon recovery algorithm Jul 6 23:21:03.345036 kernel: xor: measuring software checksum speed Jul 6 23:21:03.345068 kernel: 8regs : 20870 MB/sec Jul 6 23:21:03.345080 kernel: 32regs : 19585 MB/sec Jul 6 23:21:03.346415 kernel: arm64_neon : 28128 MB/sec Jul 6 23:21:03.346432 kernel: xor: using function: arm64_neon (28128 MB/sec) Jul 6 23:21:03.413050 kernel: Btrfs loaded, zoned=no, fsverity=no Jul 6 23:21:03.421086 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jul 6 23:21:03.423753 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 6 23:21:03.451607 systemd-udevd[502]: Using default interface naming scheme 'v255'. Jul 6 23:21:03.456149 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 6 23:21:03.458102 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jul 6 23:21:03.499572 dracut-pre-trigger[509]: rd.md=0: removing MD RAID activation Jul 6 23:21:03.526203 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jul 6 23:21:03.528475 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jul 6 23:21:03.585054 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jul 6 23:21:03.587881 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jul 6 23:21:03.637157 kernel: virtio_blk virtio1: 1/0/0 default/read/poll queues Jul 6 23:21:03.638063 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Jul 6 23:21:03.646427 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jul 6 23:21:03.646496 kernel: GPT:9289727 != 19775487 Jul 6 23:21:03.646507 kernel: GPT:Alternate GPT header not at the end of the disk. Jul 6 23:21:03.646516 kernel: GPT:9289727 != 19775487 Jul 6 23:21:03.646314 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jul 6 23:21:03.650145 kernel: GPT: Use GNU Parted to correct GPT errors. Jul 6 23:21:03.650169 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jul 6 23:21:03.646473 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jul 6 23:21:03.649630 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jul 6 23:21:03.651724 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 6 23:21:03.678960 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 6 23:21:03.699967 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Jul 6 23:21:03.705942 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jul 6 23:21:03.713595 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Jul 6 23:21:03.722487 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jul 6 23:21:03.728895 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Jul 6 23:21:03.729924 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Jul 6 23:21:03.732150 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jul 6 23:21:03.734383 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 6 23:21:03.735951 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jul 6 23:21:03.738444 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jul 6 23:21:03.740329 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jul 6 23:21:03.757891 disk-uuid[595]: Primary Header is updated. Jul 6 23:21:03.757891 disk-uuid[595]: Secondary Entries is updated. Jul 6 23:21:03.757891 disk-uuid[595]: Secondary Header is updated. Jul 6 23:21:03.764053 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jul 6 23:21:03.769116 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jul 6 23:21:04.782040 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Jul 6 23:21:04.782431 disk-uuid[598]: The operation has completed successfully. Jul 6 23:21:04.816870 systemd[1]: disk-uuid.service: Deactivated successfully. Jul 6 23:21:04.817028 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jul 6 23:21:04.838666 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jul 6 23:21:04.864683 sh[615]: Success Jul 6 23:21:04.878286 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jul 6 23:21:04.878331 kernel: device-mapper: uevent: version 1.0.3 Jul 6 23:21:04.879147 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jul 6 23:21:04.891059 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Jul 6 23:21:04.921615 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jul 6 23:21:04.924689 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jul 6 23:21:04.942962 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jul 6 23:21:04.948544 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' Jul 6 23:21:04.948580 kernel: BTRFS: device fsid aa7ffdf7-f152-4ceb-bd0e-b3b3f8f8b296 devid 1 transid 38 /dev/mapper/usr (253:0) scanned by mount (627) Jul 6 23:21:04.949647 kernel: BTRFS info (device dm-0): first mount of filesystem aa7ffdf7-f152-4ceb-bd0e-b3b3f8f8b296 Jul 6 23:21:04.949697 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Jul 6 23:21:04.950323 kernel: BTRFS info (device dm-0): using free-space-tree Jul 6 23:21:04.955225 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jul 6 23:21:04.956448 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jul 6 23:21:04.957464 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jul 6 23:21:04.958343 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jul 6 23:21:04.960953 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jul 6 23:21:04.985046 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/vda6 (254:6) scanned by mount (657) Jul 6 23:21:04.987286 kernel: BTRFS info (device vda6): first mount of filesystem 492b2e2a-5dd7-445f-b930-e9dd6acadf93 Jul 6 23:21:04.987348 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jul 6 23:21:04.987367 kernel: BTRFS info (device vda6): using free-space-tree Jul 6 23:21:04.994045 kernel: BTRFS info (device vda6): last unmount of filesystem 492b2e2a-5dd7-445f-b930-e9dd6acadf93 Jul 6 23:21:04.995792 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jul 6 23:21:04.997903 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jul 6 23:21:05.081812 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jul 6 23:21:05.086666 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jul 6 23:21:05.130626 systemd-networkd[804]: lo: Link UP Jul 6 23:21:05.130639 systemd-networkd[804]: lo: Gained carrier Jul 6 23:21:05.131402 systemd-networkd[804]: Enumeration completed Jul 6 23:21:05.131922 systemd-networkd[804]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 6 23:21:05.131926 systemd-networkd[804]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 6 23:21:05.133247 systemd-networkd[804]: eth0: Link UP Jul 6 23:21:05.133251 systemd-networkd[804]: eth0: Gained carrier Jul 6 23:21:05.133262 systemd-networkd[804]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 6 23:21:05.133268 systemd[1]: Started systemd-networkd.service - Network Configuration. Jul 6 23:21:05.134478 systemd[1]: Reached target network.target - Network. Jul 6 23:21:05.152096 systemd-networkd[804]: eth0: DHCPv4 address 10.0.0.40/16, gateway 10.0.0.1 acquired from 10.0.0.1 Jul 6 23:21:05.160911 ignition[701]: Ignition 2.21.0 Jul 6 23:21:05.160924 ignition[701]: Stage: fetch-offline Jul 6 23:21:05.160964 ignition[701]: no configs at "/usr/lib/ignition/base.d" Jul 6 23:21:05.160972 ignition[701]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Jul 6 23:21:05.161180 ignition[701]: parsed url from cmdline: "" Jul 6 23:21:05.161184 ignition[701]: no config URL provided Jul 6 23:21:05.161189 ignition[701]: reading system config file "/usr/lib/ignition/user.ign" Jul 6 23:21:05.161196 ignition[701]: no config at "/usr/lib/ignition/user.ign" Jul 6 23:21:05.161218 ignition[701]: op(1): [started] loading QEMU firmware config module Jul 6 23:21:05.161223 ignition[701]: op(1): executing: "modprobe" "qemu_fw_cfg" Jul 6 23:21:05.170627 ignition[701]: op(1): [finished] loading QEMU firmware config module Jul 6 23:21:05.211399 ignition[701]: parsing config with SHA512: b79e8fc52bddd4af2677a1cdfc1aa93d29a8929bc9b8a0da90262a81cadbf1a66359fe8119ad0b0d538ed8ed80cdfc8d6ebba63d4594b03690277f1b43eb8231 Jul 6 23:21:05.217289 unknown[701]: fetched base config from "system" Jul 6 23:21:05.217302 unknown[701]: fetched user config from "qemu" Jul 6 23:21:05.217691 ignition[701]: fetch-offline: fetch-offline passed Jul 6 23:21:05.217756 ignition[701]: Ignition finished successfully Jul 6 23:21:05.219949 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jul 6 23:21:05.221247 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Jul 6 23:21:05.222121 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jul 6 23:21:05.254808 ignition[817]: Ignition 2.21.0 Jul 6 23:21:05.254828 ignition[817]: Stage: kargs Jul 6 23:21:05.254980 ignition[817]: no configs at "/usr/lib/ignition/base.d" Jul 6 23:21:05.254989 ignition[817]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Jul 6 23:21:05.259973 ignition[817]: kargs: kargs passed Jul 6 23:21:05.260069 ignition[817]: Ignition finished successfully Jul 6 23:21:05.263178 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jul 6 23:21:05.265052 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jul 6 23:21:05.297496 ignition[826]: Ignition 2.21.0 Jul 6 23:21:05.297513 ignition[826]: Stage: disks Jul 6 23:21:05.297660 ignition[826]: no configs at "/usr/lib/ignition/base.d" Jul 6 23:21:05.297669 ignition[826]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Jul 6 23:21:05.299441 ignition[826]: disks: disks passed Jul 6 23:21:05.301190 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jul 6 23:21:05.299518 ignition[826]: Ignition finished successfully Jul 6 23:21:05.302366 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jul 6 23:21:05.303478 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jul 6 23:21:05.304809 systemd[1]: Reached target local-fs.target - Local File Systems. Jul 6 23:21:05.306152 systemd[1]: Reached target sysinit.target - System Initialization. Jul 6 23:21:05.307593 systemd[1]: Reached target basic.target - Basic System. Jul 6 23:21:05.309823 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jul 6 23:21:05.346932 systemd-fsck[835]: ROOT: clean, 15/553520 files, 52789/553472 blocks Jul 6 23:21:05.391312 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jul 6 23:21:05.394827 systemd[1]: Mounting sysroot.mount - /sysroot... Jul 6 23:21:05.463052 kernel: EXT4-fs (vda9): mounted filesystem a6b10247-fbe6-4a25-95d9-ddd4b58604ec r/w with ordered data mode. Quota mode: none. Jul 6 23:21:05.464128 systemd[1]: Mounted sysroot.mount - /sysroot. Jul 6 23:21:05.465406 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jul 6 23:21:05.468333 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jul 6 23:21:05.470774 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jul 6 23:21:05.471620 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jul 6 23:21:05.471725 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jul 6 23:21:05.471753 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jul 6 23:21:05.484328 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jul 6 23:21:05.487598 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jul 6 23:21:05.491990 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/vda6 (254:6) scanned by mount (843) Jul 6 23:21:05.492048 kernel: BTRFS info (device vda6): first mount of filesystem 492b2e2a-5dd7-445f-b930-e9dd6acadf93 Jul 6 23:21:05.492059 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jul 6 23:21:05.492112 kernel: BTRFS info (device vda6): using free-space-tree Jul 6 23:21:05.495860 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jul 6 23:21:05.543795 initrd-setup-root[867]: cut: /sysroot/etc/passwd: No such file or directory Jul 6 23:21:05.548669 initrd-setup-root[874]: cut: /sysroot/etc/group: No such file or directory Jul 6 23:21:05.552499 initrd-setup-root[881]: cut: /sysroot/etc/shadow: No such file or directory Jul 6 23:21:05.556540 initrd-setup-root[888]: cut: /sysroot/etc/gshadow: No such file or directory Jul 6 23:21:05.634715 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jul 6 23:21:05.636607 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jul 6 23:21:05.638281 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jul 6 23:21:05.658039 kernel: BTRFS info (device vda6): last unmount of filesystem 492b2e2a-5dd7-445f-b930-e9dd6acadf93 Jul 6 23:21:05.678271 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jul 6 23:21:05.692184 ignition[956]: INFO : Ignition 2.21.0 Jul 6 23:21:05.692184 ignition[956]: INFO : Stage: mount Jul 6 23:21:05.693528 ignition[956]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 6 23:21:05.693528 ignition[956]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Jul 6 23:21:05.693528 ignition[956]: INFO : mount: mount passed Jul 6 23:21:05.693528 ignition[956]: INFO : Ignition finished successfully Jul 6 23:21:05.695150 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jul 6 23:21:05.697490 systemd[1]: Starting ignition-files.service - Ignition (files)... Jul 6 23:21:05.948025 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jul 6 23:21:05.949639 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jul 6 23:21:05.986049 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/vda6 (254:6) scanned by mount (969) Jul 6 23:21:05.986117 kernel: BTRFS info (device vda6): first mount of filesystem 492b2e2a-5dd7-445f-b930-e9dd6acadf93 Jul 6 23:21:05.987661 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Jul 6 23:21:05.988195 kernel: BTRFS info (device vda6): using free-space-tree Jul 6 23:21:05.995618 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jul 6 23:21:06.027932 ignition[986]: INFO : Ignition 2.21.0 Jul 6 23:21:06.027932 ignition[986]: INFO : Stage: files Jul 6 23:21:06.030142 ignition[986]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 6 23:21:06.030142 ignition[986]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Jul 6 23:21:06.030142 ignition[986]: DEBUG : files: compiled without relabeling support, skipping Jul 6 23:21:06.035068 ignition[986]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jul 6 23:21:06.035068 ignition[986]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jul 6 23:21:06.039327 ignition[986]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jul 6 23:21:06.040578 ignition[986]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jul 6 23:21:06.041899 unknown[986]: wrote ssh authorized keys file for user: core Jul 6 23:21:06.042968 ignition[986]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jul 6 23:21:06.045560 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Jul 6 23:21:06.045560 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1 Jul 6 23:21:06.129786 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jul 6 23:21:06.305830 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Jul 6 23:21:06.305830 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jul 6 23:21:06.308820 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jul 6 23:21:06.308820 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jul 6 23:21:06.308820 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jul 6 23:21:06.308820 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jul 6 23:21:06.308820 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jul 6 23:21:06.308820 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jul 6 23:21:06.308820 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jul 6 23:21:06.318867 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jul 6 23:21:06.318867 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jul 6 23:21:06.318867 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Jul 6 23:21:06.318867 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Jul 6 23:21:06.318867 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Jul 6 23:21:06.326577 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-arm64.raw: attempt #1 Jul 6 23:21:06.909028 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jul 6 23:21:07.097297 systemd-networkd[804]: eth0: Gained IPv6LL Jul 6 23:21:07.449091 ignition[986]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Jul 6 23:21:07.449091 ignition[986]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jul 6 23:21:07.452981 ignition[986]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jul 6 23:21:07.493956 ignition[986]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jul 6 23:21:07.493956 ignition[986]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jul 6 23:21:07.493956 ignition[986]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Jul 6 23:21:07.493956 ignition[986]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Jul 6 23:21:07.501792 ignition[986]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Jul 6 23:21:07.501792 ignition[986]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Jul 6 23:21:07.501792 ignition[986]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Jul 6 23:21:07.529716 ignition[986]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Jul 6 23:21:07.534089 ignition[986]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Jul 6 23:21:07.536827 ignition[986]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Jul 6 23:21:07.536827 ignition[986]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Jul 6 23:21:07.536827 ignition[986]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Jul 6 23:21:07.536827 ignition[986]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Jul 6 23:21:07.536827 ignition[986]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Jul 6 23:21:07.536827 ignition[986]: INFO : files: files passed Jul 6 23:21:07.536827 ignition[986]: INFO : Ignition finished successfully Jul 6 23:21:07.537443 systemd[1]: Finished ignition-files.service - Ignition (files). Jul 6 23:21:07.540352 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jul 6 23:21:07.542725 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jul 6 23:21:07.567923 systemd[1]: ignition-quench.service: Deactivated successfully. Jul 6 23:21:07.568070 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jul 6 23:21:07.571923 initrd-setup-root-after-ignition[1015]: grep: /sysroot/oem/oem-release: No such file or directory Jul 6 23:21:07.574053 initrd-setup-root-after-ignition[1017]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jul 6 23:21:07.574053 initrd-setup-root-after-ignition[1017]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jul 6 23:21:07.577094 initrd-setup-root-after-ignition[1021]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jul 6 23:21:07.578319 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jul 6 23:21:07.579776 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jul 6 23:21:07.582410 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jul 6 23:21:07.630498 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jul 6 23:21:07.630633 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jul 6 23:21:07.632507 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jul 6 23:21:07.633725 systemd[1]: Reached target initrd.target - Initrd Default Target. Jul 6 23:21:07.635144 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jul 6 23:21:07.636145 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jul 6 23:21:07.654154 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jul 6 23:21:07.656876 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jul 6 23:21:07.686472 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jul 6 23:21:07.687587 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 6 23:21:07.688565 systemd[1]: Stopped target timers.target - Timer Units. Jul 6 23:21:07.689999 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jul 6 23:21:07.690144 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jul 6 23:21:07.692215 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jul 6 23:21:07.693612 systemd[1]: Stopped target basic.target - Basic System. Jul 6 23:21:07.694910 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jul 6 23:21:07.696230 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jul 6 23:21:07.697517 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jul 6 23:21:07.698938 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jul 6 23:21:07.700601 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jul 6 23:21:07.704137 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jul 6 23:21:07.706038 systemd[1]: Stopped target sysinit.target - System Initialization. Jul 6 23:21:07.708737 systemd[1]: Stopped target local-fs.target - Local File Systems. Jul 6 23:21:07.710172 systemd[1]: Stopped target swap.target - Swaps. Jul 6 23:21:07.712682 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jul 6 23:21:07.712829 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jul 6 23:21:07.714921 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jul 6 23:21:07.717472 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 6 23:21:07.719084 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jul 6 23:21:07.720152 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 6 23:21:07.722365 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jul 6 23:21:07.722506 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jul 6 23:21:07.725463 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jul 6 23:21:07.725735 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jul 6 23:21:07.727271 systemd[1]: Stopped target paths.target - Path Units. Jul 6 23:21:07.728674 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jul 6 23:21:07.729965 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 6 23:21:07.730967 systemd[1]: Stopped target slices.target - Slice Units. Jul 6 23:21:07.733043 systemd[1]: Stopped target sockets.target - Socket Units. Jul 6 23:21:07.735306 systemd[1]: iscsid.socket: Deactivated successfully. Jul 6 23:21:07.735408 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jul 6 23:21:07.737098 systemd[1]: iscsiuio.socket: Deactivated successfully. Jul 6 23:21:07.737179 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jul 6 23:21:07.738336 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jul 6 23:21:07.738475 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jul 6 23:21:07.739860 systemd[1]: ignition-files.service: Deactivated successfully. Jul 6 23:21:07.739963 systemd[1]: Stopped ignition-files.service - Ignition (files). Jul 6 23:21:07.742934 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jul 6 23:21:07.743810 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jul 6 23:21:07.744041 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jul 6 23:21:07.760639 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jul 6 23:21:07.761348 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jul 6 23:21:07.761499 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jul 6 23:21:07.763099 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jul 6 23:21:07.763195 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jul 6 23:21:07.772306 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jul 6 23:21:07.772425 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jul 6 23:21:07.779085 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jul 6 23:21:07.786990 ignition[1041]: INFO : Ignition 2.21.0 Jul 6 23:21:07.786990 ignition[1041]: INFO : Stage: umount Jul 6 23:21:07.788550 ignition[1041]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 6 23:21:07.788550 ignition[1041]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Jul 6 23:21:07.791214 ignition[1041]: INFO : umount: umount passed Jul 6 23:21:07.791214 ignition[1041]: INFO : Ignition finished successfully Jul 6 23:21:07.791084 systemd[1]: ignition-mount.service: Deactivated successfully. Jul 6 23:21:07.792102 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jul 6 23:21:07.796384 systemd[1]: Stopped target network.target - Network. Jul 6 23:21:07.800114 systemd[1]: ignition-disks.service: Deactivated successfully. Jul 6 23:21:07.800207 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jul 6 23:21:07.801066 systemd[1]: ignition-kargs.service: Deactivated successfully. Jul 6 23:21:07.801113 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jul 6 23:21:07.803158 systemd[1]: ignition-setup.service: Deactivated successfully. Jul 6 23:21:07.803218 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jul 6 23:21:07.804470 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jul 6 23:21:07.804512 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jul 6 23:21:07.806037 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jul 6 23:21:07.807337 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jul 6 23:21:07.817548 systemd[1]: systemd-resolved.service: Deactivated successfully. Jul 6 23:21:07.817655 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jul 6 23:21:07.821672 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Jul 6 23:21:07.825069 systemd[1]: systemd-networkd.service: Deactivated successfully. Jul 6 23:21:07.825991 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jul 6 23:21:07.828847 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Jul 6 23:21:07.833796 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jul 6 23:21:07.834857 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jul 6 23:21:07.834929 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jul 6 23:21:07.840642 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jul 6 23:21:07.841779 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jul 6 23:21:07.841861 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jul 6 23:21:07.843574 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jul 6 23:21:07.843618 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jul 6 23:21:07.847643 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jul 6 23:21:07.847694 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jul 6 23:21:07.850501 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jul 6 23:21:07.850559 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 6 23:21:07.852727 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 6 23:21:07.855253 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Jul 6 23:21:07.855719 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Jul 6 23:21:07.856041 systemd[1]: sysroot-boot.service: Deactivated successfully. Jul 6 23:21:07.857170 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jul 6 23:21:07.860545 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jul 6 23:21:07.860710 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jul 6 23:21:07.880943 systemd[1]: systemd-udevd.service: Deactivated successfully. Jul 6 23:21:07.884301 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 6 23:21:07.885653 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jul 6 23:21:07.885692 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jul 6 23:21:07.898437 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jul 6 23:21:07.898494 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jul 6 23:21:07.899929 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jul 6 23:21:07.899982 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jul 6 23:21:07.906637 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jul 6 23:21:07.906715 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jul 6 23:21:07.908615 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jul 6 23:21:07.908668 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jul 6 23:21:07.911712 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jul 6 23:21:07.912545 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jul 6 23:21:07.912605 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jul 6 23:21:07.916466 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jul 6 23:21:07.916517 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 6 23:21:07.919362 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jul 6 23:21:07.919425 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jul 6 23:21:07.926385 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Jul 6 23:21:07.926459 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Jul 6 23:21:07.926494 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Jul 6 23:21:07.926759 systemd[1]: network-cleanup.service: Deactivated successfully. Jul 6 23:21:07.932170 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jul 6 23:21:07.938145 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jul 6 23:21:07.939105 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jul 6 23:21:07.942403 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jul 6 23:21:07.944484 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jul 6 23:21:07.976088 systemd[1]: Switching root. Jul 6 23:21:08.016341 systemd-journald[243]: Journal stopped Jul 6 23:21:08.914528 systemd-journald[243]: Received SIGTERM from PID 1 (systemd). Jul 6 23:21:08.914588 kernel: SELinux: policy capability network_peer_controls=1 Jul 6 23:21:08.914600 kernel: SELinux: policy capability open_perms=1 Jul 6 23:21:08.914610 kernel: SELinux: policy capability extended_socket_class=1 Jul 6 23:21:08.914619 kernel: SELinux: policy capability always_check_network=0 Jul 6 23:21:08.914633 kernel: SELinux: policy capability cgroup_seclabel=1 Jul 6 23:21:08.914643 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jul 6 23:21:08.914652 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jul 6 23:21:08.914664 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jul 6 23:21:08.914675 kernel: SELinux: policy capability userspace_initial_context=0 Jul 6 23:21:08.914686 kernel: audit: type=1403 audit(1751844068.197:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Jul 6 23:21:08.914701 systemd[1]: Successfully loaded SELinux policy in 45.722ms. Jul 6 23:21:08.914721 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 10.659ms. Jul 6 23:21:08.914733 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jul 6 23:21:08.914743 systemd[1]: Detected virtualization kvm. Jul 6 23:21:08.914754 systemd[1]: Detected architecture arm64. Jul 6 23:21:08.914765 systemd[1]: Detected first boot. Jul 6 23:21:08.914775 systemd[1]: Initializing machine ID from VM UUID. Jul 6 23:21:08.914785 kernel: NET: Registered PF_VSOCK protocol family Jul 6 23:21:08.914795 zram_generator::config[1086]: No configuration found. Jul 6 23:21:08.914806 systemd[1]: Populated /etc with preset unit settings. Jul 6 23:21:08.914817 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Jul 6 23:21:08.914827 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jul 6 23:21:08.914839 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jul 6 23:21:08.914850 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jul 6 23:21:08.914861 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jul 6 23:21:08.914872 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jul 6 23:21:08.914883 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jul 6 23:21:08.914893 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jul 6 23:21:08.914904 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jul 6 23:21:08.914914 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jul 6 23:21:08.914924 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jul 6 23:21:08.914936 systemd[1]: Created slice user.slice - User and Session Slice. Jul 6 23:21:08.914946 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 6 23:21:08.914956 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 6 23:21:08.914967 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jul 6 23:21:08.914977 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jul 6 23:21:08.914987 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jul 6 23:21:08.914998 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jul 6 23:21:08.915008 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Jul 6 23:21:08.915030 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 6 23:21:08.915045 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jul 6 23:21:08.915055 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jul 6 23:21:08.915065 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jul 6 23:21:08.915075 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jul 6 23:21:08.915085 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jul 6 23:21:08.915095 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 6 23:21:08.915107 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jul 6 23:21:08.915118 systemd[1]: Reached target slices.target - Slice Units. Jul 6 23:21:08.915130 systemd[1]: Reached target swap.target - Swaps. Jul 6 23:21:08.915140 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jul 6 23:21:08.915150 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jul 6 23:21:08.915160 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jul 6 23:21:08.915171 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jul 6 23:21:08.915181 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jul 6 23:21:08.915192 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jul 6 23:21:08.915202 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jul 6 23:21:08.915212 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jul 6 23:21:08.915223 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jul 6 23:21:08.915234 systemd[1]: Mounting media.mount - External Media Directory... Jul 6 23:21:08.915244 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jul 6 23:21:08.915254 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jul 6 23:21:08.915264 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jul 6 23:21:08.915274 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jul 6 23:21:08.915285 systemd[1]: Reached target machines.target - Containers. Jul 6 23:21:08.915299 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jul 6 23:21:08.915310 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 6 23:21:08.915321 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jul 6 23:21:08.915333 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jul 6 23:21:08.915343 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 6 23:21:08.915354 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jul 6 23:21:08.915365 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 6 23:21:08.915381 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jul 6 23:21:08.915393 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jul 6 23:21:08.915404 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jul 6 23:21:08.915416 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jul 6 23:21:08.915431 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jul 6 23:21:08.915441 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jul 6 23:21:08.915451 systemd[1]: Stopped systemd-fsck-usr.service. Jul 6 23:21:08.915462 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 6 23:21:08.915472 systemd[1]: Starting systemd-journald.service - Journal Service... Jul 6 23:21:08.915482 kernel: loop: module loaded Jul 6 23:21:08.915492 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jul 6 23:21:08.915501 kernel: fuse: init (API version 7.41) Jul 6 23:21:08.915512 kernel: ACPI: bus type drm_connector registered Jul 6 23:21:08.915522 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jul 6 23:21:08.915533 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jul 6 23:21:08.915544 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jul 6 23:21:08.915554 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jul 6 23:21:08.915567 systemd[1]: verity-setup.service: Deactivated successfully. Jul 6 23:21:08.915577 systemd[1]: Stopped verity-setup.service. Jul 6 23:21:08.915587 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jul 6 23:21:08.915597 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jul 6 23:21:08.915633 systemd-journald[1147]: Collecting audit messages is disabled. Jul 6 23:21:08.915655 systemd[1]: Mounted media.mount - External Media Directory. Jul 6 23:21:08.915666 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jul 6 23:21:08.915679 systemd-journald[1147]: Journal started Jul 6 23:21:08.915700 systemd-journald[1147]: Runtime Journal (/run/log/journal/37c53b345e2242cb822cb9cf563df1d3) is 6M, max 48.5M, 42.4M free. Jul 6 23:21:08.921088 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jul 6 23:21:08.921127 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jul 6 23:21:08.921147 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jul 6 23:21:08.668104 systemd[1]: Queued start job for default target multi-user.target. Jul 6 23:21:08.692151 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Jul 6 23:21:08.692615 systemd[1]: systemd-journald.service: Deactivated successfully. Jul 6 23:21:08.923801 systemd[1]: Started systemd-journald.service - Journal Service. Jul 6 23:21:08.924839 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jul 6 23:21:08.929339 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jul 6 23:21:08.929552 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jul 6 23:21:08.930832 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 6 23:21:08.931003 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 6 23:21:08.932226 systemd[1]: modprobe@drm.service: Deactivated successfully. Jul 6 23:21:08.932364 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jul 6 23:21:08.933637 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 6 23:21:08.933790 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 6 23:21:08.935086 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jul 6 23:21:08.935248 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jul 6 23:21:08.936585 systemd[1]: modprobe@loop.service: Deactivated successfully. Jul 6 23:21:08.936734 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jul 6 23:21:08.938153 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jul 6 23:21:08.939464 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jul 6 23:21:08.940885 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jul 6 23:21:08.942307 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jul 6 23:21:08.955728 systemd[1]: Reached target network-pre.target - Preparation for Network. Jul 6 23:21:08.958397 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jul 6 23:21:08.960450 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jul 6 23:21:08.961480 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jul 6 23:21:08.961522 systemd[1]: Reached target local-fs.target - Local File Systems. Jul 6 23:21:08.963517 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jul 6 23:21:08.970746 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jul 6 23:21:08.971812 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 6 23:21:08.973811 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jul 6 23:21:08.979197 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jul 6 23:21:08.980157 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jul 6 23:21:08.982560 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jul 6 23:21:08.983666 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jul 6 23:21:08.986712 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jul 6 23:21:08.990469 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jul 6 23:21:09.003209 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jul 6 23:21:09.005946 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jul 6 23:21:09.007587 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jul 6 23:21:09.009288 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jul 6 23:21:09.012440 kernel: loop0: detected capacity change from 0 to 138376 Jul 6 23:21:09.012911 systemd-journald[1147]: Time spent on flushing to /var/log/journal/37c53b345e2242cb822cb9cf563df1d3 is 12.768ms for 885 entries. Jul 6 23:21:09.012911 systemd-journald[1147]: System Journal (/var/log/journal/37c53b345e2242cb822cb9cf563df1d3) is 8M, max 195.6M, 187.6M free. Jul 6 23:21:09.040312 systemd-journald[1147]: Received client request to flush runtime journal. Jul 6 23:21:09.040399 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jul 6 23:21:09.011791 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jul 6 23:21:09.015457 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jul 6 23:21:09.019150 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jul 6 23:21:09.042970 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jul 6 23:21:09.044964 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jul 6 23:21:09.053964 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jul 6 23:21:09.066393 kernel: loop1: detected capacity change from 0 to 107312 Jul 6 23:21:09.075130 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jul 6 23:21:09.080212 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jul 6 23:21:09.096044 kernel: loop2: detected capacity change from 0 to 203944 Jul 6 23:21:09.114548 systemd-tmpfiles[1221]: ACLs are not supported, ignoring. Jul 6 23:21:09.114565 systemd-tmpfiles[1221]: ACLs are not supported, ignoring. Jul 6 23:21:09.119781 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 6 23:21:09.133076 kernel: loop3: detected capacity change from 0 to 138376 Jul 6 23:21:09.150062 kernel: loop4: detected capacity change from 0 to 107312 Jul 6 23:21:09.157038 kernel: loop5: detected capacity change from 0 to 203944 Jul 6 23:21:09.164273 (sd-merge)[1225]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Jul 6 23:21:09.164711 (sd-merge)[1225]: Merged extensions into '/usr'. Jul 6 23:21:09.173294 systemd[1]: Reload requested from client PID 1202 ('systemd-sysext') (unit systemd-sysext.service)... Jul 6 23:21:09.173315 systemd[1]: Reloading... Jul 6 23:21:09.230319 zram_generator::config[1248]: No configuration found. Jul 6 23:21:09.331590 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 6 23:21:09.365920 ldconfig[1197]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jul 6 23:21:09.402452 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jul 6 23:21:09.402733 systemd[1]: Reloading finished in 228 ms. Jul 6 23:21:09.433540 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jul 6 23:21:09.435280 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jul 6 23:21:09.449495 systemd[1]: Starting ensure-sysext.service... Jul 6 23:21:09.451398 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jul 6 23:21:09.465337 systemd[1]: Reload requested from client PID 1285 ('systemctl') (unit ensure-sysext.service)... Jul 6 23:21:09.465358 systemd[1]: Reloading... Jul 6 23:21:09.476192 systemd-tmpfiles[1286]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jul 6 23:21:09.476219 systemd-tmpfiles[1286]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jul 6 23:21:09.477404 systemd-tmpfiles[1286]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jul 6 23:21:09.477684 systemd-tmpfiles[1286]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Jul 6 23:21:09.478481 systemd-tmpfiles[1286]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Jul 6 23:21:09.478802 systemd-tmpfiles[1286]: ACLs are not supported, ignoring. Jul 6 23:21:09.478911 systemd-tmpfiles[1286]: ACLs are not supported, ignoring. Jul 6 23:21:09.482563 systemd-tmpfiles[1286]: Detected autofs mount point /boot during canonicalization of boot. Jul 6 23:21:09.482701 systemd-tmpfiles[1286]: Skipping /boot Jul 6 23:21:09.492765 systemd-tmpfiles[1286]: Detected autofs mount point /boot during canonicalization of boot. Jul 6 23:21:09.494117 systemd-tmpfiles[1286]: Skipping /boot Jul 6 23:21:09.517049 zram_generator::config[1313]: No configuration found. Jul 6 23:21:09.593767 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 6 23:21:09.661116 systemd[1]: Reloading finished in 195 ms. Jul 6 23:21:09.685870 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jul 6 23:21:09.691722 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 6 23:21:09.700332 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jul 6 23:21:09.702752 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jul 6 23:21:09.715081 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jul 6 23:21:09.722200 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jul 6 23:21:09.726098 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 6 23:21:09.729317 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jul 6 23:21:09.734577 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 6 23:21:09.735916 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 6 23:21:09.751279 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 6 23:21:09.753424 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jul 6 23:21:09.754438 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 6 23:21:09.754562 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 6 23:21:09.757054 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jul 6 23:21:09.758623 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 6 23:21:09.758779 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 6 23:21:09.760228 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 6 23:21:09.760388 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 6 23:21:09.763328 systemd[1]: modprobe@loop.service: Deactivated successfully. Jul 6 23:21:09.763511 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jul 6 23:21:09.770660 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 6 23:21:09.772245 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 6 23:21:09.777299 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 6 23:21:09.779554 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jul 6 23:21:09.780482 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 6 23:21:09.780652 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 6 23:21:09.787610 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jul 6 23:21:09.788234 systemd-udevd[1354]: Using default interface naming scheme 'v255'. Jul 6 23:21:09.791231 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jul 6 23:21:09.794992 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jul 6 23:21:09.799655 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 6 23:21:09.799874 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 6 23:21:09.801569 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 6 23:21:09.801790 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 6 23:21:09.802092 augenrules[1386]: No rules Jul 6 23:21:09.803447 systemd[1]: audit-rules.service: Deactivated successfully. Jul 6 23:21:09.803640 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jul 6 23:21:09.805156 systemd[1]: modprobe@loop.service: Deactivated successfully. Jul 6 23:21:09.805302 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jul 6 23:21:09.806979 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jul 6 23:21:09.810644 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jul 6 23:21:09.820769 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jul 6 23:21:09.821882 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 6 23:21:09.824246 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 6 23:21:09.826190 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jul 6 23:21:09.862976 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 6 23:21:09.869357 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jul 6 23:21:09.870655 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 6 23:21:09.870808 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 6 23:21:09.870966 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jul 6 23:21:09.873499 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 6 23:21:09.876089 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jul 6 23:21:09.879432 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 6 23:21:09.879609 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 6 23:21:09.880988 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 6 23:21:09.881150 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 6 23:21:09.889073 systemd[1]: Finished ensure-sysext.service. Jul 6 23:21:09.890361 systemd[1]: modprobe@drm.service: Deactivated successfully. Jul 6 23:21:09.890550 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jul 6 23:21:09.900833 systemd[1]: modprobe@loop.service: Deactivated successfully. Jul 6 23:21:09.901555 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jul 6 23:21:09.911708 augenrules[1398]: /sbin/augenrules: No change Jul 6 23:21:09.911770 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jul 6 23:21:09.913857 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jul 6 23:21:09.913936 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jul 6 23:21:09.917271 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Jul 6 23:21:09.929109 augenrules[1464]: No rules Jul 6 23:21:09.932056 systemd[1]: audit-rules.service: Deactivated successfully. Jul 6 23:21:09.934095 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jul 6 23:21:09.943180 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Jul 6 23:21:09.965321 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Jul 6 23:21:09.969982 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jul 6 23:21:09.999524 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jul 6 23:21:10.043034 systemd-resolved[1353]: Positive Trust Anchors: Jul 6 23:21:10.043055 systemd-resolved[1353]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jul 6 23:21:10.043088 systemd-resolved[1353]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jul 6 23:21:10.052710 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Jul 6 23:21:10.054042 systemd[1]: Reached target time-set.target - System Time Set. Jul 6 23:21:10.055839 systemd-resolved[1353]: Defaulting to hostname 'linux'. Jul 6 23:21:10.058992 systemd-networkd[1453]: lo: Link UP Jul 6 23:21:10.059001 systemd-networkd[1453]: lo: Gained carrier Jul 6 23:21:10.059324 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jul 6 23:21:10.060254 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jul 6 23:21:10.061143 systemd[1]: Reached target sysinit.target - System Initialization. Jul 6 23:21:10.061687 systemd-networkd[1453]: Enumeration completed Jul 6 23:21:10.062163 systemd-networkd[1453]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 6 23:21:10.062172 systemd-networkd[1453]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 6 23:21:10.062612 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jul 6 23:21:10.063897 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jul 6 23:21:10.064540 systemd-networkd[1453]: eth0: Link UP Jul 6 23:21:10.065322 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jul 6 23:21:10.066325 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jul 6 23:21:10.067261 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jul 6 23:21:10.068189 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jul 6 23:21:10.068228 systemd[1]: Reached target paths.target - Path Units. Jul 6 23:21:10.068890 systemd[1]: Reached target timers.target - Timer Units. Jul 6 23:21:10.069503 systemd-networkd[1453]: eth0: Gained carrier Jul 6 23:21:10.069525 systemd-networkd[1453]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 6 23:21:10.070698 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jul 6 23:21:10.073062 systemd[1]: Starting docker.socket - Docker Socket for the API... Jul 6 23:21:10.076375 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jul 6 23:21:10.077560 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jul 6 23:21:10.078586 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jul 6 23:21:10.083216 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jul 6 23:21:10.084858 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jul 6 23:21:10.086109 systemd-networkd[1453]: eth0: DHCPv4 address 10.0.0.40/16, gateway 10.0.0.1 acquired from 10.0.0.1 Jul 6 23:21:10.086726 systemd-timesyncd[1458]: Network configuration changed, trying to establish connection. Jul 6 23:21:10.086803 systemd[1]: Started systemd-networkd.service - Network Configuration. Jul 6 23:21:10.088045 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jul 6 23:21:10.088083 systemd-timesyncd[1458]: Contacted time server 10.0.0.1:123 (10.0.0.1). Jul 6 23:21:10.088134 systemd-timesyncd[1458]: Initial clock synchronization to Sun 2025-07-06 23:21:10.378135 UTC. Jul 6 23:21:10.089125 systemd[1]: Reached target network.target - Network. Jul 6 23:21:10.090220 systemd[1]: Reached target sockets.target - Socket Units. Jul 6 23:21:10.090968 systemd[1]: Reached target basic.target - Basic System. Jul 6 23:21:10.091736 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jul 6 23:21:10.091766 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jul 6 23:21:10.093095 systemd[1]: Starting containerd.service - containerd container runtime... Jul 6 23:21:10.095361 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jul 6 23:21:10.107189 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jul 6 23:21:10.110251 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jul 6 23:21:10.112181 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jul 6 23:21:10.114154 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jul 6 23:21:10.128740 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jul 6 23:21:10.132343 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jul 6 23:21:10.135143 jq[1490]: false Jul 6 23:21:10.136257 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jul 6 23:21:10.142353 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jul 6 23:21:10.147095 extend-filesystems[1492]: Found /dev/vda6 Jul 6 23:21:10.148513 systemd[1]: Starting systemd-logind.service - User Login Management... Jul 6 23:21:10.151773 extend-filesystems[1492]: Found /dev/vda9 Jul 6 23:21:10.153203 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jul 6 23:21:10.154453 extend-filesystems[1492]: Checking size of /dev/vda9 Jul 6 23:21:10.156400 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jul 6 23:21:10.160178 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jul 6 23:21:10.160726 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jul 6 23:21:10.161490 systemd[1]: Starting update-engine.service - Update Engine... Jul 6 23:21:10.163874 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jul 6 23:21:10.172594 extend-filesystems[1492]: Resized partition /dev/vda9 Jul 6 23:21:10.177465 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jul 6 23:21:10.179647 extend-filesystems[1517]: resize2fs 1.47.2 (1-Jan-2025) Jul 6 23:21:10.180561 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jul 6 23:21:10.180770 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jul 6 23:21:10.188567 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Jul 6 23:21:10.189161 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jul 6 23:21:10.189402 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jul 6 23:21:10.191640 jq[1513]: true Jul 6 23:21:10.212861 jq[1525]: true Jul 6 23:21:10.218203 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 6 23:21:10.234609 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jul 6 23:21:10.241600 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Jul 6 23:21:10.242795 (ntainerd)[1523]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Jul 6 23:21:10.254403 systemd[1]: motdgen.service: Deactivated successfully. Jul 6 23:21:10.254614 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jul 6 23:21:10.255715 extend-filesystems[1517]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Jul 6 23:21:10.255715 extend-filesystems[1517]: old_desc_blocks = 1, new_desc_blocks = 1 Jul 6 23:21:10.255715 extend-filesystems[1517]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Jul 6 23:21:10.257131 dbus-daemon[1487]: [system] SELinux support is enabled Jul 6 23:21:10.257472 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jul 6 23:21:10.262297 extend-filesystems[1492]: Resized filesystem in /dev/vda9 Jul 6 23:21:10.265686 systemd[1]: extend-filesystems.service: Deactivated successfully. Jul 6 23:21:10.265950 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jul 6 23:21:10.274489 update_engine[1512]: I20250706 23:21:10.274214 1512 main.cc:92] Flatcar Update Engine starting Jul 6 23:21:10.302714 systemd-logind[1505]: Watching system buttons on /dev/input/event0 (Power Button) Jul 6 23:21:10.303466 systemd-logind[1505]: New seat seat0. Jul 6 23:21:10.303953 update_engine[1512]: I20250706 23:21:10.303886 1512 update_check_scheduler.cc:74] Next update check in 3m29s Jul 6 23:21:10.341606 bash[1560]: Updated "/home/core/.ssh/authorized_keys" Jul 6 23:21:10.343269 systemd[1]: Started systemd-logind.service - User Login Management. Jul 6 23:21:10.345863 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 6 23:21:10.348537 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jul 6 23:21:10.356000 dbus-daemon[1487]: [system] Successfully activated service 'org.freedesktop.systemd1' Jul 6 23:21:10.359274 tar[1526]: linux-arm64/helm Jul 6 23:21:10.368867 systemd[1]: Started update-engine.service - Update Engine. Jul 6 23:21:10.371247 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Jul 6 23:21:10.371467 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jul 6 23:21:10.371600 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jul 6 23:21:10.374260 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jul 6 23:21:10.374397 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jul 6 23:21:10.377291 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jul 6 23:21:10.461821 locksmithd[1566]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jul 6 23:21:10.565334 containerd[1523]: time="2025-07-06T23:21:10Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jul 6 23:21:10.566081 containerd[1523]: time="2025-07-06T23:21:10.566050040Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 Jul 6 23:21:10.580146 containerd[1523]: time="2025-07-06T23:21:10.579950440Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="11.72µs" Jul 6 23:21:10.580146 containerd[1523]: time="2025-07-06T23:21:10.579996640Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jul 6 23:21:10.580146 containerd[1523]: time="2025-07-06T23:21:10.580040880Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jul 6 23:21:10.580849 containerd[1523]: time="2025-07-06T23:21:10.580637480Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jul 6 23:21:10.580849 containerd[1523]: time="2025-07-06T23:21:10.580670280Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jul 6 23:21:10.580849 containerd[1523]: time="2025-07-06T23:21:10.580702440Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jul 6 23:21:10.581065 containerd[1523]: time="2025-07-06T23:21:10.581041920Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jul 6 23:21:10.581119 containerd[1523]: time="2025-07-06T23:21:10.581105920Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jul 6 23:21:10.581635 containerd[1523]: time="2025-07-06T23:21:10.581607640Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jul 6 23:21:10.582218 containerd[1523]: time="2025-07-06T23:21:10.581699000Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jul 6 23:21:10.582218 containerd[1523]: time="2025-07-06T23:21:10.581775360Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jul 6 23:21:10.582218 containerd[1523]: time="2025-07-06T23:21:10.581787200Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jul 6 23:21:10.582218 containerd[1523]: time="2025-07-06T23:21:10.581904680Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jul 6 23:21:10.582218 containerd[1523]: time="2025-07-06T23:21:10.582141120Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jul 6 23:21:10.582218 containerd[1523]: time="2025-07-06T23:21:10.582171600Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jul 6 23:21:10.582218 containerd[1523]: time="2025-07-06T23:21:10.582184040Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jul 6 23:21:10.582608 containerd[1523]: time="2025-07-06T23:21:10.582573440Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jul 6 23:21:10.583192 containerd[1523]: time="2025-07-06T23:21:10.583169000Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jul 6 23:21:10.583457 containerd[1523]: time="2025-07-06T23:21:10.583437400Z" level=info msg="metadata content store policy set" policy=shared Jul 6 23:21:10.590530 containerd[1523]: time="2025-07-06T23:21:10.590354080Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jul 6 23:21:10.590530 containerd[1523]: time="2025-07-06T23:21:10.590439840Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jul 6 23:21:10.590530 containerd[1523]: time="2025-07-06T23:21:10.590457800Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jul 6 23:21:10.590530 containerd[1523]: time="2025-07-06T23:21:10.590480400Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jul 6 23:21:10.590530 containerd[1523]: time="2025-07-06T23:21:10.590494280Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jul 6 23:21:10.590816 containerd[1523]: time="2025-07-06T23:21:10.590509960Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jul 6 23:21:10.591037 containerd[1523]: time="2025-07-06T23:21:10.590884920Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jul 6 23:21:10.591037 containerd[1523]: time="2025-07-06T23:21:10.590909120Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jul 6 23:21:10.591037 containerd[1523]: time="2025-07-06T23:21:10.590926000Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jul 6 23:21:10.591037 containerd[1523]: time="2025-07-06T23:21:10.590937920Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jul 6 23:21:10.591037 containerd[1523]: time="2025-07-06T23:21:10.590948840Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jul 6 23:21:10.591037 containerd[1523]: time="2025-07-06T23:21:10.590963160Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jul 6 23:21:10.593195 containerd[1523]: time="2025-07-06T23:21:10.591480400Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jul 6 23:21:10.593195 containerd[1523]: time="2025-07-06T23:21:10.591513280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jul 6 23:21:10.593195 containerd[1523]: time="2025-07-06T23:21:10.591532520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jul 6 23:21:10.593195 containerd[1523]: time="2025-07-06T23:21:10.591544360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jul 6 23:21:10.593195 containerd[1523]: time="2025-07-06T23:21:10.591563520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jul 6 23:21:10.593195 containerd[1523]: time="2025-07-06T23:21:10.591623280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jul 6 23:21:10.593195 containerd[1523]: time="2025-07-06T23:21:10.591636080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jul 6 23:21:10.593195 containerd[1523]: time="2025-07-06T23:21:10.591647640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jul 6 23:21:10.593195 containerd[1523]: time="2025-07-06T23:21:10.591659280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jul 6 23:21:10.593195 containerd[1523]: time="2025-07-06T23:21:10.591670520Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jul 6 23:21:10.593195 containerd[1523]: time="2025-07-06T23:21:10.591681400Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jul 6 23:21:10.593644 containerd[1523]: time="2025-07-06T23:21:10.593619640Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jul 6 23:21:10.593717 containerd[1523]: time="2025-07-06T23:21:10.593703800Z" level=info msg="Start snapshots syncer" Jul 6 23:21:10.594805 containerd[1523]: time="2025-07-06T23:21:10.594780200Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jul 6 23:21:10.595401 containerd[1523]: time="2025-07-06T23:21:10.595351520Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jul 6 23:21:10.595647 containerd[1523]: time="2025-07-06T23:21:10.595623880Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jul 6 23:21:10.595881 containerd[1523]: time="2025-07-06T23:21:10.595851000Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jul 6 23:21:10.596230 containerd[1523]: time="2025-07-06T23:21:10.596207680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jul 6 23:21:10.596313 containerd[1523]: time="2025-07-06T23:21:10.596299280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jul 6 23:21:10.596490 containerd[1523]: time="2025-07-06T23:21:10.596471800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jul 6 23:21:10.596663 containerd[1523]: time="2025-07-06T23:21:10.596647320Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jul 6 23:21:10.596732 containerd[1523]: time="2025-07-06T23:21:10.596718600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jul 6 23:21:10.596834 containerd[1523]: time="2025-07-06T23:21:10.596818040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jul 6 23:21:10.596972 containerd[1523]: time="2025-07-06T23:21:10.596955080Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jul 6 23:21:10.597086 containerd[1523]: time="2025-07-06T23:21:10.597071000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jul 6 23:21:10.597196 containerd[1523]: time="2025-07-06T23:21:10.597180040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jul 6 23:21:10.597256 containerd[1523]: time="2025-07-06T23:21:10.597242920Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jul 6 23:21:10.597434 containerd[1523]: time="2025-07-06T23:21:10.597402400Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jul 6 23:21:10.597506 containerd[1523]: time="2025-07-06T23:21:10.597490280Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jul 6 23:21:10.597620 containerd[1523]: time="2025-07-06T23:21:10.597603440Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jul 6 23:21:10.597732 containerd[1523]: time="2025-07-06T23:21:10.597716120Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jul 6 23:21:10.597789 containerd[1523]: time="2025-07-06T23:21:10.597775960Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jul 6 23:21:10.597838 containerd[1523]: time="2025-07-06T23:21:10.597826240Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jul 6 23:21:10.597941 containerd[1523]: time="2025-07-06T23:21:10.597926520Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jul 6 23:21:10.598862 containerd[1523]: time="2025-07-06T23:21:10.598840760Z" level=info msg="runtime interface created" Jul 6 23:21:10.599064 containerd[1523]: time="2025-07-06T23:21:10.598909760Z" level=info msg="created NRI interface" Jul 6 23:21:10.599064 containerd[1523]: time="2025-07-06T23:21:10.598929400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jul 6 23:21:10.599064 containerd[1523]: time="2025-07-06T23:21:10.598948360Z" level=info msg="Connect containerd service" Jul 6 23:21:10.599064 containerd[1523]: time="2025-07-06T23:21:10.598988200Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jul 6 23:21:10.600345 containerd[1523]: time="2025-07-06T23:21:10.600265880Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jul 6 23:21:10.718634 tar[1526]: linux-arm64/LICENSE Jul 6 23:21:10.718634 tar[1526]: linux-arm64/README.md Jul 6 23:21:10.739254 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jul 6 23:21:10.848136 containerd[1523]: time="2025-07-06T23:21:10.847389560Z" level=info msg="Start subscribing containerd event" Jul 6 23:21:10.848136 containerd[1523]: time="2025-07-06T23:21:10.847466800Z" level=info msg="Start recovering state" Jul 6 23:21:10.848136 containerd[1523]: time="2025-07-06T23:21:10.847559160Z" level=info msg="Start event monitor" Jul 6 23:21:10.848136 containerd[1523]: time="2025-07-06T23:21:10.847575680Z" level=info msg="Start cni network conf syncer for default" Jul 6 23:21:10.848136 containerd[1523]: time="2025-07-06T23:21:10.847584440Z" level=info msg="Start streaming server" Jul 6 23:21:10.848136 containerd[1523]: time="2025-07-06T23:21:10.847593760Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jul 6 23:21:10.848136 containerd[1523]: time="2025-07-06T23:21:10.847601080Z" level=info msg="runtime interface starting up..." Jul 6 23:21:10.848136 containerd[1523]: time="2025-07-06T23:21:10.847608040Z" level=info msg="starting plugins..." Jul 6 23:21:10.848136 containerd[1523]: time="2025-07-06T23:21:10.847622760Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jul 6 23:21:10.848531 containerd[1523]: time="2025-07-06T23:21:10.848507280Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jul 6 23:21:10.848712 containerd[1523]: time="2025-07-06T23:21:10.848638000Z" level=info msg=serving... address=/run/containerd/containerd.sock Jul 6 23:21:10.849932 containerd[1523]: time="2025-07-06T23:21:10.849896440Z" level=info msg="containerd successfully booted in 0.285053s" Jul 6 23:21:10.851170 systemd[1]: Started containerd.service - containerd container runtime. Jul 6 23:21:11.159040 sshd_keygen[1532]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jul 6 23:21:11.182265 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jul 6 23:21:11.185731 systemd[1]: Starting issuegen.service - Generate /run/issue... Jul 6 23:21:11.207328 systemd[1]: issuegen.service: Deactivated successfully. Jul 6 23:21:11.207623 systemd[1]: Finished issuegen.service - Generate /run/issue. Jul 6 23:21:11.210992 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jul 6 23:21:11.232568 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jul 6 23:21:11.235542 systemd[1]: Started getty@tty1.service - Getty on tty1. Jul 6 23:21:11.237886 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Jul 6 23:21:11.239295 systemd[1]: Reached target getty.target - Login Prompts. Jul 6 23:21:11.962088 systemd-networkd[1453]: eth0: Gained IPv6LL Jul 6 23:21:11.964618 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jul 6 23:21:11.966344 systemd[1]: Reached target network-online.target - Network is Online. Jul 6 23:21:11.969923 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Jul 6 23:21:11.972645 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 6 23:21:11.982591 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jul 6 23:21:12.007746 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jul 6 23:21:12.010996 systemd[1]: coreos-metadata.service: Deactivated successfully. Jul 6 23:21:12.011389 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Jul 6 23:21:12.013679 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jul 6 23:21:12.629314 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 6 23:21:12.630965 systemd[1]: Reached target multi-user.target - Multi-User System. Jul 6 23:21:12.633010 systemd[1]: Startup finished in 2.172s (kernel) + 5.565s (initrd) + 4.485s (userspace) = 12.223s. Jul 6 23:21:12.635583 (kubelet)[1636]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 6 23:21:13.193306 kubelet[1636]: E0706 23:21:13.193237 1636 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 6 23:21:13.196099 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 6 23:21:13.196263 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 6 23:21:13.196870 systemd[1]: kubelet.service: Consumed 902ms CPU time, 256.6M memory peak. Jul 6 23:21:15.520297 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jul 6 23:21:15.521959 systemd[1]: Started sshd@0-10.0.0.40:22-10.0.0.1:43328.service - OpenSSH per-connection server daemon (10.0.0.1:43328). Jul 6 23:21:15.596712 sshd[1649]: Accepted publickey for core from 10.0.0.1 port 43328 ssh2: RSA SHA256:jyTvj9WiqpnTWeC15mq15pBzt3VkG8C4RFcxi7WEalo Jul 6 23:21:15.599007 sshd-session[1649]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:21:15.606319 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jul 6 23:21:15.607395 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jul 6 23:21:15.614825 systemd-logind[1505]: New session 1 of user core. Jul 6 23:21:15.631106 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jul 6 23:21:15.634720 systemd[1]: Starting user@500.service - User Manager for UID 500... Jul 6 23:21:15.651348 (systemd)[1653]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Jul 6 23:21:15.653716 systemd-logind[1505]: New session c1 of user core. Jul 6 23:21:15.776168 systemd[1653]: Queued start job for default target default.target. Jul 6 23:21:15.787109 systemd[1653]: Created slice app.slice - User Application Slice. Jul 6 23:21:15.787142 systemd[1653]: Reached target paths.target - Paths. Jul 6 23:21:15.787184 systemd[1653]: Reached target timers.target - Timers. Jul 6 23:21:15.788629 systemd[1653]: Starting dbus.socket - D-Bus User Message Bus Socket... Jul 6 23:21:15.799416 systemd[1653]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jul 6 23:21:15.799540 systemd[1653]: Reached target sockets.target - Sockets. Jul 6 23:21:15.799605 systemd[1653]: Reached target basic.target - Basic System. Jul 6 23:21:15.799646 systemd[1653]: Reached target default.target - Main User Target. Jul 6 23:21:15.799677 systemd[1653]: Startup finished in 139ms. Jul 6 23:21:15.799763 systemd[1]: Started user@500.service - User Manager for UID 500. Jul 6 23:21:15.801727 systemd[1]: Started session-1.scope - Session 1 of User core. Jul 6 23:21:15.867442 systemd[1]: Started sshd@1-10.0.0.40:22-10.0.0.1:43342.service - OpenSSH per-connection server daemon (10.0.0.1:43342). Jul 6 23:21:15.925318 sshd[1664]: Accepted publickey for core from 10.0.0.1 port 43342 ssh2: RSA SHA256:jyTvj9WiqpnTWeC15mq15pBzt3VkG8C4RFcxi7WEalo Jul 6 23:21:15.926838 sshd-session[1664]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:21:15.932109 systemd-logind[1505]: New session 2 of user core. Jul 6 23:21:15.950302 systemd[1]: Started session-2.scope - Session 2 of User core. Jul 6 23:21:16.005750 sshd[1666]: Connection closed by 10.0.0.1 port 43342 Jul 6 23:21:16.006197 sshd-session[1664]: pam_unix(sshd:session): session closed for user core Jul 6 23:21:16.027965 systemd[1]: sshd@1-10.0.0.40:22-10.0.0.1:43342.service: Deactivated successfully. Jul 6 23:21:16.030100 systemd[1]: session-2.scope: Deactivated successfully. Jul 6 23:21:16.031737 systemd-logind[1505]: Session 2 logged out. Waiting for processes to exit. Jul 6 23:21:16.033640 systemd[1]: Started sshd@2-10.0.0.40:22-10.0.0.1:43354.service - OpenSSH per-connection server daemon (10.0.0.1:43354). Jul 6 23:21:16.034746 systemd-logind[1505]: Removed session 2. Jul 6 23:21:16.093532 sshd[1672]: Accepted publickey for core from 10.0.0.1 port 43354 ssh2: RSA SHA256:jyTvj9WiqpnTWeC15mq15pBzt3VkG8C4RFcxi7WEalo Jul 6 23:21:16.094979 sshd-session[1672]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:21:16.099650 systemd-logind[1505]: New session 3 of user core. Jul 6 23:21:16.107247 systemd[1]: Started session-3.scope - Session 3 of User core. Jul 6 23:21:16.159151 sshd[1674]: Connection closed by 10.0.0.1 port 43354 Jul 6 23:21:16.159533 sshd-session[1672]: pam_unix(sshd:session): session closed for user core Jul 6 23:21:16.171425 systemd[1]: sshd@2-10.0.0.40:22-10.0.0.1:43354.service: Deactivated successfully. Jul 6 23:21:16.174793 systemd[1]: session-3.scope: Deactivated successfully. Jul 6 23:21:16.175544 systemd-logind[1505]: Session 3 logged out. Waiting for processes to exit. Jul 6 23:21:16.178694 systemd[1]: Started sshd@3-10.0.0.40:22-10.0.0.1:43366.service - OpenSSH per-connection server daemon (10.0.0.1:43366). Jul 6 23:21:16.179932 systemd-logind[1505]: Removed session 3. Jul 6 23:21:16.244530 sshd[1680]: Accepted publickey for core from 10.0.0.1 port 43366 ssh2: RSA SHA256:jyTvj9WiqpnTWeC15mq15pBzt3VkG8C4RFcxi7WEalo Jul 6 23:21:16.246328 sshd-session[1680]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:21:16.252740 systemd-logind[1505]: New session 4 of user core. Jul 6 23:21:16.268287 systemd[1]: Started session-4.scope - Session 4 of User core. Jul 6 23:21:16.324248 sshd[1682]: Connection closed by 10.0.0.1 port 43366 Jul 6 23:21:16.325049 sshd-session[1680]: pam_unix(sshd:session): session closed for user core Jul 6 23:21:16.335561 systemd[1]: sshd@3-10.0.0.40:22-10.0.0.1:43366.service: Deactivated successfully. Jul 6 23:21:16.338732 systemd[1]: session-4.scope: Deactivated successfully. Jul 6 23:21:16.340133 systemd-logind[1505]: Session 4 logged out. Waiting for processes to exit. Jul 6 23:21:16.343335 systemd[1]: Started sshd@4-10.0.0.40:22-10.0.0.1:43382.service - OpenSSH per-connection server daemon (10.0.0.1:43382). Jul 6 23:21:16.344201 systemd-logind[1505]: Removed session 4. Jul 6 23:21:16.411476 sshd[1688]: Accepted publickey for core from 10.0.0.1 port 43382 ssh2: RSA SHA256:jyTvj9WiqpnTWeC15mq15pBzt3VkG8C4RFcxi7WEalo Jul 6 23:21:16.415380 sshd-session[1688]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:21:16.420878 systemd-logind[1505]: New session 5 of user core. Jul 6 23:21:16.429293 systemd[1]: Started session-5.scope - Session 5 of User core. Jul 6 23:21:16.519930 sudo[1691]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jul 6 23:21:16.520286 sudo[1691]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 6 23:21:16.535321 sudo[1691]: pam_unix(sudo:session): session closed for user root Jul 6 23:21:16.538968 sshd[1690]: Connection closed by 10.0.0.1 port 43382 Jul 6 23:21:16.537798 sshd-session[1688]: pam_unix(sshd:session): session closed for user core Jul 6 23:21:16.552627 systemd[1]: sshd@4-10.0.0.40:22-10.0.0.1:43382.service: Deactivated successfully. Jul 6 23:21:16.557409 systemd[1]: session-5.scope: Deactivated successfully. Jul 6 23:21:16.558946 systemd-logind[1505]: Session 5 logged out. Waiting for processes to exit. Jul 6 23:21:16.562685 systemd[1]: Started sshd@5-10.0.0.40:22-10.0.0.1:43384.service - OpenSSH per-connection server daemon (10.0.0.1:43384). Jul 6 23:21:16.563326 systemd-logind[1505]: Removed session 5. Jul 6 23:21:16.632184 sshd[1697]: Accepted publickey for core from 10.0.0.1 port 43384 ssh2: RSA SHA256:jyTvj9WiqpnTWeC15mq15pBzt3VkG8C4RFcxi7WEalo Jul 6 23:21:16.633147 sshd-session[1697]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:21:16.637604 systemd-logind[1505]: New session 6 of user core. Jul 6 23:21:16.647245 systemd[1]: Started session-6.scope - Session 6 of User core. Jul 6 23:21:16.705415 sudo[1701]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jul 6 23:21:16.705721 sudo[1701]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 6 23:21:16.745833 sudo[1701]: pam_unix(sudo:session): session closed for user root Jul 6 23:21:16.751494 sudo[1700]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jul 6 23:21:16.752194 sudo[1700]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 6 23:21:16.763717 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jul 6 23:21:16.818307 augenrules[1723]: No rules Jul 6 23:21:16.820017 systemd[1]: audit-rules.service: Deactivated successfully. Jul 6 23:21:16.820315 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jul 6 23:21:16.822280 sudo[1700]: pam_unix(sudo:session): session closed for user root Jul 6 23:21:16.824100 sshd[1699]: Connection closed by 10.0.0.1 port 43384 Jul 6 23:21:16.824336 sshd-session[1697]: pam_unix(sshd:session): session closed for user core Jul 6 23:21:16.837332 systemd[1]: sshd@5-10.0.0.40:22-10.0.0.1:43384.service: Deactivated successfully. Jul 6 23:21:16.840743 systemd[1]: session-6.scope: Deactivated successfully. Jul 6 23:21:16.841576 systemd-logind[1505]: Session 6 logged out. Waiting for processes to exit. Jul 6 23:21:16.844853 systemd[1]: Started sshd@6-10.0.0.40:22-10.0.0.1:43392.service - OpenSSH per-connection server daemon (10.0.0.1:43392). Jul 6 23:21:16.845609 systemd-logind[1505]: Removed session 6. Jul 6 23:21:16.909600 sshd[1732]: Accepted publickey for core from 10.0.0.1 port 43392 ssh2: RSA SHA256:jyTvj9WiqpnTWeC15mq15pBzt3VkG8C4RFcxi7WEalo Jul 6 23:21:16.910893 sshd-session[1732]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:21:16.915167 systemd-logind[1505]: New session 7 of user core. Jul 6 23:21:16.924233 systemd[1]: Started session-7.scope - Session 7 of User core. Jul 6 23:21:16.976715 sudo[1735]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jul 6 23:21:16.977009 sudo[1735]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 6 23:21:17.499490 systemd[1]: Starting docker.service - Docker Application Container Engine... Jul 6 23:21:17.513440 (dockerd)[1755]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jul 6 23:21:17.891354 dockerd[1755]: time="2025-07-06T23:21:17.891271443Z" level=info msg="Starting up" Jul 6 23:21:17.894057 dockerd[1755]: time="2025-07-06T23:21:17.893994458Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jul 6 23:21:18.041084 dockerd[1755]: time="2025-07-06T23:21:18.040922134Z" level=info msg="Loading containers: start." Jul 6 23:21:18.050394 kernel: Initializing XFRM netlink socket Jul 6 23:21:18.297017 systemd-networkd[1453]: docker0: Link UP Jul 6 23:21:18.302452 dockerd[1755]: time="2025-07-06T23:21:18.302305907Z" level=info msg="Loading containers: done." Jul 6 23:21:18.321349 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck1749138740-merged.mount: Deactivated successfully. Jul 6 23:21:18.324881 dockerd[1755]: time="2025-07-06T23:21:18.324826443Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jul 6 23:21:18.325010 dockerd[1755]: time="2025-07-06T23:21:18.324944015Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 Jul 6 23:21:18.325105 dockerd[1755]: time="2025-07-06T23:21:18.325077367Z" level=info msg="Initializing buildkit" Jul 6 23:21:18.358149 dockerd[1755]: time="2025-07-06T23:21:18.358093614Z" level=info msg="Completed buildkit initialization" Jul 6 23:21:18.369437 dockerd[1755]: time="2025-07-06T23:21:18.369382099Z" level=info msg="Daemon has completed initialization" Jul 6 23:21:18.369517 dockerd[1755]: time="2025-07-06T23:21:18.369479425Z" level=info msg="API listen on /run/docker.sock" Jul 6 23:21:18.371506 systemd[1]: Started docker.service - Docker Application Container Engine. Jul 6 23:21:19.116670 containerd[1523]: time="2025-07-06T23:21:19.116623381Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.10\"" Jul 6 23:21:19.703426 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3666081682.mount: Deactivated successfully. Jul 6 23:21:20.720450 containerd[1523]: time="2025-07-06T23:21:20.720394521Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:21:20.721925 containerd[1523]: time="2025-07-06T23:21:20.721890094Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.10: active requests=0, bytes read=25651795" Jul 6 23:21:20.722693 containerd[1523]: time="2025-07-06T23:21:20.722648071Z" level=info msg="ImageCreate event name:\"sha256:8907c2d36348551c1038e24ef688f6830681069380376707e55518007a20a86c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:21:20.727049 containerd[1523]: time="2025-07-06T23:21:20.726224612Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:083d7d64af31cd090f870eb49fb815e6bb42c175fc602ee9dae2f28f082bd4dc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:21:20.727167 containerd[1523]: time="2025-07-06T23:21:20.727144131Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.10\" with image id \"sha256:8907c2d36348551c1038e24ef688f6830681069380376707e55518007a20a86c\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.10\", repo digest \"registry.k8s.io/kube-apiserver@sha256:083d7d64af31cd090f870eb49fb815e6bb42c175fc602ee9dae2f28f082bd4dc\", size \"25648593\" in 1.610475001s" Jul 6 23:21:20.727257 containerd[1523]: time="2025-07-06T23:21:20.727242997Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.10\" returns image reference \"sha256:8907c2d36348551c1038e24ef688f6830681069380376707e55518007a20a86c\"" Jul 6 23:21:20.730470 containerd[1523]: time="2025-07-06T23:21:20.730447929Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.10\"" Jul 6 23:21:21.731737 containerd[1523]: time="2025-07-06T23:21:21.731674994Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:21:21.732134 containerd[1523]: time="2025-07-06T23:21:21.732099203Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.10: active requests=0, bytes read=22459679" Jul 6 23:21:21.733294 containerd[1523]: time="2025-07-06T23:21:21.733238370Z" level=info msg="ImageCreate event name:\"sha256:0f640d6889416d515a0ac4de1c26f4d80134c47641ff464abc831560a951175f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:21:21.735787 containerd[1523]: time="2025-07-06T23:21:21.735743568Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:3c67387d023c6114879f1e817669fd641797d30f117230682faf3930ecaaf0fe\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:21:21.737329 containerd[1523]: time="2025-07-06T23:21:21.737287117Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.10\" with image id \"sha256:0f640d6889416d515a0ac4de1c26f4d80134c47641ff464abc831560a951175f\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.10\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:3c67387d023c6114879f1e817669fd641797d30f117230682faf3930ecaaf0fe\", size \"23995467\" in 1.006732003s" Jul 6 23:21:21.737378 containerd[1523]: time="2025-07-06T23:21:21.737332143Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.10\" returns image reference \"sha256:0f640d6889416d515a0ac4de1c26f4d80134c47641ff464abc831560a951175f\"" Jul 6 23:21:21.737811 containerd[1523]: time="2025-07-06T23:21:21.737786719Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.10\"" Jul 6 23:21:22.801217 containerd[1523]: time="2025-07-06T23:21:22.801158949Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:21:22.802892 containerd[1523]: time="2025-07-06T23:21:22.802859429Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.10: active requests=0, bytes read=17125068" Jul 6 23:21:22.803647 containerd[1523]: time="2025-07-06T23:21:22.803577777Z" level=info msg="ImageCreate event name:\"sha256:23d79b83d912e2633bcb4f9f7b8b46024893e11d492a4249d8f1f8c9a26b7b2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:21:22.809103 containerd[1523]: time="2025-07-06T23:21:22.809068923Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:284dc2a5cf6afc9b76e39ad4b79c680c23d289488517643b28784a06d0141272\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:21:22.813134 containerd[1523]: time="2025-07-06T23:21:22.812903352Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.10\" with image id \"sha256:23d79b83d912e2633bcb4f9f7b8b46024893e11d492a4249d8f1f8c9a26b7b2c\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.10\", repo digest \"registry.k8s.io/kube-scheduler@sha256:284dc2a5cf6afc9b76e39ad4b79c680c23d289488517643b28784a06d0141272\", size \"18660874\" in 1.074998975s" Jul 6 23:21:22.813134 containerd[1523]: time="2025-07-06T23:21:22.812944250Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.10\" returns image reference \"sha256:23d79b83d912e2633bcb4f9f7b8b46024893e11d492a4249d8f1f8c9a26b7b2c\"" Jul 6 23:21:22.813518 containerd[1523]: time="2025-07-06T23:21:22.813382882Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.10\"" Jul 6 23:21:23.374013 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jul 6 23:21:23.377144 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 6 23:21:23.588460 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 6 23:21:23.592276 (kubelet)[2040]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 6 23:21:23.628778 kubelet[2040]: E0706 23:21:23.628609 2040 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 6 23:21:23.632890 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 6 23:21:23.633065 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 6 23:21:23.633343 systemd[1]: kubelet.service: Consumed 141ms CPU time, 108M memory peak. Jul 6 23:21:23.828739 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1725086337.mount: Deactivated successfully. Jul 6 23:21:24.193296 containerd[1523]: time="2025-07-06T23:21:24.193246149Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:21:24.194849 containerd[1523]: time="2025-07-06T23:21:24.194747485Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.10: active requests=0, bytes read=26915959" Jul 6 23:21:24.196328 containerd[1523]: time="2025-07-06T23:21:24.195728073Z" level=info msg="ImageCreate event name:\"sha256:dde5ff0da443b455e81aefc7bf6a216fdd659d1cbe13b8e8ac8129c3ecd27f89\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:21:24.197646 containerd[1523]: time="2025-07-06T23:21:24.197618923Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:bcbb293812bdf587b28ea98369a8c347ca84884160046296761acdf12b27029d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:21:24.198115 containerd[1523]: time="2025-07-06T23:21:24.198084198Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.10\" with image id \"sha256:dde5ff0da443b455e81aefc7bf6a216fdd659d1cbe13b8e8ac8129c3ecd27f89\", repo tag \"registry.k8s.io/kube-proxy:v1.31.10\", repo digest \"registry.k8s.io/kube-proxy@sha256:bcbb293812bdf587b28ea98369a8c347ca84884160046296761acdf12b27029d\", size \"26914976\" in 1.384671165s" Jul 6 23:21:24.198156 containerd[1523]: time="2025-07-06T23:21:24.198116040Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.10\" returns image reference \"sha256:dde5ff0da443b455e81aefc7bf6a216fdd659d1cbe13b8e8ac8129c3ecd27f89\"" Jul 6 23:21:24.198583 containerd[1523]: time="2025-07-06T23:21:24.198534619Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Jul 6 23:21:24.687248 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4181840705.mount: Deactivated successfully. Jul 6 23:21:25.310453 containerd[1523]: time="2025-07-06T23:21:25.310398233Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:21:25.311664 containerd[1523]: time="2025-07-06T23:21:25.311632296Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951624" Jul 6 23:21:25.313044 containerd[1523]: time="2025-07-06T23:21:25.312632902Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:21:25.315903 containerd[1523]: time="2025-07-06T23:21:25.315831533Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:21:25.316840 containerd[1523]: time="2025-07-06T23:21:25.316801327Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.118235275s" Jul 6 23:21:25.316840 containerd[1523]: time="2025-07-06T23:21:25.316836442Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Jul 6 23:21:25.317581 containerd[1523]: time="2025-07-06T23:21:25.317323672Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jul 6 23:21:25.743602 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount895746752.mount: Deactivated successfully. Jul 6 23:21:25.749053 containerd[1523]: time="2025-07-06T23:21:25.748993126Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 6 23:21:25.750755 containerd[1523]: time="2025-07-06T23:21:25.750698089Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268705" Jul 6 23:21:25.751898 containerd[1523]: time="2025-07-06T23:21:25.751862043Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 6 23:21:25.753637 containerd[1523]: time="2025-07-06T23:21:25.753598180Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 6 23:21:25.754261 containerd[1523]: time="2025-07-06T23:21:25.754220079Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 436.868411ms" Jul 6 23:21:25.754261 containerd[1523]: time="2025-07-06T23:21:25.754252097Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Jul 6 23:21:25.754784 containerd[1523]: time="2025-07-06T23:21:25.754753526Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Jul 6 23:21:26.256627 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1944176412.mount: Deactivated successfully. Jul 6 23:21:27.706262 containerd[1523]: time="2025-07-06T23:21:27.706211270Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:21:27.707066 containerd[1523]: time="2025-07-06T23:21:27.707011562Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=66406467" Jul 6 23:21:27.708495 containerd[1523]: time="2025-07-06T23:21:27.708462670Z" level=info msg="ImageCreate event name:\"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:21:27.711625 containerd[1523]: time="2025-07-06T23:21:27.711571445Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:21:27.712178 containerd[1523]: time="2025-07-06T23:21:27.712155616Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"66535646\" in 1.957370244s" Jul 6 23:21:27.712231 containerd[1523]: time="2025-07-06T23:21:27.712183174Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\"" Jul 6 23:21:32.749690 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 6 23:21:32.749860 systemd[1]: kubelet.service: Consumed 141ms CPU time, 108M memory peak. Jul 6 23:21:32.752051 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 6 23:21:32.777778 systemd[1]: Reload requested from client PID 2193 ('systemctl') (unit session-7.scope)... Jul 6 23:21:32.777796 systemd[1]: Reloading... Jul 6 23:21:32.868050 zram_generator::config[2236]: No configuration found. Jul 6 23:21:32.992622 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 6 23:21:33.082120 systemd[1]: Reloading finished in 303 ms. Jul 6 23:21:33.131788 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 6 23:21:33.135375 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jul 6 23:21:33.136079 systemd[1]: kubelet.service: Deactivated successfully. Jul 6 23:21:33.136379 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 6 23:21:33.136453 systemd[1]: kubelet.service: Consumed 99ms CPU time, 95.1M memory peak. Jul 6 23:21:33.139219 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 6 23:21:33.311118 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 6 23:21:33.323435 (kubelet)[2284]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jul 6 23:21:33.361639 kubelet[2284]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 6 23:21:33.361639 kubelet[2284]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jul 6 23:21:33.361639 kubelet[2284]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 6 23:21:33.361639 kubelet[2284]: I0706 23:21:33.361553 2284 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jul 6 23:21:34.753147 kubelet[2284]: I0706 23:21:34.753095 2284 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Jul 6 23:21:34.753147 kubelet[2284]: I0706 23:21:34.753132 2284 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jul 6 23:21:34.753506 kubelet[2284]: I0706 23:21:34.753372 2284 server.go:934] "Client rotation is on, will bootstrap in background" Jul 6 23:21:34.835022 kubelet[2284]: E0706 23:21:34.834981 2284 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.0.40:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.40:6443: connect: connection refused" logger="UnhandledError" Jul 6 23:21:34.836636 kubelet[2284]: I0706 23:21:34.836607 2284 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jul 6 23:21:34.850619 kubelet[2284]: I0706 23:21:34.850576 2284 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jul 6 23:21:34.859586 kubelet[2284]: I0706 23:21:34.859547 2284 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jul 6 23:21:34.860844 kubelet[2284]: I0706 23:21:34.860808 2284 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jul 6 23:21:34.861024 kubelet[2284]: I0706 23:21:34.860980 2284 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jul 6 23:21:34.861283 kubelet[2284]: I0706 23:21:34.861032 2284 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jul 6 23:21:34.861283 kubelet[2284]: I0706 23:21:34.861283 2284 topology_manager.go:138] "Creating topology manager with none policy" Jul 6 23:21:34.861409 kubelet[2284]: I0706 23:21:34.861294 2284 container_manager_linux.go:300] "Creating device plugin manager" Jul 6 23:21:34.861710 kubelet[2284]: I0706 23:21:34.861685 2284 state_mem.go:36] "Initialized new in-memory state store" Jul 6 23:21:34.865668 kubelet[2284]: I0706 23:21:34.865209 2284 kubelet.go:408] "Attempting to sync node with API server" Jul 6 23:21:34.865668 kubelet[2284]: I0706 23:21:34.865246 2284 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Jul 6 23:21:34.865668 kubelet[2284]: I0706 23:21:34.865269 2284 kubelet.go:314] "Adding apiserver pod source" Jul 6 23:21:34.865668 kubelet[2284]: I0706 23:21:34.865496 2284 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jul 6 23:21:34.868086 kubelet[2284]: W0706 23:21:34.868033 2284 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.40:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.40:6443: connect: connection refused Jul 6 23:21:34.868173 kubelet[2284]: E0706 23:21:34.868093 2284 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.40:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.40:6443: connect: connection refused" logger="UnhandledError" Jul 6 23:21:34.868279 kubelet[2284]: W0706 23:21:34.868045 2284 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.40:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.40:6443: connect: connection refused Jul 6 23:21:34.868279 kubelet[2284]: E0706 23:21:34.868256 2284 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.40:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.40:6443: connect: connection refused" logger="UnhandledError" Jul 6 23:21:34.871863 kubelet[2284]: I0706 23:21:34.871827 2284 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Jul 6 23:21:34.872943 kubelet[2284]: I0706 23:21:34.872898 2284 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jul 6 23:21:34.873232 kubelet[2284]: W0706 23:21:34.873212 2284 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jul 6 23:21:34.874734 kubelet[2284]: I0706 23:21:34.874701 2284 server.go:1274] "Started kubelet" Jul 6 23:21:34.875234 kubelet[2284]: I0706 23:21:34.875191 2284 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jul 6 23:21:34.875461 kubelet[2284]: I0706 23:21:34.875412 2284 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jul 6 23:21:34.875788 kubelet[2284]: I0706 23:21:34.875766 2284 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jul 6 23:21:34.882178 kubelet[2284]: E0706 23:21:34.880849 2284 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jul 6 23:21:34.883946 kubelet[2284]: I0706 23:21:34.883917 2284 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jul 6 23:21:34.884123 kubelet[2284]: I0706 23:21:34.884100 2284 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jul 6 23:21:34.885648 kubelet[2284]: I0706 23:21:34.885613 2284 server.go:449] "Adding debug handlers to kubelet server" Jul 6 23:21:34.885767 kubelet[2284]: E0706 23:21:34.885737 2284 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Jul 6 23:21:34.885796 kubelet[2284]: I0706 23:21:34.885783 2284 volume_manager.go:289] "Starting Kubelet Volume Manager" Jul 6 23:21:34.886041 kubelet[2284]: I0706 23:21:34.886000 2284 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Jul 6 23:21:34.886082 kubelet[2284]: I0706 23:21:34.886075 2284 reconciler.go:26] "Reconciler: start to sync state" Jul 6 23:21:34.886706 kubelet[2284]: W0706 23:21:34.886638 2284 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.40:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.40:6443: connect: connection refused Jul 6 23:21:34.886706 kubelet[2284]: E0706 23:21:34.886694 2284 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.40:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.40:6443: connect: connection refused" logger="UnhandledError" Jul 6 23:21:34.886925 kubelet[2284]: E0706 23:21:34.886789 2284 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.40:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.40:6443: connect: connection refused" interval="200ms" Jul 6 23:21:34.888508 kubelet[2284]: I0706 23:21:34.888480 2284 factory.go:221] Registration of the containerd container factory successfully Jul 6 23:21:34.888508 kubelet[2284]: I0706 23:21:34.888496 2284 factory.go:221] Registration of the systemd container factory successfully Jul 6 23:21:34.888603 kubelet[2284]: I0706 23:21:34.888589 2284 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jul 6 23:21:34.890793 kubelet[2284]: E0706 23:21:34.888873 2284 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.40:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.40:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.184fccec30dee997 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-07-06 23:21:34.874675607 +0000 UTC m=+1.546729456,LastTimestamp:2025-07-06 23:21:34.874675607 +0000 UTC m=+1.546729456,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Jul 6 23:21:34.902906 kubelet[2284]: I0706 23:21:34.902878 2284 cpu_manager.go:214] "Starting CPU manager" policy="none" Jul 6 23:21:34.903069 kubelet[2284]: I0706 23:21:34.903057 2284 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jul 6 23:21:34.903162 kubelet[2284]: I0706 23:21:34.903067 2284 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jul 6 23:21:34.903248 kubelet[2284]: I0706 23:21:34.903228 2284 state_mem.go:36] "Initialized new in-memory state store" Jul 6 23:21:34.904406 kubelet[2284]: I0706 23:21:34.904367 2284 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jul 6 23:21:34.904406 kubelet[2284]: I0706 23:21:34.904391 2284 status_manager.go:217] "Starting to sync pod status with apiserver" Jul 6 23:21:34.904406 kubelet[2284]: I0706 23:21:34.904409 2284 kubelet.go:2321] "Starting kubelet main sync loop" Jul 6 23:21:34.904541 kubelet[2284]: E0706 23:21:34.904445 2284 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jul 6 23:21:34.905395 kubelet[2284]: W0706 23:21:34.905287 2284 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.40:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.40:6443: connect: connection refused Jul 6 23:21:34.905395 kubelet[2284]: E0706 23:21:34.905361 2284 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.40:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.40:6443: connect: connection refused" logger="UnhandledError" Jul 6 23:21:34.975276 kubelet[2284]: I0706 23:21:34.975243 2284 policy_none.go:49] "None policy: Start" Jul 6 23:21:34.976616 kubelet[2284]: I0706 23:21:34.976256 2284 memory_manager.go:170] "Starting memorymanager" policy="None" Jul 6 23:21:34.976616 kubelet[2284]: I0706 23:21:34.976290 2284 state_mem.go:35] "Initializing new in-memory state store" Jul 6 23:21:34.984849 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jul 6 23:21:34.986080 kubelet[2284]: E0706 23:21:34.986006 2284 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Jul 6 23:21:34.998245 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jul 6 23:21:35.004584 kubelet[2284]: E0706 23:21:35.004487 2284 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Jul 6 23:21:35.014829 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jul 6 23:21:35.016225 kubelet[2284]: I0706 23:21:35.016194 2284 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jul 6 23:21:35.016414 kubelet[2284]: I0706 23:21:35.016397 2284 eviction_manager.go:189] "Eviction manager: starting control loop" Jul 6 23:21:35.016451 kubelet[2284]: I0706 23:21:35.016416 2284 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jul 6 23:21:35.017048 kubelet[2284]: I0706 23:21:35.017007 2284 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jul 6 23:21:35.018939 kubelet[2284]: E0706 23:21:35.018912 2284 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Jul 6 23:21:35.087271 kubelet[2284]: E0706 23:21:35.087221 2284 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.40:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.40:6443: connect: connection refused" interval="400ms" Jul 6 23:21:35.118749 kubelet[2284]: I0706 23:21:35.118701 2284 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Jul 6 23:21:35.119364 kubelet[2284]: E0706 23:21:35.119323 2284 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.40:6443/api/v1/nodes\": dial tcp 10.0.0.40:6443: connect: connection refused" node="localhost" Jul 6 23:21:35.213074 systemd[1]: Created slice kubepods-burstable-pod1277ac01fa464956a74e90ee1635f629.slice - libcontainer container kubepods-burstable-pod1277ac01fa464956a74e90ee1635f629.slice. Jul 6 23:21:35.237905 systemd[1]: Created slice kubepods-burstable-pod3f04709fe51ae4ab5abd58e8da771b74.slice - libcontainer container kubepods-burstable-pod3f04709fe51ae4ab5abd58e8da771b74.slice. Jul 6 23:21:35.256391 systemd[1]: Created slice kubepods-burstable-podb35b56493416c25588cb530e37ffc065.slice - libcontainer container kubepods-burstable-podb35b56493416c25588cb530e37ffc065.slice. Jul 6 23:21:35.264898 kubelet[2284]: E0706 23:21:35.264770 2284 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.40:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.40:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.184fccec30dee997 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-07-06 23:21:34.874675607 +0000 UTC m=+1.546729456,LastTimestamp:2025-07-06 23:21:34.874675607 +0000 UTC m=+1.546729456,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Jul 6 23:21:35.321483 kubelet[2284]: I0706 23:21:35.321433 2284 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Jul 6 23:21:35.321886 kubelet[2284]: E0706 23:21:35.321845 2284 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.40:6443/api/v1/nodes\": dial tcp 10.0.0.40:6443: connect: connection refused" node="localhost" Jul 6 23:21:35.389341 kubelet[2284]: I0706 23:21:35.389067 2284 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3f04709fe51ae4ab5abd58e8da771b74-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"3f04709fe51ae4ab5abd58e8da771b74\") " pod="kube-system/kube-controller-manager-localhost" Jul 6 23:21:35.389341 kubelet[2284]: I0706 23:21:35.389113 2284 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3f04709fe51ae4ab5abd58e8da771b74-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"3f04709fe51ae4ab5abd58e8da771b74\") " pod="kube-system/kube-controller-manager-localhost" Jul 6 23:21:35.389341 kubelet[2284]: I0706 23:21:35.389133 2284 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3f04709fe51ae4ab5abd58e8da771b74-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"3f04709fe51ae4ab5abd58e8da771b74\") " pod="kube-system/kube-controller-manager-localhost" Jul 6 23:21:35.389341 kubelet[2284]: I0706 23:21:35.389156 2284 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b35b56493416c25588cb530e37ffc065-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"b35b56493416c25588cb530e37ffc065\") " pod="kube-system/kube-scheduler-localhost" Jul 6 23:21:35.389341 kubelet[2284]: I0706 23:21:35.389171 2284 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3f04709fe51ae4ab5abd58e8da771b74-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"3f04709fe51ae4ab5abd58e8da771b74\") " pod="kube-system/kube-controller-manager-localhost" Jul 6 23:21:35.389559 kubelet[2284]: I0706 23:21:35.389187 2284 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1277ac01fa464956a74e90ee1635f629-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"1277ac01fa464956a74e90ee1635f629\") " pod="kube-system/kube-apiserver-localhost" Jul 6 23:21:35.389559 kubelet[2284]: I0706 23:21:35.389203 2284 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1277ac01fa464956a74e90ee1635f629-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"1277ac01fa464956a74e90ee1635f629\") " pod="kube-system/kube-apiserver-localhost" Jul 6 23:21:35.389559 kubelet[2284]: I0706 23:21:35.389219 2284 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1277ac01fa464956a74e90ee1635f629-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"1277ac01fa464956a74e90ee1635f629\") " pod="kube-system/kube-apiserver-localhost" Jul 6 23:21:35.389559 kubelet[2284]: I0706 23:21:35.389235 2284 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/3f04709fe51ae4ab5abd58e8da771b74-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"3f04709fe51ae4ab5abd58e8da771b74\") " pod="kube-system/kube-controller-manager-localhost" Jul 6 23:21:35.488239 kubelet[2284]: E0706 23:21:35.488183 2284 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.40:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.40:6443: connect: connection refused" interval="800ms" Jul 6 23:21:35.536370 containerd[1523]: time="2025-07-06T23:21:35.535769285Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:1277ac01fa464956a74e90ee1635f629,Namespace:kube-system,Attempt:0,}" Jul 6 23:21:35.555152 containerd[1523]: time="2025-07-06T23:21:35.555081179Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:3f04709fe51ae4ab5abd58e8da771b74,Namespace:kube-system,Attempt:0,}" Jul 6 23:21:35.557570 containerd[1523]: time="2025-07-06T23:21:35.557522406Z" level=info msg="connecting to shim 17d7d5bc2636e111dca9227080c354d47fad46850c30e5b92fec8b91d61f87da" address="unix:///run/containerd/s/79206a5ca50ac106222393da01848046bcee4d0112f4a5790cecb8a9cfacf0e2" namespace=k8s.io protocol=ttrpc version=3 Jul 6 23:21:35.560121 containerd[1523]: time="2025-07-06T23:21:35.559365914Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:b35b56493416c25588cb530e37ffc065,Namespace:kube-system,Attempt:0,}" Jul 6 23:21:35.588332 systemd[1]: Started cri-containerd-17d7d5bc2636e111dca9227080c354d47fad46850c30e5b92fec8b91d61f87da.scope - libcontainer container 17d7d5bc2636e111dca9227080c354d47fad46850c30e5b92fec8b91d61f87da. Jul 6 23:21:35.598862 containerd[1523]: time="2025-07-06T23:21:35.598660368Z" level=info msg="connecting to shim 098289b1ebcddf95a73ef6039b0887dd1cf67ad41dde829b5cd897626d22265b" address="unix:///run/containerd/s/0ffa99e052d1326586a23d6b69f37b9bea5c6e85451452d7c7e567b9997e0816" namespace=k8s.io protocol=ttrpc version=3 Jul 6 23:21:35.614409 containerd[1523]: time="2025-07-06T23:21:35.614361397Z" level=info msg="connecting to shim 688b0f9c176b8033d63b843ad7012e85af662c434033a9376bfc3ba0f4e53608" address="unix:///run/containerd/s/7187ba03e46cbd892e452f445417b468cbd6ff457d4c5a2ccde236b767fa3f69" namespace=k8s.io protocol=ttrpc version=3 Jul 6 23:21:35.625206 systemd[1]: Started cri-containerd-098289b1ebcddf95a73ef6039b0887dd1cf67ad41dde829b5cd897626d22265b.scope - libcontainer container 098289b1ebcddf95a73ef6039b0887dd1cf67ad41dde829b5cd897626d22265b. Jul 6 23:21:35.648235 systemd[1]: Started cri-containerd-688b0f9c176b8033d63b843ad7012e85af662c434033a9376bfc3ba0f4e53608.scope - libcontainer container 688b0f9c176b8033d63b843ad7012e85af662c434033a9376bfc3ba0f4e53608. Jul 6 23:21:35.659460 containerd[1523]: time="2025-07-06T23:21:35.659408463Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:1277ac01fa464956a74e90ee1635f629,Namespace:kube-system,Attempt:0,} returns sandbox id \"17d7d5bc2636e111dca9227080c354d47fad46850c30e5b92fec8b91d61f87da\"" Jul 6 23:21:35.668419 containerd[1523]: time="2025-07-06T23:21:35.668374677Z" level=info msg="CreateContainer within sandbox \"17d7d5bc2636e111dca9227080c354d47fad46850c30e5b92fec8b91d61f87da\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jul 6 23:21:35.672055 containerd[1523]: time="2025-07-06T23:21:35.671994155Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:3f04709fe51ae4ab5abd58e8da771b74,Namespace:kube-system,Attempt:0,} returns sandbox id \"098289b1ebcddf95a73ef6039b0887dd1cf67ad41dde829b5cd897626d22265b\"" Jul 6 23:21:35.676144 containerd[1523]: time="2025-07-06T23:21:35.675272452Z" level=info msg="CreateContainer within sandbox \"098289b1ebcddf95a73ef6039b0887dd1cf67ad41dde829b5cd897626d22265b\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jul 6 23:21:35.676814 containerd[1523]: time="2025-07-06T23:21:35.676785074Z" level=info msg="Container 43911af019ca9a6139858423884ded3fe4ddeb29c0d7b76ee0d705927465f8ae: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:21:35.683538 containerd[1523]: time="2025-07-06T23:21:35.683498859Z" level=info msg="Container 192a9b6d37c338b11b1637806892adabef0a24e1804ca133d7a0dab89b92fc29: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:21:35.688950 containerd[1523]: time="2025-07-06T23:21:35.688869349Z" level=info msg="CreateContainer within sandbox \"17d7d5bc2636e111dca9227080c354d47fad46850c30e5b92fec8b91d61f87da\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"43911af019ca9a6139858423884ded3fe4ddeb29c0d7b76ee0d705927465f8ae\"" Jul 6 23:21:35.689904 containerd[1523]: time="2025-07-06T23:21:35.689876269Z" level=info msg="StartContainer for \"43911af019ca9a6139858423884ded3fe4ddeb29c0d7b76ee0d705927465f8ae\"" Jul 6 23:21:35.692289 containerd[1523]: time="2025-07-06T23:21:35.692250077Z" level=info msg="CreateContainer within sandbox \"098289b1ebcddf95a73ef6039b0887dd1cf67ad41dde829b5cd897626d22265b\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"192a9b6d37c338b11b1637806892adabef0a24e1804ca133d7a0dab89b92fc29\"" Jul 6 23:21:35.692830 containerd[1523]: time="2025-07-06T23:21:35.692803450Z" level=info msg="connecting to shim 43911af019ca9a6139858423884ded3fe4ddeb29c0d7b76ee0d705927465f8ae" address="unix:///run/containerd/s/79206a5ca50ac106222393da01848046bcee4d0112f4a5790cecb8a9cfacf0e2" protocol=ttrpc version=3 Jul 6 23:21:35.693221 containerd[1523]: time="2025-07-06T23:21:35.693192381Z" level=info msg="StartContainer for \"192a9b6d37c338b11b1637806892adabef0a24e1804ca133d7a0dab89b92fc29\"" Jul 6 23:21:35.694431 containerd[1523]: time="2025-07-06T23:21:35.694267360Z" level=info msg="connecting to shim 192a9b6d37c338b11b1637806892adabef0a24e1804ca133d7a0dab89b92fc29" address="unix:///run/containerd/s/0ffa99e052d1326586a23d6b69f37b9bea5c6e85451452d7c7e567b9997e0816" protocol=ttrpc version=3 Jul 6 23:21:35.697780 containerd[1523]: time="2025-07-06T23:21:35.697743508Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:b35b56493416c25588cb530e37ffc065,Namespace:kube-system,Attempt:0,} returns sandbox id \"688b0f9c176b8033d63b843ad7012e85af662c434033a9376bfc3ba0f4e53608\"" Jul 6 23:21:35.701326 containerd[1523]: time="2025-07-06T23:21:35.701277580Z" level=info msg="CreateContainer within sandbox \"688b0f9c176b8033d63b843ad7012e85af662c434033a9376bfc3ba0f4e53608\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jul 6 23:21:35.712707 containerd[1523]: time="2025-07-06T23:21:35.712650570Z" level=info msg="Container b7268b849def62aab5a1efe063fc24a2dc4caa37dbe8b0886d7af9f409e035bd: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:21:35.718719 containerd[1523]: time="2025-07-06T23:21:35.718657917Z" level=info msg="CreateContainer within sandbox \"688b0f9c176b8033d63b843ad7012e85af662c434033a9376bfc3ba0f4e53608\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"b7268b849def62aab5a1efe063fc24a2dc4caa37dbe8b0886d7af9f409e035bd\"" Jul 6 23:21:35.719234 containerd[1523]: time="2025-07-06T23:21:35.719203358Z" level=info msg="StartContainer for \"b7268b849def62aab5a1efe063fc24a2dc4caa37dbe8b0886d7af9f409e035bd\"" Jul 6 23:21:35.719239 systemd[1]: Started cri-containerd-192a9b6d37c338b11b1637806892adabef0a24e1804ca133d7a0dab89b92fc29.scope - libcontainer container 192a9b6d37c338b11b1637806892adabef0a24e1804ca133d7a0dab89b92fc29. Jul 6 23:21:35.720929 containerd[1523]: time="2025-07-06T23:21:35.720855025Z" level=info msg="connecting to shim b7268b849def62aab5a1efe063fc24a2dc4caa37dbe8b0886d7af9f409e035bd" address="unix:///run/containerd/s/7187ba03e46cbd892e452f445417b468cbd6ff457d4c5a2ccde236b767fa3f69" protocol=ttrpc version=3 Jul 6 23:21:35.723515 systemd[1]: Started cri-containerd-43911af019ca9a6139858423884ded3fe4ddeb29c0d7b76ee0d705927465f8ae.scope - libcontainer container 43911af019ca9a6139858423884ded3fe4ddeb29c0d7b76ee0d705927465f8ae. Jul 6 23:21:35.725030 kubelet[2284]: I0706 23:21:35.724958 2284 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Jul 6 23:21:35.725446 kubelet[2284]: E0706 23:21:35.725387 2284 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.40:6443/api/v1/nodes\": dial tcp 10.0.0.40:6443: connect: connection refused" node="localhost" Jul 6 23:21:35.746355 systemd[1]: Started cri-containerd-b7268b849def62aab5a1efe063fc24a2dc4caa37dbe8b0886d7af9f409e035bd.scope - libcontainer container b7268b849def62aab5a1efe063fc24a2dc4caa37dbe8b0886d7af9f409e035bd. Jul 6 23:21:35.775461 kubelet[2284]: W0706 23:21:35.774558 2284 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.40:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.40:6443: connect: connection refused Jul 6 23:21:35.775461 kubelet[2284]: E0706 23:21:35.774709 2284 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.40:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.40:6443: connect: connection refused" logger="UnhandledError" Jul 6 23:21:35.777778 containerd[1523]: time="2025-07-06T23:21:35.776644474Z" level=info msg="StartContainer for \"192a9b6d37c338b11b1637806892adabef0a24e1804ca133d7a0dab89b92fc29\" returns successfully" Jul 6 23:21:35.815474 containerd[1523]: time="2025-07-06T23:21:35.811515950Z" level=info msg="StartContainer for \"43911af019ca9a6139858423884ded3fe4ddeb29c0d7b76ee0d705927465f8ae\" returns successfully" Jul 6 23:21:35.824763 containerd[1523]: time="2025-07-06T23:21:35.823598863Z" level=info msg="StartContainer for \"b7268b849def62aab5a1efe063fc24a2dc4caa37dbe8b0886d7af9f409e035bd\" returns successfully" Jul 6 23:21:35.835628 kubelet[2284]: W0706 23:21:35.835531 2284 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.40:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.40:6443: connect: connection refused Jul 6 23:21:35.835628 kubelet[2284]: E0706 23:21:35.835626 2284 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.40:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.40:6443: connect: connection refused" logger="UnhandledError" Jul 6 23:21:35.929798 kubelet[2284]: W0706 23:21:35.929456 2284 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.40:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.40:6443: connect: connection refused Jul 6 23:21:35.929798 kubelet[2284]: E0706 23:21:35.929528 2284 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.40:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.40:6443: connect: connection refused" logger="UnhandledError" Jul 6 23:21:36.528431 kubelet[2284]: I0706 23:21:36.528389 2284 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Jul 6 23:21:38.080089 kubelet[2284]: E0706 23:21:38.080038 2284 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Jul 6 23:21:38.152789 kubelet[2284]: I0706 23:21:38.152737 2284 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Jul 6 23:21:38.869804 kubelet[2284]: I0706 23:21:38.869728 2284 apiserver.go:52] "Watching apiserver" Jul 6 23:21:38.887365 kubelet[2284]: I0706 23:21:38.887305 2284 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Jul 6 23:21:40.294286 systemd[1]: Reload requested from client PID 2553 ('systemctl') (unit session-7.scope)... Jul 6 23:21:40.294306 systemd[1]: Reloading... Jul 6 23:21:40.378047 zram_generator::config[2599]: No configuration found. Jul 6 23:21:40.523281 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 6 23:21:40.639895 systemd[1]: Reloading finished in 345 ms. Jul 6 23:21:40.668365 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jul 6 23:21:40.680213 systemd[1]: kubelet.service: Deactivated successfully. Jul 6 23:21:40.680510 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 6 23:21:40.680593 systemd[1]: kubelet.service: Consumed 1.991s CPU time, 129.3M memory peak. Jul 6 23:21:40.683481 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 6 23:21:40.854980 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 6 23:21:40.859313 (kubelet)[2638]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jul 6 23:21:40.900675 kubelet[2638]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 6 23:21:40.900675 kubelet[2638]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jul 6 23:21:40.900675 kubelet[2638]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 6 23:21:40.900675 kubelet[2638]: I0706 23:21:40.900642 2638 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jul 6 23:21:40.912694 kubelet[2638]: I0706 23:21:40.912532 2638 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Jul 6 23:21:40.913165 kubelet[2638]: I0706 23:21:40.913074 2638 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jul 6 23:21:40.913540 kubelet[2638]: I0706 23:21:40.913414 2638 server.go:934] "Client rotation is on, will bootstrap in background" Jul 6 23:21:40.915061 kubelet[2638]: I0706 23:21:40.914979 2638 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jul 6 23:21:40.918276 kubelet[2638]: I0706 23:21:40.918247 2638 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jul 6 23:21:40.924693 kubelet[2638]: I0706 23:21:40.924660 2638 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jul 6 23:21:40.928594 kubelet[2638]: I0706 23:21:40.928534 2638 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jul 6 23:21:40.928777 kubelet[2638]: I0706 23:21:40.928743 2638 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jul 6 23:21:40.928934 kubelet[2638]: I0706 23:21:40.928897 2638 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jul 6 23:21:40.931392 kubelet[2638]: I0706 23:21:40.928926 2638 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jul 6 23:21:40.931491 kubelet[2638]: I0706 23:21:40.931398 2638 topology_manager.go:138] "Creating topology manager with none policy" Jul 6 23:21:40.931491 kubelet[2638]: I0706 23:21:40.931412 2638 container_manager_linux.go:300] "Creating device plugin manager" Jul 6 23:21:40.931491 kubelet[2638]: I0706 23:21:40.931463 2638 state_mem.go:36] "Initialized new in-memory state store" Jul 6 23:21:40.931623 kubelet[2638]: I0706 23:21:40.931607 2638 kubelet.go:408] "Attempting to sync node with API server" Jul 6 23:21:40.931623 kubelet[2638]: I0706 23:21:40.931631 2638 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Jul 6 23:21:40.932487 kubelet[2638]: I0706 23:21:40.932465 2638 kubelet.go:314] "Adding apiserver pod source" Jul 6 23:21:40.934072 kubelet[2638]: I0706 23:21:40.934034 2638 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jul 6 23:21:40.937172 kubelet[2638]: I0706 23:21:40.937042 2638 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Jul 6 23:21:40.942143 kubelet[2638]: I0706 23:21:40.941771 2638 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jul 6 23:21:40.944671 kubelet[2638]: I0706 23:21:40.944641 2638 server.go:1274] "Started kubelet" Jul 6 23:21:40.945565 kubelet[2638]: I0706 23:21:40.945534 2638 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jul 6 23:21:40.947674 kubelet[2638]: I0706 23:21:40.945216 2638 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jul 6 23:21:40.950042 kubelet[2638]: I0706 23:21:40.948670 2638 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jul 6 23:21:40.950042 kubelet[2638]: I0706 23:21:40.949778 2638 server.go:449] "Adding debug handlers to kubelet server" Jul 6 23:21:40.950042 kubelet[2638]: I0706 23:21:40.949879 2638 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jul 6 23:21:40.951737 kubelet[2638]: I0706 23:21:40.951384 2638 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jul 6 23:21:40.953914 kubelet[2638]: I0706 23:21:40.953886 2638 volume_manager.go:289] "Starting Kubelet Volume Manager" Jul 6 23:21:40.954010 kubelet[2638]: I0706 23:21:40.953994 2638 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Jul 6 23:21:40.954154 kubelet[2638]: I0706 23:21:40.954131 2638 reconciler.go:26] "Reconciler: start to sync state" Jul 6 23:21:40.954737 kubelet[2638]: I0706 23:21:40.954711 2638 factory.go:221] Registration of the systemd container factory successfully Jul 6 23:21:40.954882 kubelet[2638]: I0706 23:21:40.954847 2638 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jul 6 23:21:40.956305 kubelet[2638]: E0706 23:21:40.956272 2638 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jul 6 23:21:40.959334 kubelet[2638]: I0706 23:21:40.959271 2638 factory.go:221] Registration of the containerd container factory successfully Jul 6 23:21:40.966456 kubelet[2638]: I0706 23:21:40.966403 2638 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jul 6 23:21:40.967719 kubelet[2638]: I0706 23:21:40.967644 2638 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jul 6 23:21:40.967719 kubelet[2638]: I0706 23:21:40.967675 2638 status_manager.go:217] "Starting to sync pod status with apiserver" Jul 6 23:21:40.967719 kubelet[2638]: I0706 23:21:40.967695 2638 kubelet.go:2321] "Starting kubelet main sync loop" Jul 6 23:21:40.967870 kubelet[2638]: E0706 23:21:40.967741 2638 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jul 6 23:21:41.000841 kubelet[2638]: I0706 23:21:41.000799 2638 cpu_manager.go:214] "Starting CPU manager" policy="none" Jul 6 23:21:41.001454 kubelet[2638]: I0706 23:21:41.000998 2638 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jul 6 23:21:41.001454 kubelet[2638]: I0706 23:21:41.001070 2638 state_mem.go:36] "Initialized new in-memory state store" Jul 6 23:21:41.001454 kubelet[2638]: I0706 23:21:41.001288 2638 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jul 6 23:21:41.001454 kubelet[2638]: I0706 23:21:41.001301 2638 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jul 6 23:21:41.001454 kubelet[2638]: I0706 23:21:41.001321 2638 policy_none.go:49] "None policy: Start" Jul 6 23:21:41.002208 kubelet[2638]: I0706 23:21:41.002187 2638 memory_manager.go:170] "Starting memorymanager" policy="None" Jul 6 23:21:41.002268 kubelet[2638]: I0706 23:21:41.002219 2638 state_mem.go:35] "Initializing new in-memory state store" Jul 6 23:21:41.002410 kubelet[2638]: I0706 23:21:41.002394 2638 state_mem.go:75] "Updated machine memory state" Jul 6 23:21:41.010191 kubelet[2638]: I0706 23:21:41.010158 2638 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jul 6 23:21:41.010386 kubelet[2638]: I0706 23:21:41.010361 2638 eviction_manager.go:189] "Eviction manager: starting control loop" Jul 6 23:21:41.010446 kubelet[2638]: I0706 23:21:41.010381 2638 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jul 6 23:21:41.010660 kubelet[2638]: I0706 23:21:41.010611 2638 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jul 6 23:21:41.083050 kubelet[2638]: E0706 23:21:41.082969 2638 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Jul 6 23:21:41.115034 kubelet[2638]: I0706 23:21:41.114952 2638 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Jul 6 23:21:41.123623 kubelet[2638]: I0706 23:21:41.123585 2638 kubelet_node_status.go:111] "Node was previously registered" node="localhost" Jul 6 23:21:41.123754 kubelet[2638]: I0706 23:21:41.123690 2638 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Jul 6 23:21:41.155981 kubelet[2638]: I0706 23:21:41.155824 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3f04709fe51ae4ab5abd58e8da771b74-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"3f04709fe51ae4ab5abd58e8da771b74\") " pod="kube-system/kube-controller-manager-localhost" Jul 6 23:21:41.155981 kubelet[2638]: I0706 23:21:41.155866 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3f04709fe51ae4ab5abd58e8da771b74-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"3f04709fe51ae4ab5abd58e8da771b74\") " pod="kube-system/kube-controller-manager-localhost" Jul 6 23:21:41.155981 kubelet[2638]: I0706 23:21:41.155890 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b35b56493416c25588cb530e37ffc065-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"b35b56493416c25588cb530e37ffc065\") " pod="kube-system/kube-scheduler-localhost" Jul 6 23:21:41.155981 kubelet[2638]: I0706 23:21:41.155918 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1277ac01fa464956a74e90ee1635f629-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"1277ac01fa464956a74e90ee1635f629\") " pod="kube-system/kube-apiserver-localhost" Jul 6 23:21:41.155981 kubelet[2638]: I0706 23:21:41.155940 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1277ac01fa464956a74e90ee1635f629-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"1277ac01fa464956a74e90ee1635f629\") " pod="kube-system/kube-apiserver-localhost" Jul 6 23:21:41.156252 kubelet[2638]: I0706 23:21:41.155983 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3f04709fe51ae4ab5abd58e8da771b74-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"3f04709fe51ae4ab5abd58e8da771b74\") " pod="kube-system/kube-controller-manager-localhost" Jul 6 23:21:41.156252 kubelet[2638]: I0706 23:21:41.156075 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/3f04709fe51ae4ab5abd58e8da771b74-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"3f04709fe51ae4ab5abd58e8da771b74\") " pod="kube-system/kube-controller-manager-localhost" Jul 6 23:21:41.156252 kubelet[2638]: I0706 23:21:41.156100 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3f04709fe51ae4ab5abd58e8da771b74-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"3f04709fe51ae4ab5abd58e8da771b74\") " pod="kube-system/kube-controller-manager-localhost" Jul 6 23:21:41.156252 kubelet[2638]: I0706 23:21:41.156118 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1277ac01fa464956a74e90ee1635f629-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"1277ac01fa464956a74e90ee1635f629\") " pod="kube-system/kube-apiserver-localhost" Jul 6 23:21:41.935670 kubelet[2638]: I0706 23:21:41.935207 2638 apiserver.go:52] "Watching apiserver" Jul 6 23:21:41.955291 kubelet[2638]: I0706 23:21:41.955135 2638 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Jul 6 23:21:42.003046 kubelet[2638]: E0706 23:21:42.001481 2638 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Jul 6 23:21:42.016037 kubelet[2638]: I0706 23:21:42.015778 2638 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=3.015741723 podStartE2EDuration="3.015741723s" podCreationTimestamp="2025-07-06 23:21:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-06 23:21:42.015315437 +0000 UTC m=+1.153033578" watchObservedRunningTime="2025-07-06 23:21:42.015741723 +0000 UTC m=+1.153459824" Jul 6 23:21:42.054243 kubelet[2638]: I0706 23:21:42.054122 2638 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.053155449 podStartE2EDuration="1.053155449s" podCreationTimestamp="2025-07-06 23:21:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-06 23:21:42.027005748 +0000 UTC m=+1.164723849" watchObservedRunningTime="2025-07-06 23:21:42.053155449 +0000 UTC m=+1.190873590" Jul 6 23:21:42.054625 kubelet[2638]: I0706 23:21:42.054547 2638 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.054535126 podStartE2EDuration="1.054535126s" podCreationTimestamp="2025-07-06 23:21:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-06 23:21:42.052358389 +0000 UTC m=+1.190076530" watchObservedRunningTime="2025-07-06 23:21:42.054535126 +0000 UTC m=+1.192253267" Jul 6 23:21:45.131495 kubelet[2638]: I0706 23:21:45.131464 2638 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jul 6 23:21:45.132417 containerd[1523]: time="2025-07-06T23:21:45.132361929Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jul 6 23:21:45.133571 kubelet[2638]: I0706 23:21:45.133528 2638 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jul 6 23:21:45.621672 systemd[1]: Created slice kubepods-besteffort-podd9972ca4_60ad_413c_a2ae_32bfb3c2e7c5.slice - libcontainer container kubepods-besteffort-podd9972ca4_60ad_413c_a2ae_32bfb3c2e7c5.slice. Jul 6 23:21:45.682456 kubelet[2638]: I0706 23:21:45.682408 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/d9972ca4-60ad-413c-a2ae-32bfb3c2e7c5-kube-proxy\") pod \"kube-proxy-9zjz8\" (UID: \"d9972ca4-60ad-413c-a2ae-32bfb3c2e7c5\") " pod="kube-system/kube-proxy-9zjz8" Jul 6 23:21:45.682456 kubelet[2638]: I0706 23:21:45.682460 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5gfg\" (UniqueName: \"kubernetes.io/projected/d9972ca4-60ad-413c-a2ae-32bfb3c2e7c5-kube-api-access-j5gfg\") pod \"kube-proxy-9zjz8\" (UID: \"d9972ca4-60ad-413c-a2ae-32bfb3c2e7c5\") " pod="kube-system/kube-proxy-9zjz8" Jul 6 23:21:45.682794 kubelet[2638]: I0706 23:21:45.682498 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d9972ca4-60ad-413c-a2ae-32bfb3c2e7c5-xtables-lock\") pod \"kube-proxy-9zjz8\" (UID: \"d9972ca4-60ad-413c-a2ae-32bfb3c2e7c5\") " pod="kube-system/kube-proxy-9zjz8" Jul 6 23:21:45.682794 kubelet[2638]: I0706 23:21:45.682519 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d9972ca4-60ad-413c-a2ae-32bfb3c2e7c5-lib-modules\") pod \"kube-proxy-9zjz8\" (UID: \"d9972ca4-60ad-413c-a2ae-32bfb3c2e7c5\") " pod="kube-system/kube-proxy-9zjz8" Jul 6 23:21:45.791686 kubelet[2638]: E0706 23:21:45.791620 2638 projected.go:288] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Jul 6 23:21:45.791686 kubelet[2638]: E0706 23:21:45.791653 2638 projected.go:194] Error preparing data for projected volume kube-api-access-j5gfg for pod kube-system/kube-proxy-9zjz8: configmap "kube-root-ca.crt" not found Jul 6 23:21:45.791992 kubelet[2638]: E0706 23:21:45.791927 2638 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9972ca4-60ad-413c-a2ae-32bfb3c2e7c5-kube-api-access-j5gfg podName:d9972ca4-60ad-413c-a2ae-32bfb3c2e7c5 nodeName:}" failed. No retries permitted until 2025-07-06 23:21:46.29190095 +0000 UTC m=+5.429619091 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-j5gfg" (UniqueName: "kubernetes.io/projected/d9972ca4-60ad-413c-a2ae-32bfb3c2e7c5-kube-api-access-j5gfg") pod "kube-proxy-9zjz8" (UID: "d9972ca4-60ad-413c-a2ae-32bfb3c2e7c5") : configmap "kube-root-ca.crt" not found Jul 6 23:21:46.238991 systemd[1]: Created slice kubepods-besteffort-podd0acdc2c_45c0_4d95_b2fc_4e3606250388.slice - libcontainer container kubepods-besteffort-podd0acdc2c_45c0_4d95_b2fc_4e3606250388.slice. Jul 6 23:21:46.288924 kubelet[2638]: I0706 23:21:46.288865 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/d0acdc2c-45c0-4d95-b2fc-4e3606250388-var-lib-calico\") pod \"tigera-operator-5bf8dfcb4-n4fbl\" (UID: \"d0acdc2c-45c0-4d95-b2fc-4e3606250388\") " pod="tigera-operator/tigera-operator-5bf8dfcb4-n4fbl" Jul 6 23:21:46.288924 kubelet[2638]: I0706 23:21:46.288914 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7n5c\" (UniqueName: \"kubernetes.io/projected/d0acdc2c-45c0-4d95-b2fc-4e3606250388-kube-api-access-m7n5c\") pod \"tigera-operator-5bf8dfcb4-n4fbl\" (UID: \"d0acdc2c-45c0-4d95-b2fc-4e3606250388\") " pod="tigera-operator/tigera-operator-5bf8dfcb4-n4fbl" Jul 6 23:21:46.534918 containerd[1523]: time="2025-07-06T23:21:46.534788609Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-9zjz8,Uid:d9972ca4-60ad-413c-a2ae-32bfb3c2e7c5,Namespace:kube-system,Attempt:0,}" Jul 6 23:21:46.542788 containerd[1523]: time="2025-07-06T23:21:46.542740660Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5bf8dfcb4-n4fbl,Uid:d0acdc2c-45c0-4d95-b2fc-4e3606250388,Namespace:tigera-operator,Attempt:0,}" Jul 6 23:21:46.593845 containerd[1523]: time="2025-07-06T23:21:46.593803503Z" level=info msg="connecting to shim c3a3d4925bd97b7eacd81661949ccac8f716f998196bd57c4e324815d4385fa9" address="unix:///run/containerd/s/25ad7c18d18275981ea6c84835ba70a9b7f1b7c27469471b8ed636c107603a70" namespace=k8s.io protocol=ttrpc version=3 Jul 6 23:21:46.595539 containerd[1523]: time="2025-07-06T23:21:46.595503906Z" level=info msg="connecting to shim 1b61a8c40f9ad5261b2b0159c6069bdae10d6175fcfde75d29efa1a76f432037" address="unix:///run/containerd/s/d50a3215a4208eeafff31ffe2642c7bf833fd88c67c16b0a9f5c6386106b13f3" namespace=k8s.io protocol=ttrpc version=3 Jul 6 23:21:46.619239 systemd[1]: Started cri-containerd-c3a3d4925bd97b7eacd81661949ccac8f716f998196bd57c4e324815d4385fa9.scope - libcontainer container c3a3d4925bd97b7eacd81661949ccac8f716f998196bd57c4e324815d4385fa9. Jul 6 23:21:46.623640 systemd[1]: Started cri-containerd-1b61a8c40f9ad5261b2b0159c6069bdae10d6175fcfde75d29efa1a76f432037.scope - libcontainer container 1b61a8c40f9ad5261b2b0159c6069bdae10d6175fcfde75d29efa1a76f432037. Jul 6 23:21:46.652070 containerd[1523]: time="2025-07-06T23:21:46.651134844Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-9zjz8,Uid:d9972ca4-60ad-413c-a2ae-32bfb3c2e7c5,Namespace:kube-system,Attempt:0,} returns sandbox id \"c3a3d4925bd97b7eacd81661949ccac8f716f998196bd57c4e324815d4385fa9\"" Jul 6 23:21:46.656081 containerd[1523]: time="2025-07-06T23:21:46.656044140Z" level=info msg="CreateContainer within sandbox \"c3a3d4925bd97b7eacd81661949ccac8f716f998196bd57c4e324815d4385fa9\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jul 6 23:21:46.669866 containerd[1523]: time="2025-07-06T23:21:46.669821468Z" level=info msg="Container 1befa6a1bd3aacced74cc9ba8e9fa66fb26d9df5eacc2c3f83e4e6de09048d31: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:21:46.670572 containerd[1523]: time="2025-07-06T23:21:46.670543333Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5bf8dfcb4-n4fbl,Uid:d0acdc2c-45c0-4d95-b2fc-4e3606250388,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"1b61a8c40f9ad5261b2b0159c6069bdae10d6175fcfde75d29efa1a76f432037\"" Jul 6 23:21:46.673325 containerd[1523]: time="2025-07-06T23:21:46.673287232Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\"" Jul 6 23:21:46.678585 containerd[1523]: time="2025-07-06T23:21:46.678521240Z" level=info msg="CreateContainer within sandbox \"c3a3d4925bd97b7eacd81661949ccac8f716f998196bd57c4e324815d4385fa9\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"1befa6a1bd3aacced74cc9ba8e9fa66fb26d9df5eacc2c3f83e4e6de09048d31\"" Jul 6 23:21:46.679374 containerd[1523]: time="2025-07-06T23:21:46.679342044Z" level=info msg="StartContainer for \"1befa6a1bd3aacced74cc9ba8e9fa66fb26d9df5eacc2c3f83e4e6de09048d31\"" Jul 6 23:21:46.681078 containerd[1523]: time="2025-07-06T23:21:46.681036483Z" level=info msg="connecting to shim 1befa6a1bd3aacced74cc9ba8e9fa66fb26d9df5eacc2c3f83e4e6de09048d31" address="unix:///run/containerd/s/25ad7c18d18275981ea6c84835ba70a9b7f1b7c27469471b8ed636c107603a70" protocol=ttrpc version=3 Jul 6 23:21:46.704258 systemd[1]: Started cri-containerd-1befa6a1bd3aacced74cc9ba8e9fa66fb26d9df5eacc2c3f83e4e6de09048d31.scope - libcontainer container 1befa6a1bd3aacced74cc9ba8e9fa66fb26d9df5eacc2c3f83e4e6de09048d31. Jul 6 23:21:46.745922 containerd[1523]: time="2025-07-06T23:21:46.745851999Z" level=info msg="StartContainer for \"1befa6a1bd3aacced74cc9ba8e9fa66fb26d9df5eacc2c3f83e4e6de09048d31\" returns successfully" Jul 6 23:21:47.026280 kubelet[2638]: I0706 23:21:47.026211 2638 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-9zjz8" podStartSLOduration=2.026194373 podStartE2EDuration="2.026194373s" podCreationTimestamp="2025-07-06 23:21:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-06 23:21:47.025928704 +0000 UTC m=+6.163646805" watchObservedRunningTime="2025-07-06 23:21:47.026194373 +0000 UTC m=+6.163912514" Jul 6 23:21:47.831759 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2130152809.mount: Deactivated successfully. Jul 6 23:21:48.361736 containerd[1523]: time="2025-07-06T23:21:48.361682110Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:21:48.362204 containerd[1523]: time="2025-07-06T23:21:48.362174690Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.3: active requests=0, bytes read=22150610" Jul 6 23:21:48.375045 containerd[1523]: time="2025-07-06T23:21:48.374942550Z" level=info msg="ImageCreate event name:\"sha256:7f8a5b1dba618e907d5f7804e42b3bd7cd5766bc3b0a66da25ff2c687e356bb0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:21:48.379519 containerd[1523]: time="2025-07-06T23:21:48.379452931Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:21:48.380554 containerd[1523]: time="2025-07-06T23:21:48.380165628Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.3\" with image id \"sha256:7f8a5b1dba618e907d5f7804e42b3bd7cd5766bc3b0a66da25ff2c687e356bb0\", repo tag \"quay.io/tigera/operator:v1.38.3\", repo digest \"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\", size \"22146605\" in 1.706839573s" Jul 6 23:21:48.380554 containerd[1523]: time="2025-07-06T23:21:48.380211212Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\" returns image reference \"sha256:7f8a5b1dba618e907d5f7804e42b3bd7cd5766bc3b0a66da25ff2c687e356bb0\"" Jul 6 23:21:48.382185 containerd[1523]: time="2025-07-06T23:21:48.382152357Z" level=info msg="CreateContainer within sandbox \"1b61a8c40f9ad5261b2b0159c6069bdae10d6175fcfde75d29efa1a76f432037\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jul 6 23:21:48.391396 containerd[1523]: time="2025-07-06T23:21:48.391358096Z" level=info msg="Container bf1c694eb821fbbac4d67906c1fb90501031fd63058c3708576aa1714ba03737: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:21:48.396438 containerd[1523]: time="2025-07-06T23:21:48.396387872Z" level=info msg="CreateContainer within sandbox \"1b61a8c40f9ad5261b2b0159c6069bdae10d6175fcfde75d29efa1a76f432037\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"bf1c694eb821fbbac4d67906c1fb90501031fd63058c3708576aa1714ba03737\"" Jul 6 23:21:48.397465 containerd[1523]: time="2025-07-06T23:21:48.397419657Z" level=info msg="StartContainer for \"bf1c694eb821fbbac4d67906c1fb90501031fd63058c3708576aa1714ba03737\"" Jul 6 23:21:48.398664 containerd[1523]: time="2025-07-06T23:21:48.398622211Z" level=info msg="connecting to shim bf1c694eb821fbbac4d67906c1fb90501031fd63058c3708576aa1714ba03737" address="unix:///run/containerd/s/d50a3215a4208eeafff31ffe2642c7bf833fd88c67c16b0a9f5c6386106b13f3" protocol=ttrpc version=3 Jul 6 23:21:48.418201 systemd[1]: Started cri-containerd-bf1c694eb821fbbac4d67906c1fb90501031fd63058c3708576aa1714ba03737.scope - libcontainer container bf1c694eb821fbbac4d67906c1fb90501031fd63058c3708576aa1714ba03737. Jul 6 23:21:48.448568 containerd[1523]: time="2025-07-06T23:21:48.448516352Z" level=info msg="StartContainer for \"bf1c694eb821fbbac4d67906c1fb90501031fd63058c3708576aa1714ba03737\" returns successfully" Jul 6 23:21:49.021384 kubelet[2638]: I0706 23:21:49.021327 2638 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-5bf8dfcb4-n4fbl" podStartSLOduration=1.313156577 podStartE2EDuration="3.02130876s" podCreationTimestamp="2025-07-06 23:21:46 +0000 UTC" firstStartedPulling="2025-07-06 23:21:46.672716976 +0000 UTC m=+5.810435117" lastFinishedPulling="2025-07-06 23:21:48.380869159 +0000 UTC m=+7.518587300" observedRunningTime="2025-07-06 23:21:49.021271381 +0000 UTC m=+8.158989522" watchObservedRunningTime="2025-07-06 23:21:49.02130876 +0000 UTC m=+8.159026901" Jul 6 23:21:53.906231 sudo[1735]: pam_unix(sudo:session): session closed for user root Jul 6 23:21:53.914778 sshd[1734]: Connection closed by 10.0.0.1 port 43392 Jul 6 23:21:53.915490 sshd-session[1732]: pam_unix(sshd:session): session closed for user core Jul 6 23:21:53.920355 systemd[1]: sshd@6-10.0.0.40:22-10.0.0.1:43392.service: Deactivated successfully. Jul 6 23:21:53.922361 systemd[1]: session-7.scope: Deactivated successfully. Jul 6 23:21:53.924055 systemd[1]: session-7.scope: Consumed 7.268s CPU time, 231.7M memory peak. Jul 6 23:21:53.925138 systemd-logind[1505]: Session 7 logged out. Waiting for processes to exit. Jul 6 23:21:53.928124 systemd-logind[1505]: Removed session 7. Jul 6 23:21:55.411347 update_engine[1512]: I20250706 23:21:55.410038 1512 update_attempter.cc:509] Updating boot flags... Jul 6 23:21:59.085190 systemd[1]: Created slice kubepods-besteffort-pod50915574_4290_4e31_a012_92be81cdeac6.slice - libcontainer container kubepods-besteffort-pod50915574_4290_4e31_a012_92be81cdeac6.slice. Jul 6 23:21:59.177479 kubelet[2638]: I0706 23:21:59.177429 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50915574-4290-4e31-a012-92be81cdeac6-tigera-ca-bundle\") pod \"calico-typha-66654bd6bd-kxk6n\" (UID: \"50915574-4290-4e31-a012-92be81cdeac6\") " pod="calico-system/calico-typha-66654bd6bd-kxk6n" Jul 6 23:21:59.177479 kubelet[2638]: I0706 23:21:59.177480 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/50915574-4290-4e31-a012-92be81cdeac6-typha-certs\") pod \"calico-typha-66654bd6bd-kxk6n\" (UID: \"50915574-4290-4e31-a012-92be81cdeac6\") " pod="calico-system/calico-typha-66654bd6bd-kxk6n" Jul 6 23:21:59.178398 kubelet[2638]: I0706 23:21:59.177500 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzr9p\" (UniqueName: \"kubernetes.io/projected/50915574-4290-4e31-a012-92be81cdeac6-kube-api-access-wzr9p\") pod \"calico-typha-66654bd6bd-kxk6n\" (UID: \"50915574-4290-4e31-a012-92be81cdeac6\") " pod="calico-system/calico-typha-66654bd6bd-kxk6n" Jul 6 23:21:59.338044 systemd[1]: Created slice kubepods-besteffort-pod29326b10_e934_4521_b93d_655c64f76e87.slice - libcontainer container kubepods-besteffort-pod29326b10_e934_4521_b93d_655c64f76e87.slice. Jul 6 23:21:59.378913 kubelet[2638]: I0706 23:21:59.378660 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/29326b10-e934-4521-b93d-655c64f76e87-flexvol-driver-host\") pod \"calico-node-9x4qn\" (UID: \"29326b10-e934-4521-b93d-655c64f76e87\") " pod="calico-system/calico-node-9x4qn" Jul 6 23:21:59.378913 kubelet[2638]: I0706 23:21:59.378706 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/29326b10-e934-4521-b93d-655c64f76e87-cni-net-dir\") pod \"calico-node-9x4qn\" (UID: \"29326b10-e934-4521-b93d-655c64f76e87\") " pod="calico-system/calico-node-9x4qn" Jul 6 23:21:59.379343 kubelet[2638]: I0706 23:21:59.379272 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/29326b10-e934-4521-b93d-655c64f76e87-lib-modules\") pod \"calico-node-9x4qn\" (UID: \"29326b10-e934-4521-b93d-655c64f76e87\") " pod="calico-system/calico-node-9x4qn" Jul 6 23:21:59.379485 kubelet[2638]: I0706 23:21:59.379450 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/29326b10-e934-4521-b93d-655c64f76e87-policysync\") pod \"calico-node-9x4qn\" (UID: \"29326b10-e934-4521-b93d-655c64f76e87\") " pod="calico-system/calico-node-9x4qn" Jul 6 23:21:59.379662 kubelet[2638]: I0706 23:21:59.379609 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/29326b10-e934-4521-b93d-655c64f76e87-cni-bin-dir\") pod \"calico-node-9x4qn\" (UID: \"29326b10-e934-4521-b93d-655c64f76e87\") " pod="calico-system/calico-node-9x4qn" Jul 6 23:21:59.379829 kubelet[2638]: I0706 23:21:59.379636 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/29326b10-e934-4521-b93d-655c64f76e87-xtables-lock\") pod \"calico-node-9x4qn\" (UID: \"29326b10-e934-4521-b93d-655c64f76e87\") " pod="calico-system/calico-node-9x4qn" Jul 6 23:21:59.379829 kubelet[2638]: I0706 23:21:59.379767 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmmdb\" (UniqueName: \"kubernetes.io/projected/29326b10-e934-4521-b93d-655c64f76e87-kube-api-access-bmmdb\") pod \"calico-node-9x4qn\" (UID: \"29326b10-e934-4521-b93d-655c64f76e87\") " pod="calico-system/calico-node-9x4qn" Jul 6 23:21:59.379829 kubelet[2638]: I0706 23:21:59.379791 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/29326b10-e934-4521-b93d-655c64f76e87-var-lib-calico\") pod \"calico-node-9x4qn\" (UID: \"29326b10-e934-4521-b93d-655c64f76e87\") " pod="calico-system/calico-node-9x4qn" Jul 6 23:21:59.380072 kubelet[2638]: I0706 23:21:59.380007 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/29326b10-e934-4521-b93d-655c64f76e87-cni-log-dir\") pod \"calico-node-9x4qn\" (UID: \"29326b10-e934-4521-b93d-655c64f76e87\") " pod="calico-system/calico-node-9x4qn" Jul 6 23:21:59.380240 kubelet[2638]: I0706 23:21:59.380135 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29326b10-e934-4521-b93d-655c64f76e87-tigera-ca-bundle\") pod \"calico-node-9x4qn\" (UID: \"29326b10-e934-4521-b93d-655c64f76e87\") " pod="calico-system/calico-node-9x4qn" Jul 6 23:21:59.380392 kubelet[2638]: I0706 23:21:59.380311 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/29326b10-e934-4521-b93d-655c64f76e87-node-certs\") pod \"calico-node-9x4qn\" (UID: \"29326b10-e934-4521-b93d-655c64f76e87\") " pod="calico-system/calico-node-9x4qn" Jul 6 23:21:59.380520 kubelet[2638]: I0706 23:21:59.380476 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/29326b10-e934-4521-b93d-655c64f76e87-var-run-calico\") pod \"calico-node-9x4qn\" (UID: \"29326b10-e934-4521-b93d-655c64f76e87\") " pod="calico-system/calico-node-9x4qn" Jul 6 23:21:59.392060 containerd[1523]: time="2025-07-06T23:21:59.391972467Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-66654bd6bd-kxk6n,Uid:50915574-4290-4e31-a012-92be81cdeac6,Namespace:calico-system,Attempt:0,}" Jul 6 23:21:59.430933 containerd[1523]: time="2025-07-06T23:21:59.430853698Z" level=info msg="connecting to shim df0eb0e380bac2847a84538ce07a2dc25948dc4d2053c69dc6f72132b758154c" address="unix:///run/containerd/s/cebf1ed33c001c9516131a8da91962d4524e73f4bd9f221b4fd54d01cd33b389" namespace=k8s.io protocol=ttrpc version=3 Jul 6 23:21:59.482935 kubelet[2638]: E0706 23:21:59.482904 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.483728 kubelet[2638]: W0706 23:21:59.483083 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.484036 kubelet[2638]: E0706 23:21:59.483879 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.484732 kubelet[2638]: E0706 23:21:59.484212 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.484732 kubelet[2638]: W0706 23:21:59.484658 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.484732 kubelet[2638]: E0706 23:21:59.484684 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.487028 kubelet[2638]: E0706 23:21:59.486546 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.488096 kubelet[2638]: W0706 23:21:59.487910 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.488096 kubelet[2638]: E0706 23:21:59.487945 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.488588 kubelet[2638]: E0706 23:21:59.488493 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.488835 kubelet[2638]: W0706 23:21:59.488808 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.489366 kubelet[2638]: E0706 23:21:59.489351 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.489678 kubelet[2638]: E0706 23:21:59.489651 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.489678 kubelet[2638]: W0706 23:21:59.489671 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.489746 kubelet[2638]: E0706 23:21:59.489696 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.490128 kubelet[2638]: E0706 23:21:59.489891 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.490128 kubelet[2638]: W0706 23:21:59.489904 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.490128 kubelet[2638]: E0706 23:21:59.489961 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.490128 kubelet[2638]: E0706 23:21:59.490092 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.490128 kubelet[2638]: W0706 23:21:59.490101 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.490128 kubelet[2638]: E0706 23:21:59.490127 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.491029 kubelet[2638]: E0706 23:21:59.490303 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.491029 kubelet[2638]: W0706 23:21:59.490315 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.491029 kubelet[2638]: E0706 23:21:59.490326 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.491029 kubelet[2638]: E0706 23:21:59.490533 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.491029 kubelet[2638]: W0706 23:21:59.490542 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.491029 kubelet[2638]: E0706 23:21:59.490551 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.491029 kubelet[2638]: E0706 23:21:59.490715 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.491029 kubelet[2638]: W0706 23:21:59.490723 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.491029 kubelet[2638]: E0706 23:21:59.490731 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.491029 kubelet[2638]: E0706 23:21:59.490912 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.491236 kubelet[2638]: W0706 23:21:59.490921 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.491236 kubelet[2638]: E0706 23:21:59.490930 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.491207 systemd[1]: Started cri-containerd-df0eb0e380bac2847a84538ce07a2dc25948dc4d2053c69dc6f72132b758154c.scope - libcontainer container df0eb0e380bac2847a84538ce07a2dc25948dc4d2053c69dc6f72132b758154c. Jul 6 23:21:59.493781 kubelet[2638]: E0706 23:21:59.493754 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.493781 kubelet[2638]: W0706 23:21:59.493772 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.493896 kubelet[2638]: E0706 23:21:59.493787 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.496161 kubelet[2638]: E0706 23:21:59.496130 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.496161 kubelet[2638]: W0706 23:21:59.496150 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.496261 kubelet[2638]: E0706 23:21:59.496168 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.575077 containerd[1523]: time="2025-07-06T23:21:59.574993016Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-66654bd6bd-kxk6n,Uid:50915574-4290-4e31-a012-92be81cdeac6,Namespace:calico-system,Attempt:0,} returns sandbox id \"df0eb0e380bac2847a84538ce07a2dc25948dc4d2053c69dc6f72132b758154c\"" Jul 6 23:21:59.578783 containerd[1523]: time="2025-07-06T23:21:59.578723091Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\"" Jul 6 23:21:59.594632 kubelet[2638]: E0706 23:21:59.593630 2638 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zdbvm" podUID="28330471-fba4-44a1-96d6-3512652f7a80" Jul 6 23:21:59.644749 containerd[1523]: time="2025-07-06T23:21:59.644706390Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-9x4qn,Uid:29326b10-e934-4521-b93d-655c64f76e87,Namespace:calico-system,Attempt:0,}" Jul 6 23:21:59.670846 containerd[1523]: time="2025-07-06T23:21:59.670690683Z" level=info msg="connecting to shim 5e6dfcf9c75fd02185da227b5aa4e77d40b90a722c42f94288d7636a078edf8e" address="unix:///run/containerd/s/684acc92396be81d94b1917f4a26341bd9c5dee00690b53513bd92e8feaa74e1" namespace=k8s.io protocol=ttrpc version=3 Jul 6 23:21:59.676838 kubelet[2638]: E0706 23:21:59.676804 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.677400 kubelet[2638]: W0706 23:21:59.677195 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.677400 kubelet[2638]: E0706 23:21:59.677230 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.678076 kubelet[2638]: E0706 23:21:59.677629 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.678076 kubelet[2638]: W0706 23:21:59.677665 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.678076 kubelet[2638]: E0706 23:21:59.677680 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.678929 kubelet[2638]: E0706 23:21:59.678750 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.678929 kubelet[2638]: W0706 23:21:59.678770 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.678929 kubelet[2638]: E0706 23:21:59.678785 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.679195 kubelet[2638]: E0706 23:21:59.679180 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.679268 kubelet[2638]: W0706 23:21:59.679256 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.679437 kubelet[2638]: E0706 23:21:59.679316 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.679729 kubelet[2638]: E0706 23:21:59.679712 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.679926 kubelet[2638]: W0706 23:21:59.679770 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.679926 kubelet[2638]: E0706 23:21:59.679787 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.680201 kubelet[2638]: E0706 23:21:59.680180 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.680305 kubelet[2638]: W0706 23:21:59.680291 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.681052 kubelet[2638]: E0706 23:21:59.681031 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.681304 kubelet[2638]: E0706 23:21:59.681290 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.681533 kubelet[2638]: W0706 23:21:59.681405 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.681533 kubelet[2638]: E0706 23:21:59.681424 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.684143 kubelet[2638]: E0706 23:21:59.684123 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.684377 kubelet[2638]: W0706 23:21:59.684234 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.684377 kubelet[2638]: E0706 23:21:59.684253 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.685384 kubelet[2638]: E0706 23:21:59.685150 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.685384 kubelet[2638]: W0706 23:21:59.685167 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.685384 kubelet[2638]: E0706 23:21:59.685180 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.685883 kubelet[2638]: E0706 23:21:59.685777 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.686059 kubelet[2638]: W0706 23:21:59.686040 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.686265 kubelet[2638]: E0706 23:21:59.686159 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.687277 kubelet[2638]: E0706 23:21:59.686844 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.687402 kubelet[2638]: W0706 23:21:59.687372 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.687471 kubelet[2638]: E0706 23:21:59.687459 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.688930 kubelet[2638]: E0706 23:21:59.688910 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.689056 kubelet[2638]: W0706 23:21:59.689037 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.689128 kubelet[2638]: E0706 23:21:59.689116 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.689814 kubelet[2638]: E0706 23:21:59.689704 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.690087 kubelet[2638]: W0706 23:21:59.689917 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.690087 kubelet[2638]: E0706 23:21:59.689941 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.691473 kubelet[2638]: E0706 23:21:59.691074 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.691473 kubelet[2638]: W0706 23:21:59.691092 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.691473 kubelet[2638]: E0706 23:21:59.691104 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.692116 kubelet[2638]: E0706 23:21:59.692097 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.692215 kubelet[2638]: W0706 23:21:59.692201 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.692362 kubelet[2638]: E0706 23:21:59.692271 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.692885 kubelet[2638]: E0706 23:21:59.692867 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.693277 kubelet[2638]: W0706 23:21:59.692976 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.693277 kubelet[2638]: E0706 23:21:59.692995 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.693544 kubelet[2638]: E0706 23:21:59.693527 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.693611 kubelet[2638]: W0706 23:21:59.693599 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.693854 kubelet[2638]: E0706 23:21:59.693726 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.694266 kubelet[2638]: E0706 23:21:59.694248 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.694574 kubelet[2638]: W0706 23:21:59.694348 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.694574 kubelet[2638]: E0706 23:21:59.694367 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.698217 kubelet[2638]: E0706 23:21:59.698168 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.699103 kubelet[2638]: W0706 23:21:59.698861 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.699103 kubelet[2638]: E0706 23:21:59.698889 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.699504 kubelet[2638]: E0706 23:21:59.699364 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.699859 kubelet[2638]: W0706 23:21:59.699835 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.700622 kubelet[2638]: E0706 23:21:59.699939 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.708046 kubelet[2638]: E0706 23:21:59.704112 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.708046 kubelet[2638]: W0706 23:21:59.704135 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.708046 kubelet[2638]: E0706 23:21:59.704151 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.708046 kubelet[2638]: I0706 23:21:59.704181 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/28330471-fba4-44a1-96d6-3512652f7a80-kubelet-dir\") pod \"csi-node-driver-zdbvm\" (UID: \"28330471-fba4-44a1-96d6-3512652f7a80\") " pod="calico-system/csi-node-driver-zdbvm" Jul 6 23:21:59.708046 kubelet[2638]: E0706 23:21:59.705364 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.708046 kubelet[2638]: W0706 23:21:59.705727 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.708046 kubelet[2638]: E0706 23:21:59.705743 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.708046 kubelet[2638]: I0706 23:21:59.705764 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/28330471-fba4-44a1-96d6-3512652f7a80-registration-dir\") pod \"csi-node-driver-zdbvm\" (UID: \"28330471-fba4-44a1-96d6-3512652f7a80\") " pod="calico-system/csi-node-driver-zdbvm" Jul 6 23:21:59.708046 kubelet[2638]: E0706 23:21:59.706727 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.708323 kubelet[2638]: W0706 23:21:59.706744 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.708323 kubelet[2638]: E0706 23:21:59.706964 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.708323 kubelet[2638]: I0706 23:21:59.707001 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dcc9\" (UniqueName: \"kubernetes.io/projected/28330471-fba4-44a1-96d6-3512652f7a80-kube-api-access-7dcc9\") pod \"csi-node-driver-zdbvm\" (UID: \"28330471-fba4-44a1-96d6-3512652f7a80\") " pod="calico-system/csi-node-driver-zdbvm" Jul 6 23:21:59.708323 kubelet[2638]: E0706 23:21:59.708103 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.708323 kubelet[2638]: W0706 23:21:59.708118 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.708323 kubelet[2638]: E0706 23:21:59.708186 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.709845 kubelet[2638]: E0706 23:21:59.709820 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.709845 kubelet[2638]: W0706 23:21:59.709839 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.709845 kubelet[2638]: E0706 23:21:59.709884 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.710545 kubelet[2638]: E0706 23:21:59.710315 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.710545 kubelet[2638]: W0706 23:21:59.710327 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.710545 kubelet[2638]: E0706 23:21:59.710389 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.710545 kubelet[2638]: I0706 23:21:59.710426 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/28330471-fba4-44a1-96d6-3512652f7a80-socket-dir\") pod \"csi-node-driver-zdbvm\" (UID: \"28330471-fba4-44a1-96d6-3512652f7a80\") " pod="calico-system/csi-node-driver-zdbvm" Jul 6 23:21:59.711659 kubelet[2638]: E0706 23:21:59.711091 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.711659 kubelet[2638]: W0706 23:21:59.711114 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.711659 kubelet[2638]: E0706 23:21:59.711152 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.712388 kubelet[2638]: E0706 23:21:59.712360 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.712388 kubelet[2638]: W0706 23:21:59.712383 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.712556 kubelet[2638]: E0706 23:21:59.712401 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.714132 kubelet[2638]: E0706 23:21:59.714104 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.714132 kubelet[2638]: W0706 23:21:59.714127 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.714234 kubelet[2638]: E0706 23:21:59.714150 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.714234 kubelet[2638]: I0706 23:21:59.714179 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/28330471-fba4-44a1-96d6-3512652f7a80-varrun\") pod \"csi-node-driver-zdbvm\" (UID: \"28330471-fba4-44a1-96d6-3512652f7a80\") " pod="calico-system/csi-node-driver-zdbvm" Jul 6 23:21:59.714600 kubelet[2638]: E0706 23:21:59.714410 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.714600 kubelet[2638]: W0706 23:21:59.714427 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.714600 kubelet[2638]: E0706 23:21:59.714439 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.715300 kubelet[2638]: E0706 23:21:59.715113 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.715300 kubelet[2638]: W0706 23:21:59.715226 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.715300 kubelet[2638]: E0706 23:21:59.715244 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.715959 kubelet[2638]: E0706 23:21:59.715865 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.715959 kubelet[2638]: W0706 23:21:59.715883 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.715959 kubelet[2638]: E0706 23:21:59.715896 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.717224 kubelet[2638]: E0706 23:21:59.717121 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.717224 kubelet[2638]: W0706 23:21:59.717139 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.717224 kubelet[2638]: E0706 23:21:59.717159 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.718863 kubelet[2638]: E0706 23:21:59.718238 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.718863 kubelet[2638]: W0706 23:21:59.718598 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.718863 kubelet[2638]: E0706 23:21:59.718615 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.720512 kubelet[2638]: E0706 23:21:59.720394 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.720512 kubelet[2638]: W0706 23:21:59.720412 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.720512 kubelet[2638]: E0706 23:21:59.720426 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.736900 systemd[1]: Started cri-containerd-5e6dfcf9c75fd02185da227b5aa4e77d40b90a722c42f94288d7636a078edf8e.scope - libcontainer container 5e6dfcf9c75fd02185da227b5aa4e77d40b90a722c42f94288d7636a078edf8e. Jul 6 23:21:59.810586 containerd[1523]: time="2025-07-06T23:21:59.810541798Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-9x4qn,Uid:29326b10-e934-4521-b93d-655c64f76e87,Namespace:calico-system,Attempt:0,} returns sandbox id \"5e6dfcf9c75fd02185da227b5aa4e77d40b90a722c42f94288d7636a078edf8e\"" Jul 6 23:21:59.816281 kubelet[2638]: E0706 23:21:59.816248 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.816281 kubelet[2638]: W0706 23:21:59.816273 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.816444 kubelet[2638]: E0706 23:21:59.816294 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.816553 kubelet[2638]: E0706 23:21:59.816531 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.816553 kubelet[2638]: W0706 23:21:59.816544 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.816553 kubelet[2638]: E0706 23:21:59.816560 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.816859 kubelet[2638]: E0706 23:21:59.816839 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.816917 kubelet[2638]: W0706 23:21:59.816903 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.816996 kubelet[2638]: E0706 23:21:59.816983 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.817483 kubelet[2638]: E0706 23:21:59.817455 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.817483 kubelet[2638]: W0706 23:21:59.817474 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.817582 kubelet[2638]: E0706 23:21:59.817490 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.817691 kubelet[2638]: E0706 23:21:59.817675 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.817720 kubelet[2638]: W0706 23:21:59.817702 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.817720 kubelet[2638]: E0706 23:21:59.817714 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.818627 kubelet[2638]: E0706 23:21:59.818557 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.818627 kubelet[2638]: W0706 23:21:59.818580 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.818735 kubelet[2638]: E0706 23:21:59.818644 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.818823 kubelet[2638]: E0706 23:21:59.818809 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.818823 kubelet[2638]: W0706 23:21:59.818822 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.818904 kubelet[2638]: E0706 23:21:59.818889 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.819094 kubelet[2638]: E0706 23:21:59.819079 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.819094 kubelet[2638]: W0706 23:21:59.819092 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.819201 kubelet[2638]: E0706 23:21:59.819180 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.819265 kubelet[2638]: E0706 23:21:59.819245 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.819265 kubelet[2638]: W0706 23:21:59.819259 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.819318 kubelet[2638]: E0706 23:21:59.819302 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.819450 kubelet[2638]: E0706 23:21:59.819436 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.819450 kubelet[2638]: W0706 23:21:59.819448 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.819556 kubelet[2638]: E0706 23:21:59.819482 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.819612 kubelet[2638]: E0706 23:21:59.819598 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.819612 kubelet[2638]: W0706 23:21:59.819608 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.819674 kubelet[2638]: E0706 23:21:59.819622 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.819780 kubelet[2638]: E0706 23:21:59.819768 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.819780 kubelet[2638]: W0706 23:21:59.819779 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.819846 kubelet[2638]: E0706 23:21:59.819801 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.819961 kubelet[2638]: E0706 23:21:59.819947 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.820003 kubelet[2638]: W0706 23:21:59.819961 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.820003 kubelet[2638]: E0706 23:21:59.819976 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.820201 kubelet[2638]: E0706 23:21:59.820187 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.820201 kubelet[2638]: W0706 23:21:59.820201 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.820290 kubelet[2638]: E0706 23:21:59.820274 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.820368 kubelet[2638]: E0706 23:21:59.820329 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.820368 kubelet[2638]: W0706 23:21:59.820341 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.820416 kubelet[2638]: E0706 23:21:59.820395 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.820557 kubelet[2638]: E0706 23:21:59.820543 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.820557 kubelet[2638]: W0706 23:21:59.820555 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.820670 kubelet[2638]: E0706 23:21:59.820583 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.820742 kubelet[2638]: E0706 23:21:59.820728 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.820742 kubelet[2638]: W0706 23:21:59.820740 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.820790 kubelet[2638]: E0706 23:21:59.820779 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.820954 kubelet[2638]: E0706 23:21:59.820932 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.821053 kubelet[2638]: W0706 23:21:59.820952 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.821053 kubelet[2638]: E0706 23:21:59.820993 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.821192 kubelet[2638]: E0706 23:21:59.821174 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.821192 kubelet[2638]: W0706 23:21:59.821187 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.821249 kubelet[2638]: E0706 23:21:59.821205 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.821399 kubelet[2638]: E0706 23:21:59.821385 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.821431 kubelet[2638]: W0706 23:21:59.821398 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.821431 kubelet[2638]: E0706 23:21:59.821412 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.821907 kubelet[2638]: E0706 23:21:59.821745 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.821907 kubelet[2638]: W0706 23:21:59.821764 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.821907 kubelet[2638]: E0706 23:21:59.821783 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.822118 kubelet[2638]: E0706 23:21:59.822102 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.822191 kubelet[2638]: W0706 23:21:59.822178 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.822268 kubelet[2638]: E0706 23:21:59.822255 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.822607 kubelet[2638]: E0706 23:21:59.822564 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.822607 kubelet[2638]: W0706 23:21:59.822597 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.822672 kubelet[2638]: E0706 23:21:59.822615 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.822856 kubelet[2638]: E0706 23:21:59.822843 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.822890 kubelet[2638]: W0706 23:21:59.822856 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.822890 kubelet[2638]: E0706 23:21:59.822875 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.823336 kubelet[2638]: E0706 23:21:59.823307 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.823336 kubelet[2638]: W0706 23:21:59.823334 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.823413 kubelet[2638]: E0706 23:21:59.823349 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:21:59.833636 kubelet[2638]: E0706 23:21:59.833610 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:21:59.833636 kubelet[2638]: W0706 23:21:59.833633 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:21:59.833745 kubelet[2638]: E0706 23:21:59.833662 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:22:00.499188 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1640587798.mount: Deactivated successfully. Jul 6 23:22:00.968460 kubelet[2638]: E0706 23:22:00.968376 2638 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zdbvm" podUID="28330471-fba4-44a1-96d6-3512652f7a80" Jul 6 23:22:01.253899 containerd[1523]: time="2025-07-06T23:22:01.253784214Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:22:01.257392 containerd[1523]: time="2025-07-06T23:22:01.257303933Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.2: active requests=0, bytes read=33087207" Jul 6 23:22:01.257930 containerd[1523]: time="2025-07-06T23:22:01.257903816Z" level=info msg="ImageCreate event name:\"sha256:bd819526ff844d29b60cd75e846a1f55306016ff269d881d52a9b6c7b2eef0b2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:22:01.265477 containerd[1523]: time="2025-07-06T23:22:01.265422104Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:22:01.265845 containerd[1523]: time="2025-07-06T23:22:01.265814091Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.2\" with image id \"sha256:bd819526ff844d29b60cd75e846a1f55306016ff269d881d52a9b6c7b2eef0b2\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\", size \"33087061\" in 1.687039864s" Jul 6 23:22:01.265845 containerd[1523]: time="2025-07-06T23:22:01.265841578Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\" returns image reference \"sha256:bd819526ff844d29b60cd75e846a1f55306016ff269d881d52a9b6c7b2eef0b2\"" Jul 6 23:22:01.266980 containerd[1523]: time="2025-07-06T23:22:01.266923593Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\"" Jul 6 23:22:01.284954 containerd[1523]: time="2025-07-06T23:22:01.284903049Z" level=info msg="CreateContainer within sandbox \"df0eb0e380bac2847a84538ce07a2dc25948dc4d2053c69dc6f72132b758154c\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jul 6 23:22:01.297703 containerd[1523]: time="2025-07-06T23:22:01.296734711Z" level=info msg="Container 91e318b7a436b96a229eca987dfb1106ab2c90a1e9b6d8a63180d6457843e341: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:22:01.304833 containerd[1523]: time="2025-07-06T23:22:01.304759897Z" level=info msg="CreateContainer within sandbox \"df0eb0e380bac2847a84538ce07a2dc25948dc4d2053c69dc6f72132b758154c\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"91e318b7a436b96a229eca987dfb1106ab2c90a1e9b6d8a63180d6457843e341\"" Jul 6 23:22:01.305488 containerd[1523]: time="2025-07-06T23:22:01.305448324Z" level=info msg="StartContainer for \"91e318b7a436b96a229eca987dfb1106ab2c90a1e9b6d8a63180d6457843e341\"" Jul 6 23:22:01.306881 containerd[1523]: time="2025-07-06T23:22:01.306856068Z" level=info msg="connecting to shim 91e318b7a436b96a229eca987dfb1106ab2c90a1e9b6d8a63180d6457843e341" address="unix:///run/containerd/s/cebf1ed33c001c9516131a8da91962d4524e73f4bd9f221b4fd54d01cd33b389" protocol=ttrpc version=3 Jul 6 23:22:01.330241 systemd[1]: Started cri-containerd-91e318b7a436b96a229eca987dfb1106ab2c90a1e9b6d8a63180d6457843e341.scope - libcontainer container 91e318b7a436b96a229eca987dfb1106ab2c90a1e9b6d8a63180d6457843e341. Jul 6 23:22:01.375191 containerd[1523]: time="2025-07-06T23:22:01.375147826Z" level=info msg="StartContainer for \"91e318b7a436b96a229eca987dfb1106ab2c90a1e9b6d8a63180d6457843e341\" returns successfully" Jul 6 23:22:02.072437 kubelet[2638]: I0706 23:22:02.072366 2638 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-66654bd6bd-kxk6n" podStartSLOduration=1.382675125 podStartE2EDuration="3.072347552s" podCreationTimestamp="2025-07-06 23:21:59 +0000 UTC" firstStartedPulling="2025-07-06 23:21:59.577088522 +0000 UTC m=+18.714806663" lastFinishedPulling="2025-07-06 23:22:01.266760949 +0000 UTC m=+20.404479090" observedRunningTime="2025-07-06 23:22:02.07165005 +0000 UTC m=+21.209368191" watchObservedRunningTime="2025-07-06 23:22:02.072347552 +0000 UTC m=+21.210065693" Jul 6 23:22:02.118958 kubelet[2638]: E0706 23:22:02.118925 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:22:02.118958 kubelet[2638]: W0706 23:22:02.118948 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:22:02.118958 kubelet[2638]: E0706 23:22:02.118970 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:22:02.119227 kubelet[2638]: E0706 23:22:02.119212 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:22:02.119227 kubelet[2638]: W0706 23:22:02.119224 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:22:02.119353 kubelet[2638]: E0706 23:22:02.119233 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:22:02.119437 kubelet[2638]: E0706 23:22:02.119425 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:22:02.119467 kubelet[2638]: W0706 23:22:02.119437 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:22:02.119467 kubelet[2638]: E0706 23:22:02.119447 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:22:02.119691 kubelet[2638]: E0706 23:22:02.119677 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:22:02.119691 kubelet[2638]: W0706 23:22:02.119690 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:22:02.119758 kubelet[2638]: E0706 23:22:02.119700 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:22:02.119907 kubelet[2638]: E0706 23:22:02.119893 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:22:02.119907 kubelet[2638]: W0706 23:22:02.119906 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:22:02.119974 kubelet[2638]: E0706 23:22:02.119916 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:22:02.120152 kubelet[2638]: E0706 23:22:02.120128 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:22:02.120152 kubelet[2638]: W0706 23:22:02.120139 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:22:02.120152 kubelet[2638]: E0706 23:22:02.120148 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:22:02.120349 kubelet[2638]: E0706 23:22:02.120336 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:22:02.120349 kubelet[2638]: W0706 23:22:02.120349 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:22:02.120423 kubelet[2638]: E0706 23:22:02.120358 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:22:02.120673 kubelet[2638]: E0706 23:22:02.120659 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:22:02.120673 kubelet[2638]: W0706 23:22:02.120671 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:22:02.120749 kubelet[2638]: E0706 23:22:02.120681 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:22:02.120895 kubelet[2638]: E0706 23:22:02.120883 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:22:02.120928 kubelet[2638]: W0706 23:22:02.120895 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:22:02.120928 kubelet[2638]: E0706 23:22:02.120905 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:22:02.121144 kubelet[2638]: E0706 23:22:02.121131 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:22:02.121144 kubelet[2638]: W0706 23:22:02.121143 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:22:02.121214 kubelet[2638]: E0706 23:22:02.121153 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:22:02.121310 kubelet[2638]: E0706 23:22:02.121299 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:22:02.121340 kubelet[2638]: W0706 23:22:02.121311 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:22:02.121340 kubelet[2638]: E0706 23:22:02.121319 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:22:02.121474 kubelet[2638]: E0706 23:22:02.121463 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:22:02.121474 kubelet[2638]: W0706 23:22:02.121474 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:22:02.121536 kubelet[2638]: E0706 23:22:02.121482 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:22:02.121627 kubelet[2638]: E0706 23:22:02.121616 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:22:02.121627 kubelet[2638]: W0706 23:22:02.121627 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:22:02.121686 kubelet[2638]: E0706 23:22:02.121635 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:22:02.121877 kubelet[2638]: E0706 23:22:02.121864 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:22:02.121877 kubelet[2638]: W0706 23:22:02.121876 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:22:02.121944 kubelet[2638]: E0706 23:22:02.121885 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:22:02.122061 kubelet[2638]: E0706 23:22:02.122050 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:22:02.122061 kubelet[2638]: W0706 23:22:02.122061 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:22:02.122119 kubelet[2638]: E0706 23:22:02.122073 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:22:02.154967 kubelet[2638]: E0706 23:22:02.154776 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:22:02.154967 kubelet[2638]: W0706 23:22:02.154806 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:22:02.154967 kubelet[2638]: E0706 23:22:02.154826 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:22:02.155489 kubelet[2638]: E0706 23:22:02.155471 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:22:02.155553 kubelet[2638]: W0706 23:22:02.155541 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:22:02.155668 kubelet[2638]: E0706 23:22:02.155654 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:22:02.156037 kubelet[2638]: E0706 23:22:02.155981 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:22:02.156037 kubelet[2638]: W0706 23:22:02.156000 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:22:02.156138 kubelet[2638]: E0706 23:22:02.156092 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:22:02.156481 kubelet[2638]: E0706 23:22:02.156388 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:22:02.156481 kubelet[2638]: W0706 23:22:02.156437 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:22:02.156541 kubelet[2638]: E0706 23:22:02.156480 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:22:02.156807 kubelet[2638]: E0706 23:22:02.156792 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:22:02.156807 kubelet[2638]: W0706 23:22:02.156806 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:22:02.156871 kubelet[2638]: E0706 23:22:02.156823 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:22:02.157352 kubelet[2638]: E0706 23:22:02.157229 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:22:02.157406 kubelet[2638]: W0706 23:22:02.157369 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:22:02.157462 kubelet[2638]: E0706 23:22:02.157439 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:22:02.157719 kubelet[2638]: E0706 23:22:02.157704 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:22:02.157719 kubelet[2638]: W0706 23:22:02.157718 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:22:02.157791 kubelet[2638]: E0706 23:22:02.157755 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:22:02.158079 kubelet[2638]: E0706 23:22:02.158043 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:22:02.158132 kubelet[2638]: W0706 23:22:02.158077 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:22:02.158160 kubelet[2638]: E0706 23:22:02.158138 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:22:02.158325 kubelet[2638]: E0706 23:22:02.158310 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:22:02.158360 kubelet[2638]: W0706 23:22:02.158325 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:22:02.158422 kubelet[2638]: E0706 23:22:02.158403 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:22:02.158791 kubelet[2638]: E0706 23:22:02.158764 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:22:02.158791 kubelet[2638]: W0706 23:22:02.158781 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:22:02.158853 kubelet[2638]: E0706 23:22:02.158802 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:22:02.159127 kubelet[2638]: E0706 23:22:02.159107 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:22:02.159229 kubelet[2638]: W0706 23:22:02.159212 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:22:02.159329 kubelet[2638]: E0706 23:22:02.159315 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:22:02.159673 kubelet[2638]: E0706 23:22:02.159647 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:22:02.159673 kubelet[2638]: W0706 23:22:02.159671 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:22:02.159751 kubelet[2638]: E0706 23:22:02.159693 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:22:02.159880 kubelet[2638]: E0706 23:22:02.159868 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:22:02.159880 kubelet[2638]: W0706 23:22:02.159879 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:22:02.159950 kubelet[2638]: E0706 23:22:02.159892 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:22:02.160104 kubelet[2638]: E0706 23:22:02.160091 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:22:02.160104 kubelet[2638]: W0706 23:22:02.160103 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:22:02.160280 kubelet[2638]: E0706 23:22:02.160118 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:22:02.160438 kubelet[2638]: E0706 23:22:02.160420 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:22:02.160725 kubelet[2638]: W0706 23:22:02.160525 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:22:02.160725 kubelet[2638]: E0706 23:22:02.160556 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:22:02.167116 kubelet[2638]: E0706 23:22:02.167085 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:22:02.167116 kubelet[2638]: W0706 23:22:02.167108 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:22:02.167273 kubelet[2638]: E0706 23:22:02.167184 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:22:02.167431 kubelet[2638]: E0706 23:22:02.167411 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:22:02.167476 kubelet[2638]: W0706 23:22:02.167429 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:22:02.167476 kubelet[2638]: E0706 23:22:02.167444 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:22:02.167918 kubelet[2638]: E0706 23:22:02.167897 2638 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:22:02.167964 kubelet[2638]: W0706 23:22:02.167927 2638 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:22:02.167964 kubelet[2638]: E0706 23:22:02.167945 2638 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:22:02.311172 containerd[1523]: time="2025-07-06T23:22:02.311113553Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:22:02.312084 containerd[1523]: time="2025-07-06T23:22:02.312056398Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2: active requests=0, bytes read=4266981" Jul 6 23:22:02.312944 containerd[1523]: time="2025-07-06T23:22:02.312899778Z" level=info msg="ImageCreate event name:\"sha256:53f638101e3d73f7dd5e42dc42fb3d94ae1978e8958677222c3de6ec1d8c3d4f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:22:02.314926 containerd[1523]: time="2025-07-06T23:22:02.314687243Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:22:02.315599 containerd[1523]: time="2025-07-06T23:22:02.315554348Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" with image id \"sha256:53f638101e3d73f7dd5e42dc42fb3d94ae1978e8958677222c3de6ec1d8c3d4f\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\", size \"5636182\" in 1.048552254s" Jul 6 23:22:02.315599 containerd[1523]: time="2025-07-06T23:22:02.315593759Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" returns image reference \"sha256:53f638101e3d73f7dd5e42dc42fb3d94ae1978e8958677222c3de6ec1d8c3d4f\"" Jul 6 23:22:02.319352 containerd[1523]: time="2025-07-06T23:22:02.319311606Z" level=info msg="CreateContainer within sandbox \"5e6dfcf9c75fd02185da227b5aa4e77d40b90a722c42f94288d7636a078edf8e\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jul 6 23:22:02.329118 containerd[1523]: time="2025-07-06T23:22:02.328524163Z" level=info msg="Container ac939227a3f3589713653930de2b003563bf69d8acef111dfe5901323f4189c1: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:22:02.357807 containerd[1523]: time="2025-07-06T23:22:02.357744845Z" level=info msg="CreateContainer within sandbox \"5e6dfcf9c75fd02185da227b5aa4e77d40b90a722c42f94288d7636a078edf8e\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"ac939227a3f3589713653930de2b003563bf69d8acef111dfe5901323f4189c1\"" Jul 6 23:22:02.358357 containerd[1523]: time="2025-07-06T23:22:02.358299069Z" level=info msg="StartContainer for \"ac939227a3f3589713653930de2b003563bf69d8acef111dfe5901323f4189c1\"" Jul 6 23:22:02.361078 containerd[1523]: time="2025-07-06T23:22:02.360982088Z" level=info msg="connecting to shim ac939227a3f3589713653930de2b003563bf69d8acef111dfe5901323f4189c1" address="unix:///run/containerd/s/684acc92396be81d94b1917f4a26341bd9c5dee00690b53513bd92e8feaa74e1" protocol=ttrpc version=3 Jul 6 23:22:02.410250 systemd[1]: Started cri-containerd-ac939227a3f3589713653930de2b003563bf69d8acef111dfe5901323f4189c1.scope - libcontainer container ac939227a3f3589713653930de2b003563bf69d8acef111dfe5901323f4189c1. Jul 6 23:22:02.448001 containerd[1523]: time="2025-07-06T23:22:02.447892660Z" level=info msg="StartContainer for \"ac939227a3f3589713653930de2b003563bf69d8acef111dfe5901323f4189c1\" returns successfully" Jul 6 23:22:02.493990 systemd[1]: cri-containerd-ac939227a3f3589713653930de2b003563bf69d8acef111dfe5901323f4189c1.scope: Deactivated successfully. Jul 6 23:22:02.494338 systemd[1]: cri-containerd-ac939227a3f3589713653930de2b003563bf69d8acef111dfe5901323f4189c1.scope: Consumed 64ms CPU time, 6.3M memory peak, 4.1M written to disk. Jul 6 23:22:02.512103 containerd[1523]: time="2025-07-06T23:22:02.512042070Z" level=info msg="received exit event container_id:\"ac939227a3f3589713653930de2b003563bf69d8acef111dfe5901323f4189c1\" id:\"ac939227a3f3589713653930de2b003563bf69d8acef111dfe5901323f4189c1\" pid:3344 exited_at:{seconds:1751844122 nanos:505776240}" Jul 6 23:22:02.515938 containerd[1523]: time="2025-07-06T23:22:02.515878308Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ac939227a3f3589713653930de2b003563bf69d8acef111dfe5901323f4189c1\" id:\"ac939227a3f3589713653930de2b003563bf69d8acef111dfe5901323f4189c1\" pid:3344 exited_at:{seconds:1751844122 nanos:505776240}" Jul 6 23:22:02.547748 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ac939227a3f3589713653930de2b003563bf69d8acef111dfe5901323f4189c1-rootfs.mount: Deactivated successfully. Jul 6 23:22:02.968149 kubelet[2638]: E0706 23:22:02.967950 2638 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zdbvm" podUID="28330471-fba4-44a1-96d6-3512652f7a80" Jul 6 23:22:03.063169 kubelet[2638]: I0706 23:22:03.063138 2638 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 6 23:22:03.064905 containerd[1523]: time="2025-07-06T23:22:03.064865073Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\"" Jul 6 23:22:04.969179 kubelet[2638]: E0706 23:22:04.968741 2638 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zdbvm" podUID="28330471-fba4-44a1-96d6-3512652f7a80" Jul 6 23:22:06.568723 containerd[1523]: time="2025-07-06T23:22:06.568676967Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:22:06.570035 containerd[1523]: time="2025-07-06T23:22:06.569667143Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.2: active requests=0, bytes read=65888320" Jul 6 23:22:06.571674 containerd[1523]: time="2025-07-06T23:22:06.571621011Z" level=info msg="ImageCreate event name:\"sha256:f6e344d58b3c5524e767c7d1dd4cb29c85ce820b0f3005a603532b6a22db5588\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:22:06.575310 containerd[1523]: time="2025-07-06T23:22:06.575269008Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:22:06.576030 containerd[1523]: time="2025-07-06T23:22:06.575714706Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.2\" with image id \"sha256:f6e344d58b3c5524e767c7d1dd4cb29c85ce820b0f3005a603532b6a22db5588\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\", size \"67257561\" in 3.510806823s" Jul 6 23:22:06.576030 containerd[1523]: time="2025-07-06T23:22:06.575752754Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\" returns image reference \"sha256:f6e344d58b3c5524e767c7d1dd4cb29c85ce820b0f3005a603532b6a22db5588\"" Jul 6 23:22:06.579438 containerd[1523]: time="2025-07-06T23:22:06.579393670Z" level=info msg="CreateContainer within sandbox \"5e6dfcf9c75fd02185da227b5aa4e77d40b90a722c42f94288d7636a078edf8e\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jul 6 23:22:06.590565 containerd[1523]: time="2025-07-06T23:22:06.590505380Z" level=info msg="Container e08f8b0a2a8dd5dd36067d3033e6be019bb9b38af101742295a47e2d918b2946: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:22:06.594352 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2362017319.mount: Deactivated successfully. Jul 6 23:22:06.605339 containerd[1523]: time="2025-07-06T23:22:06.605281851Z" level=info msg="CreateContainer within sandbox \"5e6dfcf9c75fd02185da227b5aa4e77d40b90a722c42f94288d7636a078edf8e\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"e08f8b0a2a8dd5dd36067d3033e6be019bb9b38af101742295a47e2d918b2946\"" Jul 6 23:22:06.606076 containerd[1523]: time="2025-07-06T23:22:06.605886464Z" level=info msg="StartContainer for \"e08f8b0a2a8dd5dd36067d3033e6be019bb9b38af101742295a47e2d918b2946\"" Jul 6 23:22:06.607463 containerd[1523]: time="2025-07-06T23:22:06.607429321Z" level=info msg="connecting to shim e08f8b0a2a8dd5dd36067d3033e6be019bb9b38af101742295a47e2d918b2946" address="unix:///run/containerd/s/684acc92396be81d94b1917f4a26341bd9c5dee00690b53513bd92e8feaa74e1" protocol=ttrpc version=3 Jul 6 23:22:06.636263 systemd[1]: Started cri-containerd-e08f8b0a2a8dd5dd36067d3033e6be019bb9b38af101742295a47e2d918b2946.scope - libcontainer container e08f8b0a2a8dd5dd36067d3033e6be019bb9b38af101742295a47e2d918b2946. Jul 6 23:22:06.686704 containerd[1523]: time="2025-07-06T23:22:06.686659967Z" level=info msg="StartContainer for \"e08f8b0a2a8dd5dd36067d3033e6be019bb9b38af101742295a47e2d918b2946\" returns successfully" Jul 6 23:22:06.968250 kubelet[2638]: E0706 23:22:06.968195 2638 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zdbvm" podUID="28330471-fba4-44a1-96d6-3512652f7a80" Jul 6 23:22:07.445844 systemd[1]: cri-containerd-e08f8b0a2a8dd5dd36067d3033e6be019bb9b38af101742295a47e2d918b2946.scope: Deactivated successfully. Jul 6 23:22:07.447079 systemd[1]: cri-containerd-e08f8b0a2a8dd5dd36067d3033e6be019bb9b38af101742295a47e2d918b2946.scope: Consumed 521ms CPU time, 175.6M memory peak, 2.5M read from disk, 165.8M written to disk. Jul 6 23:22:07.447920 containerd[1523]: time="2025-07-06T23:22:07.447667126Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e08f8b0a2a8dd5dd36067d3033e6be019bb9b38af101742295a47e2d918b2946\" id:\"e08f8b0a2a8dd5dd36067d3033e6be019bb9b38af101742295a47e2d918b2946\" pid:3402 exited_at:{seconds:1751844127 nanos:446839712}" Jul 6 23:22:07.454808 containerd[1523]: time="2025-07-06T23:22:07.454747452Z" level=info msg="received exit event container_id:\"e08f8b0a2a8dd5dd36067d3033e6be019bb9b38af101742295a47e2d918b2946\" id:\"e08f8b0a2a8dd5dd36067d3033e6be019bb9b38af101742295a47e2d918b2946\" pid:3402 exited_at:{seconds:1751844127 nanos:446839712}" Jul 6 23:22:07.477142 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e08f8b0a2a8dd5dd36067d3033e6be019bb9b38af101742295a47e2d918b2946-rootfs.mount: Deactivated successfully. Jul 6 23:22:07.490878 kubelet[2638]: I0706 23:22:07.490349 2638 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Jul 6 23:22:07.729702 systemd[1]: Created slice kubepods-burstable-poda52ef37d_ef25_43c3_9985_70f4f2b627aa.slice - libcontainer container kubepods-burstable-poda52ef37d_ef25_43c3_9985_70f4f2b627aa.slice. Jul 6 23:22:07.751252 systemd[1]: Created slice kubepods-burstable-pod3775b966_20b1_4599_afab_88aae053916c.slice - libcontainer container kubepods-burstable-pod3775b966_20b1_4599_afab_88aae053916c.slice. Jul 6 23:22:07.790833 systemd[1]: Created slice kubepods-besteffort-pod08864749_3109_4688_85c3_afcbdbdd39a1.slice - libcontainer container kubepods-besteffort-pod08864749_3109_4688_85c3_afcbdbdd39a1.slice. Jul 6 23:22:07.798816 kubelet[2638]: I0706 23:22:07.798782 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3775b966-20b1-4599-afab-88aae053916c-config-volume\") pod \"coredns-7c65d6cfc9-cgcpv\" (UID: \"3775b966-20b1-4599-afab-88aae053916c\") " pod="kube-system/coredns-7c65d6cfc9-cgcpv" Jul 6 23:22:07.799089 kubelet[2638]: I0706 23:22:07.799069 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a52ef37d-ef25-43c3-9985-70f4f2b627aa-config-volume\") pod \"coredns-7c65d6cfc9-7l8lq\" (UID: \"a52ef37d-ef25-43c3-9985-70f4f2b627aa\") " pod="kube-system/coredns-7c65d6cfc9-7l8lq" Jul 6 23:22:07.799268 kubelet[2638]: I0706 23:22:07.799175 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g24kx\" (UniqueName: \"kubernetes.io/projected/a52ef37d-ef25-43c3-9985-70f4f2b627aa-kube-api-access-g24kx\") pod \"coredns-7c65d6cfc9-7l8lq\" (UID: \"a52ef37d-ef25-43c3-9985-70f4f2b627aa\") " pod="kube-system/coredns-7c65d6cfc9-7l8lq" Jul 6 23:22:07.799268 kubelet[2638]: I0706 23:22:07.799206 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bmmf\" (UniqueName: \"kubernetes.io/projected/3775b966-20b1-4599-afab-88aae053916c-kube-api-access-8bmmf\") pod \"coredns-7c65d6cfc9-cgcpv\" (UID: \"3775b966-20b1-4599-afab-88aae053916c\") " pod="kube-system/coredns-7c65d6cfc9-cgcpv" Jul 6 23:22:07.800067 systemd[1]: Created slice kubepods-besteffort-pod1ffb1977_7418_45ee_b1c3_d3d44838d522.slice - libcontainer container kubepods-besteffort-pod1ffb1977_7418_45ee_b1c3_d3d44838d522.slice. Jul 6 23:22:07.810256 systemd[1]: Created slice kubepods-besteffort-pod3f6ff87d_315d_41c2_bd3f_018458086797.slice - libcontainer container kubepods-besteffort-pod3f6ff87d_315d_41c2_bd3f_018458086797.slice. Jul 6 23:22:07.816146 systemd[1]: Created slice kubepods-besteffort-podf90c94a7_4d42_4be6_aae4_d8a8f872c0b6.slice - libcontainer container kubepods-besteffort-podf90c94a7_4d42_4be6_aae4_d8a8f872c0b6.slice. Jul 6 23:22:07.822509 systemd[1]: Created slice kubepods-besteffort-poddd2f3b40_ded4_4a5a_93a8_55bc4dbee958.slice - libcontainer container kubepods-besteffort-poddd2f3b40_ded4_4a5a_93a8_55bc4dbee958.slice. Jul 6 23:22:07.900276 kubelet[2638]: I0706 23:22:07.900190 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsjhb\" (UniqueName: \"kubernetes.io/projected/1ffb1977-7418-45ee-b1c3-d3d44838d522-kube-api-access-lsjhb\") pod \"calico-apiserver-fb787ffc-8r9s6\" (UID: \"1ffb1977-7418-45ee-b1c3-d3d44838d522\") " pod="calico-apiserver/calico-apiserver-fb787ffc-8r9s6" Jul 6 23:22:07.900276 kubelet[2638]: I0706 23:22:07.900244 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k44x7\" (UniqueName: \"kubernetes.io/projected/08864749-3109-4688-85c3-afcbdbdd39a1-kube-api-access-k44x7\") pod \"calico-kube-controllers-9fbbd5ffc-k79h8\" (UID: \"08864749-3109-4688-85c3-afcbdbdd39a1\") " pod="calico-system/calico-kube-controllers-9fbbd5ffc-k79h8" Jul 6 23:22:07.900276 kubelet[2638]: I0706 23:22:07.900274 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd2f3b40-ded4-4a5a-93a8-55bc4dbee958-goldmane-ca-bundle\") pod \"goldmane-58fd7646b9-6smwc\" (UID: \"dd2f3b40-ded4-4a5a-93a8-55bc4dbee958\") " pod="calico-system/goldmane-58fd7646b9-6smwc" Jul 6 23:22:07.900464 kubelet[2638]: I0706 23:22:07.900310 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcrwk\" (UniqueName: \"kubernetes.io/projected/f90c94a7-4d42-4be6-aae4-d8a8f872c0b6-kube-api-access-rcrwk\") pod \"whisker-58567f755d-k6nq5\" (UID: \"f90c94a7-4d42-4be6-aae4-d8a8f872c0b6\") " pod="calico-system/whisker-58567f755d-k6nq5" Jul 6 23:22:07.900464 kubelet[2638]: I0706 23:22:07.900342 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd2f3b40-ded4-4a5a-93a8-55bc4dbee958-config\") pod \"goldmane-58fd7646b9-6smwc\" (UID: \"dd2f3b40-ded4-4a5a-93a8-55bc4dbee958\") " pod="calico-system/goldmane-58fd7646b9-6smwc" Jul 6 23:22:07.900464 kubelet[2638]: I0706 23:22:07.900361 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/dd2f3b40-ded4-4a5a-93a8-55bc4dbee958-goldmane-key-pair\") pod \"goldmane-58fd7646b9-6smwc\" (UID: \"dd2f3b40-ded4-4a5a-93a8-55bc4dbee958\") " pod="calico-system/goldmane-58fd7646b9-6smwc" Jul 6 23:22:07.900464 kubelet[2638]: I0706 23:22:07.900379 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdlt2\" (UniqueName: \"kubernetes.io/projected/dd2f3b40-ded4-4a5a-93a8-55bc4dbee958-kube-api-access-wdlt2\") pod \"goldmane-58fd7646b9-6smwc\" (UID: \"dd2f3b40-ded4-4a5a-93a8-55bc4dbee958\") " pod="calico-system/goldmane-58fd7646b9-6smwc" Jul 6 23:22:07.900464 kubelet[2638]: I0706 23:22:07.900397 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7dr8\" (UniqueName: \"kubernetes.io/projected/3f6ff87d-315d-41c2-bd3f-018458086797-kube-api-access-n7dr8\") pod \"calico-apiserver-fb787ffc-krgms\" (UID: \"3f6ff87d-315d-41c2-bd3f-018458086797\") " pod="calico-apiserver/calico-apiserver-fb787ffc-krgms" Jul 6 23:22:07.900593 kubelet[2638]: I0706 23:22:07.900434 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/3f6ff87d-315d-41c2-bd3f-018458086797-calico-apiserver-certs\") pod \"calico-apiserver-fb787ffc-krgms\" (UID: \"3f6ff87d-315d-41c2-bd3f-018458086797\") " pod="calico-apiserver/calico-apiserver-fb787ffc-krgms" Jul 6 23:22:07.900593 kubelet[2638]: I0706 23:22:07.900453 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/f90c94a7-4d42-4be6-aae4-d8a8f872c0b6-whisker-backend-key-pair\") pod \"whisker-58567f755d-k6nq5\" (UID: \"f90c94a7-4d42-4be6-aae4-d8a8f872c0b6\") " pod="calico-system/whisker-58567f755d-k6nq5" Jul 6 23:22:07.900593 kubelet[2638]: I0706 23:22:07.900469 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f90c94a7-4d42-4be6-aae4-d8a8f872c0b6-whisker-ca-bundle\") pod \"whisker-58567f755d-k6nq5\" (UID: \"f90c94a7-4d42-4be6-aae4-d8a8f872c0b6\") " pod="calico-system/whisker-58567f755d-k6nq5" Jul 6 23:22:07.900593 kubelet[2638]: I0706 23:22:07.900487 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08864749-3109-4688-85c3-afcbdbdd39a1-tigera-ca-bundle\") pod \"calico-kube-controllers-9fbbd5ffc-k79h8\" (UID: \"08864749-3109-4688-85c3-afcbdbdd39a1\") " pod="calico-system/calico-kube-controllers-9fbbd5ffc-k79h8" Jul 6 23:22:07.900593 kubelet[2638]: I0706 23:22:07.900504 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/1ffb1977-7418-45ee-b1c3-d3d44838d522-calico-apiserver-certs\") pod \"calico-apiserver-fb787ffc-8r9s6\" (UID: \"1ffb1977-7418-45ee-b1c3-d3d44838d522\") " pod="calico-apiserver/calico-apiserver-fb787ffc-8r9s6" Jul 6 23:22:08.046559 containerd[1523]: time="2025-07-06T23:22:08.046222330Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-7l8lq,Uid:a52ef37d-ef25-43c3-9985-70f4f2b627aa,Namespace:kube-system,Attempt:0,}" Jul 6 23:22:08.104661 containerd[1523]: time="2025-07-06T23:22:08.096884904Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-cgcpv,Uid:3775b966-20b1-4599-afab-88aae053916c,Namespace:kube-system,Attempt:0,}" Jul 6 23:22:08.104661 containerd[1523]: time="2025-07-06T23:22:08.097145957Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-9fbbd5ffc-k79h8,Uid:08864749-3109-4688-85c3-afcbdbdd39a1,Namespace:calico-system,Attempt:0,}" Jul 6 23:22:08.113455 containerd[1523]: time="2025-07-06T23:22:08.112928659Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-fb787ffc-8r9s6,Uid:1ffb1977-7418-45ee-b1c3-d3d44838d522,Namespace:calico-apiserver,Attempt:0,}" Jul 6 23:22:08.113455 containerd[1523]: time="2025-07-06T23:22:08.113279170Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\"" Jul 6 23:22:08.116693 containerd[1523]: time="2025-07-06T23:22:08.114634163Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-fb787ffc-krgms,Uid:3f6ff87d-315d-41c2-bd3f-018458086797,Namespace:calico-apiserver,Attempt:0,}" Jul 6 23:22:08.129031 containerd[1523]: time="2025-07-06T23:22:08.124523157Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-58567f755d-k6nq5,Uid:f90c94a7-4d42-4be6-aae4-d8a8f872c0b6,Namespace:calico-system,Attempt:0,}" Jul 6 23:22:08.133543 containerd[1523]: time="2025-07-06T23:22:08.129235467Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-6smwc,Uid:dd2f3b40-ded4-4a5a-93a8-55bc4dbee958,Namespace:calico-system,Attempt:0,}" Jul 6 23:22:08.618449 containerd[1523]: time="2025-07-06T23:22:08.618392966Z" level=error msg="Failed to destroy network for sandbox \"7d6f3c601d9113017948721a9af01d31299db6d19e072bf7aca0fcb5b6d188fa\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:22:08.621936 containerd[1523]: time="2025-07-06T23:22:08.621881870Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-9fbbd5ffc-k79h8,Uid:08864749-3109-4688-85c3-afcbdbdd39a1,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7d6f3c601d9113017948721a9af01d31299db6d19e072bf7aca0fcb5b6d188fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:22:08.623032 containerd[1523]: time="2025-07-06T23:22:08.622967688Z" level=error msg="Failed to destroy network for sandbox \"ebf96939aa4fd75db84bda9d2b3f5b66fcf4b58a32f459aac9c9f1c2c9ae6c66\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:22:08.624444 containerd[1523]: time="2025-07-06T23:22:08.624391375Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-58567f755d-k6nq5,Uid:f90c94a7-4d42-4be6-aae4-d8a8f872c0b6,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ebf96939aa4fd75db84bda9d2b3f5b66fcf4b58a32f459aac9c9f1c2c9ae6c66\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:22:08.626446 kubelet[2638]: E0706 23:22:08.626374 2638 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7d6f3c601d9113017948721a9af01d31299db6d19e072bf7aca0fcb5b6d188fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:22:08.626962 kubelet[2638]: E0706 23:22:08.626480 2638 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7d6f3c601d9113017948721a9af01d31299db6d19e072bf7aca0fcb5b6d188fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-9fbbd5ffc-k79h8" Jul 6 23:22:08.626962 kubelet[2638]: E0706 23:22:08.626501 2638 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7d6f3c601d9113017948721a9af01d31299db6d19e072bf7aca0fcb5b6d188fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-9fbbd5ffc-k79h8" Jul 6 23:22:08.626962 kubelet[2638]: E0706 23:22:08.626550 2638 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-9fbbd5ffc-k79h8_calico-system(08864749-3109-4688-85c3-afcbdbdd39a1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-9fbbd5ffc-k79h8_calico-system(08864749-3109-4688-85c3-afcbdbdd39a1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7d6f3c601d9113017948721a9af01d31299db6d19e072bf7aca0fcb5b6d188fa\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-9fbbd5ffc-k79h8" podUID="08864749-3109-4688-85c3-afcbdbdd39a1" Jul 6 23:22:08.627171 kubelet[2638]: E0706 23:22:08.627136 2638 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ebf96939aa4fd75db84bda9d2b3f5b66fcf4b58a32f459aac9c9f1c2c9ae6c66\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:22:08.627208 kubelet[2638]: E0706 23:22:08.627187 2638 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ebf96939aa4fd75db84bda9d2b3f5b66fcf4b58a32f459aac9c9f1c2c9ae6c66\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-58567f755d-k6nq5" Jul 6 23:22:08.627239 kubelet[2638]: E0706 23:22:08.627204 2638 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ebf96939aa4fd75db84bda9d2b3f5b66fcf4b58a32f459aac9c9f1c2c9ae6c66\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-58567f755d-k6nq5" Jul 6 23:22:08.627391 kubelet[2638]: E0706 23:22:08.627362 2638 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-58567f755d-k6nq5_calico-system(f90c94a7-4d42-4be6-aae4-d8a8f872c0b6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-58567f755d-k6nq5_calico-system(f90c94a7-4d42-4be6-aae4-d8a8f872c0b6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ebf96939aa4fd75db84bda9d2b3f5b66fcf4b58a32f459aac9c9f1c2c9ae6c66\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-58567f755d-k6nq5" podUID="f90c94a7-4d42-4be6-aae4-d8a8f872c0b6" Jul 6 23:22:08.633719 containerd[1523]: time="2025-07-06T23:22:08.633553023Z" level=error msg="Failed to destroy network for sandbox \"869f84249d8f4dad29cb4a63575fa1417c34e31af629682b0f36f9070cf0eaf6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:22:08.636507 containerd[1523]: time="2025-07-06T23:22:08.636445246Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-7l8lq,Uid:a52ef37d-ef25-43c3-9985-70f4f2b627aa,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"869f84249d8f4dad29cb4a63575fa1417c34e31af629682b0f36f9070cf0eaf6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:22:08.636779 kubelet[2638]: E0706 23:22:08.636713 2638 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"869f84249d8f4dad29cb4a63575fa1417c34e31af629682b0f36f9070cf0eaf6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:22:08.636831 kubelet[2638]: E0706 23:22:08.636778 2638 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"869f84249d8f4dad29cb4a63575fa1417c34e31af629682b0f36f9070cf0eaf6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-7l8lq" Jul 6 23:22:08.636831 kubelet[2638]: E0706 23:22:08.636798 2638 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"869f84249d8f4dad29cb4a63575fa1417c34e31af629682b0f36f9070cf0eaf6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-7l8lq" Jul 6 23:22:08.636888 kubelet[2638]: E0706 23:22:08.636839 2638 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-7l8lq_kube-system(a52ef37d-ef25-43c3-9985-70f4f2b627aa)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-7l8lq_kube-system(a52ef37d-ef25-43c3-9985-70f4f2b627aa)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"869f84249d8f4dad29cb4a63575fa1417c34e31af629682b0f36f9070cf0eaf6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-7l8lq" podUID="a52ef37d-ef25-43c3-9985-70f4f2b627aa" Jul 6 23:22:08.638864 containerd[1523]: time="2025-07-06T23:22:08.638755871Z" level=error msg="Failed to destroy network for sandbox \"b4dcaf73f791f78185e02b6db25399e4eb25261960c15b3a9eb8ed2468e4a5fb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:22:08.640600 containerd[1523]: time="2025-07-06T23:22:08.640546913Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-fb787ffc-krgms,Uid:3f6ff87d-315d-41c2-bd3f-018458086797,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b4dcaf73f791f78185e02b6db25399e4eb25261960c15b3a9eb8ed2468e4a5fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:22:08.640963 kubelet[2638]: E0706 23:22:08.640791 2638 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b4dcaf73f791f78185e02b6db25399e4eb25261960c15b3a9eb8ed2468e4a5fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:22:08.640963 kubelet[2638]: E0706 23:22:08.640848 2638 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b4dcaf73f791f78185e02b6db25399e4eb25261960c15b3a9eb8ed2468e4a5fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-fb787ffc-krgms" Jul 6 23:22:08.640963 kubelet[2638]: E0706 23:22:08.640868 2638 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b4dcaf73f791f78185e02b6db25399e4eb25261960c15b3a9eb8ed2468e4a5fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-fb787ffc-krgms" Jul 6 23:22:08.641227 kubelet[2638]: E0706 23:22:08.640905 2638 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-fb787ffc-krgms_calico-apiserver(3f6ff87d-315d-41c2-bd3f-018458086797)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-fb787ffc-krgms_calico-apiserver(3f6ff87d-315d-41c2-bd3f-018458086797)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b4dcaf73f791f78185e02b6db25399e4eb25261960c15b3a9eb8ed2468e4a5fb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-fb787ffc-krgms" podUID="3f6ff87d-315d-41c2-bd3f-018458086797" Jul 6 23:22:08.644576 containerd[1523]: time="2025-07-06T23:22:08.644496069Z" level=error msg="Failed to destroy network for sandbox \"379bedfc16867d9ccf45dd525398370d6ac743e5463e05e48b481d02ac60306e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:22:08.647739 containerd[1523]: time="2025-07-06T23:22:08.647681591Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-6smwc,Uid:dd2f3b40-ded4-4a5a-93a8-55bc4dbee958,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"379bedfc16867d9ccf45dd525398370d6ac743e5463e05e48b481d02ac60306e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:22:08.647972 kubelet[2638]: E0706 23:22:08.647925 2638 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"379bedfc16867d9ccf45dd525398370d6ac743e5463e05e48b481d02ac60306e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:22:08.648104 kubelet[2638]: E0706 23:22:08.647988 2638 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"379bedfc16867d9ccf45dd525398370d6ac743e5463e05e48b481d02ac60306e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-58fd7646b9-6smwc" Jul 6 23:22:08.648104 kubelet[2638]: E0706 23:22:08.648034 2638 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"379bedfc16867d9ccf45dd525398370d6ac743e5463e05e48b481d02ac60306e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-58fd7646b9-6smwc" Jul 6 23:22:08.648104 kubelet[2638]: E0706 23:22:08.648084 2638 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-58fd7646b9-6smwc_calico-system(dd2f3b40-ded4-4a5a-93a8-55bc4dbee958)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-58fd7646b9-6smwc_calico-system(dd2f3b40-ded4-4a5a-93a8-55bc4dbee958)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"379bedfc16867d9ccf45dd525398370d6ac743e5463e05e48b481d02ac60306e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-58fd7646b9-6smwc" podUID="dd2f3b40-ded4-4a5a-93a8-55bc4dbee958" Jul 6 23:22:08.652780 containerd[1523]: time="2025-07-06T23:22:08.652273757Z" level=error msg="Failed to destroy network for sandbox \"a4456318789a3d9070867c797aa92a0b52151503c2f7796b050f60ce9bfd948b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:22:08.653170 containerd[1523]: time="2025-07-06T23:22:08.653123528Z" level=error msg="Failed to destroy network for sandbox \"c46f4aff68f4cc880082d72e077e20aea204b50c51ae7e88d2486b6ad66535ef\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:22:08.654716 containerd[1523]: time="2025-07-06T23:22:08.654676561Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-fb787ffc-8r9s6,Uid:1ffb1977-7418-45ee-b1c3-d3d44838d522,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c46f4aff68f4cc880082d72e077e20aea204b50c51ae7e88d2486b6ad66535ef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:22:08.655215 kubelet[2638]: E0706 23:22:08.655174 2638 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c46f4aff68f4cc880082d72e077e20aea204b50c51ae7e88d2486b6ad66535ef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:22:08.655283 kubelet[2638]: E0706 23:22:08.655238 2638 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c46f4aff68f4cc880082d72e077e20aea204b50c51ae7e88d2486b6ad66535ef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-fb787ffc-8r9s6" Jul 6 23:22:08.655283 kubelet[2638]: E0706 23:22:08.655261 2638 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c46f4aff68f4cc880082d72e077e20aea204b50c51ae7e88d2486b6ad66535ef\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-fb787ffc-8r9s6" Jul 6 23:22:08.655354 kubelet[2638]: E0706 23:22:08.655308 2638 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-fb787ffc-8r9s6_calico-apiserver(1ffb1977-7418-45ee-b1c3-d3d44838d522)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-fb787ffc-8r9s6_calico-apiserver(1ffb1977-7418-45ee-b1c3-d3d44838d522)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c46f4aff68f4cc880082d72e077e20aea204b50c51ae7e88d2486b6ad66535ef\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-fb787ffc-8r9s6" podUID="1ffb1977-7418-45ee-b1c3-d3d44838d522" Jul 6 23:22:08.658380 containerd[1523]: time="2025-07-06T23:22:08.658326217Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-cgcpv,Uid:3775b966-20b1-4599-afab-88aae053916c,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a4456318789a3d9070867c797aa92a0b52151503c2f7796b050f60ce9bfd948b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:22:08.658736 kubelet[2638]: E0706 23:22:08.658701 2638 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a4456318789a3d9070867c797aa92a0b52151503c2f7796b050f60ce9bfd948b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:22:08.658804 kubelet[2638]: E0706 23:22:08.658758 2638 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a4456318789a3d9070867c797aa92a0b52151503c2f7796b050f60ce9bfd948b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-cgcpv" Jul 6 23:22:08.658804 kubelet[2638]: E0706 23:22:08.658793 2638 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a4456318789a3d9070867c797aa92a0b52151503c2f7796b050f60ce9bfd948b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-cgcpv" Jul 6 23:22:08.658867 kubelet[2638]: E0706 23:22:08.658835 2638 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-cgcpv_kube-system(3775b966-20b1-4599-afab-88aae053916c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-cgcpv_kube-system(3775b966-20b1-4599-afab-88aae053916c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a4456318789a3d9070867c797aa92a0b52151503c2f7796b050f60ce9bfd948b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-cgcpv" podUID="3775b966-20b1-4599-afab-88aae053916c" Jul 6 23:22:08.976424 systemd[1]: Created slice kubepods-besteffort-pod28330471_fba4_44a1_96d6_3512652f7a80.slice - libcontainer container kubepods-besteffort-pod28330471_fba4_44a1_96d6_3512652f7a80.slice. Jul 6 23:22:08.979965 containerd[1523]: time="2025-07-06T23:22:08.979926535Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zdbvm,Uid:28330471-fba4-44a1-96d6-3512652f7a80,Namespace:calico-system,Attempt:0,}" Jul 6 23:22:09.028668 containerd[1523]: time="2025-07-06T23:22:09.028606456Z" level=error msg="Failed to destroy network for sandbox \"403b2f658dcddd99687020ada87a1a8056d2eebbf53be310b233f325403f0948\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:22:09.030525 systemd[1]: run-netns-cni\x2dd3ce69c6\x2d5cc9\x2d4fe9\x2dfd47\x2dcc3e27691745.mount: Deactivated successfully. Jul 6 23:22:09.031996 containerd[1523]: time="2025-07-06T23:22:09.031816478Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zdbvm,Uid:28330471-fba4-44a1-96d6-3512652f7a80,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"403b2f658dcddd99687020ada87a1a8056d2eebbf53be310b233f325403f0948\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:22:09.032302 kubelet[2638]: E0706 23:22:09.032176 2638 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"403b2f658dcddd99687020ada87a1a8056d2eebbf53be310b233f325403f0948\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:22:09.032302 kubelet[2638]: E0706 23:22:09.032277 2638 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"403b2f658dcddd99687020ada87a1a8056d2eebbf53be310b233f325403f0948\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zdbvm" Jul 6 23:22:09.032302 kubelet[2638]: E0706 23:22:09.032300 2638 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"403b2f658dcddd99687020ada87a1a8056d2eebbf53be310b233f325403f0948\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zdbvm" Jul 6 23:22:09.033258 kubelet[2638]: E0706 23:22:09.032421 2638 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-zdbvm_calico-system(28330471-fba4-44a1-96d6-3512652f7a80)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-zdbvm_calico-system(28330471-fba4-44a1-96d6-3512652f7a80)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"403b2f658dcddd99687020ada87a1a8056d2eebbf53be310b233f325403f0948\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-zdbvm" podUID="28330471-fba4-44a1-96d6-3512652f7a80" Jul 6 23:22:11.189041 kubelet[2638]: I0706 23:22:11.188794 2638 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 6 23:22:11.541953 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3054888827.mount: Deactivated successfully. Jul 6 23:22:11.652507 containerd[1523]: time="2025-07-06T23:22:11.652439643Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:22:11.653190 containerd[1523]: time="2025-07-06T23:22:11.653127526Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.2: active requests=0, bytes read=152544909" Jul 6 23:22:11.654174 containerd[1523]: time="2025-07-06T23:22:11.654135708Z" level=info msg="ImageCreate event name:\"sha256:1c6ddca599ddd18c061e797a7830b0aea985f8b023c5e43d815a9ed1088893a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:22:11.656503 containerd[1523]: time="2025-07-06T23:22:11.656434401Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:22:11.657236 containerd[1523]: time="2025-07-06T23:22:11.657202779Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.2\" with image id \"sha256:1c6ddca599ddd18c061e797a7830b0aea985f8b023c5e43d815a9ed1088893a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\", size \"152544771\" in 3.543891283s" Jul 6 23:22:11.657286 containerd[1523]: time="2025-07-06T23:22:11.657242266Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\" returns image reference \"sha256:1c6ddca599ddd18c061e797a7830b0aea985f8b023c5e43d815a9ed1088893a9\"" Jul 6 23:22:11.705144 containerd[1523]: time="2025-07-06T23:22:11.705093951Z" level=info msg="CreateContainer within sandbox \"5e6dfcf9c75fd02185da227b5aa4e77d40b90a722c42f94288d7636a078edf8e\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jul 6 23:22:11.715382 containerd[1523]: time="2025-07-06T23:22:11.714746286Z" level=info msg="Container b1eb33e6f7852b4b593b9f6302f239eb680e88b5c9ed06ad603c045af33ff2f6: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:22:11.732475 containerd[1523]: time="2025-07-06T23:22:11.732421144Z" level=info msg="CreateContainer within sandbox \"5e6dfcf9c75fd02185da227b5aa4e77d40b90a722c42f94288d7636a078edf8e\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"b1eb33e6f7852b4b593b9f6302f239eb680e88b5c9ed06ad603c045af33ff2f6\"" Jul 6 23:22:11.734421 containerd[1523]: time="2025-07-06T23:22:11.732991487Z" level=info msg="StartContainer for \"b1eb33e6f7852b4b593b9f6302f239eb680e88b5c9ed06ad603c045af33ff2f6\"" Jul 6 23:22:11.738425 containerd[1523]: time="2025-07-06T23:22:11.738348010Z" level=info msg="connecting to shim b1eb33e6f7852b4b593b9f6302f239eb680e88b5c9ed06ad603c045af33ff2f6" address="unix:///run/containerd/s/684acc92396be81d94b1917f4a26341bd9c5dee00690b53513bd92e8feaa74e1" protocol=ttrpc version=3 Jul 6 23:22:11.762258 systemd[1]: Started cri-containerd-b1eb33e6f7852b4b593b9f6302f239eb680e88b5c9ed06ad603c045af33ff2f6.scope - libcontainer container b1eb33e6f7852b4b593b9f6302f239eb680e88b5c9ed06ad603c045af33ff2f6. Jul 6 23:22:11.815974 containerd[1523]: time="2025-07-06T23:22:11.814416328Z" level=info msg="StartContainer for \"b1eb33e6f7852b4b593b9f6302f239eb680e88b5c9ed06ad603c045af33ff2f6\" returns successfully" Jul 6 23:22:12.106674 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jul 6 23:22:12.106846 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jul 6 23:22:12.261697 kubelet[2638]: I0706 23:22:12.260426 2638 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-9x4qn" podStartSLOduration=1.416801463 podStartE2EDuration="13.260404626s" podCreationTimestamp="2025-07-06 23:21:59 +0000 UTC" firstStartedPulling="2025-07-06 23:21:59.814467052 +0000 UTC m=+18.952185153" lastFinishedPulling="2025-07-06 23:22:11.658070175 +0000 UTC m=+30.795788316" observedRunningTime="2025-07-06 23:22:12.146424779 +0000 UTC m=+31.284142920" watchObservedRunningTime="2025-07-06 23:22:12.260404626 +0000 UTC m=+31.398122727" Jul 6 23:22:12.347482 kubelet[2638]: I0706 23:22:12.347440 2638 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcrwk\" (UniqueName: \"kubernetes.io/projected/f90c94a7-4d42-4be6-aae4-d8a8f872c0b6-kube-api-access-rcrwk\") pod \"f90c94a7-4d42-4be6-aae4-d8a8f872c0b6\" (UID: \"f90c94a7-4d42-4be6-aae4-d8a8f872c0b6\") " Jul 6 23:22:12.347709 kubelet[2638]: I0706 23:22:12.347691 2638 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/f90c94a7-4d42-4be6-aae4-d8a8f872c0b6-whisker-backend-key-pair\") pod \"f90c94a7-4d42-4be6-aae4-d8a8f872c0b6\" (UID: \"f90c94a7-4d42-4be6-aae4-d8a8f872c0b6\") " Jul 6 23:22:12.347853 kubelet[2638]: I0706 23:22:12.347839 2638 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f90c94a7-4d42-4be6-aae4-d8a8f872c0b6-whisker-ca-bundle\") pod \"f90c94a7-4d42-4be6-aae4-d8a8f872c0b6\" (UID: \"f90c94a7-4d42-4be6-aae4-d8a8f872c0b6\") " Jul 6 23:22:12.352314 kubelet[2638]: I0706 23:22:12.352238 2638 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f90c94a7-4d42-4be6-aae4-d8a8f872c0b6-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "f90c94a7-4d42-4be6-aae4-d8a8f872c0b6" (UID: "f90c94a7-4d42-4be6-aae4-d8a8f872c0b6"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jul 6 23:22:12.356071 kubelet[2638]: I0706 23:22:12.355966 2638 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f90c94a7-4d42-4be6-aae4-d8a8f872c0b6-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "f90c94a7-4d42-4be6-aae4-d8a8f872c0b6" (UID: "f90c94a7-4d42-4be6-aae4-d8a8f872c0b6"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Jul 6 23:22:12.356534 kubelet[2638]: I0706 23:22:12.356478 2638 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f90c94a7-4d42-4be6-aae4-d8a8f872c0b6-kube-api-access-rcrwk" (OuterVolumeSpecName: "kube-api-access-rcrwk") pod "f90c94a7-4d42-4be6-aae4-d8a8f872c0b6" (UID: "f90c94a7-4d42-4be6-aae4-d8a8f872c0b6"). InnerVolumeSpecName "kube-api-access-rcrwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jul 6 23:22:12.448734 kubelet[2638]: I0706 23:22:12.448656 2638 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f90c94a7-4d42-4be6-aae4-d8a8f872c0b6-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" Jul 6 23:22:12.448734 kubelet[2638]: I0706 23:22:12.448699 2638 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcrwk\" (UniqueName: \"kubernetes.io/projected/f90c94a7-4d42-4be6-aae4-d8a8f872c0b6-kube-api-access-rcrwk\") on node \"localhost\" DevicePath \"\"" Jul 6 23:22:12.448734 kubelet[2638]: I0706 23:22:12.448709 2638 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/f90c94a7-4d42-4be6-aae4-d8a8f872c0b6-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" Jul 6 23:22:12.542906 systemd[1]: var-lib-kubelet-pods-f90c94a7\x2d4d42\x2d4be6\x2daae4\x2dd8a8f872c0b6-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2drcrwk.mount: Deactivated successfully. Jul 6 23:22:12.543007 systemd[1]: var-lib-kubelet-pods-f90c94a7\x2d4d42\x2d4be6\x2daae4\x2dd8a8f872c0b6-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jul 6 23:22:12.976464 systemd[1]: Removed slice kubepods-besteffort-podf90c94a7_4d42_4be6_aae4_d8a8f872c0b6.slice - libcontainer container kubepods-besteffort-podf90c94a7_4d42_4be6_aae4_d8a8f872c0b6.slice. Jul 6 23:22:13.118272 kubelet[2638]: I0706 23:22:13.118234 2638 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 6 23:22:13.210436 systemd[1]: Created slice kubepods-besteffort-pod101ae86d_321f_4c4a_ba4d_37e4973ad678.slice - libcontainer container kubepods-besteffort-pod101ae86d_321f_4c4a_ba4d_37e4973ad678.slice. Jul 6 23:22:13.254353 kubelet[2638]: I0706 23:22:13.254219 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/101ae86d-321f-4c4a-ba4d-37e4973ad678-whisker-backend-key-pair\") pod \"whisker-78559d498-qj8dq\" (UID: \"101ae86d-321f-4c4a-ba4d-37e4973ad678\") " pod="calico-system/whisker-78559d498-qj8dq" Jul 6 23:22:13.254353 kubelet[2638]: I0706 23:22:13.254272 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/101ae86d-321f-4c4a-ba4d-37e4973ad678-whisker-ca-bundle\") pod \"whisker-78559d498-qj8dq\" (UID: \"101ae86d-321f-4c4a-ba4d-37e4973ad678\") " pod="calico-system/whisker-78559d498-qj8dq" Jul 6 23:22:13.254353 kubelet[2638]: I0706 23:22:13.254307 2638 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mg8x\" (UniqueName: \"kubernetes.io/projected/101ae86d-321f-4c4a-ba4d-37e4973ad678-kube-api-access-6mg8x\") pod \"whisker-78559d498-qj8dq\" (UID: \"101ae86d-321f-4c4a-ba4d-37e4973ad678\") " pod="calico-system/whisker-78559d498-qj8dq" Jul 6 23:22:13.513735 containerd[1523]: time="2025-07-06T23:22:13.513610226Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-78559d498-qj8dq,Uid:101ae86d-321f-4c4a-ba4d-37e4973ad678,Namespace:calico-system,Attempt:0,}" Jul 6 23:22:13.933094 systemd-networkd[1453]: calieb427817edc: Link UP Jul 6 23:22:13.933912 systemd-networkd[1453]: calieb427817edc: Gained carrier Jul 6 23:22:13.956760 containerd[1523]: 2025-07-06 23:22:13.539 [INFO][3785] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 6 23:22:13.956760 containerd[1523]: 2025-07-06 23:22:13.606 [INFO][3785] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--78559d498--qj8dq-eth0 whisker-78559d498- calico-system 101ae86d-321f-4c4a-ba4d-37e4973ad678 860 0 2025-07-06 23:22:13 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:78559d498 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-78559d498-qj8dq eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calieb427817edc [] [] }} ContainerID="701df0c3f98bc3cdf4f095db43de4a8ddfc8888403a043f65c2795866316c52f" Namespace="calico-system" Pod="whisker-78559d498-qj8dq" WorkloadEndpoint="localhost-k8s-whisker--78559d498--qj8dq-" Jul 6 23:22:13.956760 containerd[1523]: 2025-07-06 23:22:13.607 [INFO][3785] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="701df0c3f98bc3cdf4f095db43de4a8ddfc8888403a043f65c2795866316c52f" Namespace="calico-system" Pod="whisker-78559d498-qj8dq" WorkloadEndpoint="localhost-k8s-whisker--78559d498--qj8dq-eth0" Jul 6 23:22:13.956760 containerd[1523]: 2025-07-06 23:22:13.850 [INFO][3814] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="701df0c3f98bc3cdf4f095db43de4a8ddfc8888403a043f65c2795866316c52f" HandleID="k8s-pod-network.701df0c3f98bc3cdf4f095db43de4a8ddfc8888403a043f65c2795866316c52f" Workload="localhost-k8s-whisker--78559d498--qj8dq-eth0" Jul 6 23:22:13.957342 containerd[1523]: 2025-07-06 23:22:13.850 [INFO][3814] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="701df0c3f98bc3cdf4f095db43de4a8ddfc8888403a043f65c2795866316c52f" HandleID="k8s-pod-network.701df0c3f98bc3cdf4f095db43de4a8ddfc8888403a043f65c2795866316c52f" Workload="localhost-k8s-whisker--78559d498--qj8dq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000518f10), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-78559d498-qj8dq", "timestamp":"2025-07-06 23:22:13.850294044 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 6 23:22:13.957342 containerd[1523]: 2025-07-06 23:22:13.850 [INFO][3814] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 6 23:22:13.957342 containerd[1523]: 2025-07-06 23:22:13.850 [INFO][3814] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 6 23:22:13.957342 containerd[1523]: 2025-07-06 23:22:13.851 [INFO][3814] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jul 6 23:22:13.957342 containerd[1523]: 2025-07-06 23:22:13.878 [INFO][3814] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.701df0c3f98bc3cdf4f095db43de4a8ddfc8888403a043f65c2795866316c52f" host="localhost" Jul 6 23:22:13.957342 containerd[1523]: 2025-07-06 23:22:13.885 [INFO][3814] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jul 6 23:22:13.957342 containerd[1523]: 2025-07-06 23:22:13.892 [INFO][3814] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jul 6 23:22:13.957342 containerd[1523]: 2025-07-06 23:22:13.895 [INFO][3814] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jul 6 23:22:13.957342 containerd[1523]: 2025-07-06 23:22:13.898 [INFO][3814] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jul 6 23:22:13.957342 containerd[1523]: 2025-07-06 23:22:13.898 [INFO][3814] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.701df0c3f98bc3cdf4f095db43de4a8ddfc8888403a043f65c2795866316c52f" host="localhost" Jul 6 23:22:13.958374 containerd[1523]: 2025-07-06 23:22:13.901 [INFO][3814] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.701df0c3f98bc3cdf4f095db43de4a8ddfc8888403a043f65c2795866316c52f Jul 6 23:22:13.958374 containerd[1523]: 2025-07-06 23:22:13.907 [INFO][3814] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.701df0c3f98bc3cdf4f095db43de4a8ddfc8888403a043f65c2795866316c52f" host="localhost" Jul 6 23:22:13.958374 containerd[1523]: 2025-07-06 23:22:13.914 [INFO][3814] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.701df0c3f98bc3cdf4f095db43de4a8ddfc8888403a043f65c2795866316c52f" host="localhost" Jul 6 23:22:13.958374 containerd[1523]: 2025-07-06 23:22:13.914 [INFO][3814] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.701df0c3f98bc3cdf4f095db43de4a8ddfc8888403a043f65c2795866316c52f" host="localhost" Jul 6 23:22:13.958374 containerd[1523]: 2025-07-06 23:22:13.914 [INFO][3814] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 6 23:22:13.958374 containerd[1523]: 2025-07-06 23:22:13.914 [INFO][3814] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="701df0c3f98bc3cdf4f095db43de4a8ddfc8888403a043f65c2795866316c52f" HandleID="k8s-pod-network.701df0c3f98bc3cdf4f095db43de4a8ddfc8888403a043f65c2795866316c52f" Workload="localhost-k8s-whisker--78559d498--qj8dq-eth0" Jul 6 23:22:13.958490 containerd[1523]: 2025-07-06 23:22:13.919 [INFO][3785] cni-plugin/k8s.go 418: Populated endpoint ContainerID="701df0c3f98bc3cdf4f095db43de4a8ddfc8888403a043f65c2795866316c52f" Namespace="calico-system" Pod="whisker-78559d498-qj8dq" WorkloadEndpoint="localhost-k8s-whisker--78559d498--qj8dq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--78559d498--qj8dq-eth0", GenerateName:"whisker-78559d498-", Namespace:"calico-system", SelfLink:"", UID:"101ae86d-321f-4c4a-ba4d-37e4973ad678", ResourceVersion:"860", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 22, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"78559d498", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-78559d498-qj8dq", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calieb427817edc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:22:13.958490 containerd[1523]: 2025-07-06 23:22:13.920 [INFO][3785] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="701df0c3f98bc3cdf4f095db43de4a8ddfc8888403a043f65c2795866316c52f" Namespace="calico-system" Pod="whisker-78559d498-qj8dq" WorkloadEndpoint="localhost-k8s-whisker--78559d498--qj8dq-eth0" Jul 6 23:22:13.958574 containerd[1523]: 2025-07-06 23:22:13.921 [INFO][3785] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calieb427817edc ContainerID="701df0c3f98bc3cdf4f095db43de4a8ddfc8888403a043f65c2795866316c52f" Namespace="calico-system" Pod="whisker-78559d498-qj8dq" WorkloadEndpoint="localhost-k8s-whisker--78559d498--qj8dq-eth0" Jul 6 23:22:13.958574 containerd[1523]: 2025-07-06 23:22:13.934 [INFO][3785] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="701df0c3f98bc3cdf4f095db43de4a8ddfc8888403a043f65c2795866316c52f" Namespace="calico-system" Pod="whisker-78559d498-qj8dq" WorkloadEndpoint="localhost-k8s-whisker--78559d498--qj8dq-eth0" Jul 6 23:22:13.958615 containerd[1523]: 2025-07-06 23:22:13.938 [INFO][3785] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="701df0c3f98bc3cdf4f095db43de4a8ddfc8888403a043f65c2795866316c52f" Namespace="calico-system" Pod="whisker-78559d498-qj8dq" WorkloadEndpoint="localhost-k8s-whisker--78559d498--qj8dq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--78559d498--qj8dq-eth0", GenerateName:"whisker-78559d498-", Namespace:"calico-system", SelfLink:"", UID:"101ae86d-321f-4c4a-ba4d-37e4973ad678", ResourceVersion:"860", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 22, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"78559d498", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"701df0c3f98bc3cdf4f095db43de4a8ddfc8888403a043f65c2795866316c52f", Pod:"whisker-78559d498-qj8dq", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calieb427817edc", MAC:"2a:7d:5a:98:ee:43", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:22:13.958666 containerd[1523]: 2025-07-06 23:22:13.951 [INFO][3785] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="701df0c3f98bc3cdf4f095db43de4a8ddfc8888403a043f65c2795866316c52f" Namespace="calico-system" Pod="whisker-78559d498-qj8dq" WorkloadEndpoint="localhost-k8s-whisker--78559d498--qj8dq-eth0" Jul 6 23:22:13.980955 containerd[1523]: time="2025-07-06T23:22:13.980891273Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b1eb33e6f7852b4b593b9f6302f239eb680e88b5c9ed06ad603c045af33ff2f6\" id:\"cff0e36657376bde46b9b7d89c48840bb64ecac7fbe7dbbaed2ba6d1aff15caa\" pid:3920 exit_status:1 exited_at:{seconds:1751844133 nanos:980532212}" Jul 6 23:22:14.091669 containerd[1523]: time="2025-07-06T23:22:14.091572579Z" level=info msg="connecting to shim 701df0c3f98bc3cdf4f095db43de4a8ddfc8888403a043f65c2795866316c52f" address="unix:///run/containerd/s/870422dd0c2aea82178f7c4f3beefaa8baedf248938d49e34d2f85a345efec74" namespace=k8s.io protocol=ttrpc version=3 Jul 6 23:22:14.135757 containerd[1523]: time="2025-07-06T23:22:14.135696480Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b1eb33e6f7852b4b593b9f6302f239eb680e88b5c9ed06ad603c045af33ff2f6\" id:\"9d177a6e65e9fe3d03fc473ae493aca3f3a89aafc3055e282497a9a4474c5b79\" pid:3983 exit_status:1 exited_at:{seconds:1751844134 nanos:135324060}" Jul 6 23:22:14.142072 systemd[1]: Started cri-containerd-701df0c3f98bc3cdf4f095db43de4a8ddfc8888403a043f65c2795866316c52f.scope - libcontainer container 701df0c3f98bc3cdf4f095db43de4a8ddfc8888403a043f65c2795866316c52f. Jul 6 23:22:14.158201 systemd-resolved[1353]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jul 6 23:22:14.193635 containerd[1523]: time="2025-07-06T23:22:14.193512197Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-78559d498-qj8dq,Uid:101ae86d-321f-4c4a-ba4d-37e4973ad678,Namespace:calico-system,Attempt:0,} returns sandbox id \"701df0c3f98bc3cdf4f095db43de4a8ddfc8888403a043f65c2795866316c52f\"" Jul 6 23:22:14.195918 containerd[1523]: time="2025-07-06T23:22:14.195877180Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\"" Jul 6 23:22:14.238117 systemd-networkd[1453]: vxlan.calico: Link UP Jul 6 23:22:14.238126 systemd-networkd[1453]: vxlan.calico: Gained carrier Jul 6 23:22:14.241380 containerd[1523]: time="2025-07-06T23:22:14.241338898Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b1eb33e6f7852b4b593b9f6302f239eb680e88b5c9ed06ad603c045af33ff2f6\" id:\"af9d996fb3e755ccc6f297c0a91ff55ca4a036b9ac50c4b6a370d50ac1106c0c\" pid:4060 exit_status:1 exited_at:{seconds:1751844134 nanos:239185429}" Jul 6 23:22:14.971431 kubelet[2638]: I0706 23:22:14.971240 2638 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f90c94a7-4d42-4be6-aae4-d8a8f872c0b6" path="/var/lib/kubelet/pods/f90c94a7-4d42-4be6-aae4-d8a8f872c0b6/volumes" Jul 6 23:22:15.001278 systemd-networkd[1453]: calieb427817edc: Gained IPv6LL Jul 6 23:22:15.214692 containerd[1523]: time="2025-07-06T23:22:15.214637735Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.2: active requests=0, bytes read=4605614" Jul 6 23:22:15.219054 containerd[1523]: time="2025-07-06T23:22:15.218943850Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.2\" with image id \"sha256:309942601a9ca6c4e92bcd09162824fef1c137a5c5d92fbbb45be0f29bfd1817\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\", size \"5974847\" in 1.023024383s" Jul 6 23:22:15.219054 containerd[1523]: time="2025-07-06T23:22:15.219030663Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\" returns image reference \"sha256:309942601a9ca6c4e92bcd09162824fef1c137a5c5d92fbbb45be0f29bfd1817\"" Jul 6 23:22:15.225600 containerd[1523]: time="2025-07-06T23:22:15.225471032Z" level=info msg="CreateContainer within sandbox \"701df0c3f98bc3cdf4f095db43de4a8ddfc8888403a043f65c2795866316c52f\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Jul 6 23:22:15.236596 containerd[1523]: time="2025-07-06T23:22:15.236536364Z" level=info msg="Container 5a2cce5352c32d89a5327f95b079682417bc71517d58cadfeaa622ec8847c3b7: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:22:15.239509 containerd[1523]: time="2025-07-06T23:22:15.239460782Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:22:15.240195 containerd[1523]: time="2025-07-06T23:22:15.240154451Z" level=info msg="ImageCreate event name:\"sha256:309942601a9ca6c4e92bcd09162824fef1c137a5c5d92fbbb45be0f29bfd1817\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:22:15.240765 containerd[1523]: time="2025-07-06T23:22:15.240728141Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:22:15.249563 containerd[1523]: time="2025-07-06T23:22:15.249504635Z" level=info msg="CreateContainer within sandbox \"701df0c3f98bc3cdf4f095db43de4a8ddfc8888403a043f65c2795866316c52f\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"5a2cce5352c32d89a5327f95b079682417bc71517d58cadfeaa622ec8847c3b7\"" Jul 6 23:22:15.251306 containerd[1523]: time="2025-07-06T23:22:15.251269071Z" level=info msg="StartContainer for \"5a2cce5352c32d89a5327f95b079682417bc71517d58cadfeaa622ec8847c3b7\"" Jul 6 23:22:15.252685 containerd[1523]: time="2025-07-06T23:22:15.252650328Z" level=info msg="connecting to shim 5a2cce5352c32d89a5327f95b079682417bc71517d58cadfeaa622ec8847c3b7" address="unix:///run/containerd/s/870422dd0c2aea82178f7c4f3beefaa8baedf248938d49e34d2f85a345efec74" protocol=ttrpc version=3 Jul 6 23:22:15.275234 systemd[1]: Started cri-containerd-5a2cce5352c32d89a5327f95b079682417bc71517d58cadfeaa622ec8847c3b7.scope - libcontainer container 5a2cce5352c32d89a5327f95b079682417bc71517d58cadfeaa622ec8847c3b7. Jul 6 23:22:15.312107 containerd[1523]: time="2025-07-06T23:22:15.312065031Z" level=info msg="StartContainer for \"5a2cce5352c32d89a5327f95b079682417bc71517d58cadfeaa622ec8847c3b7\" returns successfully" Jul 6 23:22:15.315339 containerd[1523]: time="2025-07-06T23:22:15.314675800Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\"" Jul 6 23:22:16.217485 systemd-networkd[1453]: vxlan.calico: Gained IPv6LL Jul 6 23:22:16.820499 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2778547149.mount: Deactivated successfully. Jul 6 23:22:16.907694 containerd[1523]: time="2025-07-06T23:22:16.907643945Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:22:16.908657 containerd[1523]: time="2025-07-06T23:22:16.908476191Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.2: active requests=0, bytes read=30814581" Jul 6 23:22:16.909477 containerd[1523]: time="2025-07-06T23:22:16.909432776Z" level=info msg="ImageCreate event name:\"sha256:8763d908c0cd23d0e87bc61ce1ba8371b86449688baf955e5eeff7f7d7e101c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:22:16.911785 containerd[1523]: time="2025-07-06T23:22:16.911633790Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:22:16.913446 containerd[1523]: time="2025-07-06T23:22:16.913405179Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" with image id \"sha256:8763d908c0cd23d0e87bc61ce1ba8371b86449688baf955e5eeff7f7d7e101c4\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\", size \"30814411\" in 1.598689452s" Jul 6 23:22:16.917052 containerd[1523]: time="2025-07-06T23:22:16.916129312Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" returns image reference \"sha256:8763d908c0cd23d0e87bc61ce1ba8371b86449688baf955e5eeff7f7d7e101c4\"" Jul 6 23:22:16.922958 containerd[1523]: time="2025-07-06T23:22:16.922902779Z" level=info msg="CreateContainer within sandbox \"701df0c3f98bc3cdf4f095db43de4a8ddfc8888403a043f65c2795866316c52f\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Jul 6 23:22:16.959951 containerd[1523]: time="2025-07-06T23:22:16.959090868Z" level=info msg="Container 7f03db47ef7ff78a1be0a8deb3cf267ab5925922b9e8a152aa5ef96f63e74c93: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:22:16.973666 containerd[1523]: time="2025-07-06T23:22:16.973620231Z" level=info msg="CreateContainer within sandbox \"701df0c3f98bc3cdf4f095db43de4a8ddfc8888403a043f65c2795866316c52f\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"7f03db47ef7ff78a1be0a8deb3cf267ab5925922b9e8a152aa5ef96f63e74c93\"" Jul 6 23:22:16.974419 containerd[1523]: time="2025-07-06T23:22:16.974353022Z" level=info msg="StartContainer for \"7f03db47ef7ff78a1be0a8deb3cf267ab5925922b9e8a152aa5ef96f63e74c93\"" Jul 6 23:22:16.975622 containerd[1523]: time="2025-07-06T23:22:16.975588090Z" level=info msg="connecting to shim 7f03db47ef7ff78a1be0a8deb3cf267ab5925922b9e8a152aa5ef96f63e74c93" address="unix:///run/containerd/s/870422dd0c2aea82178f7c4f3beefaa8baedf248938d49e34d2f85a345efec74" protocol=ttrpc version=3 Jul 6 23:22:17.003735 systemd[1]: Started cri-containerd-7f03db47ef7ff78a1be0a8deb3cf267ab5925922b9e8a152aa5ef96f63e74c93.scope - libcontainer container 7f03db47ef7ff78a1be0a8deb3cf267ab5925922b9e8a152aa5ef96f63e74c93. Jul 6 23:22:17.058178 containerd[1523]: time="2025-07-06T23:22:17.058137865Z" level=info msg="StartContainer for \"7f03db47ef7ff78a1be0a8deb3cf267ab5925922b9e8a152aa5ef96f63e74c93\" returns successfully" Jul 6 23:22:20.969261 containerd[1523]: time="2025-07-06T23:22:20.969171041Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-7l8lq,Uid:a52ef37d-ef25-43c3-9985-70f4f2b627aa,Namespace:kube-system,Attempt:0,}" Jul 6 23:22:20.978225 containerd[1523]: time="2025-07-06T23:22:20.977206484Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-fb787ffc-krgms,Uid:3f6ff87d-315d-41c2-bd3f-018458086797,Namespace:calico-apiserver,Attempt:0,}" Jul 6 23:22:20.978225 containerd[1523]: time="2025-07-06T23:22:20.977798204Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-fb787ffc-8r9s6,Uid:1ffb1977-7418-45ee-b1c3-d3d44838d522,Namespace:calico-apiserver,Attempt:0,}" Jul 6 23:22:20.980905 containerd[1523]: time="2025-07-06T23:22:20.979990060Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zdbvm,Uid:28330471-fba4-44a1-96d6-3512652f7a80,Namespace:calico-system,Attempt:0,}" Jul 6 23:22:21.191410 systemd-networkd[1453]: cali9de86ee2d08: Link UP Jul 6 23:22:21.194202 systemd-networkd[1453]: cali9de86ee2d08: Gained carrier Jul 6 23:22:21.207481 kubelet[2638]: I0706 23:22:21.207256 2638 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-78559d498-qj8dq" podStartSLOduration=5.484561705 podStartE2EDuration="8.20723845s" podCreationTimestamp="2025-07-06 23:22:13 +0000 UTC" firstStartedPulling="2025-07-06 23:22:14.195421666 +0000 UTC m=+33.333139807" lastFinishedPulling="2025-07-06 23:22:16.918098411 +0000 UTC m=+36.055816552" observedRunningTime="2025-07-06 23:22:17.161658647 +0000 UTC m=+36.299376828" watchObservedRunningTime="2025-07-06 23:22:21.20723845 +0000 UTC m=+40.344956591" Jul 6 23:22:21.217132 containerd[1523]: 2025-07-06 23:22:21.053 [INFO][4233] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7c65d6cfc9--7l8lq-eth0 coredns-7c65d6cfc9- kube-system a52ef37d-ef25-43c3-9985-70f4f2b627aa 781 0 2025-07-06 23:21:46 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7c65d6cfc9-7l8lq eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali9de86ee2d08 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="b383a50d06a67dbf69500f9bc375d8fa0c075213775b47d68e11772fcd21ad26" Namespace="kube-system" Pod="coredns-7c65d6cfc9-7l8lq" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--7l8lq-" Jul 6 23:22:21.217132 containerd[1523]: 2025-07-06 23:22:21.054 [INFO][4233] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b383a50d06a67dbf69500f9bc375d8fa0c075213775b47d68e11772fcd21ad26" Namespace="kube-system" Pod="coredns-7c65d6cfc9-7l8lq" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--7l8lq-eth0" Jul 6 23:22:21.217132 containerd[1523]: 2025-07-06 23:22:21.115 [INFO][4292] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b383a50d06a67dbf69500f9bc375d8fa0c075213775b47d68e11772fcd21ad26" HandleID="k8s-pod-network.b383a50d06a67dbf69500f9bc375d8fa0c075213775b47d68e11772fcd21ad26" Workload="localhost-k8s-coredns--7c65d6cfc9--7l8lq-eth0" Jul 6 23:22:21.217373 containerd[1523]: 2025-07-06 23:22:21.116 [INFO][4292] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b383a50d06a67dbf69500f9bc375d8fa0c075213775b47d68e11772fcd21ad26" HandleID="k8s-pod-network.b383a50d06a67dbf69500f9bc375d8fa0c075213775b47d68e11772fcd21ad26" Workload="localhost-k8s-coredns--7c65d6cfc9--7l8lq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000483df0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7c65d6cfc9-7l8lq", "timestamp":"2025-07-06 23:22:21.115611581 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 6 23:22:21.217373 containerd[1523]: 2025-07-06 23:22:21.116 [INFO][4292] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 6 23:22:21.217373 containerd[1523]: 2025-07-06 23:22:21.116 [INFO][4292] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 6 23:22:21.217373 containerd[1523]: 2025-07-06 23:22:21.116 [INFO][4292] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jul 6 23:22:21.217373 containerd[1523]: 2025-07-06 23:22:21.136 [INFO][4292] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b383a50d06a67dbf69500f9bc375d8fa0c075213775b47d68e11772fcd21ad26" host="localhost" Jul 6 23:22:21.217373 containerd[1523]: 2025-07-06 23:22:21.146 [INFO][4292] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jul 6 23:22:21.217373 containerd[1523]: 2025-07-06 23:22:21.152 [INFO][4292] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jul 6 23:22:21.217373 containerd[1523]: 2025-07-06 23:22:21.155 [INFO][4292] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jul 6 23:22:21.217373 containerd[1523]: 2025-07-06 23:22:21.158 [INFO][4292] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jul 6 23:22:21.217373 containerd[1523]: 2025-07-06 23:22:21.158 [INFO][4292] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.b383a50d06a67dbf69500f9bc375d8fa0c075213775b47d68e11772fcd21ad26" host="localhost" Jul 6 23:22:21.217570 containerd[1523]: 2025-07-06 23:22:21.160 [INFO][4292] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b383a50d06a67dbf69500f9bc375d8fa0c075213775b47d68e11772fcd21ad26 Jul 6 23:22:21.217570 containerd[1523]: 2025-07-06 23:22:21.166 [INFO][4292] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.b383a50d06a67dbf69500f9bc375d8fa0c075213775b47d68e11772fcd21ad26" host="localhost" Jul 6 23:22:21.217570 containerd[1523]: 2025-07-06 23:22:21.174 [INFO][4292] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.b383a50d06a67dbf69500f9bc375d8fa0c075213775b47d68e11772fcd21ad26" host="localhost" Jul 6 23:22:21.217570 containerd[1523]: 2025-07-06 23:22:21.174 [INFO][4292] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.b383a50d06a67dbf69500f9bc375d8fa0c075213775b47d68e11772fcd21ad26" host="localhost" Jul 6 23:22:21.217570 containerd[1523]: 2025-07-06 23:22:21.174 [INFO][4292] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 6 23:22:21.217570 containerd[1523]: 2025-07-06 23:22:21.174 [INFO][4292] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="b383a50d06a67dbf69500f9bc375d8fa0c075213775b47d68e11772fcd21ad26" HandleID="k8s-pod-network.b383a50d06a67dbf69500f9bc375d8fa0c075213775b47d68e11772fcd21ad26" Workload="localhost-k8s-coredns--7c65d6cfc9--7l8lq-eth0" Jul 6 23:22:21.217747 containerd[1523]: 2025-07-06 23:22:21.184 [INFO][4233] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b383a50d06a67dbf69500f9bc375d8fa0c075213775b47d68e11772fcd21ad26" Namespace="kube-system" Pod="coredns-7c65d6cfc9-7l8lq" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--7l8lq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--7l8lq-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"a52ef37d-ef25-43c3-9985-70f4f2b627aa", ResourceVersion:"781", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 21, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7c65d6cfc9-7l8lq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9de86ee2d08", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:22:21.217819 containerd[1523]: 2025-07-06 23:22:21.184 [INFO][4233] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="b383a50d06a67dbf69500f9bc375d8fa0c075213775b47d68e11772fcd21ad26" Namespace="kube-system" Pod="coredns-7c65d6cfc9-7l8lq" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--7l8lq-eth0" Jul 6 23:22:21.217819 containerd[1523]: 2025-07-06 23:22:21.184 [INFO][4233] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9de86ee2d08 ContainerID="b383a50d06a67dbf69500f9bc375d8fa0c075213775b47d68e11772fcd21ad26" Namespace="kube-system" Pod="coredns-7c65d6cfc9-7l8lq" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--7l8lq-eth0" Jul 6 23:22:21.217819 containerd[1523]: 2025-07-06 23:22:21.191 [INFO][4233] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b383a50d06a67dbf69500f9bc375d8fa0c075213775b47d68e11772fcd21ad26" Namespace="kube-system" Pod="coredns-7c65d6cfc9-7l8lq" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--7l8lq-eth0" Jul 6 23:22:21.217882 containerd[1523]: 2025-07-06 23:22:21.191 [INFO][4233] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b383a50d06a67dbf69500f9bc375d8fa0c075213775b47d68e11772fcd21ad26" Namespace="kube-system" Pod="coredns-7c65d6cfc9-7l8lq" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--7l8lq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--7l8lq-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"a52ef37d-ef25-43c3-9985-70f4f2b627aa", ResourceVersion:"781", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 21, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"b383a50d06a67dbf69500f9bc375d8fa0c075213775b47d68e11772fcd21ad26", Pod:"coredns-7c65d6cfc9-7l8lq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9de86ee2d08", MAC:"9a:9b:ef:13:bb:0d", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:22:21.217882 containerd[1523]: 2025-07-06 23:22:21.212 [INFO][4233] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b383a50d06a67dbf69500f9bc375d8fa0c075213775b47d68e11772fcd21ad26" Namespace="kube-system" Pod="coredns-7c65d6cfc9-7l8lq" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--7l8lq-eth0" Jul 6 23:22:21.314252 systemd-networkd[1453]: cali891b1e61b15: Link UP Jul 6 23:22:21.316273 systemd-networkd[1453]: cali891b1e61b15: Gained carrier Jul 6 23:22:21.338584 containerd[1523]: 2025-07-06 23:22:21.080 [INFO][4259] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--zdbvm-eth0 csi-node-driver- calico-system 28330471-fba4-44a1-96d6-3512652f7a80 667 0 2025-07-06 23:21:59 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:57bd658777 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-zdbvm eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali891b1e61b15 [] [] }} ContainerID="8f8908cc103cf8188aa17a2a5e43cf99ceb3b997b20aaaa810ed1e558c3502b1" Namespace="calico-system" Pod="csi-node-driver-zdbvm" WorkloadEndpoint="localhost-k8s-csi--node--driver--zdbvm-" Jul 6 23:22:21.338584 containerd[1523]: 2025-07-06 23:22:21.080 [INFO][4259] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8f8908cc103cf8188aa17a2a5e43cf99ceb3b997b20aaaa810ed1e558c3502b1" Namespace="calico-system" Pod="csi-node-driver-zdbvm" WorkloadEndpoint="localhost-k8s-csi--node--driver--zdbvm-eth0" Jul 6 23:22:21.338584 containerd[1523]: 2025-07-06 23:22:21.126 [INFO][4301] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8f8908cc103cf8188aa17a2a5e43cf99ceb3b997b20aaaa810ed1e558c3502b1" HandleID="k8s-pod-network.8f8908cc103cf8188aa17a2a5e43cf99ceb3b997b20aaaa810ed1e558c3502b1" Workload="localhost-k8s-csi--node--driver--zdbvm-eth0" Jul 6 23:22:21.338584 containerd[1523]: 2025-07-06 23:22:21.126 [INFO][4301] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8f8908cc103cf8188aa17a2a5e43cf99ceb3b997b20aaaa810ed1e558c3502b1" HandleID="k8s-pod-network.8f8908cc103cf8188aa17a2a5e43cf99ceb3b997b20aaaa810ed1e558c3502b1" Workload="localhost-k8s-csi--node--driver--zdbvm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400042d580), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-zdbvm", "timestamp":"2025-07-06 23:22:21.126124041 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 6 23:22:21.338584 containerd[1523]: 2025-07-06 23:22:21.126 [INFO][4301] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 6 23:22:21.338584 containerd[1523]: 2025-07-06 23:22:21.174 [INFO][4301] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 6 23:22:21.338584 containerd[1523]: 2025-07-06 23:22:21.174 [INFO][4301] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jul 6 23:22:21.338584 containerd[1523]: 2025-07-06 23:22:21.237 [INFO][4301] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8f8908cc103cf8188aa17a2a5e43cf99ceb3b997b20aaaa810ed1e558c3502b1" host="localhost" Jul 6 23:22:21.338584 containerd[1523]: 2025-07-06 23:22:21.244 [INFO][4301] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jul 6 23:22:21.338584 containerd[1523]: 2025-07-06 23:22:21.253 [INFO][4301] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jul 6 23:22:21.338584 containerd[1523]: 2025-07-06 23:22:21.272 [INFO][4301] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jul 6 23:22:21.338584 containerd[1523]: 2025-07-06 23:22:21.275 [INFO][4301] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jul 6 23:22:21.338584 containerd[1523]: 2025-07-06 23:22:21.276 [INFO][4301] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.8f8908cc103cf8188aa17a2a5e43cf99ceb3b997b20aaaa810ed1e558c3502b1" host="localhost" Jul 6 23:22:21.338584 containerd[1523]: 2025-07-06 23:22:21.277 [INFO][4301] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8f8908cc103cf8188aa17a2a5e43cf99ceb3b997b20aaaa810ed1e558c3502b1 Jul 6 23:22:21.338584 containerd[1523]: 2025-07-06 23:22:21.292 [INFO][4301] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.8f8908cc103cf8188aa17a2a5e43cf99ceb3b997b20aaaa810ed1e558c3502b1" host="localhost" Jul 6 23:22:21.338584 containerd[1523]: 2025-07-06 23:22:21.302 [INFO][4301] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.8f8908cc103cf8188aa17a2a5e43cf99ceb3b997b20aaaa810ed1e558c3502b1" host="localhost" Jul 6 23:22:21.338584 containerd[1523]: 2025-07-06 23:22:21.302 [INFO][4301] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.8f8908cc103cf8188aa17a2a5e43cf99ceb3b997b20aaaa810ed1e558c3502b1" host="localhost" Jul 6 23:22:21.338584 containerd[1523]: 2025-07-06 23:22:21.302 [INFO][4301] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 6 23:22:21.338584 containerd[1523]: 2025-07-06 23:22:21.302 [INFO][4301] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="8f8908cc103cf8188aa17a2a5e43cf99ceb3b997b20aaaa810ed1e558c3502b1" HandleID="k8s-pod-network.8f8908cc103cf8188aa17a2a5e43cf99ceb3b997b20aaaa810ed1e558c3502b1" Workload="localhost-k8s-csi--node--driver--zdbvm-eth0" Jul 6 23:22:21.339843 containerd[1523]: 2025-07-06 23:22:21.308 [INFO][4259] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8f8908cc103cf8188aa17a2a5e43cf99ceb3b997b20aaaa810ed1e558c3502b1" Namespace="calico-system" Pod="csi-node-driver-zdbvm" WorkloadEndpoint="localhost-k8s-csi--node--driver--zdbvm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--zdbvm-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"28330471-fba4-44a1-96d6-3512652f7a80", ResourceVersion:"667", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 21, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"57bd658777", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-zdbvm", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali891b1e61b15", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:22:21.339843 containerd[1523]: 2025-07-06 23:22:21.308 [INFO][4259] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="8f8908cc103cf8188aa17a2a5e43cf99ceb3b997b20aaaa810ed1e558c3502b1" Namespace="calico-system" Pod="csi-node-driver-zdbvm" WorkloadEndpoint="localhost-k8s-csi--node--driver--zdbvm-eth0" Jul 6 23:22:21.339843 containerd[1523]: 2025-07-06 23:22:21.308 [INFO][4259] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali891b1e61b15 ContainerID="8f8908cc103cf8188aa17a2a5e43cf99ceb3b997b20aaaa810ed1e558c3502b1" Namespace="calico-system" Pod="csi-node-driver-zdbvm" WorkloadEndpoint="localhost-k8s-csi--node--driver--zdbvm-eth0" Jul 6 23:22:21.339843 containerd[1523]: 2025-07-06 23:22:21.318 [INFO][4259] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8f8908cc103cf8188aa17a2a5e43cf99ceb3b997b20aaaa810ed1e558c3502b1" Namespace="calico-system" Pod="csi-node-driver-zdbvm" WorkloadEndpoint="localhost-k8s-csi--node--driver--zdbvm-eth0" Jul 6 23:22:21.339843 containerd[1523]: 2025-07-06 23:22:21.320 [INFO][4259] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8f8908cc103cf8188aa17a2a5e43cf99ceb3b997b20aaaa810ed1e558c3502b1" Namespace="calico-system" Pod="csi-node-driver-zdbvm" WorkloadEndpoint="localhost-k8s-csi--node--driver--zdbvm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--zdbvm-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"28330471-fba4-44a1-96d6-3512652f7a80", ResourceVersion:"667", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 21, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"57bd658777", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"8f8908cc103cf8188aa17a2a5e43cf99ceb3b997b20aaaa810ed1e558c3502b1", Pod:"csi-node-driver-zdbvm", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali891b1e61b15", MAC:"3a:67:e7:c9:62:b9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:22:21.339843 containerd[1523]: 2025-07-06 23:22:21.335 [INFO][4259] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8f8908cc103cf8188aa17a2a5e43cf99ceb3b997b20aaaa810ed1e558c3502b1" Namespace="calico-system" Pod="csi-node-driver-zdbvm" WorkloadEndpoint="localhost-k8s-csi--node--driver--zdbvm-eth0" Jul 6 23:22:21.343394 containerd[1523]: time="2025-07-06T23:22:21.343149812Z" level=info msg="connecting to shim b383a50d06a67dbf69500f9bc375d8fa0c075213775b47d68e11772fcd21ad26" address="unix:///run/containerd/s/37a9e734cc78cafed6cca1e8f6c85880ae5dc5849ab48236806afc4464bc451a" namespace=k8s.io protocol=ttrpc version=3 Jul 6 23:22:21.380377 containerd[1523]: time="2025-07-06T23:22:21.380333413Z" level=info msg="connecting to shim 8f8908cc103cf8188aa17a2a5e43cf99ceb3b997b20aaaa810ed1e558c3502b1" address="unix:///run/containerd/s/63720a1f59265466596b0cf852d5726347ad554bd56fed493dad0fc879db4517" namespace=k8s.io protocol=ttrpc version=3 Jul 6 23:22:21.386575 systemd[1]: Started cri-containerd-b383a50d06a67dbf69500f9bc375d8fa0c075213775b47d68e11772fcd21ad26.scope - libcontainer container b383a50d06a67dbf69500f9bc375d8fa0c075213775b47d68e11772fcd21ad26. Jul 6 23:22:21.401209 systemd-networkd[1453]: cali36f94ee5d65: Link UP Jul 6 23:22:21.402479 systemd-networkd[1453]: cali36f94ee5d65: Gained carrier Jul 6 23:22:21.412271 systemd-resolved[1353]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jul 6 23:22:21.431341 systemd[1]: Started cri-containerd-8f8908cc103cf8188aa17a2a5e43cf99ceb3b997b20aaaa810ed1e558c3502b1.scope - libcontainer container 8f8908cc103cf8188aa17a2a5e43cf99ceb3b997b20aaaa810ed1e558c3502b1. Jul 6 23:22:21.435137 containerd[1523]: 2025-07-06 23:22:21.083 [INFO][4245] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--fb787ffc--krgms-eth0 calico-apiserver-fb787ffc- calico-apiserver 3f6ff87d-315d-41c2-bd3f-018458086797 789 0 2025-07-06 23:21:55 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:fb787ffc projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-fb787ffc-krgms eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali36f94ee5d65 [] [] }} ContainerID="40514732c81cff1f34a6c0a65837cae2191e7294cae7c4f73ca399bbea3c89b7" Namespace="calico-apiserver" Pod="calico-apiserver-fb787ffc-krgms" WorkloadEndpoint="localhost-k8s-calico--apiserver--fb787ffc--krgms-" Jul 6 23:22:21.435137 containerd[1523]: 2025-07-06 23:22:21.083 [INFO][4245] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="40514732c81cff1f34a6c0a65837cae2191e7294cae7c4f73ca399bbea3c89b7" Namespace="calico-apiserver" Pod="calico-apiserver-fb787ffc-krgms" WorkloadEndpoint="localhost-k8s-calico--apiserver--fb787ffc--krgms-eth0" Jul 6 23:22:21.435137 containerd[1523]: 2025-07-06 23:22:21.126 [INFO][4303] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="40514732c81cff1f34a6c0a65837cae2191e7294cae7c4f73ca399bbea3c89b7" HandleID="k8s-pod-network.40514732c81cff1f34a6c0a65837cae2191e7294cae7c4f73ca399bbea3c89b7" Workload="localhost-k8s-calico--apiserver--fb787ffc--krgms-eth0" Jul 6 23:22:21.435137 containerd[1523]: 2025-07-06 23:22:21.126 [INFO][4303] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="40514732c81cff1f34a6c0a65837cae2191e7294cae7c4f73ca399bbea3c89b7" HandleID="k8s-pod-network.40514732c81cff1f34a6c0a65837cae2191e7294cae7c4f73ca399bbea3c89b7" Workload="localhost-k8s-calico--apiserver--fb787ffc--krgms-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002ab490), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-fb787ffc-krgms", "timestamp":"2025-07-06 23:22:21.126554578 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 6 23:22:21.435137 containerd[1523]: 2025-07-06 23:22:21.127 [INFO][4303] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 6 23:22:21.435137 containerd[1523]: 2025-07-06 23:22:21.302 [INFO][4303] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 6 23:22:21.435137 containerd[1523]: 2025-07-06 23:22:21.302 [INFO][4303] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jul 6 23:22:21.435137 containerd[1523]: 2025-07-06 23:22:21.338 [INFO][4303] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.40514732c81cff1f34a6c0a65837cae2191e7294cae7c4f73ca399bbea3c89b7" host="localhost" Jul 6 23:22:21.435137 containerd[1523]: 2025-07-06 23:22:21.346 [INFO][4303] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jul 6 23:22:21.435137 containerd[1523]: 2025-07-06 23:22:21.360 [INFO][4303] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jul 6 23:22:21.435137 containerd[1523]: 2025-07-06 23:22:21.365 [INFO][4303] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jul 6 23:22:21.435137 containerd[1523]: 2025-07-06 23:22:21.368 [INFO][4303] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jul 6 23:22:21.435137 containerd[1523]: 2025-07-06 23:22:21.368 [INFO][4303] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.40514732c81cff1f34a6c0a65837cae2191e7294cae7c4f73ca399bbea3c89b7" host="localhost" Jul 6 23:22:21.435137 containerd[1523]: 2025-07-06 23:22:21.372 [INFO][4303] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.40514732c81cff1f34a6c0a65837cae2191e7294cae7c4f73ca399bbea3c89b7 Jul 6 23:22:21.435137 containerd[1523]: 2025-07-06 23:22:21.381 [INFO][4303] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.40514732c81cff1f34a6c0a65837cae2191e7294cae7c4f73ca399bbea3c89b7" host="localhost" Jul 6 23:22:21.435137 containerd[1523]: 2025-07-06 23:22:21.390 [INFO][4303] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.40514732c81cff1f34a6c0a65837cae2191e7294cae7c4f73ca399bbea3c89b7" host="localhost" Jul 6 23:22:21.435137 containerd[1523]: 2025-07-06 23:22:21.390 [INFO][4303] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.40514732c81cff1f34a6c0a65837cae2191e7294cae7c4f73ca399bbea3c89b7" host="localhost" Jul 6 23:22:21.435137 containerd[1523]: 2025-07-06 23:22:21.391 [INFO][4303] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 6 23:22:21.435137 containerd[1523]: 2025-07-06 23:22:21.391 [INFO][4303] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="40514732c81cff1f34a6c0a65837cae2191e7294cae7c4f73ca399bbea3c89b7" HandleID="k8s-pod-network.40514732c81cff1f34a6c0a65837cae2191e7294cae7c4f73ca399bbea3c89b7" Workload="localhost-k8s-calico--apiserver--fb787ffc--krgms-eth0" Jul 6 23:22:21.436262 containerd[1523]: 2025-07-06 23:22:21.396 [INFO][4245] cni-plugin/k8s.go 418: Populated endpoint ContainerID="40514732c81cff1f34a6c0a65837cae2191e7294cae7c4f73ca399bbea3c89b7" Namespace="calico-apiserver" Pod="calico-apiserver-fb787ffc-krgms" WorkloadEndpoint="localhost-k8s-calico--apiserver--fb787ffc--krgms-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--fb787ffc--krgms-eth0", GenerateName:"calico-apiserver-fb787ffc-", Namespace:"calico-apiserver", SelfLink:"", UID:"3f6ff87d-315d-41c2-bd3f-018458086797", ResourceVersion:"789", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 21, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"fb787ffc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-fb787ffc-krgms", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali36f94ee5d65", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:22:21.436262 containerd[1523]: 2025-07-06 23:22:21.396 [INFO][4245] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="40514732c81cff1f34a6c0a65837cae2191e7294cae7c4f73ca399bbea3c89b7" Namespace="calico-apiserver" Pod="calico-apiserver-fb787ffc-krgms" WorkloadEndpoint="localhost-k8s-calico--apiserver--fb787ffc--krgms-eth0" Jul 6 23:22:21.436262 containerd[1523]: 2025-07-06 23:22:21.396 [INFO][4245] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali36f94ee5d65 ContainerID="40514732c81cff1f34a6c0a65837cae2191e7294cae7c4f73ca399bbea3c89b7" Namespace="calico-apiserver" Pod="calico-apiserver-fb787ffc-krgms" WorkloadEndpoint="localhost-k8s-calico--apiserver--fb787ffc--krgms-eth0" Jul 6 23:22:21.436262 containerd[1523]: 2025-07-06 23:22:21.400 [INFO][4245] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="40514732c81cff1f34a6c0a65837cae2191e7294cae7c4f73ca399bbea3c89b7" Namespace="calico-apiserver" Pod="calico-apiserver-fb787ffc-krgms" WorkloadEndpoint="localhost-k8s-calico--apiserver--fb787ffc--krgms-eth0" Jul 6 23:22:21.436262 containerd[1523]: 2025-07-06 23:22:21.402 [INFO][4245] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="40514732c81cff1f34a6c0a65837cae2191e7294cae7c4f73ca399bbea3c89b7" Namespace="calico-apiserver" Pod="calico-apiserver-fb787ffc-krgms" WorkloadEndpoint="localhost-k8s-calico--apiserver--fb787ffc--krgms-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--fb787ffc--krgms-eth0", GenerateName:"calico-apiserver-fb787ffc-", Namespace:"calico-apiserver", SelfLink:"", UID:"3f6ff87d-315d-41c2-bd3f-018458086797", ResourceVersion:"789", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 21, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"fb787ffc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"40514732c81cff1f34a6c0a65837cae2191e7294cae7c4f73ca399bbea3c89b7", Pod:"calico-apiserver-fb787ffc-krgms", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali36f94ee5d65", MAC:"2a:46:32:29:20:8f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:22:21.436262 containerd[1523]: 2025-07-06 23:22:21.426 [INFO][4245] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="40514732c81cff1f34a6c0a65837cae2191e7294cae7c4f73ca399bbea3c89b7" Namespace="calico-apiserver" Pod="calico-apiserver-fb787ffc-krgms" WorkloadEndpoint="localhost-k8s-calico--apiserver--fb787ffc--krgms-eth0" Jul 6 23:22:21.472303 systemd-resolved[1353]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jul 6 23:22:21.486611 containerd[1523]: time="2025-07-06T23:22:21.485237825Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-7l8lq,Uid:a52ef37d-ef25-43c3-9985-70f4f2b627aa,Namespace:kube-system,Attempt:0,} returns sandbox id \"b383a50d06a67dbf69500f9bc375d8fa0c075213775b47d68e11772fcd21ad26\"" Jul 6 23:22:21.497161 containerd[1523]: time="2025-07-06T23:22:21.495332670Z" level=info msg="CreateContainer within sandbox \"b383a50d06a67dbf69500f9bc375d8fa0c075213775b47d68e11772fcd21ad26\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jul 6 23:22:21.514734 containerd[1523]: time="2025-07-06T23:22:21.514049967Z" level=info msg="connecting to shim 40514732c81cff1f34a6c0a65837cae2191e7294cae7c4f73ca399bbea3c89b7" address="unix:///run/containerd/s/1788f5d9a7b3f709b698adb9f9c5cef95869a7bc47d5c7545ce8a5fcda22008d" namespace=k8s.io protocol=ttrpc version=3 Jul 6 23:22:21.522467 containerd[1523]: time="2025-07-06T23:22:21.522073620Z" level=info msg="Container d6d5f22a761bd454f947e85b6cc23bdd14e866c43de5ab8c668e6c66a8b519d4: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:22:21.539207 containerd[1523]: time="2025-07-06T23:22:21.539153582Z" level=info msg="CreateContainer within sandbox \"b383a50d06a67dbf69500f9bc375d8fa0c075213775b47d68e11772fcd21ad26\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"d6d5f22a761bd454f947e85b6cc23bdd14e866c43de5ab8c668e6c66a8b519d4\"" Jul 6 23:22:21.539970 containerd[1523]: time="2025-07-06T23:22:21.539790186Z" level=info msg="StartContainer for \"d6d5f22a761bd454f947e85b6cc23bdd14e866c43de5ab8c668e6c66a8b519d4\"" Jul 6 23:22:21.543334 containerd[1523]: time="2025-07-06T23:22:21.543285205Z" level=info msg="connecting to shim d6d5f22a761bd454f947e85b6cc23bdd14e866c43de5ab8c668e6c66a8b519d4" address="unix:///run/containerd/s/37a9e734cc78cafed6cca1e8f6c85880ae5dc5849ab48236806afc4464bc451a" protocol=ttrpc version=3 Jul 6 23:22:21.543477 containerd[1523]: time="2025-07-06T23:22:21.543397620Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zdbvm,Uid:28330471-fba4-44a1-96d6-3512652f7a80,Namespace:calico-system,Attempt:0,} returns sandbox id \"8f8908cc103cf8188aa17a2a5e43cf99ceb3b997b20aaaa810ed1e558c3502b1\"" Jul 6 23:22:21.547256 containerd[1523]: time="2025-07-06T23:22:21.547120348Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\"" Jul 6 23:22:21.549376 systemd-networkd[1453]: cali39d933a1c40: Link UP Jul 6 23:22:21.550148 systemd-networkd[1453]: cali39d933a1c40: Gained carrier Jul 6 23:22:21.570364 containerd[1523]: 2025-07-06 23:22:21.090 [INFO][4274] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--fb787ffc--8r9s6-eth0 calico-apiserver-fb787ffc- calico-apiserver 1ffb1977-7418-45ee-b1c3-d3d44838d522 791 0 2025-07-06 23:21:55 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:fb787ffc projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-fb787ffc-8r9s6 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali39d933a1c40 [] [] }} ContainerID="50255b8002ad13edea16ab07786056bc661864ac97f76d23801f687d25abe6a6" Namespace="calico-apiserver" Pod="calico-apiserver-fb787ffc-8r9s6" WorkloadEndpoint="localhost-k8s-calico--apiserver--fb787ffc--8r9s6-" Jul 6 23:22:21.570364 containerd[1523]: 2025-07-06 23:22:21.091 [INFO][4274] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="50255b8002ad13edea16ab07786056bc661864ac97f76d23801f687d25abe6a6" Namespace="calico-apiserver" Pod="calico-apiserver-fb787ffc-8r9s6" WorkloadEndpoint="localhost-k8s-calico--apiserver--fb787ffc--8r9s6-eth0" Jul 6 23:22:21.570364 containerd[1523]: 2025-07-06 23:22:21.135 [INFO][4312] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="50255b8002ad13edea16ab07786056bc661864ac97f76d23801f687d25abe6a6" HandleID="k8s-pod-network.50255b8002ad13edea16ab07786056bc661864ac97f76d23801f687d25abe6a6" Workload="localhost-k8s-calico--apiserver--fb787ffc--8r9s6-eth0" Jul 6 23:22:21.570364 containerd[1523]: 2025-07-06 23:22:21.135 [INFO][4312] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="50255b8002ad13edea16ab07786056bc661864ac97f76d23801f687d25abe6a6" HandleID="k8s-pod-network.50255b8002ad13edea16ab07786056bc661864ac97f76d23801f687d25abe6a6" Workload="localhost-k8s-calico--apiserver--fb787ffc--8r9s6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001114f0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-fb787ffc-8r9s6", "timestamp":"2025-07-06 23:22:21.135314168 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 6 23:22:21.570364 containerd[1523]: 2025-07-06 23:22:21.135 [INFO][4312] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 6 23:22:21.570364 containerd[1523]: 2025-07-06 23:22:21.391 [INFO][4312] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 6 23:22:21.570364 containerd[1523]: 2025-07-06 23:22:21.391 [INFO][4312] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jul 6 23:22:21.570364 containerd[1523]: 2025-07-06 23:22:21.440 [INFO][4312] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.50255b8002ad13edea16ab07786056bc661864ac97f76d23801f687d25abe6a6" host="localhost" Jul 6 23:22:21.570364 containerd[1523]: 2025-07-06 23:22:21.448 [INFO][4312] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jul 6 23:22:21.570364 containerd[1523]: 2025-07-06 23:22:21.469 [INFO][4312] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jul 6 23:22:21.570364 containerd[1523]: 2025-07-06 23:22:21.489 [INFO][4312] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jul 6 23:22:21.570364 containerd[1523]: 2025-07-06 23:22:21.497 [INFO][4312] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jul 6 23:22:21.570364 containerd[1523]: 2025-07-06 23:22:21.497 [INFO][4312] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.50255b8002ad13edea16ab07786056bc661864ac97f76d23801f687d25abe6a6" host="localhost" Jul 6 23:22:21.570364 containerd[1523]: 2025-07-06 23:22:21.512 [INFO][4312] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.50255b8002ad13edea16ab07786056bc661864ac97f76d23801f687d25abe6a6 Jul 6 23:22:21.570364 containerd[1523]: 2025-07-06 23:22:21.526 [INFO][4312] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.50255b8002ad13edea16ab07786056bc661864ac97f76d23801f687d25abe6a6" host="localhost" Jul 6 23:22:21.570364 containerd[1523]: 2025-07-06 23:22:21.535 [INFO][4312] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.50255b8002ad13edea16ab07786056bc661864ac97f76d23801f687d25abe6a6" host="localhost" Jul 6 23:22:21.570364 containerd[1523]: 2025-07-06 23:22:21.535 [INFO][4312] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.50255b8002ad13edea16ab07786056bc661864ac97f76d23801f687d25abe6a6" host="localhost" Jul 6 23:22:21.570364 containerd[1523]: 2025-07-06 23:22:21.535 [INFO][4312] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 6 23:22:21.570364 containerd[1523]: 2025-07-06 23:22:21.535 [INFO][4312] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="50255b8002ad13edea16ab07786056bc661864ac97f76d23801f687d25abe6a6" HandleID="k8s-pod-network.50255b8002ad13edea16ab07786056bc661864ac97f76d23801f687d25abe6a6" Workload="localhost-k8s-calico--apiserver--fb787ffc--8r9s6-eth0" Jul 6 23:22:21.571353 containerd[1523]: 2025-07-06 23:22:21.543 [INFO][4274] cni-plugin/k8s.go 418: Populated endpoint ContainerID="50255b8002ad13edea16ab07786056bc661864ac97f76d23801f687d25abe6a6" Namespace="calico-apiserver" Pod="calico-apiserver-fb787ffc-8r9s6" WorkloadEndpoint="localhost-k8s-calico--apiserver--fb787ffc--8r9s6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--fb787ffc--8r9s6-eth0", GenerateName:"calico-apiserver-fb787ffc-", Namespace:"calico-apiserver", SelfLink:"", UID:"1ffb1977-7418-45ee-b1c3-d3d44838d522", ResourceVersion:"791", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 21, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"fb787ffc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-fb787ffc-8r9s6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali39d933a1c40", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:22:21.571353 containerd[1523]: 2025-07-06 23:22:21.543 [INFO][4274] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="50255b8002ad13edea16ab07786056bc661864ac97f76d23801f687d25abe6a6" Namespace="calico-apiserver" Pod="calico-apiserver-fb787ffc-8r9s6" WorkloadEndpoint="localhost-k8s-calico--apiserver--fb787ffc--8r9s6-eth0" Jul 6 23:22:21.571353 containerd[1523]: 2025-07-06 23:22:21.543 [INFO][4274] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali39d933a1c40 ContainerID="50255b8002ad13edea16ab07786056bc661864ac97f76d23801f687d25abe6a6" Namespace="calico-apiserver" Pod="calico-apiserver-fb787ffc-8r9s6" WorkloadEndpoint="localhost-k8s-calico--apiserver--fb787ffc--8r9s6-eth0" Jul 6 23:22:21.571353 containerd[1523]: 2025-07-06 23:22:21.551 [INFO][4274] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="50255b8002ad13edea16ab07786056bc661864ac97f76d23801f687d25abe6a6" Namespace="calico-apiserver" Pod="calico-apiserver-fb787ffc-8r9s6" WorkloadEndpoint="localhost-k8s-calico--apiserver--fb787ffc--8r9s6-eth0" Jul 6 23:22:21.571353 containerd[1523]: 2025-07-06 23:22:21.553 [INFO][4274] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="50255b8002ad13edea16ab07786056bc661864ac97f76d23801f687d25abe6a6" Namespace="calico-apiserver" Pod="calico-apiserver-fb787ffc-8r9s6" WorkloadEndpoint="localhost-k8s-calico--apiserver--fb787ffc--8r9s6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--fb787ffc--8r9s6-eth0", GenerateName:"calico-apiserver-fb787ffc-", Namespace:"calico-apiserver", SelfLink:"", UID:"1ffb1977-7418-45ee-b1c3-d3d44838d522", ResourceVersion:"791", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 21, 55, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"fb787ffc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"50255b8002ad13edea16ab07786056bc661864ac97f76d23801f687d25abe6a6", Pod:"calico-apiserver-fb787ffc-8r9s6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali39d933a1c40", MAC:"da:29:30:b4:2a:0c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:22:21.571353 containerd[1523]: 2025-07-06 23:22:21.566 [INFO][4274] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="50255b8002ad13edea16ab07786056bc661864ac97f76d23801f687d25abe6a6" Namespace="calico-apiserver" Pod="calico-apiserver-fb787ffc-8r9s6" WorkloadEndpoint="localhost-k8s-calico--apiserver--fb787ffc--8r9s6-eth0" Jul 6 23:22:21.575382 systemd[1]: Started cri-containerd-d6d5f22a761bd454f947e85b6cc23bdd14e866c43de5ab8c668e6c66a8b519d4.scope - libcontainer container d6d5f22a761bd454f947e85b6cc23bdd14e866c43de5ab8c668e6c66a8b519d4. Jul 6 23:22:21.600385 systemd[1]: Started cri-containerd-40514732c81cff1f34a6c0a65837cae2191e7294cae7c4f73ca399bbea3c89b7.scope - libcontainer container 40514732c81cff1f34a6c0a65837cae2191e7294cae7c4f73ca399bbea3c89b7. Jul 6 23:22:21.602929 containerd[1523]: time="2025-07-06T23:22:21.602778735Z" level=info msg="connecting to shim 50255b8002ad13edea16ab07786056bc661864ac97f76d23801f687d25abe6a6" address="unix:///run/containerd/s/da0c734d8f6e20cb723f93cac12f8ad2a3bb4d58f6431f5bfd2361c87b6d4f70" namespace=k8s.io protocol=ttrpc version=3 Jul 6 23:22:21.619519 systemd-resolved[1353]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jul 6 23:22:21.640303 systemd[1]: Started cri-containerd-50255b8002ad13edea16ab07786056bc661864ac97f76d23801f687d25abe6a6.scope - libcontainer container 50255b8002ad13edea16ab07786056bc661864ac97f76d23801f687d25abe6a6. Jul 6 23:22:21.665910 containerd[1523]: time="2025-07-06T23:22:21.665869177Z" level=info msg="StartContainer for \"d6d5f22a761bd454f947e85b6cc23bdd14e866c43de5ab8c668e6c66a8b519d4\" returns successfully" Jul 6 23:22:21.674997 containerd[1523]: time="2025-07-06T23:22:21.674936807Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-fb787ffc-krgms,Uid:3f6ff87d-315d-41c2-bd3f-018458086797,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"40514732c81cff1f34a6c0a65837cae2191e7294cae7c4f73ca399bbea3c89b7\"" Jul 6 23:22:21.709362 systemd-resolved[1353]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jul 6 23:22:21.736652 containerd[1523]: time="2025-07-06T23:22:21.736483647Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-fb787ffc-8r9s6,Uid:1ffb1977-7418-45ee-b1c3-d3d44838d522,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"50255b8002ad13edea16ab07786056bc661864ac97f76d23801f687d25abe6a6\"" Jul 6 23:22:21.860113 systemd[1]: Started sshd@7-10.0.0.40:22-10.0.0.1:48766.service - OpenSSH per-connection server daemon (10.0.0.1:48766). Jul 6 23:22:21.929106 sshd[4579]: Accepted publickey for core from 10.0.0.1 port 48766 ssh2: RSA SHA256:jyTvj9WiqpnTWeC15mq15pBzt3VkG8C4RFcxi7WEalo Jul 6 23:22:21.930447 sshd-session[4579]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:22:21.940636 systemd-logind[1505]: New session 8 of user core. Jul 6 23:22:21.947604 systemd[1]: Started session-8.scope - Session 8 of User core. Jul 6 23:22:21.969336 containerd[1523]: time="2025-07-06T23:22:21.969295570Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-cgcpv,Uid:3775b966-20b1-4599-afab-88aae053916c,Namespace:kube-system,Attempt:0,}" Jul 6 23:22:22.105705 systemd-networkd[1453]: calif30b51a8971: Link UP Jul 6 23:22:22.106909 systemd-networkd[1453]: calif30b51a8971: Gained carrier Jul 6 23:22:22.122975 containerd[1523]: 2025-07-06 23:22:22.016 [INFO][4588] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7c65d6cfc9--cgcpv-eth0 coredns-7c65d6cfc9- kube-system 3775b966-20b1-4599-afab-88aae053916c 784 0 2025-07-06 23:21:46 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7c65d6cfc9-cgcpv eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calif30b51a8971 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="6ae33388393483ca68675c3656d4b40c75fc77185fe10aa777fac49e816c0e93" Namespace="kube-system" Pod="coredns-7c65d6cfc9-cgcpv" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--cgcpv-" Jul 6 23:22:22.122975 containerd[1523]: 2025-07-06 23:22:22.016 [INFO][4588] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6ae33388393483ca68675c3656d4b40c75fc77185fe10aa777fac49e816c0e93" Namespace="kube-system" Pod="coredns-7c65d6cfc9-cgcpv" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--cgcpv-eth0" Jul 6 23:22:22.122975 containerd[1523]: 2025-07-06 23:22:22.049 [INFO][4613] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6ae33388393483ca68675c3656d4b40c75fc77185fe10aa777fac49e816c0e93" HandleID="k8s-pod-network.6ae33388393483ca68675c3656d4b40c75fc77185fe10aa777fac49e816c0e93" Workload="localhost-k8s-coredns--7c65d6cfc9--cgcpv-eth0" Jul 6 23:22:22.122975 containerd[1523]: 2025-07-06 23:22:22.050 [INFO][4613] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6ae33388393483ca68675c3656d4b40c75fc77185fe10aa777fac49e816c0e93" HandleID="k8s-pod-network.6ae33388393483ca68675c3656d4b40c75fc77185fe10aa777fac49e816c0e93" Workload="localhost-k8s-coredns--7c65d6cfc9--cgcpv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c810), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7c65d6cfc9-cgcpv", "timestamp":"2025-07-06 23:22:22.049790134 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 6 23:22:22.122975 containerd[1523]: 2025-07-06 23:22:22.050 [INFO][4613] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 6 23:22:22.122975 containerd[1523]: 2025-07-06 23:22:22.051 [INFO][4613] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 6 23:22:22.122975 containerd[1523]: 2025-07-06 23:22:22.051 [INFO][4613] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jul 6 23:22:22.122975 containerd[1523]: 2025-07-06 23:22:22.061 [INFO][4613] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6ae33388393483ca68675c3656d4b40c75fc77185fe10aa777fac49e816c0e93" host="localhost" Jul 6 23:22:22.122975 containerd[1523]: 2025-07-06 23:22:22.068 [INFO][4613] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jul 6 23:22:22.122975 containerd[1523]: 2025-07-06 23:22:22.076 [INFO][4613] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jul 6 23:22:22.122975 containerd[1523]: 2025-07-06 23:22:22.079 [INFO][4613] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jul 6 23:22:22.122975 containerd[1523]: 2025-07-06 23:22:22.082 [INFO][4613] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jul 6 23:22:22.122975 containerd[1523]: 2025-07-06 23:22:22.082 [INFO][4613] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.6ae33388393483ca68675c3656d4b40c75fc77185fe10aa777fac49e816c0e93" host="localhost" Jul 6 23:22:22.122975 containerd[1523]: 2025-07-06 23:22:22.084 [INFO][4613] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.6ae33388393483ca68675c3656d4b40c75fc77185fe10aa777fac49e816c0e93 Jul 6 23:22:22.122975 containerd[1523]: 2025-07-06 23:22:22.091 [INFO][4613] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.6ae33388393483ca68675c3656d4b40c75fc77185fe10aa777fac49e816c0e93" host="localhost" Jul 6 23:22:22.122975 containerd[1523]: 2025-07-06 23:22:22.100 [INFO][4613] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.6ae33388393483ca68675c3656d4b40c75fc77185fe10aa777fac49e816c0e93" host="localhost" Jul 6 23:22:22.122975 containerd[1523]: 2025-07-06 23:22:22.100 [INFO][4613] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.6ae33388393483ca68675c3656d4b40c75fc77185fe10aa777fac49e816c0e93" host="localhost" Jul 6 23:22:22.122975 containerd[1523]: 2025-07-06 23:22:22.100 [INFO][4613] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 6 23:22:22.122975 containerd[1523]: 2025-07-06 23:22:22.100 [INFO][4613] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="6ae33388393483ca68675c3656d4b40c75fc77185fe10aa777fac49e816c0e93" HandleID="k8s-pod-network.6ae33388393483ca68675c3656d4b40c75fc77185fe10aa777fac49e816c0e93" Workload="localhost-k8s-coredns--7c65d6cfc9--cgcpv-eth0" Jul 6 23:22:22.123684 containerd[1523]: 2025-07-06 23:22:22.103 [INFO][4588] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6ae33388393483ca68675c3656d4b40c75fc77185fe10aa777fac49e816c0e93" Namespace="kube-system" Pod="coredns-7c65d6cfc9-cgcpv" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--cgcpv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--cgcpv-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"3775b966-20b1-4599-afab-88aae053916c", ResourceVersion:"784", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 21, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7c65d6cfc9-cgcpv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif30b51a8971", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:22:22.123684 containerd[1523]: 2025-07-06 23:22:22.103 [INFO][4588] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="6ae33388393483ca68675c3656d4b40c75fc77185fe10aa777fac49e816c0e93" Namespace="kube-system" Pod="coredns-7c65d6cfc9-cgcpv" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--cgcpv-eth0" Jul 6 23:22:22.123684 containerd[1523]: 2025-07-06 23:22:22.103 [INFO][4588] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif30b51a8971 ContainerID="6ae33388393483ca68675c3656d4b40c75fc77185fe10aa777fac49e816c0e93" Namespace="kube-system" Pod="coredns-7c65d6cfc9-cgcpv" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--cgcpv-eth0" Jul 6 23:22:22.123684 containerd[1523]: 2025-07-06 23:22:22.108 [INFO][4588] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6ae33388393483ca68675c3656d4b40c75fc77185fe10aa777fac49e816c0e93" Namespace="kube-system" Pod="coredns-7c65d6cfc9-cgcpv" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--cgcpv-eth0" Jul 6 23:22:22.123684 containerd[1523]: 2025-07-06 23:22:22.108 [INFO][4588] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6ae33388393483ca68675c3656d4b40c75fc77185fe10aa777fac49e816c0e93" Namespace="kube-system" Pod="coredns-7c65d6cfc9-cgcpv" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--cgcpv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7c65d6cfc9--cgcpv-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"3775b966-20b1-4599-afab-88aae053916c", ResourceVersion:"784", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 21, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"6ae33388393483ca68675c3656d4b40c75fc77185fe10aa777fac49e816c0e93", Pod:"coredns-7c65d6cfc9-cgcpv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif30b51a8971", MAC:"62:7e:f3:c9:99:fb", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:22:22.123684 containerd[1523]: 2025-07-06 23:22:22.118 [INFO][4588] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6ae33388393483ca68675c3656d4b40c75fc77185fe10aa777fac49e816c0e93" Namespace="kube-system" Pod="coredns-7c65d6cfc9-cgcpv" WorkloadEndpoint="localhost-k8s-coredns--7c65d6cfc9--cgcpv-eth0" Jul 6 23:22:22.154034 containerd[1523]: time="2025-07-06T23:22:22.153612256Z" level=info msg="connecting to shim 6ae33388393483ca68675c3656d4b40c75fc77185fe10aa777fac49e816c0e93" address="unix:///run/containerd/s/a28739e87b165fa3f16d7c69074229d0fc66814dd2f872c6ce0303926e1bf6f4" namespace=k8s.io protocol=ttrpc version=3 Jul 6 23:22:22.204185 systemd[1]: Started cri-containerd-6ae33388393483ca68675c3656d4b40c75fc77185fe10aa777fac49e816c0e93.scope - libcontainer container 6ae33388393483ca68675c3656d4b40c75fc77185fe10aa777fac49e816c0e93. Jul 6 23:22:22.206536 kubelet[2638]: I0706 23:22:22.206460 2638 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-7l8lq" podStartSLOduration=36.206436293 podStartE2EDuration="36.206436293s" podCreationTimestamp="2025-07-06 23:21:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-06 23:22:22.180102405 +0000 UTC m=+41.317820546" watchObservedRunningTime="2025-07-06 23:22:22.206436293 +0000 UTC m=+41.344154434" Jul 6 23:22:22.236938 systemd-resolved[1353]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jul 6 23:22:22.270533 containerd[1523]: time="2025-07-06T23:22:22.270464365Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-cgcpv,Uid:3775b966-20b1-4599-afab-88aae053916c,Namespace:kube-system,Attempt:0,} returns sandbox id \"6ae33388393483ca68675c3656d4b40c75fc77185fe10aa777fac49e816c0e93\"" Jul 6 23:22:22.279125 containerd[1523]: time="2025-07-06T23:22:22.279056624Z" level=info msg="CreateContainer within sandbox \"6ae33388393483ca68675c3656d4b40c75fc77185fe10aa777fac49e816c0e93\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jul 6 23:22:22.292678 containerd[1523]: time="2025-07-06T23:22:22.292632361Z" level=info msg="Container 4a29967d93f53311b5b4487b2ce27e3d82f55829504b991826c1e4069ccca699: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:22:22.302136 sshd[4586]: Connection closed by 10.0.0.1 port 48766 Jul 6 23:22:22.302719 containerd[1523]: time="2025-07-06T23:22:22.302248111Z" level=info msg="CreateContainer within sandbox \"6ae33388393483ca68675c3656d4b40c75fc77185fe10aa777fac49e816c0e93\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"4a29967d93f53311b5b4487b2ce27e3d82f55829504b991826c1e4069ccca699\"" Jul 6 23:22:22.303035 containerd[1523]: time="2025-07-06T23:22:22.302967483Z" level=info msg="StartContainer for \"4a29967d93f53311b5b4487b2ce27e3d82f55829504b991826c1e4069ccca699\"" Jul 6 23:22:22.303816 sshd-session[4579]: pam_unix(sshd:session): session closed for user core Jul 6 23:22:22.304570 containerd[1523]: time="2025-07-06T23:22:22.304520521Z" level=info msg="connecting to shim 4a29967d93f53311b5b4487b2ce27e3d82f55829504b991826c1e4069ccca699" address="unix:///run/containerd/s/a28739e87b165fa3f16d7c69074229d0fc66814dd2f872c6ce0303926e1bf6f4" protocol=ttrpc version=3 Jul 6 23:22:22.312994 systemd[1]: sshd@7-10.0.0.40:22-10.0.0.1:48766.service: Deactivated successfully. Jul 6 23:22:22.318770 systemd[1]: session-8.scope: Deactivated successfully. Jul 6 23:22:22.321071 systemd-logind[1505]: Session 8 logged out. Waiting for processes to exit. Jul 6 23:22:22.325504 systemd-logind[1505]: Removed session 8. Jul 6 23:22:22.349322 systemd[1]: Started cri-containerd-4a29967d93f53311b5b4487b2ce27e3d82f55829504b991826c1e4069ccca699.scope - libcontainer container 4a29967d93f53311b5b4487b2ce27e3d82f55829504b991826c1e4069ccca699. Jul 6 23:22:22.379161 containerd[1523]: time="2025-07-06T23:22:22.378992209Z" level=info msg="StartContainer for \"4a29967d93f53311b5b4487b2ce27e3d82f55829504b991826c1e4069ccca699\" returns successfully" Jul 6 23:22:22.525381 containerd[1523]: time="2025-07-06T23:22:22.525297486Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:22:22.526346 containerd[1523]: time="2025-07-06T23:22:22.526309175Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.2: active requests=0, bytes read=8225702" Jul 6 23:22:22.527643 containerd[1523]: time="2025-07-06T23:22:22.527606701Z" level=info msg="ImageCreate event name:\"sha256:14ecfabbdbebd1f5a36708f8b11a95a43baddd6a935d7d78c89a9c333849fcd2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:22:22.529566 containerd[1523]: time="2025-07-06T23:22:22.529490582Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:22:22.530124 containerd[1523]: time="2025-07-06T23:22:22.530091939Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.2\" with image id \"sha256:14ecfabbdbebd1f5a36708f8b11a95a43baddd6a935d7d78c89a9c333849fcd2\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\", size \"9594943\" in 982.929145ms" Jul 6 23:22:22.530197 containerd[1523]: time="2025-07-06T23:22:22.530123303Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\" returns image reference \"sha256:14ecfabbdbebd1f5a36708f8b11a95a43baddd6a935d7d78c89a9c333849fcd2\"" Jul 6 23:22:22.531848 containerd[1523]: time="2025-07-06T23:22:22.531820800Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Jul 6 23:22:22.533087 containerd[1523]: time="2025-07-06T23:22:22.532777683Z" level=info msg="CreateContainer within sandbox \"8f8908cc103cf8188aa17a2a5e43cf99ceb3b997b20aaaa810ed1e558c3502b1\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Jul 6 23:22:22.549339 containerd[1523]: time="2025-07-06T23:22:22.549283554Z" level=info msg="Container d33ca648f7cd6ac68a10625f92b3780ca1253691f24bde7cb5ca478b53b173d7: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:22:22.559164 containerd[1523]: time="2025-07-06T23:22:22.559106731Z" level=info msg="CreateContainer within sandbox \"8f8908cc103cf8188aa17a2a5e43cf99ceb3b997b20aaaa810ed1e558c3502b1\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"d33ca648f7cd6ac68a10625f92b3780ca1253691f24bde7cb5ca478b53b173d7\"" Jul 6 23:22:22.559717 containerd[1523]: time="2025-07-06T23:22:22.559692606Z" level=info msg="StartContainer for \"d33ca648f7cd6ac68a10625f92b3780ca1253691f24bde7cb5ca478b53b173d7\"" Jul 6 23:22:22.561434 containerd[1523]: time="2025-07-06T23:22:22.561390983Z" level=info msg="connecting to shim d33ca648f7cd6ac68a10625f92b3780ca1253691f24bde7cb5ca478b53b173d7" address="unix:///run/containerd/s/63720a1f59265466596b0cf852d5726347ad554bd56fed493dad0fc879db4517" protocol=ttrpc version=3 Jul 6 23:22:22.588316 systemd[1]: Started cri-containerd-d33ca648f7cd6ac68a10625f92b3780ca1253691f24bde7cb5ca478b53b173d7.scope - libcontainer container d33ca648f7cd6ac68a10625f92b3780ca1253691f24bde7cb5ca478b53b173d7. Jul 6 23:22:22.617149 systemd-networkd[1453]: cali36f94ee5d65: Gained IPv6LL Jul 6 23:22:22.628981 containerd[1523]: time="2025-07-06T23:22:22.628833251Z" level=info msg="StartContainer for \"d33ca648f7cd6ac68a10625f92b3780ca1253691f24bde7cb5ca478b53b173d7\" returns successfully" Jul 6 23:22:22.937195 systemd-networkd[1453]: cali39d933a1c40: Gained IPv6LL Jul 6 23:22:22.968755 containerd[1523]: time="2025-07-06T23:22:22.968685648Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-9fbbd5ffc-k79h8,Uid:08864749-3109-4688-85c3-afcbdbdd39a1,Namespace:calico-system,Attempt:0,}" Jul 6 23:22:22.969223 containerd[1523]: time="2025-07-06T23:22:22.968690809Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-6smwc,Uid:dd2f3b40-ded4-4a5a-93a8-55bc4dbee958,Namespace:calico-system,Attempt:0,}" Jul 6 23:22:23.110875 systemd-networkd[1453]: cali5524807e688: Link UP Jul 6 23:22:23.111584 systemd-networkd[1453]: cali5524807e688: Gained carrier Jul 6 23:22:23.126124 containerd[1523]: 2025-07-06 23:22:23.023 [INFO][4753] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--9fbbd5ffc--k79h8-eth0 calico-kube-controllers-9fbbd5ffc- calico-system 08864749-3109-4688-85c3-afcbdbdd39a1 788 0 2025-07-06 23:21:59 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:9fbbd5ffc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-9fbbd5ffc-k79h8 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali5524807e688 [] [] }} ContainerID="a5cfa6aa09e27619cb922fbcc79f0cd91aad68f13dbb89636f98378c63b7bcf2" Namespace="calico-system" Pod="calico-kube-controllers-9fbbd5ffc-k79h8" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--9fbbd5ffc--k79h8-" Jul 6 23:22:23.126124 containerd[1523]: 2025-07-06 23:22:23.023 [INFO][4753] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a5cfa6aa09e27619cb922fbcc79f0cd91aad68f13dbb89636f98378c63b7bcf2" Namespace="calico-system" Pod="calico-kube-controllers-9fbbd5ffc-k79h8" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--9fbbd5ffc--k79h8-eth0" Jul 6 23:22:23.126124 containerd[1523]: 2025-07-06 23:22:23.057 [INFO][4780] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a5cfa6aa09e27619cb922fbcc79f0cd91aad68f13dbb89636f98378c63b7bcf2" HandleID="k8s-pod-network.a5cfa6aa09e27619cb922fbcc79f0cd91aad68f13dbb89636f98378c63b7bcf2" Workload="localhost-k8s-calico--kube--controllers--9fbbd5ffc--k79h8-eth0" Jul 6 23:22:23.126124 containerd[1523]: 2025-07-06 23:22:23.058 [INFO][4780] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a5cfa6aa09e27619cb922fbcc79f0cd91aad68f13dbb89636f98378c63b7bcf2" HandleID="k8s-pod-network.a5cfa6aa09e27619cb922fbcc79f0cd91aad68f13dbb89636f98378c63b7bcf2" Workload="localhost-k8s-calico--kube--controllers--9fbbd5ffc--k79h8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001a0e70), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-9fbbd5ffc-k79h8", "timestamp":"2025-07-06 23:22:23.057825154 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 6 23:22:23.126124 containerd[1523]: 2025-07-06 23:22:23.058 [INFO][4780] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 6 23:22:23.126124 containerd[1523]: 2025-07-06 23:22:23.058 [INFO][4780] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 6 23:22:23.126124 containerd[1523]: 2025-07-06 23:22:23.058 [INFO][4780] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jul 6 23:22:23.126124 containerd[1523]: 2025-07-06 23:22:23.068 [INFO][4780] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a5cfa6aa09e27619cb922fbcc79f0cd91aad68f13dbb89636f98378c63b7bcf2" host="localhost" Jul 6 23:22:23.126124 containerd[1523]: 2025-07-06 23:22:23.077 [INFO][4780] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jul 6 23:22:23.126124 containerd[1523]: 2025-07-06 23:22:23.082 [INFO][4780] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jul 6 23:22:23.126124 containerd[1523]: 2025-07-06 23:22:23.085 [INFO][4780] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jul 6 23:22:23.126124 containerd[1523]: 2025-07-06 23:22:23.088 [INFO][4780] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jul 6 23:22:23.126124 containerd[1523]: 2025-07-06 23:22:23.089 [INFO][4780] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.a5cfa6aa09e27619cb922fbcc79f0cd91aad68f13dbb89636f98378c63b7bcf2" host="localhost" Jul 6 23:22:23.126124 containerd[1523]: 2025-07-06 23:22:23.090 [INFO][4780] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a5cfa6aa09e27619cb922fbcc79f0cd91aad68f13dbb89636f98378c63b7bcf2 Jul 6 23:22:23.126124 containerd[1523]: 2025-07-06 23:22:23.095 [INFO][4780] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.a5cfa6aa09e27619cb922fbcc79f0cd91aad68f13dbb89636f98378c63b7bcf2" host="localhost" Jul 6 23:22:23.126124 containerd[1523]: 2025-07-06 23:22:23.102 [INFO][4780] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.a5cfa6aa09e27619cb922fbcc79f0cd91aad68f13dbb89636f98378c63b7bcf2" host="localhost" Jul 6 23:22:23.126124 containerd[1523]: 2025-07-06 23:22:23.103 [INFO][4780] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.a5cfa6aa09e27619cb922fbcc79f0cd91aad68f13dbb89636f98378c63b7bcf2" host="localhost" Jul 6 23:22:23.126124 containerd[1523]: 2025-07-06 23:22:23.103 [INFO][4780] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 6 23:22:23.126124 containerd[1523]: 2025-07-06 23:22:23.103 [INFO][4780] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="a5cfa6aa09e27619cb922fbcc79f0cd91aad68f13dbb89636f98378c63b7bcf2" HandleID="k8s-pod-network.a5cfa6aa09e27619cb922fbcc79f0cd91aad68f13dbb89636f98378c63b7bcf2" Workload="localhost-k8s-calico--kube--controllers--9fbbd5ffc--k79h8-eth0" Jul 6 23:22:23.126913 containerd[1523]: 2025-07-06 23:22:23.108 [INFO][4753] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a5cfa6aa09e27619cb922fbcc79f0cd91aad68f13dbb89636f98378c63b7bcf2" Namespace="calico-system" Pod="calico-kube-controllers-9fbbd5ffc-k79h8" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--9fbbd5ffc--k79h8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--9fbbd5ffc--k79h8-eth0", GenerateName:"calico-kube-controllers-9fbbd5ffc-", Namespace:"calico-system", SelfLink:"", UID:"08864749-3109-4688-85c3-afcbdbdd39a1", ResourceVersion:"788", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 21, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"9fbbd5ffc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-9fbbd5ffc-k79h8", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali5524807e688", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:22:23.126913 containerd[1523]: 2025-07-06 23:22:23.108 [INFO][4753] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="a5cfa6aa09e27619cb922fbcc79f0cd91aad68f13dbb89636f98378c63b7bcf2" Namespace="calico-system" Pod="calico-kube-controllers-9fbbd5ffc-k79h8" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--9fbbd5ffc--k79h8-eth0" Jul 6 23:22:23.126913 containerd[1523]: 2025-07-06 23:22:23.108 [INFO][4753] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5524807e688 ContainerID="a5cfa6aa09e27619cb922fbcc79f0cd91aad68f13dbb89636f98378c63b7bcf2" Namespace="calico-system" Pod="calico-kube-controllers-9fbbd5ffc-k79h8" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--9fbbd5ffc--k79h8-eth0" Jul 6 23:22:23.126913 containerd[1523]: 2025-07-06 23:22:23.111 [INFO][4753] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a5cfa6aa09e27619cb922fbcc79f0cd91aad68f13dbb89636f98378c63b7bcf2" Namespace="calico-system" Pod="calico-kube-controllers-9fbbd5ffc-k79h8" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--9fbbd5ffc--k79h8-eth0" Jul 6 23:22:23.126913 containerd[1523]: 2025-07-06 23:22:23.112 [INFO][4753] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a5cfa6aa09e27619cb922fbcc79f0cd91aad68f13dbb89636f98378c63b7bcf2" Namespace="calico-system" Pod="calico-kube-controllers-9fbbd5ffc-k79h8" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--9fbbd5ffc--k79h8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--9fbbd5ffc--k79h8-eth0", GenerateName:"calico-kube-controllers-9fbbd5ffc-", Namespace:"calico-system", SelfLink:"", UID:"08864749-3109-4688-85c3-afcbdbdd39a1", ResourceVersion:"788", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 21, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"9fbbd5ffc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"a5cfa6aa09e27619cb922fbcc79f0cd91aad68f13dbb89636f98378c63b7bcf2", Pod:"calico-kube-controllers-9fbbd5ffc-k79h8", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali5524807e688", MAC:"f2:d5:c6:95:4c:f2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:22:23.126913 containerd[1523]: 2025-07-06 23:22:23.122 [INFO][4753] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a5cfa6aa09e27619cb922fbcc79f0cd91aad68f13dbb89636f98378c63b7bcf2" Namespace="calico-system" Pod="calico-kube-controllers-9fbbd5ffc-k79h8" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--9fbbd5ffc--k79h8-eth0" Jul 6 23:22:23.129169 systemd-networkd[1453]: cali891b1e61b15: Gained IPv6LL Jul 6 23:22:23.149805 containerd[1523]: time="2025-07-06T23:22:23.149753706Z" level=info msg="connecting to shim a5cfa6aa09e27619cb922fbcc79f0cd91aad68f13dbb89636f98378c63b7bcf2" address="unix:///run/containerd/s/bf044d49aa170545593c7e06b7294702fbc450a0f06117c833a45b812eda13ff" namespace=k8s.io protocol=ttrpc version=3 Jul 6 23:22:23.185742 kubelet[2638]: I0706 23:22:23.185670 2638 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-cgcpv" podStartSLOduration=37.185649706 podStartE2EDuration="37.185649706s" podCreationTimestamp="2025-07-06 23:21:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-06 23:22:23.185324065 +0000 UTC m=+42.323042246" watchObservedRunningTime="2025-07-06 23:22:23.185649706 +0000 UTC m=+42.323367847" Jul 6 23:22:23.188546 systemd[1]: Started cri-containerd-a5cfa6aa09e27619cb922fbcc79f0cd91aad68f13dbb89636f98378c63b7bcf2.scope - libcontainer container a5cfa6aa09e27619cb922fbcc79f0cd91aad68f13dbb89636f98378c63b7bcf2. Jul 6 23:22:23.208595 systemd-resolved[1353]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jul 6 23:22:23.242632 systemd-networkd[1453]: calif0935c5f54e: Link UP Jul 6 23:22:23.242919 systemd-networkd[1453]: calif0935c5f54e: Gained carrier Jul 6 23:22:23.257331 systemd-networkd[1453]: cali9de86ee2d08: Gained IPv6LL Jul 6 23:22:23.263677 containerd[1523]: time="2025-07-06T23:22:23.263638358Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-9fbbd5ffc-k79h8,Uid:08864749-3109-4688-85c3-afcbdbdd39a1,Namespace:calico-system,Attempt:0,} returns sandbox id \"a5cfa6aa09e27619cb922fbcc79f0cd91aad68f13dbb89636f98378c63b7bcf2\"" Jul 6 23:22:23.269034 containerd[1523]: 2025-07-06 23:22:23.033 [INFO][4765] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--58fd7646b9--6smwc-eth0 goldmane-58fd7646b9- calico-system dd2f3b40-ded4-4a5a-93a8-55bc4dbee958 790 0 2025-07-06 23:21:59 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:58fd7646b9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-58fd7646b9-6smwc eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calif0935c5f54e [] [] }} ContainerID="7c00331cdc54d45e6677413001371ce2d4a4b7b400a4c484cff6b3cc33b79767" Namespace="calico-system" Pod="goldmane-58fd7646b9-6smwc" WorkloadEndpoint="localhost-k8s-goldmane--58fd7646b9--6smwc-" Jul 6 23:22:23.269034 containerd[1523]: 2025-07-06 23:22:23.033 [INFO][4765] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7c00331cdc54d45e6677413001371ce2d4a4b7b400a4c484cff6b3cc33b79767" Namespace="calico-system" Pod="goldmane-58fd7646b9-6smwc" WorkloadEndpoint="localhost-k8s-goldmane--58fd7646b9--6smwc-eth0" Jul 6 23:22:23.269034 containerd[1523]: 2025-07-06 23:22:23.080 [INFO][4788] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7c00331cdc54d45e6677413001371ce2d4a4b7b400a4c484cff6b3cc33b79767" HandleID="k8s-pod-network.7c00331cdc54d45e6677413001371ce2d4a4b7b400a4c484cff6b3cc33b79767" Workload="localhost-k8s-goldmane--58fd7646b9--6smwc-eth0" Jul 6 23:22:23.269034 containerd[1523]: 2025-07-06 23:22:23.080 [INFO][4788] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7c00331cdc54d45e6677413001371ce2d4a4b7b400a4c484cff6b3cc33b79767" HandleID="k8s-pod-network.7c00331cdc54d45e6677413001371ce2d4a4b7b400a4c484cff6b3cc33b79767" Workload="localhost-k8s-goldmane--58fd7646b9--6smwc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c36c0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-58fd7646b9-6smwc", "timestamp":"2025-07-06 23:22:23.080144579 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 6 23:22:23.269034 containerd[1523]: 2025-07-06 23:22:23.080 [INFO][4788] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 6 23:22:23.269034 containerd[1523]: 2025-07-06 23:22:23.103 [INFO][4788] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 6 23:22:23.269034 containerd[1523]: 2025-07-06 23:22:23.103 [INFO][4788] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jul 6 23:22:23.269034 containerd[1523]: 2025-07-06 23:22:23.170 [INFO][4788] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7c00331cdc54d45e6677413001371ce2d4a4b7b400a4c484cff6b3cc33b79767" host="localhost" Jul 6 23:22:23.269034 containerd[1523]: 2025-07-06 23:22:23.185 [INFO][4788] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" Jul 6 23:22:23.269034 containerd[1523]: 2025-07-06 23:22:23.198 [INFO][4788] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" Jul 6 23:22:23.269034 containerd[1523]: 2025-07-06 23:22:23.201 [INFO][4788] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jul 6 23:22:23.269034 containerd[1523]: 2025-07-06 23:22:23.209 [INFO][4788] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jul 6 23:22:23.269034 containerd[1523]: 2025-07-06 23:22:23.209 [INFO][4788] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.7c00331cdc54d45e6677413001371ce2d4a4b7b400a4c484cff6b3cc33b79767" host="localhost" Jul 6 23:22:23.269034 containerd[1523]: 2025-07-06 23:22:23.211 [INFO][4788] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.7c00331cdc54d45e6677413001371ce2d4a4b7b400a4c484cff6b3cc33b79767 Jul 6 23:22:23.269034 containerd[1523]: 2025-07-06 23:22:23.219 [INFO][4788] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.7c00331cdc54d45e6677413001371ce2d4a4b7b400a4c484cff6b3cc33b79767" host="localhost" Jul 6 23:22:23.269034 containerd[1523]: 2025-07-06 23:22:23.231 [INFO][4788] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.7c00331cdc54d45e6677413001371ce2d4a4b7b400a4c484cff6b3cc33b79767" host="localhost" Jul 6 23:22:23.269034 containerd[1523]: 2025-07-06 23:22:23.231 [INFO][4788] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.7c00331cdc54d45e6677413001371ce2d4a4b7b400a4c484cff6b3cc33b79767" host="localhost" Jul 6 23:22:23.269034 containerd[1523]: 2025-07-06 23:22:23.231 [INFO][4788] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 6 23:22:23.269034 containerd[1523]: 2025-07-06 23:22:23.231 [INFO][4788] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="7c00331cdc54d45e6677413001371ce2d4a4b7b400a4c484cff6b3cc33b79767" HandleID="k8s-pod-network.7c00331cdc54d45e6677413001371ce2d4a4b7b400a4c484cff6b3cc33b79767" Workload="localhost-k8s-goldmane--58fd7646b9--6smwc-eth0" Jul 6 23:22:23.270807 containerd[1523]: 2025-07-06 23:22:23.234 [INFO][4765] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7c00331cdc54d45e6677413001371ce2d4a4b7b400a4c484cff6b3cc33b79767" Namespace="calico-system" Pod="goldmane-58fd7646b9-6smwc" WorkloadEndpoint="localhost-k8s-goldmane--58fd7646b9--6smwc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--58fd7646b9--6smwc-eth0", GenerateName:"goldmane-58fd7646b9-", Namespace:"calico-system", SelfLink:"", UID:"dd2f3b40-ded4-4a5a-93a8-55bc4dbee958", ResourceVersion:"790", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 21, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"58fd7646b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-58fd7646b9-6smwc", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calif0935c5f54e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:22:23.270807 containerd[1523]: 2025-07-06 23:22:23.234 [INFO][4765] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="7c00331cdc54d45e6677413001371ce2d4a4b7b400a4c484cff6b3cc33b79767" Namespace="calico-system" Pod="goldmane-58fd7646b9-6smwc" WorkloadEndpoint="localhost-k8s-goldmane--58fd7646b9--6smwc-eth0" Jul 6 23:22:23.270807 containerd[1523]: 2025-07-06 23:22:23.234 [INFO][4765] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif0935c5f54e ContainerID="7c00331cdc54d45e6677413001371ce2d4a4b7b400a4c484cff6b3cc33b79767" Namespace="calico-system" Pod="goldmane-58fd7646b9-6smwc" WorkloadEndpoint="localhost-k8s-goldmane--58fd7646b9--6smwc-eth0" Jul 6 23:22:23.270807 containerd[1523]: 2025-07-06 23:22:23.243 [INFO][4765] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7c00331cdc54d45e6677413001371ce2d4a4b7b400a4c484cff6b3cc33b79767" Namespace="calico-system" Pod="goldmane-58fd7646b9-6smwc" WorkloadEndpoint="localhost-k8s-goldmane--58fd7646b9--6smwc-eth0" Jul 6 23:22:23.270807 containerd[1523]: 2025-07-06 23:22:23.244 [INFO][4765] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7c00331cdc54d45e6677413001371ce2d4a4b7b400a4c484cff6b3cc33b79767" Namespace="calico-system" Pod="goldmane-58fd7646b9-6smwc" WorkloadEndpoint="localhost-k8s-goldmane--58fd7646b9--6smwc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--58fd7646b9--6smwc-eth0", GenerateName:"goldmane-58fd7646b9-", Namespace:"calico-system", SelfLink:"", UID:"dd2f3b40-ded4-4a5a-93a8-55bc4dbee958", ResourceVersion:"790", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 21, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"58fd7646b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"7c00331cdc54d45e6677413001371ce2d4a4b7b400a4c484cff6b3cc33b79767", Pod:"goldmane-58fd7646b9-6smwc", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calif0935c5f54e", MAC:"72:d7:11:6b:e8:b9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:22:23.270807 containerd[1523]: 2025-07-06 23:22:23.261 [INFO][4765] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7c00331cdc54d45e6677413001371ce2d4a4b7b400a4c484cff6b3cc33b79767" Namespace="calico-system" Pod="goldmane-58fd7646b9-6smwc" WorkloadEndpoint="localhost-k8s-goldmane--58fd7646b9--6smwc-eth0" Jul 6 23:22:23.305788 containerd[1523]: time="2025-07-06T23:22:23.305740172Z" level=info msg="connecting to shim 7c00331cdc54d45e6677413001371ce2d4a4b7b400a4c484cff6b3cc33b79767" address="unix:///run/containerd/s/5de26ba25cb9bd4969a2ac53e5ab9ba0cc506f815b12d28c144733b282c3f84d" namespace=k8s.io protocol=ttrpc version=3 Jul 6 23:22:23.344746 systemd[1]: Started cri-containerd-7c00331cdc54d45e6677413001371ce2d4a4b7b400a4c484cff6b3cc33b79767.scope - libcontainer container 7c00331cdc54d45e6677413001371ce2d4a4b7b400a4c484cff6b3cc33b79767. Jul 6 23:22:23.363190 systemd-resolved[1353]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jul 6 23:22:23.385147 systemd-networkd[1453]: calif30b51a8971: Gained IPv6LL Jul 6 23:22:23.397417 containerd[1523]: time="2025-07-06T23:22:23.397374208Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-6smwc,Uid:dd2f3b40-ded4-4a5a-93a8-55bc4dbee958,Namespace:calico-system,Attempt:0,} returns sandbox id \"7c00331cdc54d45e6677413001371ce2d4a4b7b400a4c484cff6b3cc33b79767\"" Jul 6 23:22:24.153431 systemd-networkd[1453]: cali5524807e688: Gained IPv6LL Jul 6 23:22:24.313577 containerd[1523]: time="2025-07-06T23:22:24.313531540Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:22:24.314353 containerd[1523]: time="2025-07-06T23:22:24.314315035Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=44517149" Jul 6 23:22:24.315313 containerd[1523]: time="2025-07-06T23:22:24.315285874Z" level=info msg="ImageCreate event name:\"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:22:24.317361 containerd[1523]: time="2025-07-06T23:22:24.317298519Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:22:24.318170 containerd[1523]: time="2025-07-06T23:22:24.318043650Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"45886406\" in 1.786193405s" Jul 6 23:22:24.318170 containerd[1523]: time="2025-07-06T23:22:24.318079534Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\"" Jul 6 23:22:24.319710 containerd[1523]: time="2025-07-06T23:22:24.319645645Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Jul 6 23:22:24.321251 containerd[1523]: time="2025-07-06T23:22:24.321218436Z" level=info msg="CreateContainer within sandbox \"40514732c81cff1f34a6c0a65837cae2191e7294cae7c4f73ca399bbea3c89b7\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 6 23:22:24.329297 containerd[1523]: time="2025-07-06T23:22:24.329239094Z" level=info msg="Container 2982f4a095bed037e13188fc48c0ad08eee00dee50486c38c42f708d05b6d1c8: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:22:24.336824 containerd[1523]: time="2025-07-06T23:22:24.336776092Z" level=info msg="CreateContainer within sandbox \"40514732c81cff1f34a6c0a65837cae2191e7294cae7c4f73ca399bbea3c89b7\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"2982f4a095bed037e13188fc48c0ad08eee00dee50486c38c42f708d05b6d1c8\"" Jul 6 23:22:24.337661 containerd[1523]: time="2025-07-06T23:22:24.337621235Z" level=info msg="StartContainer for \"2982f4a095bed037e13188fc48c0ad08eee00dee50486c38c42f708d05b6d1c8\"" Jul 6 23:22:24.339207 containerd[1523]: time="2025-07-06T23:22:24.339053210Z" level=info msg="connecting to shim 2982f4a095bed037e13188fc48c0ad08eee00dee50486c38c42f708d05b6d1c8" address="unix:///run/containerd/s/1788f5d9a7b3f709b698adb9f9c5cef95869a7bc47d5c7545ce8a5fcda22008d" protocol=ttrpc version=3 Jul 6 23:22:24.345159 systemd-networkd[1453]: calif0935c5f54e: Gained IPv6LL Jul 6 23:22:24.360502 systemd[1]: Started cri-containerd-2982f4a095bed037e13188fc48c0ad08eee00dee50486c38c42f708d05b6d1c8.scope - libcontainer container 2982f4a095bed037e13188fc48c0ad08eee00dee50486c38c42f708d05b6d1c8. Jul 6 23:22:24.410731 containerd[1523]: time="2025-07-06T23:22:24.407420541Z" level=info msg="StartContainer for \"2982f4a095bed037e13188fc48c0ad08eee00dee50486c38c42f708d05b6d1c8\" returns successfully" Jul 6 23:22:24.559346 containerd[1523]: time="2025-07-06T23:22:24.558633486Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:22:24.559605 containerd[1523]: time="2025-07-06T23:22:24.559558079Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=77" Jul 6 23:22:24.561585 containerd[1523]: time="2025-07-06T23:22:24.561550602Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"45886406\" in 241.860271ms" Jul 6 23:22:24.561585 containerd[1523]: time="2025-07-06T23:22:24.561587606Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\"" Jul 6 23:22:24.563741 containerd[1523]: time="2025-07-06T23:22:24.563668780Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\"" Jul 6 23:22:24.566287 containerd[1523]: time="2025-07-06T23:22:24.566236093Z" level=info msg="CreateContainer within sandbox \"50255b8002ad13edea16ab07786056bc661864ac97f76d23801f687d25abe6a6\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 6 23:22:24.575337 containerd[1523]: time="2025-07-06T23:22:24.574707605Z" level=info msg="Container d0f1f67256167d786e83e4423bbc191ef60b190e4a001f654450d8c8b9eaa1b6: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:22:24.587999 containerd[1523]: time="2025-07-06T23:22:24.587942378Z" level=info msg="CreateContainer within sandbox \"50255b8002ad13edea16ab07786056bc661864ac97f76d23801f687d25abe6a6\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"d0f1f67256167d786e83e4423bbc191ef60b190e4a001f654450d8c8b9eaa1b6\"" Jul 6 23:22:24.588889 containerd[1523]: time="2025-07-06T23:22:24.588650464Z" level=info msg="StartContainer for \"d0f1f67256167d786e83e4423bbc191ef60b190e4a001f654450d8c8b9eaa1b6\"" Jul 6 23:22:24.589998 containerd[1523]: time="2025-07-06T23:22:24.589968345Z" level=info msg="connecting to shim d0f1f67256167d786e83e4423bbc191ef60b190e4a001f654450d8c8b9eaa1b6" address="unix:///run/containerd/s/da0c734d8f6e20cb723f93cac12f8ad2a3bb4d58f6431f5bfd2361c87b6d4f70" protocol=ttrpc version=3 Jul 6 23:22:24.612239 systemd[1]: Started cri-containerd-d0f1f67256167d786e83e4423bbc191ef60b190e4a001f654450d8c8b9eaa1b6.scope - libcontainer container d0f1f67256167d786e83e4423bbc191ef60b190e4a001f654450d8c8b9eaa1b6. Jul 6 23:22:24.655895 containerd[1523]: time="2025-07-06T23:22:24.655288504Z" level=info msg="StartContainer for \"d0f1f67256167d786e83e4423bbc191ef60b190e4a001f654450d8c8b9eaa1b6\" returns successfully" Jul 6 23:22:25.219734 kubelet[2638]: I0706 23:22:25.218971 2638 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-fb787ffc-8r9s6" podStartSLOduration=27.395510305 podStartE2EDuration="30.218951348s" podCreationTimestamp="2025-07-06 23:21:55 +0000 UTC" firstStartedPulling="2025-07-06 23:22:21.740047115 +0000 UTC m=+40.877765256" lastFinishedPulling="2025-07-06 23:22:24.563488158 +0000 UTC m=+43.701206299" observedRunningTime="2025-07-06 23:22:25.218248704 +0000 UTC m=+44.355966845" watchObservedRunningTime="2025-07-06 23:22:25.218951348 +0000 UTC m=+44.356669489" Jul 6 23:22:25.240809 kubelet[2638]: I0706 23:22:25.240290 2638 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-fb787ffc-krgms" podStartSLOduration=27.62118985 podStartE2EDuration="30.240237403s" podCreationTimestamp="2025-07-06 23:21:55 +0000 UTC" firstStartedPulling="2025-07-06 23:22:21.70017096 +0000 UTC m=+40.837889101" lastFinishedPulling="2025-07-06 23:22:24.319218513 +0000 UTC m=+43.456936654" observedRunningTime="2025-07-06 23:22:25.239514037 +0000 UTC m=+44.377232178" watchObservedRunningTime="2025-07-06 23:22:25.240237403 +0000 UTC m=+44.377955504" Jul 6 23:22:26.124670 containerd[1523]: time="2025-07-06T23:22:26.124021501Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:22:26.126220 containerd[1523]: time="2025-07-06T23:22:26.126083861Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2: active requests=0, bytes read=13754366" Jul 6 23:22:26.127030 containerd[1523]: time="2025-07-06T23:22:26.126939721Z" level=info msg="ImageCreate event name:\"sha256:664ed31fb4687b0de23d6e6e116bc87b236790d7355871d3237c54452e02e27c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:22:26.131949 containerd[1523]: time="2025-07-06T23:22:26.131025117Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:22:26.132938 containerd[1523]: time="2025-07-06T23:22:26.132895815Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" with image id \"sha256:664ed31fb4687b0de23d6e6e116bc87b236790d7355871d3237c54452e02e27c\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\", size \"15123559\" in 1.568859071s" Jul 6 23:22:26.132938 containerd[1523]: time="2025-07-06T23:22:26.132938140Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" returns image reference \"sha256:664ed31fb4687b0de23d6e6e116bc87b236790d7355871d3237c54452e02e27c\"" Jul 6 23:22:26.134515 containerd[1523]: time="2025-07-06T23:22:26.134442035Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\"" Jul 6 23:22:26.136397 containerd[1523]: time="2025-07-06T23:22:26.136361859Z" level=info msg="CreateContainer within sandbox \"8f8908cc103cf8188aa17a2a5e43cf99ceb3b997b20aaaa810ed1e558c3502b1\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Jul 6 23:22:26.153078 containerd[1523]: time="2025-07-06T23:22:26.147355380Z" level=info msg="Container 9d7b25ac06b8d82cc19950cd145a555c0a4c3ee399668d0132882deacfca62a1: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:22:26.163898 containerd[1523]: time="2025-07-06T23:22:26.163762211Z" level=info msg="CreateContainer within sandbox \"8f8908cc103cf8188aa17a2a5e43cf99ceb3b997b20aaaa810ed1e558c3502b1\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"9d7b25ac06b8d82cc19950cd145a555c0a4c3ee399668d0132882deacfca62a1\"" Jul 6 23:22:26.164856 containerd[1523]: time="2025-07-06T23:22:26.164815294Z" level=info msg="StartContainer for \"9d7b25ac06b8d82cc19950cd145a555c0a4c3ee399668d0132882deacfca62a1\"" Jul 6 23:22:26.166798 containerd[1523]: time="2025-07-06T23:22:26.166755920Z" level=info msg="connecting to shim 9d7b25ac06b8d82cc19950cd145a555c0a4c3ee399668d0132882deacfca62a1" address="unix:///run/containerd/s/63720a1f59265466596b0cf852d5726347ad554bd56fed493dad0fc879db4517" protocol=ttrpc version=3 Jul 6 23:22:26.189327 kubelet[2638]: I0706 23:22:26.189290 2638 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 6 23:22:26.189969 kubelet[2638]: I0706 23:22:26.189527 2638 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 6 23:22:26.209405 systemd[1]: Started cri-containerd-9d7b25ac06b8d82cc19950cd145a555c0a4c3ee399668d0132882deacfca62a1.scope - libcontainer container 9d7b25ac06b8d82cc19950cd145a555c0a4c3ee399668d0132882deacfca62a1. Jul 6 23:22:26.281593 containerd[1523]: time="2025-07-06T23:22:26.281546015Z" level=info msg="StartContainer for \"9d7b25ac06b8d82cc19950cd145a555c0a4c3ee399668d0132882deacfca62a1\" returns successfully" Jul 6 23:22:27.086954 kubelet[2638]: I0706 23:22:27.086863 2638 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Jul 6 23:22:27.086954 kubelet[2638]: I0706 23:22:27.086927 2638 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Jul 6 23:22:27.209184 kubelet[2638]: I0706 23:22:27.209124 2638 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-zdbvm" podStartSLOduration=23.620951415 podStartE2EDuration="28.209104583s" podCreationTimestamp="2025-07-06 23:21:59 +0000 UTC" firstStartedPulling="2025-07-06 23:22:21.545529019 +0000 UTC m=+40.683247160" lastFinishedPulling="2025-07-06 23:22:26.133682227 +0000 UTC m=+45.271400328" observedRunningTime="2025-07-06 23:22:27.208485832 +0000 UTC m=+46.346203933" watchObservedRunningTime="2025-07-06 23:22:27.209104583 +0000 UTC m=+46.346822684" Jul 6 23:22:27.318918 systemd[1]: Started sshd@8-10.0.0.40:22-10.0.0.1:60244.service - OpenSSH per-connection server daemon (10.0.0.1:60244). Jul 6 23:22:27.403504 sshd[5050]: Accepted publickey for core from 10.0.0.1 port 60244 ssh2: RSA SHA256:jyTvj9WiqpnTWeC15mq15pBzt3VkG8C4RFcxi7WEalo Jul 6 23:22:27.405461 sshd-session[5050]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:22:27.413934 systemd-logind[1505]: New session 9 of user core. Jul 6 23:22:27.421289 systemd[1]: Started session-9.scope - Session 9 of User core. Jul 6 23:22:27.729581 sshd[5052]: Connection closed by 10.0.0.1 port 60244 Jul 6 23:22:27.729811 sshd-session[5050]: pam_unix(sshd:session): session closed for user core Jul 6 23:22:27.735836 systemd[1]: sshd@8-10.0.0.40:22-10.0.0.1:60244.service: Deactivated successfully. Jul 6 23:22:27.739135 systemd[1]: session-9.scope: Deactivated successfully. Jul 6 23:22:27.745127 systemd-logind[1505]: Session 9 logged out. Waiting for processes to exit. Jul 6 23:22:27.748643 systemd-logind[1505]: Removed session 9. Jul 6 23:22:28.551672 containerd[1523]: time="2025-07-06T23:22:28.551613979Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:22:28.553051 containerd[1523]: time="2025-07-06T23:22:28.552408748Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.2: active requests=0, bytes read=48128336" Jul 6 23:22:28.554029 containerd[1523]: time="2025-07-06T23:22:28.553925798Z" level=info msg="ImageCreate event name:\"sha256:ba9e7793995ca67a9b78aa06adda4e89cbd435b1e88ab1032ca665140517fa7a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:22:28.556357 containerd[1523]: time="2025-07-06T23:22:28.556286302Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:22:28.556818 containerd[1523]: time="2025-07-06T23:22:28.556776917Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" with image id \"sha256:ba9e7793995ca67a9b78aa06adda4e89cbd435b1e88ab1032ca665140517fa7a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\", size \"49497545\" in 2.422259833s" Jul 6 23:22:28.556818 containerd[1523]: time="2025-07-06T23:22:28.556816201Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" returns image reference \"sha256:ba9e7793995ca67a9b78aa06adda4e89cbd435b1e88ab1032ca665140517fa7a\"" Jul 6 23:22:28.558289 containerd[1523]: time="2025-07-06T23:22:28.558264203Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\"" Jul 6 23:22:28.571669 containerd[1523]: time="2025-07-06T23:22:28.570854771Z" level=info msg="CreateContainer within sandbox \"a5cfa6aa09e27619cb922fbcc79f0cd91aad68f13dbb89636f98378c63b7bcf2\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Jul 6 23:22:28.579091 containerd[1523]: time="2025-07-06T23:22:28.579045647Z" level=info msg="Container 506e9fba5267574d019144d9eab5fc289c7022d67927b544044695c3bf0aa050: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:22:28.588392 containerd[1523]: time="2025-07-06T23:22:28.588335846Z" level=info msg="CreateContainer within sandbox \"a5cfa6aa09e27619cb922fbcc79f0cd91aad68f13dbb89636f98378c63b7bcf2\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"506e9fba5267574d019144d9eab5fc289c7022d67927b544044695c3bf0aa050\"" Jul 6 23:22:28.589128 containerd[1523]: time="2025-07-06T23:22:28.589089090Z" level=info msg="StartContainer for \"506e9fba5267574d019144d9eab5fc289c7022d67927b544044695c3bf0aa050\"" Jul 6 23:22:28.592052 containerd[1523]: time="2025-07-06T23:22:28.591955930Z" level=info msg="connecting to shim 506e9fba5267574d019144d9eab5fc289c7022d67927b544044695c3bf0aa050" address="unix:///run/containerd/s/bf044d49aa170545593c7e06b7294702fbc450a0f06117c833a45b812eda13ff" protocol=ttrpc version=3 Jul 6 23:22:28.618246 systemd[1]: Started cri-containerd-506e9fba5267574d019144d9eab5fc289c7022d67927b544044695c3bf0aa050.scope - libcontainer container 506e9fba5267574d019144d9eab5fc289c7022d67927b544044695c3bf0aa050. Jul 6 23:22:28.805550 containerd[1523]: time="2025-07-06T23:22:28.805332430Z" level=info msg="StartContainer for \"506e9fba5267574d019144d9eab5fc289c7022d67927b544044695c3bf0aa050\" returns successfully" Jul 6 23:22:29.220591 kubelet[2638]: I0706 23:22:29.220412 2638 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-9fbbd5ffc-k79h8" podStartSLOduration=24.9294039 podStartE2EDuration="30.220393414s" podCreationTimestamp="2025-07-06 23:21:59 +0000 UTC" firstStartedPulling="2025-07-06 23:22:23.267047224 +0000 UTC m=+42.404765365" lastFinishedPulling="2025-07-06 23:22:28.558036738 +0000 UTC m=+47.695754879" observedRunningTime="2025-07-06 23:22:29.219305415 +0000 UTC m=+48.357023556" watchObservedRunningTime="2025-07-06 23:22:29.220393414 +0000 UTC m=+48.358111555" Jul 6 23:22:29.251431 containerd[1523]: time="2025-07-06T23:22:29.251378293Z" level=info msg="TaskExit event in podsandbox handler container_id:\"506e9fba5267574d019144d9eab5fc289c7022d67927b544044695c3bf0aa050\" id:\"95fc5f498df166a71c8ceb89fa2934fed4872fe5c4cdf994e791339bf5764fb7\" pid:5130 exited_at:{seconds:1751844149 nanos:250832953}" Jul 6 23:22:31.312801 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3878931145.mount: Deactivated successfully. Jul 6 23:22:31.818547 containerd[1523]: time="2025-07-06T23:22:31.818492597Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:22:31.819414 containerd[1523]: time="2025-07-06T23:22:31.819293322Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.2: active requests=0, bytes read=61838790" Jul 6 23:22:31.821042 containerd[1523]: time="2025-07-06T23:22:31.820853487Z" level=info msg="ImageCreate event name:\"sha256:1389d38feb576cfff09a57a2c028a53e51a72c658f295166960f770eaf07985f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:22:31.822998 containerd[1523]: time="2025-07-06T23:22:31.822956430Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:22:31.823730 containerd[1523]: time="2025-07-06T23:22:31.823683747Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" with image id \"sha256:1389d38feb576cfff09a57a2c028a53e51a72c658f295166960f770eaf07985f\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\", size \"61838636\" in 3.265388021s" Jul 6 23:22:31.823730 containerd[1523]: time="2025-07-06T23:22:31.823724871Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" returns image reference \"sha256:1389d38feb576cfff09a57a2c028a53e51a72c658f295166960f770eaf07985f\"" Jul 6 23:22:31.827320 containerd[1523]: time="2025-07-06T23:22:31.827279047Z" level=info msg="CreateContainer within sandbox \"7c00331cdc54d45e6677413001371ce2d4a4b7b400a4c484cff6b3cc33b79767\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Jul 6 23:22:31.838067 containerd[1523]: time="2025-07-06T23:22:31.835625330Z" level=info msg="Container d55daa4573ace6ab52cb44998edac4d338f52ac095b42c75d36785ae121648ec: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:22:31.847074 containerd[1523]: time="2025-07-06T23:22:31.846993293Z" level=info msg="CreateContainer within sandbox \"7c00331cdc54d45e6677413001371ce2d4a4b7b400a4c484cff6b3cc33b79767\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"d55daa4573ace6ab52cb44998edac4d338f52ac095b42c75d36785ae121648ec\"" Jul 6 23:22:31.847616 containerd[1523]: time="2025-07-06T23:22:31.847579235Z" level=info msg="StartContainer for \"d55daa4573ace6ab52cb44998edac4d338f52ac095b42c75d36785ae121648ec\"" Jul 6 23:22:31.849223 containerd[1523]: time="2025-07-06T23:22:31.849187725Z" level=info msg="connecting to shim d55daa4573ace6ab52cb44998edac4d338f52ac095b42c75d36785ae121648ec" address="unix:///run/containerd/s/5de26ba25cb9bd4969a2ac53e5ab9ba0cc506f815b12d28c144733b282c3f84d" protocol=ttrpc version=3 Jul 6 23:22:31.883237 systemd[1]: Started cri-containerd-d55daa4573ace6ab52cb44998edac4d338f52ac095b42c75d36785ae121648ec.scope - libcontainer container d55daa4573ace6ab52cb44998edac4d338f52ac095b42c75d36785ae121648ec. Jul 6 23:22:31.932967 containerd[1523]: time="2025-07-06T23:22:31.932911025Z" level=info msg="StartContainer for \"d55daa4573ace6ab52cb44998edac4d338f52ac095b42c75d36785ae121648ec\" returns successfully" Jul 6 23:22:32.333676 containerd[1523]: time="2025-07-06T23:22:32.333609804Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d55daa4573ace6ab52cb44998edac4d338f52ac095b42c75d36785ae121648ec\" id:\"901fffb8c11940d203ac2c5029fb23caf5bf6f3b732eec44d8ee0e41c6832bc3\" pid:5200 exit_status:1 exited_at:{seconds:1751844152 nanos:326996835}" Jul 6 23:22:32.748636 systemd[1]: Started sshd@9-10.0.0.40:22-10.0.0.1:54070.service - OpenSSH per-connection server daemon (10.0.0.1:54070). Jul 6 23:22:32.821962 sshd[5216]: Accepted publickey for core from 10.0.0.1 port 54070 ssh2: RSA SHA256:jyTvj9WiqpnTWeC15mq15pBzt3VkG8C4RFcxi7WEalo Jul 6 23:22:32.824088 sshd-session[5216]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:22:32.828599 systemd-logind[1505]: New session 10 of user core. Jul 6 23:22:32.839243 systemd[1]: Started session-10.scope - Session 10 of User core. Jul 6 23:22:33.108943 sshd[5218]: Connection closed by 10.0.0.1 port 54070 Jul 6 23:22:33.109590 sshd-session[5216]: pam_unix(sshd:session): session closed for user core Jul 6 23:22:33.118728 systemd[1]: sshd@9-10.0.0.40:22-10.0.0.1:54070.service: Deactivated successfully. Jul 6 23:22:33.121126 systemd[1]: session-10.scope: Deactivated successfully. Jul 6 23:22:33.121982 systemd-logind[1505]: Session 10 logged out. Waiting for processes to exit. Jul 6 23:22:33.127613 systemd[1]: Started sshd@10-10.0.0.40:22-10.0.0.1:54074.service - OpenSSH per-connection server daemon (10.0.0.1:54074). Jul 6 23:22:33.128720 systemd-logind[1505]: Removed session 10. Jul 6 23:22:33.196813 sshd[5232]: Accepted publickey for core from 10.0.0.1 port 54074 ssh2: RSA SHA256:jyTvj9WiqpnTWeC15mq15pBzt3VkG8C4RFcxi7WEalo Jul 6 23:22:33.198388 sshd-session[5232]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:22:33.204535 systemd-logind[1505]: New session 11 of user core. Jul 6 23:22:33.214245 systemd[1]: Started session-11.scope - Session 11 of User core. Jul 6 23:22:33.288326 containerd[1523]: time="2025-07-06T23:22:33.288272279Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d55daa4573ace6ab52cb44998edac4d338f52ac095b42c75d36785ae121648ec\" id:\"e0fcdc0e71b3351ecd0048c777c4f4711cc219e84d613390cfb58ee293d50809\" pid:5248 exit_status:1 exited_at:{seconds:1751844153 nanos:287928804}" Jul 6 23:22:33.445346 sshd[5234]: Connection closed by 10.0.0.1 port 54074 Jul 6 23:22:33.446191 sshd-session[5232]: pam_unix(sshd:session): session closed for user core Jul 6 23:22:33.456612 systemd[1]: sshd@10-10.0.0.40:22-10.0.0.1:54074.service: Deactivated successfully. Jul 6 23:22:33.463615 systemd[1]: session-11.scope: Deactivated successfully. Jul 6 23:22:33.466464 systemd-logind[1505]: Session 11 logged out. Waiting for processes to exit. Jul 6 23:22:33.473263 systemd[1]: Started sshd@11-10.0.0.40:22-10.0.0.1:54086.service - OpenSSH per-connection server daemon (10.0.0.1:54086). Jul 6 23:22:33.475189 systemd-logind[1505]: Removed session 11. Jul 6 23:22:33.523811 sshd[5271]: Accepted publickey for core from 10.0.0.1 port 54086 ssh2: RSA SHA256:jyTvj9WiqpnTWeC15mq15pBzt3VkG8C4RFcxi7WEalo Jul 6 23:22:33.525357 sshd-session[5271]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:22:33.531271 systemd-logind[1505]: New session 12 of user core. Jul 6 23:22:33.542224 systemd[1]: Started session-12.scope - Session 12 of User core. Jul 6 23:22:33.691702 sshd[5273]: Connection closed by 10.0.0.1 port 54086 Jul 6 23:22:33.692277 sshd-session[5271]: pam_unix(sshd:session): session closed for user core Jul 6 23:22:33.695885 systemd[1]: sshd@11-10.0.0.40:22-10.0.0.1:54086.service: Deactivated successfully. Jul 6 23:22:33.698963 systemd[1]: session-12.scope: Deactivated successfully. Jul 6 23:22:33.699966 systemd-logind[1505]: Session 12 logged out. Waiting for processes to exit. Jul 6 23:22:33.701421 systemd-logind[1505]: Removed session 12. Jul 6 23:22:38.710359 systemd[1]: Started sshd@12-10.0.0.40:22-10.0.0.1:54098.service - OpenSSH per-connection server daemon (10.0.0.1:54098). Jul 6 23:22:38.772060 sshd[5298]: Accepted publickey for core from 10.0.0.1 port 54098 ssh2: RSA SHA256:jyTvj9WiqpnTWeC15mq15pBzt3VkG8C4RFcxi7WEalo Jul 6 23:22:38.773431 sshd-session[5298]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:22:38.779670 systemd-logind[1505]: New session 13 of user core. Jul 6 23:22:38.785240 systemd[1]: Started session-13.scope - Session 13 of User core. Jul 6 23:22:38.923851 sshd[5300]: Connection closed by 10.0.0.1 port 54098 Jul 6 23:22:38.924221 sshd-session[5298]: pam_unix(sshd:session): session closed for user core Jul 6 23:22:38.928680 systemd[1]: sshd@12-10.0.0.40:22-10.0.0.1:54098.service: Deactivated successfully. Jul 6 23:22:38.932244 systemd[1]: session-13.scope: Deactivated successfully. Jul 6 23:22:38.933450 systemd-logind[1505]: Session 13 logged out. Waiting for processes to exit. Jul 6 23:22:38.936293 systemd-logind[1505]: Removed session 13. Jul 6 23:22:43.718696 containerd[1523]: time="2025-07-06T23:22:43.718646861Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b1eb33e6f7852b4b593b9f6302f239eb680e88b5c9ed06ad603c045af33ff2f6\" id:\"35be691d710d3b928d8b9387a26494b799c2402a703448c87e8e8432a5cfc4af\" pid:5329 exited_at:{seconds:1751844163 nanos:718233624}" Jul 6 23:22:43.739266 kubelet[2638]: I0706 23:22:43.739190 2638 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-58fd7646b9-6smwc" podStartSLOduration=36.313295416 podStartE2EDuration="44.739170482s" podCreationTimestamp="2025-07-06 23:21:59 +0000 UTC" firstStartedPulling="2025-07-06 23:22:23.398794545 +0000 UTC m=+42.536512686" lastFinishedPulling="2025-07-06 23:22:31.824669611 +0000 UTC m=+50.962387752" observedRunningTime="2025-07-06 23:22:32.231515379 +0000 UTC m=+51.369233520" watchObservedRunningTime="2025-07-06 23:22:43.739170482 +0000 UTC m=+62.876888623" Jul 6 23:22:43.942664 systemd[1]: Started sshd@13-10.0.0.40:22-10.0.0.1:46246.service - OpenSSH per-connection server daemon (10.0.0.1:46246). Jul 6 23:22:44.003058 sshd[5342]: Accepted publickey for core from 10.0.0.1 port 46246 ssh2: RSA SHA256:jyTvj9WiqpnTWeC15mq15pBzt3VkG8C4RFcxi7WEalo Jul 6 23:22:44.005609 sshd-session[5342]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:22:44.014361 systemd-logind[1505]: New session 14 of user core. Jul 6 23:22:44.024287 systemd[1]: Started session-14.scope - Session 14 of User core. Jul 6 23:22:44.038836 kubelet[2638]: I0706 23:22:44.038662 2638 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 6 23:22:44.201084 sshd[5344]: Connection closed by 10.0.0.1 port 46246 Jul 6 23:22:44.201477 sshd-session[5342]: pam_unix(sshd:session): session closed for user core Jul 6 23:22:44.205694 systemd[1]: sshd@13-10.0.0.40:22-10.0.0.1:46246.service: Deactivated successfully. Jul 6 23:22:44.207859 systemd[1]: session-14.scope: Deactivated successfully. Jul 6 23:22:44.208759 systemd-logind[1505]: Session 14 logged out. Waiting for processes to exit. Jul 6 23:22:44.209986 systemd-logind[1505]: Removed session 14. Jul 6 23:22:45.349983 containerd[1523]: time="2025-07-06T23:22:45.349901719Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d55daa4573ace6ab52cb44998edac4d338f52ac095b42c75d36785ae121648ec\" id:\"b1d74d48f69e63d5c4fe5dfee198ceb948604745c548a4374bf5ad5da93018d6\" pid:5373 exited_at:{seconds:1751844165 nanos:349542487}" Jul 6 23:22:48.420761 containerd[1523]: time="2025-07-06T23:22:48.420718915Z" level=info msg="TaskExit event in podsandbox handler container_id:\"506e9fba5267574d019144d9eab5fc289c7022d67927b544044695c3bf0aa050\" id:\"b5458ba44a98c3e5872ca12d45df6f05c948ef32e14a11feba8a93ec731e9cdd\" pid:5399 exited_at:{seconds:1751844168 nanos:420394786}" Jul 6 23:22:49.214198 systemd[1]: Started sshd@14-10.0.0.40:22-10.0.0.1:46258.service - OpenSSH per-connection server daemon (10.0.0.1:46258). Jul 6 23:22:49.290721 sshd[5410]: Accepted publickey for core from 10.0.0.1 port 46258 ssh2: RSA SHA256:jyTvj9WiqpnTWeC15mq15pBzt3VkG8C4RFcxi7WEalo Jul 6 23:22:49.292503 sshd-session[5410]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:22:49.298191 systemd-logind[1505]: New session 15 of user core. Jul 6 23:22:49.309234 systemd[1]: Started session-15.scope - Session 15 of User core. Jul 6 23:22:49.505432 sshd[5412]: Connection closed by 10.0.0.1 port 46258 Jul 6 23:22:49.506397 sshd-session[5410]: pam_unix(sshd:session): session closed for user core Jul 6 23:22:49.511343 systemd-logind[1505]: Session 15 logged out. Waiting for processes to exit. Jul 6 23:22:49.511733 systemd[1]: sshd@14-10.0.0.40:22-10.0.0.1:46258.service: Deactivated successfully. Jul 6 23:22:49.515188 systemd[1]: session-15.scope: Deactivated successfully. Jul 6 23:22:49.517498 systemd-logind[1505]: Removed session 15. Jul 6 23:22:54.521321 systemd[1]: Started sshd@15-10.0.0.40:22-10.0.0.1:41172.service - OpenSSH per-connection server daemon (10.0.0.1:41172). Jul 6 23:22:54.609841 sshd[5432]: Accepted publickey for core from 10.0.0.1 port 41172 ssh2: RSA SHA256:jyTvj9WiqpnTWeC15mq15pBzt3VkG8C4RFcxi7WEalo Jul 6 23:22:54.612037 sshd-session[5432]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:22:54.616882 systemd-logind[1505]: New session 16 of user core. Jul 6 23:22:54.630273 systemd[1]: Started session-16.scope - Session 16 of User core. Jul 6 23:22:54.842916 sshd[5434]: Connection closed by 10.0.0.1 port 41172 Jul 6 23:22:54.841987 sshd-session[5432]: pam_unix(sshd:session): session closed for user core Jul 6 23:22:54.852751 systemd[1]: sshd@15-10.0.0.40:22-10.0.0.1:41172.service: Deactivated successfully. Jul 6 23:22:54.855099 systemd[1]: session-16.scope: Deactivated successfully. Jul 6 23:22:54.856170 systemd-logind[1505]: Session 16 logged out. Waiting for processes to exit. Jul 6 23:22:54.859402 systemd[1]: Started sshd@16-10.0.0.40:22-10.0.0.1:41188.service - OpenSSH per-connection server daemon (10.0.0.1:41188). Jul 6 23:22:54.861423 systemd-logind[1505]: Removed session 16. Jul 6 23:22:54.922653 sshd[5448]: Accepted publickey for core from 10.0.0.1 port 41188 ssh2: RSA SHA256:jyTvj9WiqpnTWeC15mq15pBzt3VkG8C4RFcxi7WEalo Jul 6 23:22:54.924886 sshd-session[5448]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:22:54.933186 systemd-logind[1505]: New session 17 of user core. Jul 6 23:22:54.946327 systemd[1]: Started session-17.scope - Session 17 of User core. Jul 6 23:22:55.221053 sshd[5450]: Connection closed by 10.0.0.1 port 41188 Jul 6 23:22:55.222732 sshd-session[5448]: pam_unix(sshd:session): session closed for user core Jul 6 23:22:55.239549 systemd[1]: sshd@16-10.0.0.40:22-10.0.0.1:41188.service: Deactivated successfully. Jul 6 23:22:55.242419 systemd[1]: session-17.scope: Deactivated successfully. Jul 6 23:22:55.244992 systemd-logind[1505]: Session 17 logged out. Waiting for processes to exit. Jul 6 23:22:55.248903 systemd-logind[1505]: Removed session 17. Jul 6 23:22:55.252509 systemd[1]: Started sshd@17-10.0.0.40:22-10.0.0.1:41200.service - OpenSSH per-connection server daemon (10.0.0.1:41200). Jul 6 23:22:55.320068 sshd[5461]: Accepted publickey for core from 10.0.0.1 port 41200 ssh2: RSA SHA256:jyTvj9WiqpnTWeC15mq15pBzt3VkG8C4RFcxi7WEalo Jul 6 23:22:55.321158 sshd-session[5461]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:22:55.325421 systemd-logind[1505]: New session 18 of user core. Jul 6 23:22:55.335251 systemd[1]: Started session-18.scope - Session 18 of User core. Jul 6 23:22:57.288400 sshd[5463]: Connection closed by 10.0.0.1 port 41200 Jul 6 23:22:57.290287 sshd-session[5461]: pam_unix(sshd:session): session closed for user core Jul 6 23:22:57.300326 systemd[1]: sshd@17-10.0.0.40:22-10.0.0.1:41200.service: Deactivated successfully. Jul 6 23:22:57.303071 systemd[1]: session-18.scope: Deactivated successfully. Jul 6 23:22:57.303319 systemd[1]: session-18.scope: Consumed 590ms CPU time, 73.2M memory peak. Jul 6 23:22:57.304999 systemd-logind[1505]: Session 18 logged out. Waiting for processes to exit. Jul 6 23:22:57.311478 systemd[1]: Started sshd@18-10.0.0.40:22-10.0.0.1:41220.service - OpenSSH per-connection server daemon (10.0.0.1:41220). Jul 6 23:22:57.314741 systemd-logind[1505]: Removed session 18. Jul 6 23:22:57.377542 sshd[5480]: Accepted publickey for core from 10.0.0.1 port 41220 ssh2: RSA SHA256:jyTvj9WiqpnTWeC15mq15pBzt3VkG8C4RFcxi7WEalo Jul 6 23:22:57.380936 sshd-session[5480]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:22:57.385713 systemd-logind[1505]: New session 19 of user core. Jul 6 23:22:57.394446 systemd[1]: Started session-19.scope - Session 19 of User core. Jul 6 23:22:57.709881 sshd[5485]: Connection closed by 10.0.0.1 port 41220 Jul 6 23:22:57.710827 sshd-session[5480]: pam_unix(sshd:session): session closed for user core Jul 6 23:22:57.721418 systemd[1]: sshd@18-10.0.0.40:22-10.0.0.1:41220.service: Deactivated successfully. Jul 6 23:22:57.723568 systemd[1]: session-19.scope: Deactivated successfully. Jul 6 23:22:57.725988 systemd-logind[1505]: Session 19 logged out. Waiting for processes to exit. Jul 6 23:22:57.729537 systemd[1]: Started sshd@19-10.0.0.40:22-10.0.0.1:41226.service - OpenSSH per-connection server daemon (10.0.0.1:41226). Jul 6 23:22:57.730834 systemd-logind[1505]: Removed session 19. Jul 6 23:22:57.789817 sshd[5496]: Accepted publickey for core from 10.0.0.1 port 41226 ssh2: RSA SHA256:jyTvj9WiqpnTWeC15mq15pBzt3VkG8C4RFcxi7WEalo Jul 6 23:22:57.791916 sshd-session[5496]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:22:57.799511 systemd-logind[1505]: New session 20 of user core. Jul 6 23:22:57.807238 systemd[1]: Started session-20.scope - Session 20 of User core. Jul 6 23:22:57.986005 sshd[5498]: Connection closed by 10.0.0.1 port 41226 Jul 6 23:22:57.986986 sshd-session[5496]: pam_unix(sshd:session): session closed for user core Jul 6 23:22:57.990644 systemd-logind[1505]: Session 20 logged out. Waiting for processes to exit. Jul 6 23:22:57.991335 systemd[1]: sshd@19-10.0.0.40:22-10.0.0.1:41226.service: Deactivated successfully. Jul 6 23:22:57.995231 systemd[1]: session-20.scope: Deactivated successfully. Jul 6 23:22:57.998400 systemd-logind[1505]: Removed session 20. Jul 6 23:23:01.485734 kubelet[2638]: I0706 23:23:01.485195 2638 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 6 23:23:03.006037 systemd[1]: Started sshd@20-10.0.0.40:22-10.0.0.1:59422.service - OpenSSH per-connection server daemon (10.0.0.1:59422). Jul 6 23:23:03.060334 sshd[5516]: Accepted publickey for core from 10.0.0.1 port 59422 ssh2: RSA SHA256:jyTvj9WiqpnTWeC15mq15pBzt3VkG8C4RFcxi7WEalo Jul 6 23:23:03.061904 sshd-session[5516]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:23:03.066431 systemd-logind[1505]: New session 21 of user core. Jul 6 23:23:03.077258 systemd[1]: Started session-21.scope - Session 21 of User core. Jul 6 23:23:03.209305 sshd[5518]: Connection closed by 10.0.0.1 port 59422 Jul 6 23:23:03.210265 sshd-session[5516]: pam_unix(sshd:session): session closed for user core Jul 6 23:23:03.214584 systemd[1]: sshd@20-10.0.0.40:22-10.0.0.1:59422.service: Deactivated successfully. Jul 6 23:23:03.217963 systemd[1]: session-21.scope: Deactivated successfully. Jul 6 23:23:03.220319 systemd-logind[1505]: Session 21 logged out. Waiting for processes to exit. Jul 6 23:23:03.221852 systemd-logind[1505]: Removed session 21. Jul 6 23:23:08.225349 systemd[1]: Started sshd@21-10.0.0.40:22-10.0.0.1:59434.service - OpenSSH per-connection server daemon (10.0.0.1:59434). Jul 6 23:23:08.286821 sshd[5533]: Accepted publickey for core from 10.0.0.1 port 59434 ssh2: RSA SHA256:jyTvj9WiqpnTWeC15mq15pBzt3VkG8C4RFcxi7WEalo Jul 6 23:23:08.288620 sshd-session[5533]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:23:08.294633 systemd-logind[1505]: New session 22 of user core. Jul 6 23:23:08.299222 systemd[1]: Started session-22.scope - Session 22 of User core. Jul 6 23:23:08.448731 sshd[5535]: Connection closed by 10.0.0.1 port 59434 Jul 6 23:23:08.449244 sshd-session[5533]: pam_unix(sshd:session): session closed for user core Jul 6 23:23:08.453686 systemd[1]: sshd@21-10.0.0.40:22-10.0.0.1:59434.service: Deactivated successfully. Jul 6 23:23:08.456759 systemd[1]: session-22.scope: Deactivated successfully. Jul 6 23:23:08.458184 systemd-logind[1505]: Session 22 logged out. Waiting for processes to exit. Jul 6 23:23:08.459725 systemd-logind[1505]: Removed session 22. Jul 6 23:23:11.162964 containerd[1523]: time="2025-07-06T23:23:11.162902472Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d55daa4573ace6ab52cb44998edac4d338f52ac095b42c75d36785ae121648ec\" id:\"c3ee6a3b4dd50f068f9c8f04e8a82cacd6c80a1ea3456eb0ab0466b1878639b7\" pid:5560 exited_at:{seconds:1751844191 nanos:162383401}" Jul 6 23:23:13.472396 systemd[1]: Started sshd@22-10.0.0.40:22-10.0.0.1:57300.service - OpenSSH per-connection server daemon (10.0.0.1:57300). Jul 6 23:23:13.531604 sshd[5574]: Accepted publickey for core from 10.0.0.1 port 57300 ssh2: RSA SHA256:jyTvj9WiqpnTWeC15mq15pBzt3VkG8C4RFcxi7WEalo Jul 6 23:23:13.533357 sshd-session[5574]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:23:13.539140 systemd-logind[1505]: New session 23 of user core. Jul 6 23:23:13.551275 systemd[1]: Started session-23.scope - Session 23 of User core. Jul 6 23:23:13.690056 sshd[5576]: Connection closed by 10.0.0.1 port 57300 Jul 6 23:23:13.690401 sshd-session[5574]: pam_unix(sshd:session): session closed for user core Jul 6 23:23:13.694990 systemd[1]: sshd@22-10.0.0.40:22-10.0.0.1:57300.service: Deactivated successfully. Jul 6 23:23:13.697593 systemd[1]: session-23.scope: Deactivated successfully. Jul 6 23:23:13.700459 systemd-logind[1505]: Session 23 logged out. Waiting for processes to exit. Jul 6 23:23:13.703226 systemd-logind[1505]: Removed session 23. Jul 6 23:23:13.719452 containerd[1523]: time="2025-07-06T23:23:13.719381961Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b1eb33e6f7852b4b593b9f6302f239eb680e88b5c9ed06ad603c045af33ff2f6\" id:\"5c4a5e17468f30d4b1b6124f3e67820d48c935a74c7c5e75204441bdf5f2b485\" pid:5597 exited_at:{seconds:1751844193 nanos:719046925}" Jul 6 23:23:15.356441 containerd[1523]: time="2025-07-06T23:23:15.356286002Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d55daa4573ace6ab52cb44998edac4d338f52ac095b42c75d36785ae121648ec\" id:\"ca49507278d2c800defe746b66cedf389b6503a5e01e3c95b2d82caf812d587a\" pid:5626 exited_at:{seconds:1751844195 nanos:355897285}"