Mar 25 01:10:01.914332 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Mar 25 01:10:01.914352 kernel: Linux version 6.6.83-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT Mon Mar 24 23:39:14 -00 2025 Mar 25 01:10:01.914362 kernel: KASLR enabled Mar 25 01:10:01.914368 kernel: efi: EFI v2.7 by EDK II Mar 25 01:10:01.914374 kernel: efi: SMBIOS 3.0=0xdced0000 MEMATTR=0xdbbae018 ACPI 2.0=0xd9b43018 RNG=0xd9b43a18 MEMRESERVE=0xd9b40218 Mar 25 01:10:01.914379 kernel: random: crng init done Mar 25 01:10:01.914386 kernel: secureboot: Secure boot disabled Mar 25 01:10:01.914392 kernel: ACPI: Early table checksum verification disabled Mar 25 01:10:01.914398 kernel: ACPI: RSDP 0x00000000D9B43018 000024 (v02 BOCHS ) Mar 25 01:10:01.914405 kernel: ACPI: XSDT 0x00000000D9B43F18 000064 (v01 BOCHS BXPC 00000001 01000013) Mar 25 01:10:01.914411 kernel: ACPI: FACP 0x00000000D9B43B18 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:10:01.914417 kernel: ACPI: DSDT 0x00000000D9B41018 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:10:01.914422 kernel: ACPI: APIC 0x00000000D9B43C98 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:10:01.914428 kernel: ACPI: PPTT 0x00000000D9B43098 00009C (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:10:01.914435 kernel: ACPI: GTDT 0x00000000D9B43818 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:10:01.914443 kernel: ACPI: MCFG 0x00000000D9B43A98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:10:01.914449 kernel: ACPI: SPCR 0x00000000D9B43918 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:10:01.914456 kernel: ACPI: DBG2 0x00000000D9B43998 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:10:01.914462 kernel: ACPI: IORT 0x00000000D9B43198 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:10:01.914468 kernel: ACPI: SPCR: console: pl011,mmio,0x9000000,9600 Mar 25 01:10:01.914474 kernel: NUMA: Failed to initialise from firmware Mar 25 01:10:01.914482 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x00000000dcffffff] Mar 25 01:10:01.914492 kernel: NUMA: NODE_DATA [mem 0xdc959800-0xdc95efff] Mar 25 01:10:01.914501 kernel: Zone ranges: Mar 25 01:10:01.914507 kernel: DMA [mem 0x0000000040000000-0x00000000dcffffff] Mar 25 01:10:01.914515 kernel: DMA32 empty Mar 25 01:10:01.914521 kernel: Normal empty Mar 25 01:10:01.914527 kernel: Movable zone start for each node Mar 25 01:10:01.914534 kernel: Early memory node ranges Mar 25 01:10:01.914540 kernel: node 0: [mem 0x0000000040000000-0x00000000d967ffff] Mar 25 01:10:01.914546 kernel: node 0: [mem 0x00000000d9680000-0x00000000d968ffff] Mar 25 01:10:01.914552 kernel: node 0: [mem 0x00000000d9690000-0x00000000d976ffff] Mar 25 01:10:01.914559 kernel: node 0: [mem 0x00000000d9770000-0x00000000d9b3ffff] Mar 25 01:10:01.914565 kernel: node 0: [mem 0x00000000d9b40000-0x00000000dce1ffff] Mar 25 01:10:01.914571 kernel: node 0: [mem 0x00000000dce20000-0x00000000dceaffff] Mar 25 01:10:01.914577 kernel: node 0: [mem 0x00000000dceb0000-0x00000000dcebffff] Mar 25 01:10:01.914583 kernel: node 0: [mem 0x00000000dcec0000-0x00000000dcfdffff] Mar 25 01:10:01.914590 kernel: node 0: [mem 0x00000000dcfe0000-0x00000000dcffffff] Mar 25 01:10:01.914596 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x00000000dcffffff] Mar 25 01:10:01.914603 kernel: On node 0, zone DMA: 12288 pages in unavailable ranges Mar 25 01:10:01.914611 kernel: psci: probing for conduit method from ACPI. Mar 25 01:10:01.914640 kernel: psci: PSCIv1.1 detected in firmware. Mar 25 01:10:01.914647 kernel: psci: Using standard PSCI v0.2 function IDs Mar 25 01:10:01.914655 kernel: psci: Trusted OS migration not required Mar 25 01:10:01.914662 kernel: psci: SMC Calling Convention v1.1 Mar 25 01:10:01.914668 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Mar 25 01:10:01.914675 kernel: percpu: Embedded 31 pages/cpu s86696 r8192 d32088 u126976 Mar 25 01:10:01.914681 kernel: pcpu-alloc: s86696 r8192 d32088 u126976 alloc=31*4096 Mar 25 01:10:01.914688 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Mar 25 01:10:01.914694 kernel: Detected PIPT I-cache on CPU0 Mar 25 01:10:01.914707 kernel: CPU features: detected: GIC system register CPU interface Mar 25 01:10:01.914714 kernel: CPU features: detected: Hardware dirty bit management Mar 25 01:10:01.914721 kernel: CPU features: detected: Spectre-v4 Mar 25 01:10:01.914729 kernel: CPU features: detected: Spectre-BHB Mar 25 01:10:01.914736 kernel: CPU features: kernel page table isolation forced ON by KASLR Mar 25 01:10:01.914742 kernel: CPU features: detected: Kernel page table isolation (KPTI) Mar 25 01:10:01.914749 kernel: CPU features: detected: ARM erratum 1418040 Mar 25 01:10:01.914755 kernel: CPU features: detected: SSBS not fully self-synchronizing Mar 25 01:10:01.914762 kernel: alternatives: applying boot alternatives Mar 25 01:10:01.914769 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=b84e5f613acd6cd0a8a878f32f5653a14f2e6fb2820997fecd5b2bd33a4ba3ab Mar 25 01:10:01.914776 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 25 01:10:01.914783 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 25 01:10:01.914789 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 25 01:10:01.914796 kernel: Fallback order for Node 0: 0 Mar 25 01:10:01.914803 kernel: Built 1 zonelists, mobility grouping on. Total pages: 633024 Mar 25 01:10:01.914812 kernel: Policy zone: DMA Mar 25 01:10:01.914823 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 25 01:10:01.914831 kernel: software IO TLB: area num 4. Mar 25 01:10:01.914837 kernel: software IO TLB: mapped [mem 0x00000000d2e00000-0x00000000d6e00000] (64MB) Mar 25 01:10:01.914844 kernel: Memory: 2387416K/2572288K available (10304K kernel code, 2186K rwdata, 8096K rodata, 38464K init, 897K bss, 184872K reserved, 0K cma-reserved) Mar 25 01:10:01.914851 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Mar 25 01:10:01.914858 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 25 01:10:01.914865 kernel: rcu: RCU event tracing is enabled. Mar 25 01:10:01.914872 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Mar 25 01:10:01.914878 kernel: Trampoline variant of Tasks RCU enabled. Mar 25 01:10:01.914885 kernel: Tracing variant of Tasks RCU enabled. Mar 25 01:10:01.914893 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 25 01:10:01.914900 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Mar 25 01:10:01.914906 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Mar 25 01:10:01.914913 kernel: GICv3: 256 SPIs implemented Mar 25 01:10:01.914919 kernel: GICv3: 0 Extended SPIs implemented Mar 25 01:10:01.914926 kernel: Root IRQ handler: gic_handle_irq Mar 25 01:10:01.914932 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Mar 25 01:10:01.914938 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Mar 25 01:10:01.914945 kernel: ITS [mem 0x08080000-0x0809ffff] Mar 25 01:10:01.914952 kernel: ITS@0x0000000008080000: allocated 8192 Devices @400c0000 (indirect, esz 8, psz 64K, shr 1) Mar 25 01:10:01.914958 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @400d0000 (flat, esz 8, psz 64K, shr 1) Mar 25 01:10:01.914966 kernel: GICv3: using LPI property table @0x00000000400f0000 Mar 25 01:10:01.914972 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000040100000 Mar 25 01:10:01.914979 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 25 01:10:01.914986 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 25 01:10:01.914992 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Mar 25 01:10:01.914999 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Mar 25 01:10:01.915006 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Mar 25 01:10:01.915012 kernel: arm-pv: using stolen time PV Mar 25 01:10:01.915019 kernel: Console: colour dummy device 80x25 Mar 25 01:10:01.915026 kernel: ACPI: Core revision 20230628 Mar 25 01:10:01.915033 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Mar 25 01:10:01.915041 kernel: pid_max: default: 32768 minimum: 301 Mar 25 01:10:01.915047 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 25 01:10:01.915054 kernel: landlock: Up and running. Mar 25 01:10:01.915060 kernel: SELinux: Initializing. Mar 25 01:10:01.915067 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 25 01:10:01.915074 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 25 01:10:01.915081 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Mar 25 01:10:01.915088 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Mar 25 01:10:01.915095 kernel: rcu: Hierarchical SRCU implementation. Mar 25 01:10:01.915103 kernel: rcu: Max phase no-delay instances is 400. Mar 25 01:10:01.915109 kernel: Platform MSI: ITS@0x8080000 domain created Mar 25 01:10:01.915116 kernel: PCI/MSI: ITS@0x8080000 domain created Mar 25 01:10:01.915122 kernel: Remapping and enabling EFI services. Mar 25 01:10:01.915129 kernel: smp: Bringing up secondary CPUs ... Mar 25 01:10:01.915136 kernel: Detected PIPT I-cache on CPU1 Mar 25 01:10:01.915143 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Mar 25 01:10:01.915149 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000040110000 Mar 25 01:10:01.915156 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 25 01:10:01.915164 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Mar 25 01:10:01.915171 kernel: Detected PIPT I-cache on CPU2 Mar 25 01:10:01.915182 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Mar 25 01:10:01.915190 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000040120000 Mar 25 01:10:01.915198 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 25 01:10:01.915205 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Mar 25 01:10:01.915212 kernel: Detected PIPT I-cache on CPU3 Mar 25 01:10:01.915218 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Mar 25 01:10:01.915226 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000040130000 Mar 25 01:10:01.915234 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 25 01:10:01.915241 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Mar 25 01:10:01.915248 kernel: smp: Brought up 1 node, 4 CPUs Mar 25 01:10:01.915255 kernel: SMP: Total of 4 processors activated. Mar 25 01:10:01.915262 kernel: CPU features: detected: 32-bit EL0 Support Mar 25 01:10:01.915269 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Mar 25 01:10:01.915276 kernel: CPU features: detected: Common not Private translations Mar 25 01:10:01.915286 kernel: CPU features: detected: CRC32 instructions Mar 25 01:10:01.915294 kernel: CPU features: detected: Enhanced Virtualization Traps Mar 25 01:10:01.915301 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Mar 25 01:10:01.915308 kernel: CPU features: detected: LSE atomic instructions Mar 25 01:10:01.915315 kernel: CPU features: detected: Privileged Access Never Mar 25 01:10:01.915322 kernel: CPU features: detected: RAS Extension Support Mar 25 01:10:01.915329 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Mar 25 01:10:01.915337 kernel: CPU: All CPU(s) started at EL1 Mar 25 01:10:01.915344 kernel: alternatives: applying system-wide alternatives Mar 25 01:10:01.915350 kernel: devtmpfs: initialized Mar 25 01:10:01.915358 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 25 01:10:01.915367 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Mar 25 01:10:01.915374 kernel: pinctrl core: initialized pinctrl subsystem Mar 25 01:10:01.915380 kernel: SMBIOS 3.0.0 present. Mar 25 01:10:01.915387 kernel: DMI: QEMU KVM Virtual Machine, BIOS unknown 02/02/2022 Mar 25 01:10:01.915394 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 25 01:10:01.915401 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Mar 25 01:10:01.915408 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Mar 25 01:10:01.915416 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Mar 25 01:10:01.915424 kernel: audit: initializing netlink subsys (disabled) Mar 25 01:10:01.915431 kernel: audit: type=2000 audit(0.018:1): state=initialized audit_enabled=0 res=1 Mar 25 01:10:01.915438 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 25 01:10:01.915445 kernel: cpuidle: using governor menu Mar 25 01:10:01.915452 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Mar 25 01:10:01.915459 kernel: ASID allocator initialised with 32768 entries Mar 25 01:10:01.915466 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 25 01:10:01.915474 kernel: Serial: AMBA PL011 UART driver Mar 25 01:10:01.915481 kernel: Modules: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Mar 25 01:10:01.915489 kernel: Modules: 0 pages in range for non-PLT usage Mar 25 01:10:01.915496 kernel: Modules: 509248 pages in range for PLT usage Mar 25 01:10:01.915503 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 25 01:10:01.915510 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Mar 25 01:10:01.915517 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Mar 25 01:10:01.915524 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Mar 25 01:10:01.915531 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 25 01:10:01.915538 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Mar 25 01:10:01.915545 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Mar 25 01:10:01.915554 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Mar 25 01:10:01.915561 kernel: ACPI: Added _OSI(Module Device) Mar 25 01:10:01.915568 kernel: ACPI: Added _OSI(Processor Device) Mar 25 01:10:01.915575 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 25 01:10:01.915581 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 25 01:10:01.915588 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 25 01:10:01.915597 kernel: ACPI: Interpreter enabled Mar 25 01:10:01.915606 kernel: ACPI: Using GIC for interrupt routing Mar 25 01:10:01.915619 kernel: ACPI: MCFG table detected, 1 entries Mar 25 01:10:01.915628 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Mar 25 01:10:01.915636 kernel: printk: console [ttyAMA0] enabled Mar 25 01:10:01.915644 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 25 01:10:01.915786 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 25 01:10:01.915862 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Mar 25 01:10:01.915927 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Mar 25 01:10:01.915990 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Mar 25 01:10:01.916055 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Mar 25 01:10:01.916067 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Mar 25 01:10:01.916074 kernel: PCI host bridge to bus 0000:00 Mar 25 01:10:01.916146 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Mar 25 01:10:01.916218 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Mar 25 01:10:01.916276 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Mar 25 01:10:01.916335 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 25 01:10:01.916415 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 Mar 25 01:10:01.916494 kernel: pci 0000:00:01.0: [1af4:1005] type 00 class 0x00ff00 Mar 25 01:10:01.916562 kernel: pci 0000:00:01.0: reg 0x10: [io 0x0000-0x001f] Mar 25 01:10:01.916642 kernel: pci 0000:00:01.0: reg 0x14: [mem 0x10000000-0x10000fff] Mar 25 01:10:01.916717 kernel: pci 0000:00:01.0: reg 0x20: [mem 0x8000000000-0x8000003fff 64bit pref] Mar 25 01:10:01.916787 kernel: pci 0000:00:01.0: BAR 4: assigned [mem 0x8000000000-0x8000003fff 64bit pref] Mar 25 01:10:01.916852 kernel: pci 0000:00:01.0: BAR 1: assigned [mem 0x10000000-0x10000fff] Mar 25 01:10:01.916921 kernel: pci 0000:00:01.0: BAR 0: assigned [io 0x1000-0x101f] Mar 25 01:10:01.916980 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Mar 25 01:10:01.917039 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Mar 25 01:10:01.917097 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Mar 25 01:10:01.917106 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Mar 25 01:10:01.917113 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Mar 25 01:10:01.917120 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Mar 25 01:10:01.917127 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Mar 25 01:10:01.917136 kernel: iommu: Default domain type: Translated Mar 25 01:10:01.917143 kernel: iommu: DMA domain TLB invalidation policy: strict mode Mar 25 01:10:01.917150 kernel: efivars: Registered efivars operations Mar 25 01:10:01.917157 kernel: vgaarb: loaded Mar 25 01:10:01.917163 kernel: clocksource: Switched to clocksource arch_sys_counter Mar 25 01:10:01.917170 kernel: VFS: Disk quotas dquot_6.6.0 Mar 25 01:10:01.917177 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 25 01:10:01.917184 kernel: pnp: PnP ACPI init Mar 25 01:10:01.917258 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Mar 25 01:10:01.917270 kernel: pnp: PnP ACPI: found 1 devices Mar 25 01:10:01.917277 kernel: NET: Registered PF_INET protocol family Mar 25 01:10:01.917284 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 25 01:10:01.917291 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 25 01:10:01.917298 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 25 01:10:01.917305 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 25 01:10:01.917312 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 25 01:10:01.917319 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 25 01:10:01.917327 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 25 01:10:01.917334 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 25 01:10:01.917341 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 25 01:10:01.917347 kernel: PCI: CLS 0 bytes, default 64 Mar 25 01:10:01.917354 kernel: kvm [1]: HYP mode not available Mar 25 01:10:01.917361 kernel: Initialise system trusted keyrings Mar 25 01:10:01.917368 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 25 01:10:01.917375 kernel: Key type asymmetric registered Mar 25 01:10:01.917381 kernel: Asymmetric key parser 'x509' registered Mar 25 01:10:01.917390 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Mar 25 01:10:01.917397 kernel: io scheduler mq-deadline registered Mar 25 01:10:01.917403 kernel: io scheduler kyber registered Mar 25 01:10:01.917410 kernel: io scheduler bfq registered Mar 25 01:10:01.917417 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Mar 25 01:10:01.917424 kernel: ACPI: button: Power Button [PWRB] Mar 25 01:10:01.917431 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Mar 25 01:10:01.917496 kernel: virtio-pci 0000:00:01.0: enabling device (0005 -> 0007) Mar 25 01:10:01.917505 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 25 01:10:01.917514 kernel: thunder_xcv, ver 1.0 Mar 25 01:10:01.917521 kernel: thunder_bgx, ver 1.0 Mar 25 01:10:01.917527 kernel: nicpf, ver 1.0 Mar 25 01:10:01.917534 kernel: nicvf, ver 1.0 Mar 25 01:10:01.917605 kernel: rtc-efi rtc-efi.0: registered as rtc0 Mar 25 01:10:01.917692 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-03-25T01:10:01 UTC (1742865001) Mar 25 01:10:01.917709 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 25 01:10:01.917717 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 counters available Mar 25 01:10:01.917724 kernel: watchdog: Delayed init of the lockup detector failed: -19 Mar 25 01:10:01.917735 kernel: watchdog: Hard watchdog permanently disabled Mar 25 01:10:01.917742 kernel: NET: Registered PF_INET6 protocol family Mar 25 01:10:01.917748 kernel: Segment Routing with IPv6 Mar 25 01:10:01.917755 kernel: In-situ OAM (IOAM) with IPv6 Mar 25 01:10:01.917762 kernel: NET: Registered PF_PACKET protocol family Mar 25 01:10:01.917769 kernel: Key type dns_resolver registered Mar 25 01:10:01.917776 kernel: registered taskstats version 1 Mar 25 01:10:01.917783 kernel: Loading compiled-in X.509 certificates Mar 25 01:10:01.917790 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.83-flatcar: ed4ababe871f0afac8b4236504477de11a6baf07' Mar 25 01:10:01.917798 kernel: Key type .fscrypt registered Mar 25 01:10:01.917805 kernel: Key type fscrypt-provisioning registered Mar 25 01:10:01.917812 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 25 01:10:01.917819 kernel: ima: Allocated hash algorithm: sha1 Mar 25 01:10:01.917825 kernel: ima: No architecture policies found Mar 25 01:10:01.917832 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Mar 25 01:10:01.917839 kernel: clk: Disabling unused clocks Mar 25 01:10:01.917846 kernel: Freeing unused kernel memory: 38464K Mar 25 01:10:01.917854 kernel: Run /init as init process Mar 25 01:10:01.917861 kernel: with arguments: Mar 25 01:10:01.917867 kernel: /init Mar 25 01:10:01.917874 kernel: with environment: Mar 25 01:10:01.917880 kernel: HOME=/ Mar 25 01:10:01.917887 kernel: TERM=linux Mar 25 01:10:01.917894 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 25 01:10:01.917902 systemd[1]: Successfully made /usr/ read-only. Mar 25 01:10:01.917911 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 25 01:10:01.917921 systemd[1]: Detected virtualization kvm. Mar 25 01:10:01.917928 systemd[1]: Detected architecture arm64. Mar 25 01:10:01.917935 systemd[1]: Running in initrd. Mar 25 01:10:01.917942 systemd[1]: No hostname configured, using default hostname. Mar 25 01:10:01.917950 systemd[1]: Hostname set to . Mar 25 01:10:01.917957 systemd[1]: Initializing machine ID from VM UUID. Mar 25 01:10:01.917965 systemd[1]: Queued start job for default target initrd.target. Mar 25 01:10:01.917974 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 01:10:01.917981 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 01:10:01.917989 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 25 01:10:01.917996 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 25 01:10:01.918004 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 25 01:10:01.918012 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 25 01:10:01.918020 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 25 01:10:01.918030 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 25 01:10:01.918037 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 01:10:01.918045 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 25 01:10:01.918052 systemd[1]: Reached target paths.target - Path Units. Mar 25 01:10:01.918059 systemd[1]: Reached target slices.target - Slice Units. Mar 25 01:10:01.918067 systemd[1]: Reached target swap.target - Swaps. Mar 25 01:10:01.918074 systemd[1]: Reached target timers.target - Timer Units. Mar 25 01:10:01.918082 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 25 01:10:01.918089 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 25 01:10:01.918098 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 25 01:10:01.918106 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 25 01:10:01.918113 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 25 01:10:01.918120 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 25 01:10:01.918128 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 01:10:01.918135 systemd[1]: Reached target sockets.target - Socket Units. Mar 25 01:10:01.918142 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 25 01:10:01.918150 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 25 01:10:01.918159 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 25 01:10:01.918166 systemd[1]: Starting systemd-fsck-usr.service... Mar 25 01:10:01.918173 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 25 01:10:01.918181 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 25 01:10:01.918188 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:10:01.918195 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 01:10:01.918203 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 25 01:10:01.918212 systemd[1]: Finished systemd-fsck-usr.service. Mar 25 01:10:01.918220 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 25 01:10:01.918227 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:10:01.918251 systemd-journald[237]: Collecting audit messages is disabled. Mar 25 01:10:01.918271 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 25 01:10:01.918278 kernel: Bridge firewalling registered Mar 25 01:10:01.918286 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 01:10:01.918293 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 25 01:10:01.918302 systemd-journald[237]: Journal started Mar 25 01:10:01.918321 systemd-journald[237]: Runtime Journal (/run/log/journal/6f36baba03804dca84d67d8f31bca369) is 5.9M, max 47.3M, 41.4M free. Mar 25 01:10:01.895964 systemd-modules-load[238]: Inserted module 'overlay' Mar 25 01:10:01.913523 systemd-modules-load[238]: Inserted module 'br_netfilter' Mar 25 01:10:01.921117 systemd[1]: Started systemd-journald.service - Journal Service. Mar 25 01:10:01.920737 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 25 01:10:01.923375 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 25 01:10:01.925842 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 25 01:10:01.932316 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 25 01:10:01.938669 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 25 01:10:01.940456 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 01:10:01.941892 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:10:01.943341 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 01:10:01.946026 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 25 01:10:01.948728 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 25 01:10:01.965080 dracut-cmdline[279]: dracut-dracut-053 Mar 25 01:10:01.967432 dracut-cmdline[279]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=b84e5f613acd6cd0a8a878f32f5653a14f2e6fb2820997fecd5b2bd33a4ba3ab Mar 25 01:10:01.984116 systemd-resolved[280]: Positive Trust Anchors: Mar 25 01:10:01.984132 systemd-resolved[280]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 25 01:10:01.984163 systemd-resolved[280]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 25 01:10:01.988839 systemd-resolved[280]: Defaulting to hostname 'linux'. Mar 25 01:10:01.990715 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 25 01:10:01.991562 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 25 01:10:02.039648 kernel: SCSI subsystem initialized Mar 25 01:10:02.043631 kernel: Loading iSCSI transport class v2.0-870. Mar 25 01:10:02.050632 kernel: iscsi: registered transport (tcp) Mar 25 01:10:02.063639 kernel: iscsi: registered transport (qla4xxx) Mar 25 01:10:02.063652 kernel: QLogic iSCSI HBA Driver Mar 25 01:10:02.105812 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 25 01:10:02.107796 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 25 01:10:02.142634 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 25 01:10:02.142683 kernel: device-mapper: uevent: version 1.0.3 Mar 25 01:10:02.143825 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 25 01:10:02.188653 kernel: raid6: neonx8 gen() 15779 MB/s Mar 25 01:10:02.205639 kernel: raid6: neonx4 gen() 15804 MB/s Mar 25 01:10:02.222632 kernel: raid6: neonx2 gen() 13204 MB/s Mar 25 01:10:02.239632 kernel: raid6: neonx1 gen() 10507 MB/s Mar 25 01:10:02.256629 kernel: raid6: int64x8 gen() 6776 MB/s Mar 25 01:10:02.273627 kernel: raid6: int64x4 gen() 7344 MB/s Mar 25 01:10:02.290633 kernel: raid6: int64x2 gen() 6109 MB/s Mar 25 01:10:02.307632 kernel: raid6: int64x1 gen() 5061 MB/s Mar 25 01:10:02.307663 kernel: raid6: using algorithm neonx4 gen() 15804 MB/s Mar 25 01:10:02.324655 kernel: raid6: .... xor() 12380 MB/s, rmw enabled Mar 25 01:10:02.324713 kernel: raid6: using neon recovery algorithm Mar 25 01:10:02.329641 kernel: xor: measuring software checksum speed Mar 25 01:10:02.329673 kernel: 8regs : 19942 MB/sec Mar 25 01:10:02.330667 kernel: 32regs : 21699 MB/sec Mar 25 01:10:02.331648 kernel: arm64_neon : 25393 MB/sec Mar 25 01:10:02.331673 kernel: xor: using function: arm64_neon (25393 MB/sec) Mar 25 01:10:02.380906 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 25 01:10:02.392280 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 25 01:10:02.394513 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 01:10:02.425051 systemd-udevd[464]: Using default interface naming scheme 'v255'. Mar 25 01:10:02.428742 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 01:10:02.431179 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 25 01:10:02.451838 dracut-pre-trigger[467]: rd.md=0: removing MD RAID activation Mar 25 01:10:02.477024 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 25 01:10:02.478903 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 25 01:10:02.536268 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 01:10:02.538797 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 25 01:10:02.556154 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 25 01:10:02.557379 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 25 01:10:02.558697 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 01:10:02.560244 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 25 01:10:02.562347 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 25 01:10:02.581559 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 25 01:10:02.585008 kernel: virtio_blk virtio1: 1/0/0 default/read/poll queues Mar 25 01:10:02.596913 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Mar 25 01:10:02.597021 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 25 01:10:02.597040 kernel: GPT:9289727 != 19775487 Mar 25 01:10:02.597049 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 25 01:10:02.597057 kernel: GPT:9289727 != 19775487 Mar 25 01:10:02.597065 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 25 01:10:02.597074 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 25 01:10:02.597634 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 25 01:10:02.597765 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:10:02.600710 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 01:10:02.602983 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 25 01:10:02.603125 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:10:02.605903 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:10:02.608488 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:10:02.618678 kernel: BTRFS: device fsid bf348154-9cb1-474d-801c-0e035a5758cf devid 1 transid 39 /dev/vda3 scanned by (udev-worker) (520) Mar 25 01:10:02.621675 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 scanned by (udev-worker) (522) Mar 25 01:10:02.628652 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Mar 25 01:10:02.629774 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:10:02.649033 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Mar 25 01:10:02.654914 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Mar 25 01:10:02.655833 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Mar 25 01:10:02.664380 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 25 01:10:02.666185 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 25 01:10:02.667736 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 01:10:02.684534 disk-uuid[552]: Primary Header is updated. Mar 25 01:10:02.684534 disk-uuid[552]: Secondary Entries is updated. Mar 25 01:10:02.684534 disk-uuid[552]: Secondary Header is updated. Mar 25 01:10:02.694669 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 25 01:10:02.699574 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:10:03.699496 disk-uuid[553]: The operation has completed successfully. Mar 25 01:10:03.700987 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 25 01:10:03.725723 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 25 01:10:03.725823 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 25 01:10:03.752466 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 25 01:10:03.766361 sh[573]: Success Mar 25 01:10:03.781667 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" Mar 25 01:10:03.812410 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 25 01:10:03.814767 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 25 01:10:03.834961 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 25 01:10:03.840276 kernel: BTRFS info (device dm-0): first mount of filesystem bf348154-9cb1-474d-801c-0e035a5758cf Mar 25 01:10:03.840303 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Mar 25 01:10:03.840313 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 25 01:10:03.841669 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 25 01:10:03.841687 kernel: BTRFS info (device dm-0): using free space tree Mar 25 01:10:03.845961 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 25 01:10:03.846970 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 25 01:10:03.847587 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 25 01:10:03.850199 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 25 01:10:03.871334 kernel: BTRFS info (device vda6): first mount of filesystem 09629b08-d05c-4ce3-8bf7-615041c4b2c9 Mar 25 01:10:03.871376 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Mar 25 01:10:03.871393 kernel: BTRFS info (device vda6): using free space tree Mar 25 01:10:03.874664 kernel: BTRFS info (device vda6): auto enabling async discard Mar 25 01:10:03.877651 kernel: BTRFS info (device vda6): last unmount of filesystem 09629b08-d05c-4ce3-8bf7-615041c4b2c9 Mar 25 01:10:03.880528 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 25 01:10:03.882377 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 25 01:10:03.945369 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 25 01:10:03.949876 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 25 01:10:03.983481 ignition[666]: Ignition 2.20.0 Mar 25 01:10:03.983491 ignition[666]: Stage: fetch-offline Mar 25 01:10:03.983519 ignition[666]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:10:03.983527 ignition[666]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 25 01:10:03.983701 ignition[666]: parsed url from cmdline: "" Mar 25 01:10:03.983710 ignition[666]: no config URL provided Mar 25 01:10:03.983715 ignition[666]: reading system config file "/usr/lib/ignition/user.ign" Mar 25 01:10:03.983723 ignition[666]: no config at "/usr/lib/ignition/user.ign" Mar 25 01:10:03.983745 ignition[666]: op(1): [started] loading QEMU firmware config module Mar 25 01:10:03.983749 ignition[666]: op(1): executing: "modprobe" "qemu_fw_cfg" Mar 25 01:10:03.991276 ignition[666]: op(1): [finished] loading QEMU firmware config module Mar 25 01:10:03.994827 systemd-networkd[756]: lo: Link UP Mar 25 01:10:03.994837 systemd-networkd[756]: lo: Gained carrier Mar 25 01:10:03.995629 systemd-networkd[756]: Enumeration completed Mar 25 01:10:03.995906 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 25 01:10:03.996046 systemd-networkd[756]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:10:03.996049 systemd-networkd[756]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 25 01:10:03.996845 systemd-networkd[756]: eth0: Link UP Mar 25 01:10:03.996848 systemd-networkd[756]: eth0: Gained carrier Mar 25 01:10:03.996854 systemd-networkd[756]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:10:03.998160 systemd[1]: Reached target network.target - Network. Mar 25 01:10:04.013665 systemd-networkd[756]: eth0: DHCPv4 address 10.0.0.25/16, gateway 10.0.0.1 acquired from 10.0.0.1 Mar 25 01:10:04.037758 ignition[666]: parsing config with SHA512: 0614e0b4f713ed7721ef631b1ece83b0561c43a1c0d13497105c0da34d4ab7e3370a605b4979dededd3369d13a3a6a1d1f29ada6763f6401a9880d9ab11197f8 Mar 25 01:10:04.043808 unknown[666]: fetched base config from "system" Mar 25 01:10:04.043819 unknown[666]: fetched user config from "qemu" Mar 25 01:10:04.044204 ignition[666]: fetch-offline: fetch-offline passed Mar 25 01:10:04.044271 ignition[666]: Ignition finished successfully Mar 25 01:10:04.046187 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 25 01:10:04.047893 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Mar 25 01:10:04.048641 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 25 01:10:04.071740 ignition[769]: Ignition 2.20.0 Mar 25 01:10:04.071751 ignition[769]: Stage: kargs Mar 25 01:10:04.071918 ignition[769]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:10:04.071937 ignition[769]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 25 01:10:04.072841 ignition[769]: kargs: kargs passed Mar 25 01:10:04.072886 ignition[769]: Ignition finished successfully Mar 25 01:10:04.074786 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 25 01:10:04.077105 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 25 01:10:04.105477 ignition[777]: Ignition 2.20.0 Mar 25 01:10:04.105487 ignition[777]: Stage: disks Mar 25 01:10:04.105650 ignition[777]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:10:04.105659 ignition[777]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 25 01:10:04.108127 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 25 01:10:04.106458 ignition[777]: disks: disks passed Mar 25 01:10:04.109947 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 25 01:10:04.106499 ignition[777]: Ignition finished successfully Mar 25 01:10:04.111445 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 25 01:10:04.112963 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 25 01:10:04.114630 systemd[1]: Reached target sysinit.target - System Initialization. Mar 25 01:10:04.116020 systemd[1]: Reached target basic.target - Basic System. Mar 25 01:10:04.118345 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 25 01:10:04.139198 systemd-fsck[788]: ROOT: clean, 14/553520 files, 52654/553472 blocks Mar 25 01:10:04.142300 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 25 01:10:04.145731 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 25 01:10:04.197632 kernel: EXT4-fs (vda9): mounted filesystem a7a89271-ee7d-4bda-a834-705261d6cda9 r/w with ordered data mode. Quota mode: none. Mar 25 01:10:04.198197 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 25 01:10:04.199402 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 25 01:10:04.201502 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 25 01:10:04.203103 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 25 01:10:04.204102 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 25 01:10:04.204143 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 25 01:10:04.204165 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 25 01:10:04.210783 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 25 01:10:04.213231 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 25 01:10:04.217748 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/vda6 scanned by mount (796) Mar 25 01:10:04.217767 kernel: BTRFS info (device vda6): first mount of filesystem 09629b08-d05c-4ce3-8bf7-615041c4b2c9 Mar 25 01:10:04.217777 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Mar 25 01:10:04.217786 kernel: BTRFS info (device vda6): using free space tree Mar 25 01:10:04.219661 kernel: BTRFS info (device vda6): auto enabling async discard Mar 25 01:10:04.220174 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 25 01:10:04.260110 initrd-setup-root[822]: cut: /sysroot/etc/passwd: No such file or directory Mar 25 01:10:04.263356 initrd-setup-root[829]: cut: /sysroot/etc/group: No such file or directory Mar 25 01:10:04.267520 initrd-setup-root[836]: cut: /sysroot/etc/shadow: No such file or directory Mar 25 01:10:04.271159 initrd-setup-root[843]: cut: /sysroot/etc/gshadow: No such file or directory Mar 25 01:10:04.337721 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 25 01:10:04.340053 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 25 01:10:04.341604 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 25 01:10:04.358649 kernel: BTRFS info (device vda6): last unmount of filesystem 09629b08-d05c-4ce3-8bf7-615041c4b2c9 Mar 25 01:10:04.375800 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 25 01:10:04.385638 ignition[913]: INFO : Ignition 2.20.0 Mar 25 01:10:04.385638 ignition[913]: INFO : Stage: mount Mar 25 01:10:04.385638 ignition[913]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 01:10:04.385638 ignition[913]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 25 01:10:04.389338 ignition[913]: INFO : mount: mount passed Mar 25 01:10:04.389338 ignition[913]: INFO : Ignition finished successfully Mar 25 01:10:04.387892 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 25 01:10:04.390908 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 25 01:10:04.973754 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 25 01:10:04.975204 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 25 01:10:04.992978 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/vda6 scanned by mount (926) Mar 25 01:10:04.993014 kernel: BTRFS info (device vda6): first mount of filesystem 09629b08-d05c-4ce3-8bf7-615041c4b2c9 Mar 25 01:10:04.993025 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Mar 25 01:10:04.993634 kernel: BTRFS info (device vda6): using free space tree Mar 25 01:10:04.996642 kernel: BTRFS info (device vda6): auto enabling async discard Mar 25 01:10:04.997032 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 25 01:10:05.021777 ignition[944]: INFO : Ignition 2.20.0 Mar 25 01:10:05.021777 ignition[944]: INFO : Stage: files Mar 25 01:10:05.023003 ignition[944]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 01:10:05.023003 ignition[944]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 25 01:10:05.023003 ignition[944]: DEBUG : files: compiled without relabeling support, skipping Mar 25 01:10:05.025390 ignition[944]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 25 01:10:05.025390 ignition[944]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 25 01:10:05.028387 ignition[944]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 25 01:10:05.029424 ignition[944]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 25 01:10:05.029424 ignition[944]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 25 01:10:05.028910 unknown[944]: wrote ssh authorized keys file for user: core Mar 25 01:10:05.032177 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Mar 25 01:10:05.032177 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-arm64.tar.gz: attempt #1 Mar 25 01:10:05.083974 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 25 01:10:05.663522 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Mar 25 01:10:05.663522 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 25 01:10:05.667290 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 25 01:10:05.667290 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 25 01:10:05.667290 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 25 01:10:05.667290 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 25 01:10:05.667290 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 25 01:10:05.667290 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 25 01:10:05.667290 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 25 01:10:05.667290 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 25 01:10:05.667290 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 25 01:10:05.667290 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.0-arm64.raw" Mar 25 01:10:05.667290 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.0-arm64.raw" Mar 25 01:10:05.667290 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.0-arm64.raw" Mar 25 01:10:05.667290 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.32.0-arm64.raw: attempt #1 Mar 25 01:10:05.777857 systemd-networkd[756]: eth0: Gained IPv6LL Mar 25 01:10:06.079421 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 25 01:10:06.933486 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.0-arm64.raw" Mar 25 01:10:06.933486 ignition[944]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 25 01:10:06.936199 ignition[944]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 25 01:10:06.937699 ignition[944]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 25 01:10:06.937699 ignition[944]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 25 01:10:06.937699 ignition[944]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Mar 25 01:10:06.937699 ignition[944]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Mar 25 01:10:06.937699 ignition[944]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Mar 25 01:10:06.937699 ignition[944]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Mar 25 01:10:06.937699 ignition[944]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Mar 25 01:10:06.953806 ignition[944]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Mar 25 01:10:06.956965 ignition[944]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Mar 25 01:10:06.959533 ignition[944]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Mar 25 01:10:06.959533 ignition[944]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Mar 25 01:10:06.959533 ignition[944]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Mar 25 01:10:06.959533 ignition[944]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 25 01:10:06.959533 ignition[944]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 25 01:10:06.959533 ignition[944]: INFO : files: files passed Mar 25 01:10:06.959533 ignition[944]: INFO : Ignition finished successfully Mar 25 01:10:06.959868 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 25 01:10:06.962545 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 25 01:10:06.966461 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 25 01:10:06.976865 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 25 01:10:06.976943 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 25 01:10:06.980108 initrd-setup-root-after-ignition[972]: grep: /sysroot/oem/oem-release: No such file or directory Mar 25 01:10:06.981436 initrd-setup-root-after-ignition[975]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 25 01:10:06.981436 initrd-setup-root-after-ignition[975]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 25 01:10:06.984975 initrd-setup-root-after-ignition[979]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 25 01:10:06.982682 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 25 01:10:06.983820 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 25 01:10:06.986281 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 25 01:10:07.005469 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 25 01:10:07.005560 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 25 01:10:07.007502 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 25 01:10:07.009081 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 25 01:10:07.010572 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 25 01:10:07.011223 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 25 01:10:07.027231 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 25 01:10:07.029608 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 25 01:10:07.053227 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 25 01:10:07.054524 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 01:10:07.056675 systemd[1]: Stopped target timers.target - Timer Units. Mar 25 01:10:07.058480 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 25 01:10:07.058594 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 25 01:10:07.061118 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 25 01:10:07.063256 systemd[1]: Stopped target basic.target - Basic System. Mar 25 01:10:07.065005 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 25 01:10:07.066775 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 25 01:10:07.068891 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 25 01:10:07.070944 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 25 01:10:07.072873 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 25 01:10:07.074900 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 25 01:10:07.076947 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 25 01:10:07.078738 systemd[1]: Stopped target swap.target - Swaps. Mar 25 01:10:07.080405 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 25 01:10:07.080517 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 25 01:10:07.082960 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 25 01:10:07.085008 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 01:10:07.087021 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 25 01:10:07.090698 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 01:10:07.092119 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 25 01:10:07.092230 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 25 01:10:07.095241 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 25 01:10:07.095357 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 25 01:10:07.097528 systemd[1]: Stopped target paths.target - Path Units. Mar 25 01:10:07.099207 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 25 01:10:07.102665 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 01:10:07.104017 systemd[1]: Stopped target slices.target - Slice Units. Mar 25 01:10:07.106243 systemd[1]: Stopped target sockets.target - Socket Units. Mar 25 01:10:07.107903 systemd[1]: iscsid.socket: Deactivated successfully. Mar 25 01:10:07.107981 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 25 01:10:07.109677 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 25 01:10:07.109768 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 25 01:10:07.111298 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 25 01:10:07.111405 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 25 01:10:07.113140 systemd[1]: ignition-files.service: Deactivated successfully. Mar 25 01:10:07.113237 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 25 01:10:07.115420 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 25 01:10:07.116317 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 25 01:10:07.116451 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 01:10:07.130462 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 25 01:10:07.131385 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 25 01:10:07.131539 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 01:10:07.133374 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 25 01:10:07.133509 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 25 01:10:07.144209 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 25 01:10:07.144531 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 25 01:10:07.153660 ignition[999]: INFO : Ignition 2.20.0 Mar 25 01:10:07.153660 ignition[999]: INFO : Stage: umount Mar 25 01:10:07.153660 ignition[999]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 01:10:07.153660 ignition[999]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 25 01:10:07.153660 ignition[999]: INFO : umount: umount passed Mar 25 01:10:07.153660 ignition[999]: INFO : Ignition finished successfully Mar 25 01:10:07.149910 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 25 01:10:07.150014 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 25 01:10:07.151798 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 25 01:10:07.152897 systemd[1]: Stopped target network.target - Network. Mar 25 01:10:07.154561 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 25 01:10:07.154641 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 25 01:10:07.159096 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 25 01:10:07.159151 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 25 01:10:07.161223 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 25 01:10:07.161266 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 25 01:10:07.163093 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 25 01:10:07.163135 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 25 01:10:07.165903 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 25 01:10:07.167462 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 25 01:10:07.169334 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 25 01:10:07.169417 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 25 01:10:07.171230 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 25 01:10:07.171311 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 25 01:10:07.177094 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 25 01:10:07.177195 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 25 01:10:07.181416 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 25 01:10:07.181592 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 25 01:10:07.181702 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 25 01:10:07.187562 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 25 01:10:07.188470 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 25 01:10:07.188513 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 25 01:10:07.190744 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 25 01:10:07.191779 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 25 01:10:07.191837 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 25 01:10:07.193732 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 25 01:10:07.193779 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 25 01:10:07.196286 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 25 01:10:07.196329 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 25 01:10:07.198290 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 25 01:10:07.198336 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 01:10:07.201186 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 01:10:07.205019 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 25 01:10:07.205075 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 25 01:10:07.222300 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 25 01:10:07.222431 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 01:10:07.224555 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 25 01:10:07.224655 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 25 01:10:07.226443 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 25 01:10:07.226504 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 25 01:10:07.228065 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 25 01:10:07.228096 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 01:10:07.229777 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 25 01:10:07.229826 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 25 01:10:07.232450 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 25 01:10:07.232497 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 25 01:10:07.235180 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 25 01:10:07.235226 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:10:07.237977 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 25 01:10:07.239140 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 25 01:10:07.239196 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 01:10:07.241935 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 25 01:10:07.241977 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:10:07.245479 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Mar 25 01:10:07.245530 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 25 01:10:07.252095 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 25 01:10:07.252213 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 25 01:10:07.254123 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 25 01:10:07.256691 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 25 01:10:07.273893 systemd[1]: Switching root. Mar 25 01:10:07.304404 systemd-journald[237]: Journal stopped Mar 25 01:10:08.060304 systemd-journald[237]: Received SIGTERM from PID 1 (systemd). Mar 25 01:10:08.060361 kernel: SELinux: policy capability network_peer_controls=1 Mar 25 01:10:08.060373 kernel: SELinux: policy capability open_perms=1 Mar 25 01:10:08.060382 kernel: SELinux: policy capability extended_socket_class=1 Mar 25 01:10:08.060391 kernel: SELinux: policy capability always_check_network=0 Mar 25 01:10:08.060400 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 25 01:10:08.060409 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 25 01:10:08.060418 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 25 01:10:08.060427 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 25 01:10:08.060441 kernel: audit: type=1403 audit(1742865007.442:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 25 01:10:08.060453 systemd[1]: Successfully loaded SELinux policy in 30.986ms. Mar 25 01:10:08.060472 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 9.539ms. Mar 25 01:10:08.060485 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 25 01:10:08.060496 systemd[1]: Detected virtualization kvm. Mar 25 01:10:08.060506 systemd[1]: Detected architecture arm64. Mar 25 01:10:08.060516 systemd[1]: Detected first boot. Mar 25 01:10:08.060526 systemd[1]: Initializing machine ID from VM UUID. Mar 25 01:10:08.060537 zram_generator::config[1048]: No configuration found. Mar 25 01:10:08.060550 kernel: NET: Registered PF_VSOCK protocol family Mar 25 01:10:08.060559 systemd[1]: Populated /etc with preset unit settings. Mar 25 01:10:08.060570 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 25 01:10:08.060580 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 25 01:10:08.060590 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 25 01:10:08.060600 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 25 01:10:08.060611 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 25 01:10:08.060635 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 25 01:10:08.060645 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 25 01:10:08.060658 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 25 01:10:08.060668 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 25 01:10:08.060678 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 25 01:10:08.060689 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 25 01:10:08.060699 systemd[1]: Created slice user.slice - User and Session Slice. Mar 25 01:10:08.060709 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 01:10:08.060726 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 01:10:08.060739 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 25 01:10:08.060749 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 25 01:10:08.060762 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 25 01:10:08.060772 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 25 01:10:08.060782 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Mar 25 01:10:08.060794 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 01:10:08.060804 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 25 01:10:08.060814 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 25 01:10:08.060824 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 25 01:10:08.060836 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 25 01:10:08.060846 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 01:10:08.060857 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 25 01:10:08.060867 systemd[1]: Reached target slices.target - Slice Units. Mar 25 01:10:08.060877 systemd[1]: Reached target swap.target - Swaps. Mar 25 01:10:08.060887 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 25 01:10:08.060896 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 25 01:10:08.060906 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 25 01:10:08.060917 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 25 01:10:08.060927 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 25 01:10:08.060938 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 01:10:08.060949 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 25 01:10:08.060959 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 25 01:10:08.060970 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 25 01:10:08.060979 systemd[1]: Mounting media.mount - External Media Directory... Mar 25 01:10:08.060989 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 25 01:10:08.060999 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 25 01:10:08.061011 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 25 01:10:08.061023 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 25 01:10:08.061033 systemd[1]: Reached target machines.target - Containers. Mar 25 01:10:08.061048 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 25 01:10:08.061058 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 01:10:08.061068 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 25 01:10:08.061079 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 25 01:10:08.061088 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 25 01:10:08.061098 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 25 01:10:08.061109 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 25 01:10:08.061120 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 25 01:10:08.061130 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 25 01:10:08.061141 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 25 01:10:08.061151 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 25 01:10:08.061161 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 25 01:10:08.061171 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 25 01:10:08.061181 systemd[1]: Stopped systemd-fsck-usr.service. Mar 25 01:10:08.061191 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 01:10:08.061203 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 25 01:10:08.061213 kernel: loop: module loaded Mar 25 01:10:08.061222 kernel: fuse: init (API version 7.39) Mar 25 01:10:08.061233 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 25 01:10:08.061244 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 25 01:10:08.061254 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 25 01:10:08.061264 kernel: ACPI: bus type drm_connector registered Mar 25 01:10:08.061274 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 25 01:10:08.061284 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 25 01:10:08.061296 systemd[1]: verity-setup.service: Deactivated successfully. Mar 25 01:10:08.061306 systemd[1]: Stopped verity-setup.service. Mar 25 01:10:08.061316 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 25 01:10:08.061326 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 25 01:10:08.061335 systemd[1]: Mounted media.mount - External Media Directory. Mar 25 01:10:08.061348 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 25 01:10:08.061358 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 25 01:10:08.061368 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 25 01:10:08.061378 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 25 01:10:08.061404 systemd-journald[1120]: Collecting audit messages is disabled. Mar 25 01:10:08.061426 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 01:10:08.061436 systemd-journald[1120]: Journal started Mar 25 01:10:08.061458 systemd-journald[1120]: Runtime Journal (/run/log/journal/6f36baba03804dca84d67d8f31bca369) is 5.9M, max 47.3M, 41.4M free. Mar 25 01:10:07.832291 systemd[1]: Queued start job for default target multi-user.target. Mar 25 01:10:07.845572 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Mar 25 01:10:07.845951 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 25 01:10:08.064552 systemd[1]: Started systemd-journald.service - Journal Service. Mar 25 01:10:08.065249 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 25 01:10:08.065510 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 25 01:10:08.066993 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 25 01:10:08.067265 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 25 01:10:08.070758 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 25 01:10:08.071007 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 25 01:10:08.072320 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 25 01:10:08.072459 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 25 01:10:08.073988 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 25 01:10:08.074127 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 25 01:10:08.075411 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 25 01:10:08.075544 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 25 01:10:08.077100 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 25 01:10:08.078689 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 25 01:10:08.080200 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 25 01:10:08.081809 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 25 01:10:08.096063 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 25 01:10:08.098484 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 25 01:10:08.100601 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 25 01:10:08.101707 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 25 01:10:08.101747 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 25 01:10:08.103565 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 25 01:10:08.113414 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 25 01:10:08.115502 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 25 01:10:08.116611 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 01:10:08.117604 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 25 01:10:08.119456 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 25 01:10:08.120679 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 25 01:10:08.123756 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 25 01:10:08.124807 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 25 01:10:08.129421 systemd-journald[1120]: Time spent on flushing to /var/log/journal/6f36baba03804dca84d67d8f31bca369 is 40.110ms for 865 entries. Mar 25 01:10:08.129421 systemd-journald[1120]: System Journal (/var/log/journal/6f36baba03804dca84d67d8f31bca369) is 8M, max 195.6M, 187.6M free. Mar 25 01:10:08.187865 systemd-journald[1120]: Received client request to flush runtime journal. Mar 25 01:10:08.187921 kernel: loop0: detected capacity change from 0 to 103832 Mar 25 01:10:08.187938 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 25 01:10:08.128357 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 25 01:10:08.132281 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 25 01:10:08.137752 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 25 01:10:08.142684 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 01:10:08.143891 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 25 01:10:08.145662 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 25 01:10:08.147042 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 25 01:10:08.149110 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 25 01:10:08.155982 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 25 01:10:08.159805 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 25 01:10:08.162699 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 25 01:10:08.164016 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 25 01:10:08.180978 udevadm[1177]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Mar 25 01:10:08.186430 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 25 01:10:08.189503 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 25 01:10:08.190929 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 25 01:10:08.195070 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 25 01:10:08.215637 kernel: loop1: detected capacity change from 0 to 201592 Mar 25 01:10:08.236758 systemd-tmpfiles[1180]: ACLs are not supported, ignoring. Mar 25 01:10:08.236775 systemd-tmpfiles[1180]: ACLs are not supported, ignoring. Mar 25 01:10:08.243333 kernel: loop2: detected capacity change from 0 to 126448 Mar 25 01:10:08.242096 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 01:10:08.279643 kernel: loop3: detected capacity change from 0 to 103832 Mar 25 01:10:08.284627 kernel: loop4: detected capacity change from 0 to 201592 Mar 25 01:10:08.290634 kernel: loop5: detected capacity change from 0 to 126448 Mar 25 01:10:08.295869 (sd-merge)[1189]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Mar 25 01:10:08.296273 (sd-merge)[1189]: Merged extensions into '/usr'. Mar 25 01:10:08.299311 systemd[1]: Reload requested from client PID 1165 ('systemd-sysext') (unit systemd-sysext.service)... Mar 25 01:10:08.299445 systemd[1]: Reloading... Mar 25 01:10:08.348642 zram_generator::config[1213]: No configuration found. Mar 25 01:10:08.388484 ldconfig[1160]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 25 01:10:08.469075 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:10:08.518321 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 25 01:10:08.518862 systemd[1]: Reloading finished in 218 ms. Mar 25 01:10:08.537658 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 25 01:10:08.539173 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 25 01:10:08.558051 systemd[1]: Starting ensure-sysext.service... Mar 25 01:10:08.562238 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 25 01:10:08.575968 systemd[1]: Reload requested from client PID 1251 ('systemctl') (unit ensure-sysext.service)... Mar 25 01:10:08.575985 systemd[1]: Reloading... Mar 25 01:10:08.579595 systemd-tmpfiles[1252]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 25 01:10:08.579874 systemd-tmpfiles[1252]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 25 01:10:08.580490 systemd-tmpfiles[1252]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 25 01:10:08.580755 systemd-tmpfiles[1252]: ACLs are not supported, ignoring. Mar 25 01:10:08.580812 systemd-tmpfiles[1252]: ACLs are not supported, ignoring. Mar 25 01:10:08.583249 systemd-tmpfiles[1252]: Detected autofs mount point /boot during canonicalization of boot. Mar 25 01:10:08.583263 systemd-tmpfiles[1252]: Skipping /boot Mar 25 01:10:08.592159 systemd-tmpfiles[1252]: Detected autofs mount point /boot during canonicalization of boot. Mar 25 01:10:08.592176 systemd-tmpfiles[1252]: Skipping /boot Mar 25 01:10:08.628646 zram_generator::config[1287]: No configuration found. Mar 25 01:10:08.700873 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:10:08.749754 systemd[1]: Reloading finished in 173 ms. Mar 25 01:10:08.760146 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 25 01:10:08.766672 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 01:10:08.782290 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 25 01:10:08.784228 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 25 01:10:08.786422 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 25 01:10:08.795409 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 25 01:10:08.807814 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 01:10:08.812584 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 25 01:10:08.823134 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 25 01:10:08.824628 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 25 01:10:08.826163 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 25 01:10:08.841866 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 01:10:08.843637 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 25 01:10:08.849223 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 25 01:10:08.854336 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 25 01:10:08.856501 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 01:10:08.856699 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 01:10:08.856727 systemd-udevd[1322]: Using default interface naming scheme 'v255'. Mar 25 01:10:08.857974 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 25 01:10:08.858764 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 25 01:10:08.861836 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 25 01:10:08.862016 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 25 01:10:08.872242 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 25 01:10:08.872418 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 25 01:10:08.874177 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 25 01:10:08.874321 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 25 01:10:08.877856 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 25 01:10:08.880765 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 25 01:10:08.885309 augenrules[1352]: No rules Mar 25 01:10:08.886804 systemd[1]: audit-rules.service: Deactivated successfully. Mar 25 01:10:08.887001 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 25 01:10:08.890141 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 25 01:10:08.891296 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 01:10:08.898268 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 25 01:10:08.899232 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 01:10:08.902659 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 25 01:10:08.906883 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 25 01:10:08.908927 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 25 01:10:08.912088 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 25 01:10:08.913024 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 01:10:08.913070 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 01:10:08.914835 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 25 01:10:08.917978 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 25 01:10:08.919967 systemd[1]: Finished ensure-sysext.service. Mar 25 01:10:08.920875 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 25 01:10:08.921041 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 25 01:10:08.922224 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 25 01:10:08.922362 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 25 01:10:08.923759 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 25 01:10:08.923897 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 25 01:10:08.925052 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 25 01:10:08.925205 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 25 01:10:08.927198 augenrules[1375]: /sbin/augenrules: No change Mar 25 01:10:08.937091 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 25 01:10:08.937165 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 25 01:10:08.939242 augenrules[1411]: No rules Mar 25 01:10:08.939334 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 25 01:10:08.941034 systemd[1]: audit-rules.service: Deactivated successfully. Mar 25 01:10:08.941708 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 25 01:10:08.975752 systemd-resolved[1320]: Positive Trust Anchors: Mar 25 01:10:08.977505 systemd-resolved[1320]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 25 01:10:08.977540 systemd-resolved[1320]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 25 01:10:08.984590 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Mar 25 01:10:08.986239 systemd-resolved[1320]: Defaulting to hostname 'linux'. Mar 25 01:10:08.991830 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 25 01:10:08.992805 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 25 01:10:09.013406 systemd-networkd[1395]: lo: Link UP Mar 25 01:10:09.013414 systemd-networkd[1395]: lo: Gained carrier Mar 25 01:10:09.016630 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (1365) Mar 25 01:10:09.017026 systemd-networkd[1395]: Enumeration completed Mar 25 01:10:09.017117 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 25 01:10:09.017730 systemd-networkd[1395]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:10:09.017739 systemd-networkd[1395]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 25 01:10:09.018140 systemd[1]: Reached target network.target - Network. Mar 25 01:10:09.018216 systemd-networkd[1395]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:10:09.018238 systemd-networkd[1395]: eth0: Link UP Mar 25 01:10:09.018240 systemd-networkd[1395]: eth0: Gained carrier Mar 25 01:10:09.018247 systemd-networkd[1395]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:10:09.022038 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 25 01:10:09.027778 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 25 01:10:09.031672 systemd-networkd[1395]: eth0: DHCPv4 address 10.0.0.25/16, gateway 10.0.0.1 acquired from 10.0.0.1 Mar 25 01:10:09.049605 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 25 01:10:08.595089 systemd-timesyncd[1416]: Contacted time server 10.0.0.1:123 (10.0.0.1). Mar 25 01:10:08.604621 systemd-journald[1120]: Time jumped backwards, rotating. Mar 25 01:10:08.595140 systemd-timesyncd[1416]: Initial clock synchronization to Tue 2025-03-25 01:10:08.595009 UTC. Mar 25 01:10:08.595615 systemd-resolved[1320]: Clock change detected. Flushing caches. Mar 25 01:10:08.599125 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 25 01:10:08.600902 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 25 01:10:08.602937 systemd[1]: Reached target time-set.target - System Time Set. Mar 25 01:10:08.604988 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 25 01:10:08.629304 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 25 01:10:08.639354 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:10:08.653847 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 25 01:10:08.656241 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 25 01:10:08.689525 lvm[1439]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 25 01:10:08.731851 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:10:08.747472 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 25 01:10:08.748859 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 25 01:10:08.749742 systemd[1]: Reached target sysinit.target - System Initialization. Mar 25 01:10:08.750792 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 25 01:10:08.751823 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 25 01:10:08.753052 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 25 01:10:08.754016 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 25 01:10:08.754999 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 25 01:10:08.755941 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 25 01:10:08.755982 systemd[1]: Reached target paths.target - Path Units. Mar 25 01:10:08.756827 systemd[1]: Reached target timers.target - Timer Units. Mar 25 01:10:08.758469 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 25 01:10:08.761123 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 25 01:10:08.764715 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 25 01:10:08.765832 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 25 01:10:08.766816 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 25 01:10:08.771280 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 25 01:10:08.772725 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 25 01:10:08.774679 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 25 01:10:08.775982 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 25 01:10:08.776896 systemd[1]: Reached target sockets.target - Socket Units. Mar 25 01:10:08.777638 systemd[1]: Reached target basic.target - Basic System. Mar 25 01:10:08.778344 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 25 01:10:08.778376 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 25 01:10:08.779241 systemd[1]: Starting containerd.service - containerd container runtime... Mar 25 01:10:08.780947 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 25 01:10:08.783573 lvm[1448]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 25 01:10:08.784553 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 25 01:10:08.788419 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 25 01:10:08.789367 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 25 01:10:08.792939 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 25 01:10:08.795099 jq[1451]: false Mar 25 01:10:08.795122 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 25 01:10:08.796952 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 25 01:10:08.799286 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 25 01:10:08.802139 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 25 01:10:08.806243 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 25 01:10:08.806695 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 25 01:10:08.810408 systemd[1]: Starting update-engine.service - Update Engine... Mar 25 01:10:08.812650 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 25 01:10:08.814416 extend-filesystems[1452]: Found loop3 Mar 25 01:10:08.814416 extend-filesystems[1452]: Found loop4 Mar 25 01:10:08.814416 extend-filesystems[1452]: Found loop5 Mar 25 01:10:08.814416 extend-filesystems[1452]: Found vda Mar 25 01:10:08.814416 extend-filesystems[1452]: Found vda1 Mar 25 01:10:08.814416 extend-filesystems[1452]: Found vda2 Mar 25 01:10:08.814416 extend-filesystems[1452]: Found vda3 Mar 25 01:10:08.814416 extend-filesystems[1452]: Found usr Mar 25 01:10:08.814416 extend-filesystems[1452]: Found vda4 Mar 25 01:10:08.824795 extend-filesystems[1452]: Found vda6 Mar 25 01:10:08.824795 extend-filesystems[1452]: Found vda7 Mar 25 01:10:08.824795 extend-filesystems[1452]: Found vda9 Mar 25 01:10:08.824795 extend-filesystems[1452]: Checking size of /dev/vda9 Mar 25 01:10:08.819093 dbus-daemon[1450]: [system] SELinux support is enabled Mar 25 01:10:08.815507 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 25 01:10:08.828634 jq[1462]: true Mar 25 01:10:08.820027 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 25 01:10:08.823810 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 25 01:10:08.824002 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 25 01:10:08.825158 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 25 01:10:08.825339 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 25 01:10:08.835404 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 25 01:10:08.835967 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 25 01:10:08.837520 extend-filesystems[1452]: Resized partition /dev/vda9 Mar 25 01:10:08.837902 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 25 01:10:08.837925 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 25 01:10:08.843410 (ntainerd)[1474]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 25 01:10:08.846402 tar[1472]: linux-arm64/LICENSE Mar 25 01:10:08.846402 tar[1472]: linux-arm64/helm Mar 25 01:10:08.857456 extend-filesystems[1484]: resize2fs 1.47.2 (1-Jan-2025) Mar 25 01:10:08.859034 jq[1473]: true Mar 25 01:10:08.861683 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Mar 25 01:10:08.861724 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (1368) Mar 25 01:10:08.863107 systemd[1]: motdgen.service: Deactivated successfully. Mar 25 01:10:08.863323 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 25 01:10:08.890454 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Mar 25 01:10:08.898653 update_engine[1461]: I20250325 01:10:08.898404 1461 main.cc:92] Flatcar Update Engine starting Mar 25 01:10:08.902390 systemd[1]: Started update-engine.service - Update Engine. Mar 25 01:10:08.905882 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 25 01:10:08.911650 update_engine[1461]: I20250325 01:10:08.905045 1461 update_check_scheduler.cc:74] Next update check in 2m9s Mar 25 01:10:08.919147 extend-filesystems[1484]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Mar 25 01:10:08.919147 extend-filesystems[1484]: old_desc_blocks = 1, new_desc_blocks = 1 Mar 25 01:10:08.919147 extend-filesystems[1484]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Mar 25 01:10:08.933485 extend-filesystems[1452]: Resized filesystem in /dev/vda9 Mar 25 01:10:08.920821 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 25 01:10:08.921046 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 25 01:10:08.925236 systemd-logind[1458]: Watching system buttons on /dev/input/event0 (Power Button) Mar 25 01:10:08.928954 systemd-logind[1458]: New seat seat0. Mar 25 01:10:08.937739 systemd[1]: Started systemd-logind.service - User Login Management. Mar 25 01:10:08.954496 bash[1504]: Updated "/home/core/.ssh/authorized_keys" Mar 25 01:10:08.955980 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 25 01:10:08.960742 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Mar 25 01:10:09.020643 locksmithd[1501]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 25 01:10:09.090411 containerd[1474]: time="2025-03-25T01:10:09Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 25 01:10:09.091081 containerd[1474]: time="2025-03-25T01:10:09.091043026Z" level=info msg="starting containerd" revision=88aa2f531d6c2922003cc7929e51daf1c14caa0a version=v2.0.1 Mar 25 01:10:09.104262 containerd[1474]: time="2025-03-25T01:10:09.104187106Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="6.16µs" Mar 25 01:10:09.104262 containerd[1474]: time="2025-03-25T01:10:09.104228786Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 25 01:10:09.104262 containerd[1474]: time="2025-03-25T01:10:09.104248226Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 25 01:10:09.104547 containerd[1474]: time="2025-03-25T01:10:09.104513346Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 25 01:10:09.104574 containerd[1474]: time="2025-03-25T01:10:09.104559026Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 25 01:10:09.104656 containerd[1474]: time="2025-03-25T01:10:09.104635106Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 25 01:10:09.104726 containerd[1474]: time="2025-03-25T01:10:09.104706106Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 25 01:10:09.104757 containerd[1474]: time="2025-03-25T01:10:09.104724186Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 25 01:10:09.105182 containerd[1474]: time="2025-03-25T01:10:09.105154226Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 25 01:10:09.105182 containerd[1474]: time="2025-03-25T01:10:09.105179786Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 25 01:10:09.105231 containerd[1474]: time="2025-03-25T01:10:09.105191346Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 25 01:10:09.105231 containerd[1474]: time="2025-03-25T01:10:09.105199226Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 25 01:10:09.107432 containerd[1474]: time="2025-03-25T01:10:09.105345186Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 25 01:10:09.107432 containerd[1474]: time="2025-03-25T01:10:09.105656826Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 25 01:10:09.107432 containerd[1474]: time="2025-03-25T01:10:09.105745826Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 25 01:10:09.107432 containerd[1474]: time="2025-03-25T01:10:09.105761586Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 25 01:10:09.107432 containerd[1474]: time="2025-03-25T01:10:09.107036106Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 25 01:10:09.108286 containerd[1474]: time="2025-03-25T01:10:09.108230466Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 25 01:10:09.108387 containerd[1474]: time="2025-03-25T01:10:09.108365546Z" level=info msg="metadata content store policy set" policy=shared Mar 25 01:10:09.113037 containerd[1474]: time="2025-03-25T01:10:09.112999866Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 25 01:10:09.113090 containerd[1474]: time="2025-03-25T01:10:09.113060106Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 25 01:10:09.113090 containerd[1474]: time="2025-03-25T01:10:09.113077226Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 25 01:10:09.113090 containerd[1474]: time="2025-03-25T01:10:09.113090786Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 25 01:10:09.113186 containerd[1474]: time="2025-03-25T01:10:09.113102066Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 25 01:10:09.113186 containerd[1474]: time="2025-03-25T01:10:09.113112386Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 25 01:10:09.113186 containerd[1474]: time="2025-03-25T01:10:09.113124066Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 25 01:10:09.113186 containerd[1474]: time="2025-03-25T01:10:09.113135626Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 25 01:10:09.113186 containerd[1474]: time="2025-03-25T01:10:09.113146226Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 25 01:10:09.113186 containerd[1474]: time="2025-03-25T01:10:09.113163506Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 25 01:10:09.113186 containerd[1474]: time="2025-03-25T01:10:09.113173306Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 25 01:10:09.113186 containerd[1474]: time="2025-03-25T01:10:09.113185586Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 25 01:10:09.113326 containerd[1474]: time="2025-03-25T01:10:09.113309266Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 25 01:10:09.113326 containerd[1474]: time="2025-03-25T01:10:09.113331026Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 25 01:10:09.113371 containerd[1474]: time="2025-03-25T01:10:09.113343306Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 25 01:10:09.113371 containerd[1474]: time="2025-03-25T01:10:09.113353946Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 25 01:10:09.113371 containerd[1474]: time="2025-03-25T01:10:09.113364586Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 25 01:10:09.113495 containerd[1474]: time="2025-03-25T01:10:09.113374226Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 25 01:10:09.113495 containerd[1474]: time="2025-03-25T01:10:09.113386106Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 25 01:10:09.113495 containerd[1474]: time="2025-03-25T01:10:09.113397906Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 25 01:10:09.113495 containerd[1474]: time="2025-03-25T01:10:09.113408946Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 25 01:10:09.113495 containerd[1474]: time="2025-03-25T01:10:09.113421506Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 25 01:10:09.113495 containerd[1474]: time="2025-03-25T01:10:09.113456346Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 25 01:10:09.113878 containerd[1474]: time="2025-03-25T01:10:09.113724426Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 25 01:10:09.113878 containerd[1474]: time="2025-03-25T01:10:09.113845506Z" level=info msg="Start snapshots syncer" Mar 25 01:10:09.113878 containerd[1474]: time="2025-03-25T01:10:09.113874066Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 25 01:10:09.114281 containerd[1474]: time="2025-03-25T01:10:09.114095306Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 25 01:10:09.114281 containerd[1474]: time="2025-03-25T01:10:09.114150226Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 25 01:10:09.114390 containerd[1474]: time="2025-03-25T01:10:09.114220186Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 25 01:10:09.114390 containerd[1474]: time="2025-03-25T01:10:09.114337986Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 25 01:10:09.114390 containerd[1474]: time="2025-03-25T01:10:09.114362666Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 25 01:10:09.114390 containerd[1474]: time="2025-03-25T01:10:09.114374186Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 25 01:10:09.114390 containerd[1474]: time="2025-03-25T01:10:09.114385306Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 25 01:10:09.114723 containerd[1474]: time="2025-03-25T01:10:09.114398866Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 25 01:10:09.114723 containerd[1474]: time="2025-03-25T01:10:09.114410506Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 25 01:10:09.114723 containerd[1474]: time="2025-03-25T01:10:09.114425106Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 25 01:10:09.114723 containerd[1474]: time="2025-03-25T01:10:09.114474906Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 25 01:10:09.114723 containerd[1474]: time="2025-03-25T01:10:09.114488306Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 25 01:10:09.114723 containerd[1474]: time="2025-03-25T01:10:09.114503866Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 25 01:10:09.114723 containerd[1474]: time="2025-03-25T01:10:09.114544666Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 25 01:10:09.114723 containerd[1474]: time="2025-03-25T01:10:09.114558506Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 25 01:10:09.114723 containerd[1474]: time="2025-03-25T01:10:09.114567426Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 25 01:10:09.114723 containerd[1474]: time="2025-03-25T01:10:09.114577346Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 25 01:10:09.114723 containerd[1474]: time="2025-03-25T01:10:09.114586666Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 25 01:10:09.114723 containerd[1474]: time="2025-03-25T01:10:09.114596426Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 25 01:10:09.114723 containerd[1474]: time="2025-03-25T01:10:09.114607066Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 25 01:10:09.114723 containerd[1474]: time="2025-03-25T01:10:09.114682066Z" level=info msg="runtime interface created" Mar 25 01:10:09.114931 containerd[1474]: time="2025-03-25T01:10:09.114687066Z" level=info msg="created NRI interface" Mar 25 01:10:09.114931 containerd[1474]: time="2025-03-25T01:10:09.114695906Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 25 01:10:09.114931 containerd[1474]: time="2025-03-25T01:10:09.114706866Z" level=info msg="Connect containerd service" Mar 25 01:10:09.114931 containerd[1474]: time="2025-03-25T01:10:09.114742346Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 25 01:10:09.115717 containerd[1474]: time="2025-03-25T01:10:09.115472906Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 25 01:10:09.233187 containerd[1474]: time="2025-03-25T01:10:09.233093866Z" level=info msg="Start subscribing containerd event" Mar 25 01:10:09.233187 containerd[1474]: time="2025-03-25T01:10:09.233161506Z" level=info msg="Start recovering state" Mar 25 01:10:09.233363 containerd[1474]: time="2025-03-25T01:10:09.233255826Z" level=info msg="Start event monitor" Mar 25 01:10:09.233363 containerd[1474]: time="2025-03-25T01:10:09.233278466Z" level=info msg="Start cni network conf syncer for default" Mar 25 01:10:09.233363 containerd[1474]: time="2025-03-25T01:10:09.233288626Z" level=info msg="Start streaming server" Mar 25 01:10:09.233363 containerd[1474]: time="2025-03-25T01:10:09.233297986Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 25 01:10:09.233363 containerd[1474]: time="2025-03-25T01:10:09.233305546Z" level=info msg="runtime interface starting up..." Mar 25 01:10:09.233363 containerd[1474]: time="2025-03-25T01:10:09.233311026Z" level=info msg="starting plugins..." Mar 25 01:10:09.233363 containerd[1474]: time="2025-03-25T01:10:09.233325626Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 25 01:10:09.233807 containerd[1474]: time="2025-03-25T01:10:09.233737386Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 25 01:10:09.233807 containerd[1474]: time="2025-03-25T01:10:09.233785546Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 25 01:10:09.233958 systemd[1]: Started containerd.service - containerd container runtime. Mar 25 01:10:09.235822 containerd[1474]: time="2025-03-25T01:10:09.235782146Z" level=info msg="containerd successfully booted in 0.146497s" Mar 25 01:10:09.279787 tar[1472]: linux-arm64/README.md Mar 25 01:10:09.294901 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 25 01:10:10.186611 systemd-networkd[1395]: eth0: Gained IPv6LL Mar 25 01:10:10.189484 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 25 01:10:10.190883 systemd[1]: Reached target network-online.target - Network is Online. Mar 25 01:10:10.193149 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Mar 25 01:10:10.195300 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:10:10.197245 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 25 01:10:10.225263 systemd[1]: coreos-metadata.service: Deactivated successfully. Mar 25 01:10:10.226015 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Mar 25 01:10:10.230109 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 25 01:10:10.232485 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 25 01:10:10.443384 sshd_keygen[1471]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 25 01:10:10.462503 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 25 01:10:10.464975 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 25 01:10:10.483286 systemd[1]: issuegen.service: Deactivated successfully. Mar 25 01:10:10.483565 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 25 01:10:10.486424 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 25 01:10:10.509284 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 25 01:10:10.512544 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 25 01:10:10.514995 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Mar 25 01:10:10.516421 systemd[1]: Reached target getty.target - Login Prompts. Mar 25 01:10:10.741833 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:10:10.743406 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 25 01:10:10.744968 (kubelet)[1576]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:10:10.748477 systemd[1]: Startup finished in 521ms (kernel) + 5.749s (initrd) + 3.793s (userspace) = 10.065s. Mar 25 01:10:11.176843 kubelet[1576]: E0325 01:10:11.176743 1576 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:10:11.179296 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:10:11.179466 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:10:11.179815 systemd[1]: kubelet.service: Consumed 793ms CPU time, 250.7M memory peak. Mar 25 01:10:14.235764 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 25 01:10:14.236852 systemd[1]: Started sshd@0-10.0.0.25:22-10.0.0.1:41248.service - OpenSSH per-connection server daemon (10.0.0.1:41248). Mar 25 01:10:14.304187 sshd[1590]: Accepted publickey for core from 10.0.0.1 port 41248 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:10:14.306084 sshd-session[1590]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:10:14.311996 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 25 01:10:14.312920 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 25 01:10:14.318827 systemd-logind[1458]: New session 1 of user core. Mar 25 01:10:14.332875 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 25 01:10:14.335206 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 25 01:10:14.350142 (systemd)[1594]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 25 01:10:14.351923 systemd-logind[1458]: New session c1 of user core. Mar 25 01:10:14.456104 systemd[1594]: Queued start job for default target default.target. Mar 25 01:10:14.465317 systemd[1594]: Created slice app.slice - User Application Slice. Mar 25 01:10:14.465346 systemd[1594]: Reached target paths.target - Paths. Mar 25 01:10:14.465380 systemd[1594]: Reached target timers.target - Timers. Mar 25 01:10:14.466513 systemd[1594]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 25 01:10:14.474750 systemd[1594]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 25 01:10:14.474809 systemd[1594]: Reached target sockets.target - Sockets. Mar 25 01:10:14.474843 systemd[1594]: Reached target basic.target - Basic System. Mar 25 01:10:14.474870 systemd[1594]: Reached target default.target - Main User Target. Mar 25 01:10:14.474893 systemd[1594]: Startup finished in 118ms. Mar 25 01:10:14.475017 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 25 01:10:14.476359 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 25 01:10:14.542982 systemd[1]: Started sshd@1-10.0.0.25:22-10.0.0.1:41256.service - OpenSSH per-connection server daemon (10.0.0.1:41256). Mar 25 01:10:14.601378 sshd[1605]: Accepted publickey for core from 10.0.0.1 port 41256 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:10:14.602534 sshd-session[1605]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:10:14.606396 systemd-logind[1458]: New session 2 of user core. Mar 25 01:10:14.614573 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 25 01:10:14.664600 sshd[1607]: Connection closed by 10.0.0.1 port 41256 Mar 25 01:10:14.665321 sshd-session[1605]: pam_unix(sshd:session): session closed for user core Mar 25 01:10:14.678293 systemd[1]: sshd@1-10.0.0.25:22-10.0.0.1:41256.service: Deactivated successfully. Mar 25 01:10:14.679685 systemd[1]: session-2.scope: Deactivated successfully. Mar 25 01:10:14.680494 systemd-logind[1458]: Session 2 logged out. Waiting for processes to exit. Mar 25 01:10:14.682168 systemd[1]: Started sshd@2-10.0.0.25:22-10.0.0.1:41272.service - OpenSSH per-connection server daemon (10.0.0.1:41272). Mar 25 01:10:14.683002 systemd-logind[1458]: Removed session 2. Mar 25 01:10:14.733092 sshd[1612]: Accepted publickey for core from 10.0.0.1 port 41272 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:10:14.734163 sshd-session[1612]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:10:14.738116 systemd-logind[1458]: New session 3 of user core. Mar 25 01:10:14.747634 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 25 01:10:14.795022 sshd[1615]: Connection closed by 10.0.0.1 port 41272 Mar 25 01:10:14.795259 sshd-session[1612]: pam_unix(sshd:session): session closed for user core Mar 25 01:10:14.804280 systemd[1]: sshd@2-10.0.0.25:22-10.0.0.1:41272.service: Deactivated successfully. Mar 25 01:10:14.806210 systemd[1]: session-3.scope: Deactivated successfully. Mar 25 01:10:14.807703 systemd-logind[1458]: Session 3 logged out. Waiting for processes to exit. Mar 25 01:10:14.809170 systemd[1]: Started sshd@3-10.0.0.25:22-10.0.0.1:41284.service - OpenSSH per-connection server daemon (10.0.0.1:41284). Mar 25 01:10:14.810159 systemd-logind[1458]: Removed session 3. Mar 25 01:10:14.862448 sshd[1620]: Accepted publickey for core from 10.0.0.1 port 41284 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:10:14.863475 sshd-session[1620]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:10:14.867626 systemd-logind[1458]: New session 4 of user core. Mar 25 01:10:14.881633 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 25 01:10:14.930575 sshd[1623]: Connection closed by 10.0.0.1 port 41284 Mar 25 01:10:14.930815 sshd-session[1620]: pam_unix(sshd:session): session closed for user core Mar 25 01:10:14.940010 systemd[1]: sshd@3-10.0.0.25:22-10.0.0.1:41284.service: Deactivated successfully. Mar 25 01:10:14.942483 systemd[1]: session-4.scope: Deactivated successfully. Mar 25 01:10:14.943706 systemd-logind[1458]: Session 4 logged out. Waiting for processes to exit. Mar 25 01:10:14.944703 systemd[1]: Started sshd@4-10.0.0.25:22-10.0.0.1:41286.service - OpenSSH per-connection server daemon (10.0.0.1:41286). Mar 25 01:10:14.945802 systemd-logind[1458]: Removed session 4. Mar 25 01:10:14.988451 sshd[1628]: Accepted publickey for core from 10.0.0.1 port 41286 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:10:14.989405 sshd-session[1628]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:10:14.993626 systemd-logind[1458]: New session 5 of user core. Mar 25 01:10:14.999563 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 25 01:10:15.059111 sudo[1632]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 25 01:10:15.059376 sudo[1632]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:10:15.071155 sudo[1632]: pam_unix(sudo:session): session closed for user root Mar 25 01:10:15.074263 sshd[1631]: Connection closed by 10.0.0.1 port 41286 Mar 25 01:10:15.074590 sshd-session[1628]: pam_unix(sshd:session): session closed for user core Mar 25 01:10:15.084341 systemd[1]: sshd@4-10.0.0.25:22-10.0.0.1:41286.service: Deactivated successfully. Mar 25 01:10:15.085585 systemd[1]: session-5.scope: Deactivated successfully. Mar 25 01:10:15.086271 systemd-logind[1458]: Session 5 logged out. Waiting for processes to exit. Mar 25 01:10:15.087940 systemd[1]: Started sshd@5-10.0.0.25:22-10.0.0.1:41296.service - OpenSSH per-connection server daemon (10.0.0.1:41296). Mar 25 01:10:15.088646 systemd-logind[1458]: Removed session 5. Mar 25 01:10:15.146730 sshd[1637]: Accepted publickey for core from 10.0.0.1 port 41296 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:10:15.147798 sshd-session[1637]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:10:15.151662 systemd-logind[1458]: New session 6 of user core. Mar 25 01:10:15.159639 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 25 01:10:15.209484 sudo[1642]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 25 01:10:15.209963 sudo[1642]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:10:15.212715 sudo[1642]: pam_unix(sudo:session): session closed for user root Mar 25 01:10:15.216792 sudo[1641]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 25 01:10:15.217036 sudo[1641]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:10:15.224840 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 25 01:10:15.256726 augenrules[1664]: No rules Mar 25 01:10:15.257821 systemd[1]: audit-rules.service: Deactivated successfully. Mar 25 01:10:15.258760 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 25 01:10:15.259510 sudo[1641]: pam_unix(sudo:session): session closed for user root Mar 25 01:10:15.260556 sshd[1640]: Connection closed by 10.0.0.1 port 41296 Mar 25 01:10:15.260876 sshd-session[1637]: pam_unix(sshd:session): session closed for user core Mar 25 01:10:15.275363 systemd[1]: sshd@5-10.0.0.25:22-10.0.0.1:41296.service: Deactivated successfully. Mar 25 01:10:15.276730 systemd[1]: session-6.scope: Deactivated successfully. Mar 25 01:10:15.279586 systemd-logind[1458]: Session 6 logged out. Waiting for processes to exit. Mar 25 01:10:15.280667 systemd[1]: Started sshd@6-10.0.0.25:22-10.0.0.1:41298.service - OpenSSH per-connection server daemon (10.0.0.1:41298). Mar 25 01:10:15.281349 systemd-logind[1458]: Removed session 6. Mar 25 01:10:15.333527 sshd[1672]: Accepted publickey for core from 10.0.0.1 port 41298 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:10:15.334841 sshd-session[1672]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:10:15.338485 systemd-logind[1458]: New session 7 of user core. Mar 25 01:10:15.350636 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 25 01:10:15.399199 sudo[1676]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 25 01:10:15.399481 sudo[1676]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:10:15.735541 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 25 01:10:15.752815 (dockerd)[1696]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 25 01:10:16.001109 dockerd[1696]: time="2025-03-25T01:10:16.000996026Z" level=info msg="Starting up" Mar 25 01:10:16.002788 dockerd[1696]: time="2025-03-25T01:10:16.002760266Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 25 01:10:16.107028 dockerd[1696]: time="2025-03-25T01:10:16.106985786Z" level=info msg="Loading containers: start." Mar 25 01:10:16.238579 kernel: Initializing XFRM netlink socket Mar 25 01:10:16.293798 systemd-networkd[1395]: docker0: Link UP Mar 25 01:10:16.357569 dockerd[1696]: time="2025-03-25T01:10:16.357530226Z" level=info msg="Loading containers: done." Mar 25 01:10:16.373780 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck1999796296-merged.mount: Deactivated successfully. Mar 25 01:10:16.375592 dockerd[1696]: time="2025-03-25T01:10:16.375526786Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 25 01:10:16.375664 dockerd[1696]: time="2025-03-25T01:10:16.375630906Z" level=info msg="Docker daemon" commit=c710b88579fcb5e0d53f96dcae976d79323b9166 containerd-snapshotter=false storage-driver=overlay2 version=27.4.1 Mar 25 01:10:16.375822 dockerd[1696]: time="2025-03-25T01:10:16.375792586Z" level=info msg="Daemon has completed initialization" Mar 25 01:10:16.402853 dockerd[1696]: time="2025-03-25T01:10:16.402800346Z" level=info msg="API listen on /run/docker.sock" Mar 25 01:10:16.402930 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 25 01:10:16.998311 containerd[1474]: time="2025-03-25T01:10:16.997793706Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.3\"" Mar 25 01:10:17.703945 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount85593470.mount: Deactivated successfully. Mar 25 01:10:19.635650 containerd[1474]: time="2025-03-25T01:10:19.635603106Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:10:19.636902 containerd[1474]: time="2025-03-25T01:10:19.636856986Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.3: active requests=0, bytes read=26231952" Mar 25 01:10:19.638576 containerd[1474]: time="2025-03-25T01:10:19.638030826Z" level=info msg="ImageCreate event name:\"sha256:25dd33975ea35cef2fa9b105778dbe3369de267e9ddf81427b7b82e98ff374e5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:10:19.640150 containerd[1474]: time="2025-03-25T01:10:19.640092546Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:279e45cf07e4f56925c3c5237179eb63616788426a96e94df5fedf728b18926e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:10:19.641120 containerd[1474]: time="2025-03-25T01:10:19.641095706Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.3\" with image id \"sha256:25dd33975ea35cef2fa9b105778dbe3369de267e9ddf81427b7b82e98ff374e5\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.3\", repo digest \"registry.k8s.io/kube-apiserver@sha256:279e45cf07e4f56925c3c5237179eb63616788426a96e94df5fedf728b18926e\", size \"26228750\" in 2.64326256s" Mar 25 01:10:19.641309 containerd[1474]: time="2025-03-25T01:10:19.641186426Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.3\" returns image reference \"sha256:25dd33975ea35cef2fa9b105778dbe3369de267e9ddf81427b7b82e98ff374e5\"" Mar 25 01:10:19.641799 containerd[1474]: time="2025-03-25T01:10:19.641779146Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.3\"" Mar 25 01:10:21.341580 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 25 01:10:21.343913 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:10:21.472589 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:10:21.476028 (kubelet)[1970]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:10:21.512381 kubelet[1970]: E0325 01:10:21.512338 1970 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:10:21.515680 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:10:21.515822 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:10:21.517504 systemd[1]: kubelet.service: Consumed 139ms CPU time, 104.1M memory peak. Mar 25 01:10:21.573606 containerd[1474]: time="2025-03-25T01:10:21.572779706Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:10:21.573606 containerd[1474]: time="2025-03-25T01:10:21.573504386Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.3: active requests=0, bytes read=22530034" Mar 25 01:10:21.574191 containerd[1474]: time="2025-03-25T01:10:21.574129386Z" level=info msg="ImageCreate event name:\"sha256:9e29b4db8c5cdf9970961ed3a47137ea71ad067643b8e5cccb58085f22a9b315\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:10:21.576768 containerd[1474]: time="2025-03-25T01:10:21.576722466Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:54456a96a1bbdc35dcc2e70fcc1355bf655af67694e40b650ac12e83521f6411\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:10:21.578530 containerd[1474]: time="2025-03-25T01:10:21.578492146Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.3\" with image id \"sha256:9e29b4db8c5cdf9970961ed3a47137ea71ad067643b8e5cccb58085f22a9b315\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.3\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:54456a96a1bbdc35dcc2e70fcc1355bf655af67694e40b650ac12e83521f6411\", size \"23970828\" in 1.93659732s" Mar 25 01:10:21.578530 containerd[1474]: time="2025-03-25T01:10:21.578530706Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.3\" returns image reference \"sha256:9e29b4db8c5cdf9970961ed3a47137ea71ad067643b8e5cccb58085f22a9b315\"" Mar 25 01:10:21.578976 containerd[1474]: time="2025-03-25T01:10:21.578943786Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.3\"" Mar 25 01:10:23.017815 containerd[1474]: time="2025-03-25T01:10:23.017756626Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:10:23.019510 containerd[1474]: time="2025-03-25T01:10:23.019418546Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.3: active requests=0, bytes read=17482563" Mar 25 01:10:23.020583 containerd[1474]: time="2025-03-25T01:10:23.020512946Z" level=info msg="ImageCreate event name:\"sha256:6b8dfebcc65dc9d4765a91d2923c304e13beca7111c57dfc99f1c3267a6e9f30\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:10:23.022659 containerd[1474]: time="2025-03-25T01:10:23.022623506Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:aafae2e3a8d65bc6dc3a0c6095c24bc72b1ff608e1417f0f5e860ce4a61c27df\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:10:23.024499 containerd[1474]: time="2025-03-25T01:10:23.024216266Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.3\" with image id \"sha256:6b8dfebcc65dc9d4765a91d2923c304e13beca7111c57dfc99f1c3267a6e9f30\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.3\", repo digest \"registry.k8s.io/kube-scheduler@sha256:aafae2e3a8d65bc6dc3a0c6095c24bc72b1ff608e1417f0f5e860ce4a61c27df\", size \"18923375\" in 1.4452392s" Mar 25 01:10:23.024499 containerd[1474]: time="2025-03-25T01:10:23.024261426Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.3\" returns image reference \"sha256:6b8dfebcc65dc9d4765a91d2923c304e13beca7111c57dfc99f1c3267a6e9f30\"" Mar 25 01:10:23.025002 containerd[1474]: time="2025-03-25T01:10:23.024967626Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.3\"" Mar 25 01:10:23.991118 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount858260604.mount: Deactivated successfully. Mar 25 01:10:24.644921 containerd[1474]: time="2025-03-25T01:10:24.644866586Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:10:24.645357 containerd[1474]: time="2025-03-25T01:10:24.645304306Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.3: active requests=0, bytes read=27370097" Mar 25 01:10:24.647535 containerd[1474]: time="2025-03-25T01:10:24.646405226Z" level=info msg="ImageCreate event name:\"sha256:2a637602f3e88e76046aa1a75bccdb37b25b2fcba99a380412e2c27ccd55c547\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:10:24.649015 containerd[1474]: time="2025-03-25T01:10:24.648335586Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:5015269547a0b7dd2c062758e9a64467b58978ff2502cad4c3f5cdf4aa554ad3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:10:24.649263 containerd[1474]: time="2025-03-25T01:10:24.649228706Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.3\" with image id \"sha256:2a637602f3e88e76046aa1a75bccdb37b25b2fcba99a380412e2c27ccd55c547\", repo tag \"registry.k8s.io/kube-proxy:v1.32.3\", repo digest \"registry.k8s.io/kube-proxy@sha256:5015269547a0b7dd2c062758e9a64467b58978ff2502cad4c3f5cdf4aa554ad3\", size \"27369114\" in 1.6242054s" Mar 25 01:10:24.649474 containerd[1474]: time="2025-03-25T01:10:24.649266106Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.3\" returns image reference \"sha256:2a637602f3e88e76046aa1a75bccdb37b25b2fcba99a380412e2c27ccd55c547\"" Mar 25 01:10:24.650116 containerd[1474]: time="2025-03-25T01:10:24.650081466Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Mar 25 01:10:25.165734 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1057900936.mount: Deactivated successfully. Mar 25 01:10:25.832055 containerd[1474]: time="2025-03-25T01:10:25.832005106Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:10:25.833454 containerd[1474]: time="2025-03-25T01:10:25.832442586Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951624" Mar 25 01:10:25.833547 containerd[1474]: time="2025-03-25T01:10:25.833475546Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:10:25.837489 containerd[1474]: time="2025-03-25T01:10:25.837419866Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:10:25.838207 containerd[1474]: time="2025-03-25T01:10:25.838163266Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.18803228s" Mar 25 01:10:25.838207 containerd[1474]: time="2025-03-25T01:10:25.838203826Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Mar 25 01:10:25.838680 containerd[1474]: time="2025-03-25T01:10:25.838650106Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Mar 25 01:10:26.264744 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4085899288.mount: Deactivated successfully. Mar 25 01:10:26.270717 containerd[1474]: time="2025-03-25T01:10:26.270645626Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268705" Mar 25 01:10:26.270812 containerd[1474]: time="2025-03-25T01:10:26.270799186Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:10:26.272944 containerd[1474]: time="2025-03-25T01:10:26.272564026Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:10:26.274907 containerd[1474]: time="2025-03-25T01:10:26.274871386Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:10:26.275874 containerd[1474]: time="2025-03-25T01:10:26.275837946Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 437.15192ms" Mar 25 01:10:26.275874 containerd[1474]: time="2025-03-25T01:10:26.275873266Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Mar 25 01:10:26.276284 containerd[1474]: time="2025-03-25T01:10:26.276263146Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Mar 25 01:10:26.745168 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3155325671.mount: Deactivated successfully. Mar 25 01:10:28.085749 containerd[1474]: time="2025-03-25T01:10:28.085692826Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:10:28.086706 containerd[1474]: time="2025-03-25T01:10:28.086247026Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=67812431" Mar 25 01:10:28.088487 containerd[1474]: time="2025-03-25T01:10:28.087495026Z" level=info msg="ImageCreate event name:\"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:10:28.090563 containerd[1474]: time="2025-03-25T01:10:28.090532906Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:10:28.092746 containerd[1474]: time="2025-03-25T01:10:28.092707186Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"67941650\" in 1.81641112s" Mar 25 01:10:28.092746 containerd[1474]: time="2025-03-25T01:10:28.092745586Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\"" Mar 25 01:10:31.591556 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 25 01:10:31.593556 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:10:31.721126 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:10:31.731823 (kubelet)[2132]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:10:31.770799 kubelet[2132]: E0325 01:10:31.770735 2132 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:10:31.773447 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:10:31.773731 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:10:31.774265 systemd[1]: kubelet.service: Consumed 146ms CPU time, 106.6M memory peak. Mar 25 01:10:33.546305 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:10:33.546678 systemd[1]: kubelet.service: Consumed 146ms CPU time, 106.6M memory peak. Mar 25 01:10:33.549318 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:10:33.574503 systemd[1]: Reload requested from client PID 2147 ('systemctl') (unit session-7.scope)... Mar 25 01:10:33.574524 systemd[1]: Reloading... Mar 25 01:10:33.639573 zram_generator::config[2190]: No configuration found. Mar 25 01:10:33.769788 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:10:33.842028 systemd[1]: Reloading finished in 267 ms. Mar 25 01:10:33.897001 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 25 01:10:33.897075 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 25 01:10:33.897321 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:10:33.897368 systemd[1]: kubelet.service: Consumed 93ms CPU time, 90.2M memory peak. Mar 25 01:10:33.900244 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:10:34.010710 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:10:34.014758 (kubelet)[2236]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 25 01:10:34.049250 kubelet[2236]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:10:34.049250 kubelet[2236]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 25 01:10:34.049250 kubelet[2236]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:10:34.049584 kubelet[2236]: I0325 01:10:34.049307 2236 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 25 01:10:35.822317 kubelet[2236]: I0325 01:10:35.822265 2236 server.go:520] "Kubelet version" kubeletVersion="v1.32.0" Mar 25 01:10:35.822317 kubelet[2236]: I0325 01:10:35.822303 2236 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 25 01:10:35.822707 kubelet[2236]: I0325 01:10:35.822598 2236 server.go:954] "Client rotation is on, will bootstrap in background" Mar 25 01:10:35.870676 kubelet[2236]: E0325 01:10:35.870630 2236 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.0.25:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.25:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:10:35.871869 kubelet[2236]: I0325 01:10:35.871845 2236 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 25 01:10:35.884894 kubelet[2236]: I0325 01:10:35.884817 2236 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 25 01:10:35.888487 kubelet[2236]: I0325 01:10:35.888045 2236 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 25 01:10:35.888994 kubelet[2236]: I0325 01:10:35.888697 2236 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 25 01:10:35.889071 kubelet[2236]: I0325 01:10:35.888747 2236 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 25 01:10:35.889176 kubelet[2236]: I0325 01:10:35.889157 2236 topology_manager.go:138] "Creating topology manager with none policy" Mar 25 01:10:35.889176 kubelet[2236]: I0325 01:10:35.889174 2236 container_manager_linux.go:304] "Creating device plugin manager" Mar 25 01:10:35.889400 kubelet[2236]: I0325 01:10:35.889378 2236 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:10:35.892011 kubelet[2236]: I0325 01:10:35.891984 2236 kubelet.go:446] "Attempting to sync node with API server" Mar 25 01:10:35.892011 kubelet[2236]: I0325 01:10:35.892011 2236 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 25 01:10:35.893509 kubelet[2236]: I0325 01:10:35.892039 2236 kubelet.go:352] "Adding apiserver pod source" Mar 25 01:10:35.893509 kubelet[2236]: I0325 01:10:35.892049 2236 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 25 01:10:35.897926 kubelet[2236]: I0325 01:10:35.894788 2236 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 25 01:10:35.897926 kubelet[2236]: W0325 01:10:35.895226 2236 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.25:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.25:6443: connect: connection refused Mar 25 01:10:35.897926 kubelet[2236]: E0325 01:10:35.895278 2236 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.25:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.25:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:10:35.898556 kubelet[2236]: W0325 01:10:35.898494 2236 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.25:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.25:6443: connect: connection refused Mar 25 01:10:35.898556 kubelet[2236]: E0325 01:10:35.898554 2236 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.25:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.25:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:10:35.898887 kubelet[2236]: I0325 01:10:35.898859 2236 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 25 01:10:35.899007 kubelet[2236]: W0325 01:10:35.898984 2236 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 25 01:10:35.900082 kubelet[2236]: I0325 01:10:35.899881 2236 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 25 01:10:35.900082 kubelet[2236]: I0325 01:10:35.899926 2236 server.go:1287] "Started kubelet" Mar 25 01:10:35.901153 kubelet[2236]: I0325 01:10:35.900519 2236 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 25 01:10:35.901153 kubelet[2236]: I0325 01:10:35.900992 2236 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 25 01:10:35.901153 kubelet[2236]: I0325 01:10:35.901075 2236 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Mar 25 01:10:35.901564 kubelet[2236]: I0325 01:10:35.901541 2236 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 25 01:10:35.905146 kubelet[2236]: I0325 01:10:35.902012 2236 server.go:490] "Adding debug handlers to kubelet server" Mar 25 01:10:35.905146 kubelet[2236]: I0325 01:10:35.903810 2236 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 25 01:10:35.905529 kubelet[2236]: E0325 01:10:35.905508 2236 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 25 01:10:35.905645 kubelet[2236]: I0325 01:10:35.905630 2236 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 25 01:10:35.905991 kubelet[2236]: I0325 01:10:35.905846 2236 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 25 01:10:35.906168 kubelet[2236]: I0325 01:10:35.906149 2236 reconciler.go:26] "Reconciler: start to sync state" Mar 25 01:10:35.906389 kubelet[2236]: W0325 01:10:35.906352 2236 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.25:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.25:6443: connect: connection refused Mar 25 01:10:35.906508 kubelet[2236]: E0325 01:10:35.906487 2236 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.25:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.25:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:10:35.906641 kubelet[2236]: E0325 01:10:35.906538 2236 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.25:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.25:6443: connect: connection refused" interval="200ms" Mar 25 01:10:35.907102 kubelet[2236]: E0325 01:10:35.907081 2236 kubelet.go:1561] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 25 01:10:35.907285 kubelet[2236]: I0325 01:10:35.907086 2236 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 25 01:10:35.907492 kubelet[2236]: E0325 01:10:35.906991 2236 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.25:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.25:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.182fe6841c430922 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-03-25 01:10:35.899898146 +0000 UTC m=+1.882132281,LastTimestamp:2025-03-25 01:10:35.899898146 +0000 UTC m=+1.882132281,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Mar 25 01:10:35.908696 kubelet[2236]: I0325 01:10:35.908656 2236 factory.go:221] Registration of the containerd container factory successfully Mar 25 01:10:35.908696 kubelet[2236]: I0325 01:10:35.908685 2236 factory.go:221] Registration of the systemd container factory successfully Mar 25 01:10:35.919565 kubelet[2236]: I0325 01:10:35.919532 2236 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 25 01:10:35.919565 kubelet[2236]: I0325 01:10:35.919551 2236 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 25 01:10:35.919565 kubelet[2236]: I0325 01:10:35.919570 2236 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:10:35.920874 kubelet[2236]: I0325 01:10:35.920815 2236 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 25 01:10:35.921875 kubelet[2236]: I0325 01:10:35.921843 2236 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 25 01:10:35.921875 kubelet[2236]: I0325 01:10:35.921866 2236 status_manager.go:227] "Starting to sync pod status with apiserver" Mar 25 01:10:35.921977 kubelet[2236]: I0325 01:10:35.921885 2236 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 25 01:10:35.921977 kubelet[2236]: I0325 01:10:35.921893 2236 kubelet.go:2388] "Starting kubelet main sync loop" Mar 25 01:10:35.921977 kubelet[2236]: E0325 01:10:35.921936 2236 kubelet.go:2412] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 25 01:10:35.922788 kubelet[2236]: W0325 01:10:35.922738 2236 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.25:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.25:6443: connect: connection refused Mar 25 01:10:35.922944 kubelet[2236]: E0325 01:10:35.922796 2236 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.25:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.25:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:10:35.988570 kubelet[2236]: I0325 01:10:35.988528 2236 policy_none.go:49] "None policy: Start" Mar 25 01:10:35.988570 kubelet[2236]: I0325 01:10:35.988562 2236 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 25 01:10:35.988570 kubelet[2236]: I0325 01:10:35.988576 2236 state_mem.go:35] "Initializing new in-memory state store" Mar 25 01:10:35.993509 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 25 01:10:36.005923 kubelet[2236]: E0325 01:10:36.005861 2236 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 25 01:10:36.010108 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 25 01:10:36.013762 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 25 01:10:36.022539 kubelet[2236]: E0325 01:10:36.022503 2236 kubelet.go:2412] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 25 01:10:36.027307 kubelet[2236]: I0325 01:10:36.027278 2236 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 25 01:10:36.027886 kubelet[2236]: I0325 01:10:36.027501 2236 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 25 01:10:36.027886 kubelet[2236]: I0325 01:10:36.027532 2236 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 25 01:10:36.027886 kubelet[2236]: I0325 01:10:36.027792 2236 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 25 01:10:36.031087 kubelet[2236]: E0325 01:10:36.031061 2236 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 25 01:10:36.031538 kubelet[2236]: E0325 01:10:36.031504 2236 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Mar 25 01:10:36.107171 kubelet[2236]: E0325 01:10:36.107135 2236 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.25:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.25:6443: connect: connection refused" interval="400ms" Mar 25 01:10:36.129318 kubelet[2236]: I0325 01:10:36.129290 2236 kubelet_node_status.go:76] "Attempting to register node" node="localhost" Mar 25 01:10:36.129754 kubelet[2236]: E0325 01:10:36.129727 2236 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://10.0.0.25:6443/api/v1/nodes\": dial tcp 10.0.0.25:6443: connect: connection refused" node="localhost" Mar 25 01:10:36.231837 systemd[1]: Created slice kubepods-burstable-pod5cc26063b48899768ec65ee5e08dca1b.slice - libcontainer container kubepods-burstable-pod5cc26063b48899768ec65ee5e08dca1b.slice. Mar 25 01:10:36.242399 kubelet[2236]: E0325 01:10:36.242361 2236 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 25 01:10:36.245536 systemd[1]: Created slice kubepods-burstable-podcbbb394ff48414687df77e1bc213eeb5.slice - libcontainer container kubepods-burstable-podcbbb394ff48414687df77e1bc213eeb5.slice. Mar 25 01:10:36.249484 kubelet[2236]: E0325 01:10:36.249355 2236 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 25 01:10:36.251265 systemd[1]: Created slice kubepods-burstable-pod3700e556aa2777679a324159272023f1.slice - libcontainer container kubepods-burstable-pod3700e556aa2777679a324159272023f1.slice. Mar 25 01:10:36.252971 kubelet[2236]: E0325 01:10:36.252783 2236 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 25 01:10:36.307444 kubelet[2236]: I0325 01:10:36.307388 2236 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/5cc26063b48899768ec65ee5e08dca1b-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"5cc26063b48899768ec65ee5e08dca1b\") " pod="kube-system/kube-apiserver-localhost" Mar 25 01:10:36.307444 kubelet[2236]: I0325 01:10:36.307441 2236 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/5cc26063b48899768ec65ee5e08dca1b-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"5cc26063b48899768ec65ee5e08dca1b\") " pod="kube-system/kube-apiserver-localhost" Mar 25 01:10:36.307553 kubelet[2236]: I0325 01:10:36.307469 2236 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/cbbb394ff48414687df77e1bc213eeb5-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"cbbb394ff48414687df77e1bc213eeb5\") " pod="kube-system/kube-controller-manager-localhost" Mar 25 01:10:36.307553 kubelet[2236]: I0325 01:10:36.307486 2236 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/5cc26063b48899768ec65ee5e08dca1b-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"5cc26063b48899768ec65ee5e08dca1b\") " pod="kube-system/kube-apiserver-localhost" Mar 25 01:10:36.307553 kubelet[2236]: I0325 01:10:36.307505 2236 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/cbbb394ff48414687df77e1bc213eeb5-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"cbbb394ff48414687df77e1bc213eeb5\") " pod="kube-system/kube-controller-manager-localhost" Mar 25 01:10:36.307553 kubelet[2236]: I0325 01:10:36.307522 2236 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/cbbb394ff48414687df77e1bc213eeb5-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"cbbb394ff48414687df77e1bc213eeb5\") " pod="kube-system/kube-controller-manager-localhost" Mar 25 01:10:36.307553 kubelet[2236]: I0325 01:10:36.307540 2236 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/cbbb394ff48414687df77e1bc213eeb5-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"cbbb394ff48414687df77e1bc213eeb5\") " pod="kube-system/kube-controller-manager-localhost" Mar 25 01:10:36.307691 kubelet[2236]: I0325 01:10:36.307555 2236 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3700e556aa2777679a324159272023f1-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"3700e556aa2777679a324159272023f1\") " pod="kube-system/kube-scheduler-localhost" Mar 25 01:10:36.307691 kubelet[2236]: I0325 01:10:36.307571 2236 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/cbbb394ff48414687df77e1bc213eeb5-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"cbbb394ff48414687df77e1bc213eeb5\") " pod="kube-system/kube-controller-manager-localhost" Mar 25 01:10:36.331579 kubelet[2236]: I0325 01:10:36.331514 2236 kubelet_node_status.go:76] "Attempting to register node" node="localhost" Mar 25 01:10:36.331955 kubelet[2236]: E0325 01:10:36.331926 2236 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://10.0.0.25:6443/api/v1/nodes\": dial tcp 10.0.0.25:6443: connect: connection refused" node="localhost" Mar 25 01:10:36.508625 kubelet[2236]: E0325 01:10:36.508496 2236 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.25:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.25:6443: connect: connection refused" interval="800ms" Mar 25 01:10:36.544718 containerd[1474]: time="2025-03-25T01:10:36.544410146Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:5cc26063b48899768ec65ee5e08dca1b,Namespace:kube-system,Attempt:0,}" Mar 25 01:10:36.551150 containerd[1474]: time="2025-03-25T01:10:36.551100306Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:cbbb394ff48414687df77e1bc213eeb5,Namespace:kube-system,Attempt:0,}" Mar 25 01:10:36.553995 containerd[1474]: time="2025-03-25T01:10:36.553876066Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:3700e556aa2777679a324159272023f1,Namespace:kube-system,Attempt:0,}" Mar 25 01:10:36.733781 kubelet[2236]: I0325 01:10:36.733739 2236 kubelet_node_status.go:76] "Attempting to register node" node="localhost" Mar 25 01:10:36.734155 kubelet[2236]: E0325 01:10:36.734107 2236 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://10.0.0.25:6443/api/v1/nodes\": dial tcp 10.0.0.25:6443: connect: connection refused" node="localhost" Mar 25 01:10:36.798422 containerd[1474]: time="2025-03-25T01:10:36.797093186Z" level=info msg="connecting to shim 55ab430dc8cb96c9731819291cabc9a0223366b593542a738a7fed0c1854722c" address="unix:///run/containerd/s/c5ef8db4d0067c386fd8db9ec480ccb0b29d777f05cb42883fc3bff405c5461a" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:10:36.799896 kubelet[2236]: W0325 01:10:36.799731 2236 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.25:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.25:6443: connect: connection refused Mar 25 01:10:36.799896 kubelet[2236]: E0325 01:10:36.799800 2236 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.25:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.25:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:10:36.805634 containerd[1474]: time="2025-03-25T01:10:36.805575906Z" level=info msg="connecting to shim 806e8413dc89d858f3f2454986735370ab4559c764ebaa0b9694c8e1eefa00b6" address="unix:///run/containerd/s/17975a374072690a92a8eb5d9b505909828b843f241ae59db2cd66ba25a545cc" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:10:36.806196 containerd[1474]: time="2025-03-25T01:10:36.806162346Z" level=info msg="connecting to shim 6f9121f20a9b8748398f5f1ce770c85528ad74a866fcbf0257a644166dfc637b" address="unix:///run/containerd/s/af3ae1bbac52369e7b479006b9fdcef8aae8f2c811479b9c222440221e1045c6" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:10:36.822630 systemd[1]: Started cri-containerd-55ab430dc8cb96c9731819291cabc9a0223366b593542a738a7fed0c1854722c.scope - libcontainer container 55ab430dc8cb96c9731819291cabc9a0223366b593542a738a7fed0c1854722c. Mar 25 01:10:36.825852 systemd[1]: Started cri-containerd-6f9121f20a9b8748398f5f1ce770c85528ad74a866fcbf0257a644166dfc637b.scope - libcontainer container 6f9121f20a9b8748398f5f1ce770c85528ad74a866fcbf0257a644166dfc637b. Mar 25 01:10:36.830230 systemd[1]: Started cri-containerd-806e8413dc89d858f3f2454986735370ab4559c764ebaa0b9694c8e1eefa00b6.scope - libcontainer container 806e8413dc89d858f3f2454986735370ab4559c764ebaa0b9694c8e1eefa00b6. Mar 25 01:10:36.860856 containerd[1474]: time="2025-03-25T01:10:36.860781866Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:5cc26063b48899768ec65ee5e08dca1b,Namespace:kube-system,Attempt:0,} returns sandbox id \"55ab430dc8cb96c9731819291cabc9a0223366b593542a738a7fed0c1854722c\"" Mar 25 01:10:36.866321 containerd[1474]: time="2025-03-25T01:10:36.866284786Z" level=info msg="CreateContainer within sandbox \"55ab430dc8cb96c9731819291cabc9a0223366b593542a738a7fed0c1854722c\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 25 01:10:36.867627 containerd[1474]: time="2025-03-25T01:10:36.867532706Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:cbbb394ff48414687df77e1bc213eeb5,Namespace:kube-system,Attempt:0,} returns sandbox id \"6f9121f20a9b8748398f5f1ce770c85528ad74a866fcbf0257a644166dfc637b\"" Mar 25 01:10:36.869655 containerd[1474]: time="2025-03-25T01:10:36.869625106Z" level=info msg="CreateContainer within sandbox \"6f9121f20a9b8748398f5f1ce770c85528ad74a866fcbf0257a644166dfc637b\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 25 01:10:36.876151 containerd[1474]: time="2025-03-25T01:10:36.876110266Z" level=info msg="Container e076b236045416929c6bcf505d424138578723b21d76d5843c1b53c54fc8a1c0: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:10:36.882148 containerd[1474]: time="2025-03-25T01:10:36.882100906Z" level=info msg="Container b202d56bf0a48c471938bbf5f73ee59e7aa18b2f077aa90c3a5d6381494eaa90: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:10:36.888947 containerd[1474]: time="2025-03-25T01:10:36.888842386Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:3700e556aa2777679a324159272023f1,Namespace:kube-system,Attempt:0,} returns sandbox id \"806e8413dc89d858f3f2454986735370ab4559c764ebaa0b9694c8e1eefa00b6\"" Mar 25 01:10:36.891084 containerd[1474]: time="2025-03-25T01:10:36.891048066Z" level=info msg="CreateContainer within sandbox \"806e8413dc89d858f3f2454986735370ab4559c764ebaa0b9694c8e1eefa00b6\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 25 01:10:36.893447 containerd[1474]: time="2025-03-25T01:10:36.893399426Z" level=info msg="CreateContainer within sandbox \"55ab430dc8cb96c9731819291cabc9a0223366b593542a738a7fed0c1854722c\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"e076b236045416929c6bcf505d424138578723b21d76d5843c1b53c54fc8a1c0\"" Mar 25 01:10:36.895007 containerd[1474]: time="2025-03-25T01:10:36.894702906Z" level=info msg="StartContainer for \"e076b236045416929c6bcf505d424138578723b21d76d5843c1b53c54fc8a1c0\"" Mar 25 01:10:36.895007 containerd[1474]: time="2025-03-25T01:10:36.894807546Z" level=info msg="CreateContainer within sandbox \"6f9121f20a9b8748398f5f1ce770c85528ad74a866fcbf0257a644166dfc637b\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"b202d56bf0a48c471938bbf5f73ee59e7aa18b2f077aa90c3a5d6381494eaa90\"" Mar 25 01:10:36.895479 containerd[1474]: time="2025-03-25T01:10:36.895443986Z" level=info msg="StartContainer for \"b202d56bf0a48c471938bbf5f73ee59e7aa18b2f077aa90c3a5d6381494eaa90\"" Mar 25 01:10:36.897208 containerd[1474]: time="2025-03-25T01:10:36.897159026Z" level=info msg="connecting to shim e076b236045416929c6bcf505d424138578723b21d76d5843c1b53c54fc8a1c0" address="unix:///run/containerd/s/c5ef8db4d0067c386fd8db9ec480ccb0b29d777f05cb42883fc3bff405c5461a" protocol=ttrpc version=3 Mar 25 01:10:36.897838 containerd[1474]: time="2025-03-25T01:10:36.897808746Z" level=info msg="connecting to shim b202d56bf0a48c471938bbf5f73ee59e7aa18b2f077aa90c3a5d6381494eaa90" address="unix:///run/containerd/s/af3ae1bbac52369e7b479006b9fdcef8aae8f2c811479b9c222440221e1045c6" protocol=ttrpc version=3 Mar 25 01:10:36.900637 containerd[1474]: time="2025-03-25T01:10:36.900606066Z" level=info msg="Container eecb9f800c6ab632f69581e87eee1dd528b1e154b4ac6df5ac9213879f2826ce: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:10:36.907021 containerd[1474]: time="2025-03-25T01:10:36.906980186Z" level=info msg="CreateContainer within sandbox \"806e8413dc89d858f3f2454986735370ab4559c764ebaa0b9694c8e1eefa00b6\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"eecb9f800c6ab632f69581e87eee1dd528b1e154b4ac6df5ac9213879f2826ce\"" Mar 25 01:10:36.907418 containerd[1474]: time="2025-03-25T01:10:36.907391706Z" level=info msg="StartContainer for \"eecb9f800c6ab632f69581e87eee1dd528b1e154b4ac6df5ac9213879f2826ce\"" Mar 25 01:10:36.908870 containerd[1474]: time="2025-03-25T01:10:36.908836586Z" level=info msg="connecting to shim eecb9f800c6ab632f69581e87eee1dd528b1e154b4ac6df5ac9213879f2826ce" address="unix:///run/containerd/s/17975a374072690a92a8eb5d9b505909828b843f241ae59db2cd66ba25a545cc" protocol=ttrpc version=3 Mar 25 01:10:36.917766 systemd[1]: Started cri-containerd-e076b236045416929c6bcf505d424138578723b21d76d5843c1b53c54fc8a1c0.scope - libcontainer container e076b236045416929c6bcf505d424138578723b21d76d5843c1b53c54fc8a1c0. Mar 25 01:10:36.920529 systemd[1]: Started cri-containerd-b202d56bf0a48c471938bbf5f73ee59e7aa18b2f077aa90c3a5d6381494eaa90.scope - libcontainer container b202d56bf0a48c471938bbf5f73ee59e7aa18b2f077aa90c3a5d6381494eaa90. Mar 25 01:10:36.933585 systemd[1]: Started cri-containerd-eecb9f800c6ab632f69581e87eee1dd528b1e154b4ac6df5ac9213879f2826ce.scope - libcontainer container eecb9f800c6ab632f69581e87eee1dd528b1e154b4ac6df5ac9213879f2826ce. Mar 25 01:10:36.994524 containerd[1474]: time="2025-03-25T01:10:36.991068466Z" level=info msg="StartContainer for \"b202d56bf0a48c471938bbf5f73ee59e7aa18b2f077aa90c3a5d6381494eaa90\" returns successfully" Mar 25 01:10:36.994524 containerd[1474]: time="2025-03-25T01:10:36.991239466Z" level=info msg="StartContainer for \"e076b236045416929c6bcf505d424138578723b21d76d5843c1b53c54fc8a1c0\" returns successfully" Mar 25 01:10:37.027815 containerd[1474]: time="2025-03-25T01:10:37.024009186Z" level=info msg="StartContainer for \"eecb9f800c6ab632f69581e87eee1dd528b1e154b4ac6df5ac9213879f2826ce\" returns successfully" Mar 25 01:10:37.088948 kubelet[2236]: W0325 01:10:37.087066 2236 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.25:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.25:6443: connect: connection refused Mar 25 01:10:37.088948 kubelet[2236]: E0325 01:10:37.087129 2236 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.25:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.25:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:10:37.102909 kubelet[2236]: W0325 01:10:37.102794 2236 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.25:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.25:6443: connect: connection refused Mar 25 01:10:37.102909 kubelet[2236]: E0325 01:10:37.102844 2236 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.25:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.25:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:10:37.176987 kubelet[2236]: W0325 01:10:37.176930 2236 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.25:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.25:6443: connect: connection refused Mar 25 01:10:37.177096 kubelet[2236]: E0325 01:10:37.176998 2236 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.25:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.25:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:10:37.538754 kubelet[2236]: I0325 01:10:37.538699 2236 kubelet_node_status.go:76] "Attempting to register node" node="localhost" Mar 25 01:10:37.942979 kubelet[2236]: E0325 01:10:37.942806 2236 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 25 01:10:37.946353 kubelet[2236]: E0325 01:10:37.946130 2236 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 25 01:10:37.950875 kubelet[2236]: E0325 01:10:37.950718 2236 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Mar 25 01:10:38.598274 kubelet[2236]: E0325 01:10:38.598226 2236 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Mar 25 01:10:38.677374 kubelet[2236]: I0325 01:10:38.677332 2236 kubelet_node_status.go:79] "Successfully registered node" node="localhost" Mar 25 01:10:38.707011 kubelet[2236]: I0325 01:10:38.706778 2236 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Mar 25 01:10:38.721660 kubelet[2236]: E0325 01:10:38.721479 2236 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Mar 25 01:10:38.721660 kubelet[2236]: I0325 01:10:38.721507 2236 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Mar 25 01:10:38.728922 kubelet[2236]: E0325 01:10:38.728681 2236 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Mar 25 01:10:38.728922 kubelet[2236]: I0325 01:10:38.728704 2236 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Mar 25 01:10:38.730893 kubelet[2236]: E0325 01:10:38.730864 2236 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Mar 25 01:10:38.895155 kubelet[2236]: I0325 01:10:38.895057 2236 apiserver.go:52] "Watching apiserver" Mar 25 01:10:38.906776 kubelet[2236]: I0325 01:10:38.906742 2236 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 25 01:10:38.951358 kubelet[2236]: I0325 01:10:38.951317 2236 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Mar 25 01:10:38.951485 kubelet[2236]: I0325 01:10:38.951386 2236 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Mar 25 01:10:38.953088 kubelet[2236]: I0325 01:10:38.951652 2236 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Mar 25 01:10:38.953440 kubelet[2236]: E0325 01:10:38.953196 2236 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Mar 25 01:10:38.953440 kubelet[2236]: E0325 01:10:38.953376 2236 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Mar 25 01:10:38.953440 kubelet[2236]: E0325 01:10:38.953382 2236 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Mar 25 01:10:39.953806 kubelet[2236]: I0325 01:10:39.953775 2236 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Mar 25 01:10:39.954143 kubelet[2236]: I0325 01:10:39.954126 2236 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Mar 25 01:10:39.955395 kubelet[2236]: I0325 01:10:39.954232 2236 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Mar 25 01:10:40.485417 systemd[1]: Reload requested from client PID 2505 ('systemctl') (unit session-7.scope)... Mar 25 01:10:40.485445 systemd[1]: Reloading... Mar 25 01:10:40.562520 zram_generator::config[2552]: No configuration found. Mar 25 01:10:40.637861 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:10:40.719375 systemd[1]: Reloading finished in 233 ms. Mar 25 01:10:40.745646 kubelet[2236]: I0325 01:10:40.745515 2236 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 25 01:10:40.745659 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:10:40.758827 systemd[1]: kubelet.service: Deactivated successfully. Mar 25 01:10:40.759021 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:10:40.759065 systemd[1]: kubelet.service: Consumed 2.295s CPU time, 123.8M memory peak. Mar 25 01:10:40.761241 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:10:40.888463 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:10:40.891823 (kubelet)[2591]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 25 01:10:40.930685 kubelet[2591]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:10:40.930685 kubelet[2591]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Mar 25 01:10:40.930685 kubelet[2591]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:10:40.930999 kubelet[2591]: I0325 01:10:40.930743 2591 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 25 01:10:40.936226 kubelet[2591]: I0325 01:10:40.936179 2591 server.go:520] "Kubelet version" kubeletVersion="v1.32.0" Mar 25 01:10:40.936226 kubelet[2591]: I0325 01:10:40.936210 2591 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 25 01:10:40.936456 kubelet[2591]: I0325 01:10:40.936423 2591 server.go:954] "Client rotation is on, will bootstrap in background" Mar 25 01:10:40.937627 kubelet[2591]: I0325 01:10:40.937598 2591 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 25 01:10:40.942223 kubelet[2591]: I0325 01:10:40.942192 2591 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 25 01:10:40.946167 kubelet[2591]: I0325 01:10:40.946147 2591 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 25 01:10:40.948632 kubelet[2591]: I0325 01:10:40.948598 2591 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 25 01:10:40.948864 kubelet[2591]: I0325 01:10:40.948832 2591 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 25 01:10:40.949048 kubelet[2591]: I0325 01:10:40.948859 2591 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 25 01:10:40.949122 kubelet[2591]: I0325 01:10:40.949053 2591 topology_manager.go:138] "Creating topology manager with none policy" Mar 25 01:10:40.949122 kubelet[2591]: I0325 01:10:40.949061 2591 container_manager_linux.go:304] "Creating device plugin manager" Mar 25 01:10:40.949122 kubelet[2591]: I0325 01:10:40.949103 2591 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:10:40.949263 kubelet[2591]: I0325 01:10:40.949243 2591 kubelet.go:446] "Attempting to sync node with API server" Mar 25 01:10:40.949263 kubelet[2591]: I0325 01:10:40.949261 2591 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 25 01:10:40.952492 kubelet[2591]: I0325 01:10:40.952463 2591 kubelet.go:352] "Adding apiserver pod source" Mar 25 01:10:40.952660 kubelet[2591]: I0325 01:10:40.952500 2591 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 25 01:10:40.954189 kubelet[2591]: I0325 01:10:40.954172 2591 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 25 01:10:40.955066 kubelet[2591]: I0325 01:10:40.954792 2591 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 25 01:10:40.956213 kubelet[2591]: I0325 01:10:40.955744 2591 watchdog_linux.go:99] "Systemd watchdog is not enabled" Mar 25 01:10:40.956213 kubelet[2591]: I0325 01:10:40.955783 2591 server.go:1287] "Started kubelet" Mar 25 01:10:40.957024 kubelet[2591]: I0325 01:10:40.956804 2591 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 25 01:10:40.957225 kubelet[2591]: I0325 01:10:40.957067 2591 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 25 01:10:40.957225 kubelet[2591]: I0325 01:10:40.957132 2591 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Mar 25 01:10:40.957727 kubelet[2591]: I0325 01:10:40.957518 2591 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 25 01:10:40.958183 kubelet[2591]: I0325 01:10:40.958163 2591 server.go:490] "Adding debug handlers to kubelet server" Mar 25 01:10:40.961448 kubelet[2591]: I0325 01:10:40.959641 2591 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 25 01:10:40.961448 kubelet[2591]: I0325 01:10:40.960064 2591 volume_manager.go:297] "Starting Kubelet Volume Manager" Mar 25 01:10:40.961448 kubelet[2591]: E0325 01:10:40.960164 2591 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 25 01:10:40.961448 kubelet[2591]: I0325 01:10:40.960408 2591 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 25 01:10:40.961448 kubelet[2591]: I0325 01:10:40.961096 2591 reconciler.go:26] "Reconciler: start to sync state" Mar 25 01:10:40.968362 kubelet[2591]: I0325 01:10:40.967642 2591 factory.go:221] Registration of the systemd container factory successfully Mar 25 01:10:40.968362 kubelet[2591]: I0325 01:10:40.967741 2591 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 25 01:10:40.973543 kubelet[2591]: I0325 01:10:40.970927 2591 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 25 01:10:40.973543 kubelet[2591]: I0325 01:10:40.972149 2591 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 25 01:10:40.973543 kubelet[2591]: I0325 01:10:40.972167 2591 status_manager.go:227] "Starting to sync pod status with apiserver" Mar 25 01:10:40.973543 kubelet[2591]: I0325 01:10:40.972186 2591 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Mar 25 01:10:40.973543 kubelet[2591]: I0325 01:10:40.972194 2591 kubelet.go:2388] "Starting kubelet main sync loop" Mar 25 01:10:40.973543 kubelet[2591]: E0325 01:10:40.972228 2591 kubelet.go:2412] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 25 01:10:40.979780 kubelet[2591]: I0325 01:10:40.979757 2591 factory.go:221] Registration of the containerd container factory successfully Mar 25 01:10:41.010756 kubelet[2591]: I0325 01:10:41.010678 2591 cpu_manager.go:221] "Starting CPU manager" policy="none" Mar 25 01:10:41.010756 kubelet[2591]: I0325 01:10:41.010700 2591 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Mar 25 01:10:41.010756 kubelet[2591]: I0325 01:10:41.010719 2591 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:10:41.010872 kubelet[2591]: I0325 01:10:41.010853 2591 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 25 01:10:41.010892 kubelet[2591]: I0325 01:10:41.010864 2591 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 25 01:10:41.010892 kubelet[2591]: I0325 01:10:41.010880 2591 policy_none.go:49] "None policy: Start" Mar 25 01:10:41.010892 kubelet[2591]: I0325 01:10:41.010889 2591 memory_manager.go:186] "Starting memorymanager" policy="None" Mar 25 01:10:41.010944 kubelet[2591]: I0325 01:10:41.010898 2591 state_mem.go:35] "Initializing new in-memory state store" Mar 25 01:10:41.011000 kubelet[2591]: I0325 01:10:41.010983 2591 state_mem.go:75] "Updated machine memory state" Mar 25 01:10:41.015748 kubelet[2591]: I0325 01:10:41.015717 2591 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 25 01:10:41.015893 kubelet[2591]: I0325 01:10:41.015867 2591 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 25 01:10:41.015926 kubelet[2591]: I0325 01:10:41.015886 2591 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 25 01:10:41.016407 kubelet[2591]: I0325 01:10:41.016382 2591 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 25 01:10:41.017232 kubelet[2591]: E0325 01:10:41.017209 2591 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Mar 25 01:10:41.073397 kubelet[2591]: I0325 01:10:41.073180 2591 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Mar 25 01:10:41.073397 kubelet[2591]: I0325 01:10:41.073225 2591 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Mar 25 01:10:41.073397 kubelet[2591]: I0325 01:10:41.073349 2591 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Mar 25 01:10:41.078033 kubelet[2591]: E0325 01:10:41.077994 2591 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Mar 25 01:10:41.078185 kubelet[2591]: E0325 01:10:41.078012 2591 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Mar 25 01:10:41.078218 kubelet[2591]: E0325 01:10:41.078038 2591 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Mar 25 01:10:41.118329 kubelet[2591]: I0325 01:10:41.118305 2591 kubelet_node_status.go:76] "Attempting to register node" node="localhost" Mar 25 01:10:41.125494 kubelet[2591]: I0325 01:10:41.124993 2591 kubelet_node_status.go:125] "Node was previously registered" node="localhost" Mar 25 01:10:41.125494 kubelet[2591]: I0325 01:10:41.125063 2591 kubelet_node_status.go:79] "Successfully registered node" node="localhost" Mar 25 01:10:41.161664 kubelet[2591]: I0325 01:10:41.161636 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/cbbb394ff48414687df77e1bc213eeb5-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"cbbb394ff48414687df77e1bc213eeb5\") " pod="kube-system/kube-controller-manager-localhost" Mar 25 01:10:41.161664 kubelet[2591]: I0325 01:10:41.161669 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/cbbb394ff48414687df77e1bc213eeb5-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"cbbb394ff48414687df77e1bc213eeb5\") " pod="kube-system/kube-controller-manager-localhost" Mar 25 01:10:41.161786 kubelet[2591]: I0325 01:10:41.161686 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/cbbb394ff48414687df77e1bc213eeb5-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"cbbb394ff48414687df77e1bc213eeb5\") " pod="kube-system/kube-controller-manager-localhost" Mar 25 01:10:41.161786 kubelet[2591]: I0325 01:10:41.161704 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/5cc26063b48899768ec65ee5e08dca1b-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"5cc26063b48899768ec65ee5e08dca1b\") " pod="kube-system/kube-apiserver-localhost" Mar 25 01:10:41.161786 kubelet[2591]: I0325 01:10:41.161719 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/cbbb394ff48414687df77e1bc213eeb5-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"cbbb394ff48414687df77e1bc213eeb5\") " pod="kube-system/kube-controller-manager-localhost" Mar 25 01:10:41.161786 kubelet[2591]: I0325 01:10:41.161738 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/cbbb394ff48414687df77e1bc213eeb5-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"cbbb394ff48414687df77e1bc213eeb5\") " pod="kube-system/kube-controller-manager-localhost" Mar 25 01:10:41.161786 kubelet[2591]: I0325 01:10:41.161753 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3700e556aa2777679a324159272023f1-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"3700e556aa2777679a324159272023f1\") " pod="kube-system/kube-scheduler-localhost" Mar 25 01:10:41.161893 kubelet[2591]: I0325 01:10:41.161776 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/5cc26063b48899768ec65ee5e08dca1b-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"5cc26063b48899768ec65ee5e08dca1b\") " pod="kube-system/kube-apiserver-localhost" Mar 25 01:10:41.161893 kubelet[2591]: I0325 01:10:41.161791 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/5cc26063b48899768ec65ee5e08dca1b-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"5cc26063b48899768ec65ee5e08dca1b\") " pod="kube-system/kube-apiserver-localhost" Mar 25 01:10:41.953517 kubelet[2591]: I0325 01:10:41.953478 2591 apiserver.go:52] "Watching apiserver" Mar 25 01:10:41.961591 kubelet[2591]: I0325 01:10:41.961537 2591 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 25 01:10:41.998020 kubelet[2591]: I0325 01:10:41.997589 2591 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Mar 25 01:10:41.998020 kubelet[2591]: I0325 01:10:41.997900 2591 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Mar 25 01:10:41.998248 kubelet[2591]: I0325 01:10:41.998234 2591 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Mar 25 01:10:42.003907 kubelet[2591]: E0325 01:10:42.003864 2591 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Mar 25 01:10:42.011700 kubelet[2591]: E0325 01:10:42.011668 2591 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Mar 25 01:10:42.012054 kubelet[2591]: E0325 01:10:42.011829 2591 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Mar 25 01:10:42.026747 kubelet[2591]: I0325 01:10:42.026693 2591 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=3.026661487 podStartE2EDuration="3.026661487s" podCreationTimestamp="2025-03-25 01:10:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:10:42.026392007 +0000 UTC m=+1.131369342" watchObservedRunningTime="2025-03-25 01:10:42.026661487 +0000 UTC m=+1.131638822" Mar 25 01:10:42.091394 kubelet[2591]: I0325 01:10:42.091326 2591 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=3.09130787 podStartE2EDuration="3.09130787s" podCreationTimestamp="2025-03-25 01:10:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:10:42.052061576 +0000 UTC m=+1.157038911" watchObservedRunningTime="2025-03-25 01:10:42.09130787 +0000 UTC m=+1.196285205" Mar 25 01:10:42.091690 kubelet[2591]: I0325 01:10:42.091652 2591 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=3.09164187 podStartE2EDuration="3.09164187s" podCreationTimestamp="2025-03-25 01:10:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:10:42.09122379 +0000 UTC m=+1.196201125" watchObservedRunningTime="2025-03-25 01:10:42.09164187 +0000 UTC m=+1.196619205" Mar 25 01:10:45.514473 sudo[1676]: pam_unix(sudo:session): session closed for user root Mar 25 01:10:45.516117 sshd[1675]: Connection closed by 10.0.0.1 port 41298 Mar 25 01:10:45.516703 sshd-session[1672]: pam_unix(sshd:session): session closed for user core Mar 25 01:10:45.519560 systemd[1]: sshd@6-10.0.0.25:22-10.0.0.1:41298.service: Deactivated successfully. Mar 25 01:10:45.521722 systemd[1]: session-7.scope: Deactivated successfully. Mar 25 01:10:45.521943 systemd[1]: session-7.scope: Consumed 7.353s CPU time, 225.1M memory peak. Mar 25 01:10:45.523867 systemd-logind[1458]: Session 7 logged out. Waiting for processes to exit. Mar 25 01:10:45.524961 systemd-logind[1458]: Removed session 7. Mar 25 01:10:46.967573 kubelet[2591]: I0325 01:10:46.967528 2591 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 25 01:10:46.968645 containerd[1474]: time="2025-03-25T01:10:46.967926370Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 25 01:10:46.968999 kubelet[2591]: I0325 01:10:46.968138 2591 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 25 01:10:47.795659 systemd[1]: Created slice kubepods-besteffort-pod0d0d9fce_3764_4a86_906e_de71428a39cc.slice - libcontainer container kubepods-besteffort-pod0d0d9fce_3764_4a86_906e_de71428a39cc.slice. Mar 25 01:10:47.810283 kubelet[2591]: I0325 01:10:47.810255 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/0d0d9fce-3764-4a86-906e-de71428a39cc-xtables-lock\") pod \"kube-proxy-z5zpr\" (UID: \"0d0d9fce-3764-4a86-906e-de71428a39cc\") " pod="kube-system/kube-proxy-z5zpr" Mar 25 01:10:47.810378 kubelet[2591]: I0325 01:10:47.810290 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d0d9fce-3764-4a86-906e-de71428a39cc-lib-modules\") pod \"kube-proxy-z5zpr\" (UID: \"0d0d9fce-3764-4a86-906e-de71428a39cc\") " pod="kube-system/kube-proxy-z5zpr" Mar 25 01:10:47.810378 kubelet[2591]: I0325 01:10:47.810308 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/0d0d9fce-3764-4a86-906e-de71428a39cc-kube-proxy\") pod \"kube-proxy-z5zpr\" (UID: \"0d0d9fce-3764-4a86-906e-de71428a39cc\") " pod="kube-system/kube-proxy-z5zpr" Mar 25 01:10:47.810378 kubelet[2591]: I0325 01:10:47.810324 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4g7c\" (UniqueName: \"kubernetes.io/projected/0d0d9fce-3764-4a86-906e-de71428a39cc-kube-api-access-g4g7c\") pod \"kube-proxy-z5zpr\" (UID: \"0d0d9fce-3764-4a86-906e-de71428a39cc\") " pod="kube-system/kube-proxy-z5zpr" Mar 25 01:10:48.010158 systemd[1]: Created slice kubepods-besteffort-pod9c6e3daf_fe15_4138_a48a_240f36b2d53f.slice - libcontainer container kubepods-besteffort-pod9c6e3daf_fe15_4138_a48a_240f36b2d53f.slice. Mar 25 01:10:48.011639 kubelet[2591]: I0325 01:10:48.011600 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8bm9\" (UniqueName: \"kubernetes.io/projected/9c6e3daf-fe15-4138-a48a-240f36b2d53f-kube-api-access-w8bm9\") pod \"tigera-operator-ccfc44587-xf5sk\" (UID: \"9c6e3daf-fe15-4138-a48a-240f36b2d53f\") " pod="tigera-operator/tigera-operator-ccfc44587-xf5sk" Mar 25 01:10:48.011979 kubelet[2591]: I0325 01:10:48.011648 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/9c6e3daf-fe15-4138-a48a-240f36b2d53f-var-lib-calico\") pod \"tigera-operator-ccfc44587-xf5sk\" (UID: \"9c6e3daf-fe15-4138-a48a-240f36b2d53f\") " pod="tigera-operator/tigera-operator-ccfc44587-xf5sk" Mar 25 01:10:48.107038 containerd[1474]: time="2025-03-25T01:10:48.106971497Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-z5zpr,Uid:0d0d9fce-3764-4a86-906e-de71428a39cc,Namespace:kube-system,Attempt:0,}" Mar 25 01:10:48.125833 containerd[1474]: time="2025-03-25T01:10:48.125336342Z" level=info msg="connecting to shim 529839498baf97c5b758588aec58f889e17e8fdb579955bbdb5b0ffce491abfc" address="unix:///run/containerd/s/27d1a228db6ded3e837c4a791b5308de94545b5ecd8b2ce942c9ab4a8e5fcfd9" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:10:48.147630 systemd[1]: Started cri-containerd-529839498baf97c5b758588aec58f889e17e8fdb579955bbdb5b0ffce491abfc.scope - libcontainer container 529839498baf97c5b758588aec58f889e17e8fdb579955bbdb5b0ffce491abfc. Mar 25 01:10:48.169679 containerd[1474]: time="2025-03-25T01:10:48.169639512Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-z5zpr,Uid:0d0d9fce-3764-4a86-906e-de71428a39cc,Namespace:kube-system,Attempt:0,} returns sandbox id \"529839498baf97c5b758588aec58f889e17e8fdb579955bbdb5b0ffce491abfc\"" Mar 25 01:10:48.172023 containerd[1474]: time="2025-03-25T01:10:48.171991273Z" level=info msg="CreateContainer within sandbox \"529839498baf97c5b758588aec58f889e17e8fdb579955bbdb5b0ffce491abfc\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 25 01:10:48.187453 containerd[1474]: time="2025-03-25T01:10:48.187123836Z" level=info msg="Container 22f08e28109b54e57ce1742c6ee130125417548331b6b1bc413a05e51cc3fba2: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:10:48.197830 containerd[1474]: time="2025-03-25T01:10:48.197783879Z" level=info msg="CreateContainer within sandbox \"529839498baf97c5b758588aec58f889e17e8fdb579955bbdb5b0ffce491abfc\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"22f08e28109b54e57ce1742c6ee130125417548331b6b1bc413a05e51cc3fba2\"" Mar 25 01:10:48.199474 containerd[1474]: time="2025-03-25T01:10:48.198530039Z" level=info msg="StartContainer for \"22f08e28109b54e57ce1742c6ee130125417548331b6b1bc413a05e51cc3fba2\"" Mar 25 01:10:48.200049 containerd[1474]: time="2025-03-25T01:10:48.200023119Z" level=info msg="connecting to shim 22f08e28109b54e57ce1742c6ee130125417548331b6b1bc413a05e51cc3fba2" address="unix:///run/containerd/s/27d1a228db6ded3e837c4a791b5308de94545b5ecd8b2ce942c9ab4a8e5fcfd9" protocol=ttrpc version=3 Mar 25 01:10:48.219572 systemd[1]: Started cri-containerd-22f08e28109b54e57ce1742c6ee130125417548331b6b1bc413a05e51cc3fba2.scope - libcontainer container 22f08e28109b54e57ce1742c6ee130125417548331b6b1bc413a05e51cc3fba2. Mar 25 01:10:48.260282 containerd[1474]: time="2025-03-25T01:10:48.260243014Z" level=info msg="StartContainer for \"22f08e28109b54e57ce1742c6ee130125417548331b6b1bc413a05e51cc3fba2\" returns successfully" Mar 25 01:10:48.313553 containerd[1474]: time="2025-03-25T01:10:48.313519386Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-ccfc44587-xf5sk,Uid:9c6e3daf-fe15-4138-a48a-240f36b2d53f,Namespace:tigera-operator,Attempt:0,}" Mar 25 01:10:48.330636 containerd[1474]: time="2025-03-25T01:10:48.330117950Z" level=info msg="connecting to shim 319f42b9ebccf27f083176fa5558d9c1fe016341507796b13df9a1115154007a" address="unix:///run/containerd/s/6ad08e2f1ecbd83c2ec44071e2df12f14c35a71f2cbcdfebd8ab9a4fd550a5dc" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:10:48.358604 systemd[1]: Started cri-containerd-319f42b9ebccf27f083176fa5558d9c1fe016341507796b13df9a1115154007a.scope - libcontainer container 319f42b9ebccf27f083176fa5558d9c1fe016341507796b13df9a1115154007a. Mar 25 01:10:48.391916 containerd[1474]: time="2025-03-25T01:10:48.391869725Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-ccfc44587-xf5sk,Uid:9c6e3daf-fe15-4138-a48a-240f36b2d53f,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"319f42b9ebccf27f083176fa5558d9c1fe016341507796b13df9a1115154007a\"" Mar 25 01:10:48.393566 containerd[1474]: time="2025-03-25T01:10:48.393536325Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\"" Mar 25 01:10:50.028194 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3744564147.mount: Deactivated successfully. Mar 25 01:10:50.263382 containerd[1474]: time="2025-03-25T01:10:50.263328026Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:10:50.264000 containerd[1474]: time="2025-03-25T01:10:50.263949906Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.5: active requests=0, bytes read=19271115" Mar 25 01:10:50.265078 containerd[1474]: time="2025-03-25T01:10:50.265036946Z" level=info msg="ImageCreate event name:\"sha256:a709184cc04589116e7266cb3575491ae8f2ac1c959975fea966447025f66eaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:10:50.267169 containerd[1474]: time="2025-03-25T01:10:50.267137867Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:10:50.268101 containerd[1474]: time="2025-03-25T01:10:50.267846827Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.5\" with image id \"sha256:a709184cc04589116e7266cb3575491ae8f2ac1c959975fea966447025f66eaa\", repo tag \"quay.io/tigera/operator:v1.36.5\", repo digest \"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\", size \"19267110\" in 1.874277702s" Mar 25 01:10:50.268101 containerd[1474]: time="2025-03-25T01:10:50.267875107Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\" returns image reference \"sha256:a709184cc04589116e7266cb3575491ae8f2ac1c959975fea966447025f66eaa\"" Mar 25 01:10:50.271781 containerd[1474]: time="2025-03-25T01:10:50.271750668Z" level=info msg="CreateContainer within sandbox \"319f42b9ebccf27f083176fa5558d9c1fe016341507796b13df9a1115154007a\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 25 01:10:50.276960 containerd[1474]: time="2025-03-25T01:10:50.276919989Z" level=info msg="Container fef82ad425a5da03ee5dbe47aac1c95c469302c2ce31e379edd18cdb3e751c7f: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:10:50.280363 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount212643651.mount: Deactivated successfully. Mar 25 01:10:50.282974 containerd[1474]: time="2025-03-25T01:10:50.282929230Z" level=info msg="CreateContainer within sandbox \"319f42b9ebccf27f083176fa5558d9c1fe016341507796b13df9a1115154007a\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"fef82ad425a5da03ee5dbe47aac1c95c469302c2ce31e379edd18cdb3e751c7f\"" Mar 25 01:10:50.283512 containerd[1474]: time="2025-03-25T01:10:50.283424470Z" level=info msg="StartContainer for \"fef82ad425a5da03ee5dbe47aac1c95c469302c2ce31e379edd18cdb3e751c7f\"" Mar 25 01:10:50.284217 containerd[1474]: time="2025-03-25T01:10:50.284188350Z" level=info msg="connecting to shim fef82ad425a5da03ee5dbe47aac1c95c469302c2ce31e379edd18cdb3e751c7f" address="unix:///run/containerd/s/6ad08e2f1ecbd83c2ec44071e2df12f14c35a71f2cbcdfebd8ab9a4fd550a5dc" protocol=ttrpc version=3 Mar 25 01:10:50.325700 systemd[1]: Started cri-containerd-fef82ad425a5da03ee5dbe47aac1c95c469302c2ce31e379edd18cdb3e751c7f.scope - libcontainer container fef82ad425a5da03ee5dbe47aac1c95c469302c2ce31e379edd18cdb3e751c7f. Mar 25 01:10:50.351252 containerd[1474]: time="2025-03-25T01:10:50.351196924Z" level=info msg="StartContainer for \"fef82ad425a5da03ee5dbe47aac1c95c469302c2ce31e379edd18cdb3e751c7f\" returns successfully" Mar 25 01:10:51.042999 kubelet[2591]: I0325 01:10:51.042644 2591 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-ccfc44587-xf5sk" podStartSLOduration=2.165727446 podStartE2EDuration="4.042616308s" podCreationTimestamp="2025-03-25 01:10:47 +0000 UTC" firstStartedPulling="2025-03-25 01:10:48.392955085 +0000 UTC m=+7.497932420" lastFinishedPulling="2025-03-25 01:10:50.269843947 +0000 UTC m=+9.374821282" observedRunningTime="2025-03-25 01:10:51.042304028 +0000 UTC m=+10.147281323" watchObservedRunningTime="2025-03-25 01:10:51.042616308 +0000 UTC m=+10.147593643" Mar 25 01:10:51.042999 kubelet[2591]: I0325 01:10:51.042890 2591 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-z5zpr" podStartSLOduration=4.042883388 podStartE2EDuration="4.042883388s" podCreationTimestamp="2025-03-25 01:10:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:10:49.021831754 +0000 UTC m=+8.126809129" watchObservedRunningTime="2025-03-25 01:10:51.042883388 +0000 UTC m=+10.147860723" Mar 25 01:10:53.683685 update_engine[1461]: I20250325 01:10:53.683594 1461 update_attempter.cc:509] Updating boot flags... Mar 25 01:10:53.722492 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (2986) Mar 25 01:10:53.773068 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (2987) Mar 25 01:10:54.171937 systemd[1]: Created slice kubepods-besteffort-podf5097cb3_966b_4517_958c_51ba571954fe.slice - libcontainer container kubepods-besteffort-podf5097cb3_966b_4517_958c_51ba571954fe.slice. Mar 25 01:10:54.209135 systemd[1]: Created slice kubepods-besteffort-pod97d5d189_06a9_453c_900b_ee2963c54aec.slice - libcontainer container kubepods-besteffort-pod97d5d189_06a9_453c_900b_ee2963c54aec.slice. Mar 25 01:10:54.251477 kubelet[2591]: I0325 01:10:54.251150 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/97d5d189-06a9-453c-900b-ee2963c54aec-tigera-ca-bundle\") pod \"calico-node-gnbvr\" (UID: \"97d5d189-06a9-453c-900b-ee2963c54aec\") " pod="calico-system/calico-node-gnbvr" Mar 25 01:10:54.251477 kubelet[2591]: I0325 01:10:54.251205 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/97d5d189-06a9-453c-900b-ee2963c54aec-lib-modules\") pod \"calico-node-gnbvr\" (UID: \"97d5d189-06a9-453c-900b-ee2963c54aec\") " pod="calico-system/calico-node-gnbvr" Mar 25 01:10:54.251477 kubelet[2591]: I0325 01:10:54.251224 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/97d5d189-06a9-453c-900b-ee2963c54aec-xtables-lock\") pod \"calico-node-gnbvr\" (UID: \"97d5d189-06a9-453c-900b-ee2963c54aec\") " pod="calico-system/calico-node-gnbvr" Mar 25 01:10:54.251477 kubelet[2591]: I0325 01:10:54.251241 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9trcq\" (UniqueName: \"kubernetes.io/projected/97d5d189-06a9-453c-900b-ee2963c54aec-kube-api-access-9trcq\") pod \"calico-node-gnbvr\" (UID: \"97d5d189-06a9-453c-900b-ee2963c54aec\") " pod="calico-system/calico-node-gnbvr" Mar 25 01:10:54.251477 kubelet[2591]: I0325 01:10:54.251259 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/f5097cb3-966b-4517-958c-51ba571954fe-typha-certs\") pod \"calico-typha-fd6b95b6d-22v7x\" (UID: \"f5097cb3-966b-4517-958c-51ba571954fe\") " pod="calico-system/calico-typha-fd6b95b6d-22v7x" Mar 25 01:10:54.251898 kubelet[2591]: I0325 01:10:54.251274 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/97d5d189-06a9-453c-900b-ee2963c54aec-node-certs\") pod \"calico-node-gnbvr\" (UID: \"97d5d189-06a9-453c-900b-ee2963c54aec\") " pod="calico-system/calico-node-gnbvr" Mar 25 01:10:54.251898 kubelet[2591]: I0325 01:10:54.251289 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llplr\" (UniqueName: \"kubernetes.io/projected/f5097cb3-966b-4517-958c-51ba571954fe-kube-api-access-llplr\") pod \"calico-typha-fd6b95b6d-22v7x\" (UID: \"f5097cb3-966b-4517-958c-51ba571954fe\") " pod="calico-system/calico-typha-fd6b95b6d-22v7x" Mar 25 01:10:54.251898 kubelet[2591]: I0325 01:10:54.251304 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/97d5d189-06a9-453c-900b-ee2963c54aec-var-lib-calico\") pod \"calico-node-gnbvr\" (UID: \"97d5d189-06a9-453c-900b-ee2963c54aec\") " pod="calico-system/calico-node-gnbvr" Mar 25 01:10:54.251898 kubelet[2591]: I0325 01:10:54.251319 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/97d5d189-06a9-453c-900b-ee2963c54aec-cni-bin-dir\") pod \"calico-node-gnbvr\" (UID: \"97d5d189-06a9-453c-900b-ee2963c54aec\") " pod="calico-system/calico-node-gnbvr" Mar 25 01:10:54.251898 kubelet[2591]: I0325 01:10:54.251333 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/97d5d189-06a9-453c-900b-ee2963c54aec-cni-net-dir\") pod \"calico-node-gnbvr\" (UID: \"97d5d189-06a9-453c-900b-ee2963c54aec\") " pod="calico-system/calico-node-gnbvr" Mar 25 01:10:54.252038 kubelet[2591]: I0325 01:10:54.251367 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/97d5d189-06a9-453c-900b-ee2963c54aec-cni-log-dir\") pod \"calico-node-gnbvr\" (UID: \"97d5d189-06a9-453c-900b-ee2963c54aec\") " pod="calico-system/calico-node-gnbvr" Mar 25 01:10:54.252038 kubelet[2591]: I0325 01:10:54.251405 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/97d5d189-06a9-453c-900b-ee2963c54aec-policysync\") pod \"calico-node-gnbvr\" (UID: \"97d5d189-06a9-453c-900b-ee2963c54aec\") " pod="calico-system/calico-node-gnbvr" Mar 25 01:10:54.252038 kubelet[2591]: I0325 01:10:54.251422 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5097cb3-966b-4517-958c-51ba571954fe-tigera-ca-bundle\") pod \"calico-typha-fd6b95b6d-22v7x\" (UID: \"f5097cb3-966b-4517-958c-51ba571954fe\") " pod="calico-system/calico-typha-fd6b95b6d-22v7x" Mar 25 01:10:54.252038 kubelet[2591]: I0325 01:10:54.251463 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/97d5d189-06a9-453c-900b-ee2963c54aec-var-run-calico\") pod \"calico-node-gnbvr\" (UID: \"97d5d189-06a9-453c-900b-ee2963c54aec\") " pod="calico-system/calico-node-gnbvr" Mar 25 01:10:54.252038 kubelet[2591]: I0325 01:10:54.251493 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/97d5d189-06a9-453c-900b-ee2963c54aec-flexvol-driver-host\") pod \"calico-node-gnbvr\" (UID: \"97d5d189-06a9-453c-900b-ee2963c54aec\") " pod="calico-system/calico-node-gnbvr" Mar 25 01:10:54.318497 kubelet[2591]: E0325 01:10:54.317861 2591 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rm4dd" podUID="134e9f30-05b3-4c2c-8ea5-b587606a6022" Mar 25 01:10:54.352442 kubelet[2591]: I0325 01:10:54.352395 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/134e9f30-05b3-4c2c-8ea5-b587606a6022-varrun\") pod \"csi-node-driver-rm4dd\" (UID: \"134e9f30-05b3-4c2c-8ea5-b587606a6022\") " pod="calico-system/csi-node-driver-rm4dd" Mar 25 01:10:54.352620 kubelet[2591]: I0325 01:10:54.352559 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/134e9f30-05b3-4c2c-8ea5-b587606a6022-kubelet-dir\") pod \"csi-node-driver-rm4dd\" (UID: \"134e9f30-05b3-4c2c-8ea5-b587606a6022\") " pod="calico-system/csi-node-driver-rm4dd" Mar 25 01:10:54.352709 kubelet[2591]: I0325 01:10:54.352670 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/134e9f30-05b3-4c2c-8ea5-b587606a6022-registration-dir\") pod \"csi-node-driver-rm4dd\" (UID: \"134e9f30-05b3-4c2c-8ea5-b587606a6022\") " pod="calico-system/csi-node-driver-rm4dd" Mar 25 01:10:54.352743 kubelet[2591]: I0325 01:10:54.352734 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s29x5\" (UniqueName: \"kubernetes.io/projected/134e9f30-05b3-4c2c-8ea5-b587606a6022-kube-api-access-s29x5\") pod \"csi-node-driver-rm4dd\" (UID: \"134e9f30-05b3-4c2c-8ea5-b587606a6022\") " pod="calico-system/csi-node-driver-rm4dd" Mar 25 01:10:54.352788 kubelet[2591]: I0325 01:10:54.352765 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/134e9f30-05b3-4c2c-8ea5-b587606a6022-socket-dir\") pod \"csi-node-driver-rm4dd\" (UID: \"134e9f30-05b3-4c2c-8ea5-b587606a6022\") " pod="calico-system/csi-node-driver-rm4dd" Mar 25 01:10:54.366977 kubelet[2591]: E0325 01:10:54.366846 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.366977 kubelet[2591]: W0325 01:10:54.366869 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.366977 kubelet[2591]: E0325 01:10:54.366897 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.368387 kubelet[2591]: E0325 01:10:54.368372 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.368466 kubelet[2591]: W0325 01:10:54.368453 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.368620 kubelet[2591]: E0325 01:10:54.368552 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.368736 kubelet[2591]: E0325 01:10:54.368722 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.368794 kubelet[2591]: W0325 01:10:54.368782 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.368920 kubelet[2591]: E0325 01:10:54.368856 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.369156 kubelet[2591]: E0325 01:10:54.369045 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.369156 kubelet[2591]: W0325 01:10:54.369058 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.369235 kubelet[2591]: E0325 01:10:54.369180 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.369378 kubelet[2591]: E0325 01:10:54.369320 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.369459 kubelet[2591]: W0325 01:10:54.369424 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.369659 kubelet[2591]: E0325 01:10:54.369531 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.369884 kubelet[2591]: E0325 01:10:54.369868 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.369958 kubelet[2591]: W0325 01:10:54.369945 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.370044 kubelet[2591]: E0325 01:10:54.370024 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.370293 kubelet[2591]: E0325 01:10:54.370212 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.370293 kubelet[2591]: W0325 01:10:54.370223 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.370293 kubelet[2591]: E0325 01:10:54.370253 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.370450 kubelet[2591]: E0325 01:10:54.370422 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.370501 kubelet[2591]: W0325 01:10:54.370490 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.370600 kubelet[2591]: E0325 01:10:54.370574 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.370886 kubelet[2591]: E0325 01:10:54.370871 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.371089 kubelet[2591]: W0325 01:10:54.370975 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.371089 kubelet[2591]: E0325 01:10:54.371013 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.371673 kubelet[2591]: E0325 01:10:54.371656 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.371737 kubelet[2591]: W0325 01:10:54.371725 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.371891 kubelet[2591]: E0325 01:10:54.371858 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.372114 kubelet[2591]: E0325 01:10:54.371974 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.372114 kubelet[2591]: W0325 01:10:54.371984 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.372114 kubelet[2591]: E0325 01:10:54.372036 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.372280 kubelet[2591]: E0325 01:10:54.372267 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.372334 kubelet[2591]: W0325 01:10:54.372323 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.372446 kubelet[2591]: E0325 01:10:54.372405 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.372778 kubelet[2591]: E0325 01:10:54.372762 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.372875 kubelet[2591]: W0325 01:10:54.372862 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.372986 kubelet[2591]: E0325 01:10:54.372935 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.373286 kubelet[2591]: E0325 01:10:54.373268 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.373477 kubelet[2591]: W0325 01:10:54.373342 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.373477 kubelet[2591]: E0325 01:10:54.373377 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.373610 kubelet[2591]: E0325 01:10:54.373597 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.373670 kubelet[2591]: W0325 01:10:54.373659 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.373738 kubelet[2591]: E0325 01:10:54.373725 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.373945 kubelet[2591]: E0325 01:10:54.373926 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.373945 kubelet[2591]: W0325 01:10:54.373940 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.374006 kubelet[2591]: E0325 01:10:54.373956 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.374172 kubelet[2591]: E0325 01:10:54.374159 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.374172 kubelet[2591]: W0325 01:10:54.374169 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.374226 kubelet[2591]: E0325 01:10:54.374181 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.375063 kubelet[2591]: E0325 01:10:54.374502 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.375063 kubelet[2591]: W0325 01:10:54.374520 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.375063 kubelet[2591]: E0325 01:10:54.374804 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.375063 kubelet[2591]: W0325 01:10:54.374819 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.375063 kubelet[2591]: E0325 01:10:54.374837 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.375063 kubelet[2591]: E0325 01:10:54.374982 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.375063 kubelet[2591]: W0325 01:10:54.374991 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.375063 kubelet[2591]: E0325 01:10:54.375040 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.375259 kubelet[2591]: E0325 01:10:54.375123 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.375259 kubelet[2591]: W0325 01:10:54.375129 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.375298 kubelet[2591]: E0325 01:10:54.375275 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.375298 kubelet[2591]: W0325 01:10:54.375283 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.375408 kubelet[2591]: E0325 01:10:54.375355 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.375408 kubelet[2591]: E0325 01:10:54.375381 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.375408 kubelet[2591]: E0325 01:10:54.375390 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.375572 kubelet[2591]: E0325 01:10:54.375498 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.375572 kubelet[2591]: W0325 01:10:54.375508 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.375572 kubelet[2591]: E0325 01:10:54.375539 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.375753 kubelet[2591]: E0325 01:10:54.375727 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.375972 kubelet[2591]: W0325 01:10:54.375783 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.375972 kubelet[2591]: E0325 01:10:54.375803 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.376346 kubelet[2591]: E0325 01:10:54.376192 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.376346 kubelet[2591]: W0325 01:10:54.376208 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.376346 kubelet[2591]: E0325 01:10:54.376228 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.376450 kubelet[2591]: E0325 01:10:54.376416 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.376488 kubelet[2591]: W0325 01:10:54.376464 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.376488 kubelet[2591]: E0325 01:10:54.376478 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.376733 kubelet[2591]: E0325 01:10:54.376704 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.376796 kubelet[2591]: W0325 01:10:54.376774 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.376978 kubelet[2591]: E0325 01:10:54.376862 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.377242 kubelet[2591]: E0325 01:10:54.377217 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.377242 kubelet[2591]: W0325 01:10:54.377232 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.377314 kubelet[2591]: E0325 01:10:54.377304 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.377528 kubelet[2591]: E0325 01:10:54.377409 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.377528 kubelet[2591]: W0325 01:10:54.377419 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.377528 kubelet[2591]: E0325 01:10:54.377459 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.377800 kubelet[2591]: E0325 01:10:54.377717 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.377800 kubelet[2591]: W0325 01:10:54.377730 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.377800 kubelet[2591]: E0325 01:10:54.377739 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.390551 kubelet[2591]: E0325 01:10:54.390510 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.390551 kubelet[2591]: W0325 01:10:54.390531 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.390645 kubelet[2591]: E0325 01:10:54.390567 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.391536 kubelet[2591]: E0325 01:10:54.391518 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.391536 kubelet[2591]: W0325 01:10:54.391531 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.391592 kubelet[2591]: E0325 01:10:54.391543 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.454297 kubelet[2591]: E0325 01:10:54.454046 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.454297 kubelet[2591]: W0325 01:10:54.454066 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.454297 kubelet[2591]: E0325 01:10:54.454083 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.455467 kubelet[2591]: E0325 01:10:54.455449 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.455546 kubelet[2591]: W0325 01:10:54.455532 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.455631 kubelet[2591]: E0325 01:10:54.455618 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.456424 kubelet[2591]: E0325 01:10:54.456404 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.456479 kubelet[2591]: W0325 01:10:54.456424 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.456479 kubelet[2591]: E0325 01:10:54.456460 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.457468 kubelet[2591]: E0325 01:10:54.457423 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.457468 kubelet[2591]: W0325 01:10:54.457449 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.457468 kubelet[2591]: E0325 01:10:54.457468 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.458518 kubelet[2591]: E0325 01:10:54.458494 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.458518 kubelet[2591]: W0325 01:10:54.458509 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.458587 kubelet[2591]: E0325 01:10:54.458574 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.458788 kubelet[2591]: E0325 01:10:54.458760 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.458788 kubelet[2591]: W0325 01:10:54.458775 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.458863 kubelet[2591]: E0325 01:10:54.458841 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.459194 kubelet[2591]: E0325 01:10:54.459173 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.459194 kubelet[2591]: W0325 01:10:54.459188 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.459300 kubelet[2591]: E0325 01:10:54.459272 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.459415 kubelet[2591]: E0325 01:10:54.459402 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.459415 kubelet[2591]: W0325 01:10:54.459414 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.459533 kubelet[2591]: E0325 01:10:54.459449 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.459627 kubelet[2591]: E0325 01:10:54.459613 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.459627 kubelet[2591]: W0325 01:10:54.459624 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.459697 kubelet[2591]: E0325 01:10:54.459645 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.459845 kubelet[2591]: E0325 01:10:54.459834 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.459845 kubelet[2591]: W0325 01:10:54.459845 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.459899 kubelet[2591]: E0325 01:10:54.459859 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.460055 kubelet[2591]: E0325 01:10:54.460040 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.460055 kubelet[2591]: W0325 01:10:54.460048 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.460055 kubelet[2591]: E0325 01:10:54.460087 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.460196 kubelet[2591]: E0325 01:10:54.460171 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.460196 kubelet[2591]: W0325 01:10:54.460178 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.460284 kubelet[2591]: E0325 01:10:54.460256 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.460326 kubelet[2591]: E0325 01:10:54.460318 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.460326 kubelet[2591]: W0325 01:10:54.460325 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.460423 kubelet[2591]: E0325 01:10:54.460340 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.460509 kubelet[2591]: E0325 01:10:54.460498 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.460509 kubelet[2591]: W0325 01:10:54.460508 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.460558 kubelet[2591]: E0325 01:10:54.460522 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.460704 kubelet[2591]: E0325 01:10:54.460694 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.460704 kubelet[2591]: W0325 01:10:54.460704 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.460779 kubelet[2591]: E0325 01:10:54.460716 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.460874 kubelet[2591]: E0325 01:10:54.460864 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.460874 kubelet[2591]: W0325 01:10:54.460874 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.460943 kubelet[2591]: E0325 01:10:54.460887 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.461020 kubelet[2591]: E0325 01:10:54.461010 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.461020 kubelet[2591]: W0325 01:10:54.461020 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.461088 kubelet[2591]: E0325 01:10:54.461032 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.461157 kubelet[2591]: E0325 01:10:54.461148 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.461157 kubelet[2591]: W0325 01:10:54.461157 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.461313 kubelet[2591]: E0325 01:10:54.461203 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.461313 kubelet[2591]: E0325 01:10:54.461265 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.461313 kubelet[2591]: W0325 01:10:54.461272 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.461313 kubelet[2591]: E0325 01:10:54.461300 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.461495 kubelet[2591]: E0325 01:10:54.461423 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.461495 kubelet[2591]: W0325 01:10:54.461437 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.461495 kubelet[2591]: E0325 01:10:54.461458 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.461661 kubelet[2591]: E0325 01:10:54.461578 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.461661 kubelet[2591]: W0325 01:10:54.461587 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.461661 kubelet[2591]: E0325 01:10:54.461598 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.461757 kubelet[2591]: E0325 01:10:54.461742 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.461757 kubelet[2591]: W0325 01:10:54.461752 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.461816 kubelet[2591]: E0325 01:10:54.461767 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.462261 kubelet[2591]: E0325 01:10:54.462145 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.462261 kubelet[2591]: W0325 01:10:54.462161 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.462261 kubelet[2591]: E0325 01:10:54.462182 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.462609 kubelet[2591]: E0325 01:10:54.462596 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.462776 kubelet[2591]: W0325 01:10:54.462700 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.462776 kubelet[2591]: E0325 01:10:54.462730 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.463097 kubelet[2591]: E0325 01:10:54.463066 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.463208 kubelet[2591]: W0325 01:10:54.463079 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.463208 kubelet[2591]: E0325 01:10:54.463158 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.471968 kubelet[2591]: E0325 01:10:54.471944 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.472117 kubelet[2591]: W0325 01:10:54.472063 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.472117 kubelet[2591]: E0325 01:10:54.472087 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.481566 containerd[1474]: time="2025-03-25T01:10:54.481528407Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-fd6b95b6d-22v7x,Uid:f5097cb3-966b-4517-958c-51ba571954fe,Namespace:calico-system,Attempt:0,}" Mar 25 01:10:54.512486 containerd[1474]: time="2025-03-25T01:10:54.512418132Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-gnbvr,Uid:97d5d189-06a9-453c-900b-ee2963c54aec,Namespace:calico-system,Attempt:0,}" Mar 25 01:10:54.515847 containerd[1474]: time="2025-03-25T01:10:54.515810973Z" level=info msg="connecting to shim 254501fb941265985dff10084acba0b54ff87c59a66e59570a8934d8e4bc2e4d" address="unix:///run/containerd/s/b04eba15796ad6628eddb0c61a2cbe2571a81438c1bb53d2650cf1bccbe41745" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:10:54.539978 containerd[1474]: time="2025-03-25T01:10:54.539926457Z" level=info msg="connecting to shim 94fdf550acb51321ba180aa761734dc25cbb67eea5df9585d0d220951ba78552" address="unix:///run/containerd/s/8cde3f311ab2c2eb29743bbe78365616eeab0ac96a3c3842adce2c666e0f8fce" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:10:54.546507 kubelet[2591]: E0325 01:10:54.546463 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.546507 kubelet[2591]: W0325 01:10:54.546500 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.546641 kubelet[2591]: E0325 01:10:54.546529 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.546769 kubelet[2591]: E0325 01:10:54.546665 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.550007 kubelet[2591]: W0325 01:10:54.546674 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.550007 kubelet[2591]: E0325 01:10:54.550007 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.550018 systemd[1]: Started cri-containerd-254501fb941265985dff10084acba0b54ff87c59a66e59570a8934d8e4bc2e4d.scope - libcontainer container 254501fb941265985dff10084acba0b54ff87c59a66e59570a8934d8e4bc2e4d. Mar 25 01:10:54.550964 kubelet[2591]: E0325 01:10:54.550895 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.550964 kubelet[2591]: W0325 01:10:54.550911 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.550964 kubelet[2591]: E0325 01:10:54.550924 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.551218 kubelet[2591]: E0325 01:10:54.551195 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.551218 kubelet[2591]: W0325 01:10:54.551215 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.551299 kubelet[2591]: E0325 01:10:54.551226 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.551753 kubelet[2591]: E0325 01:10:54.551734 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.551753 kubelet[2591]: W0325 01:10:54.551750 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.551855 kubelet[2591]: E0325 01:10:54.551762 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.552391 kubelet[2591]: E0325 01:10:54.552074 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.552391 kubelet[2591]: W0325 01:10:54.552087 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.552391 kubelet[2591]: E0325 01:10:54.552109 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.552840 kubelet[2591]: E0325 01:10:54.552364 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.552840 kubelet[2591]: W0325 01:10:54.552456 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.552840 kubelet[2591]: E0325 01:10:54.552468 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.552840 kubelet[2591]: E0325 01:10:54.552803 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.552840 kubelet[2591]: W0325 01:10:54.552812 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.552840 kubelet[2591]: E0325 01:10:54.552823 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.553374 kubelet[2591]: E0325 01:10:54.553300 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.553374 kubelet[2591]: W0325 01:10:54.553315 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.553374 kubelet[2591]: E0325 01:10:54.553326 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.554266 kubelet[2591]: E0325 01:10:54.554231 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.554355 kubelet[2591]: W0325 01:10:54.554247 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.554355 kubelet[2591]: E0325 01:10:54.554296 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.554589 kubelet[2591]: E0325 01:10:54.554531 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.554589 kubelet[2591]: W0325 01:10:54.554546 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.554589 kubelet[2591]: E0325 01:10:54.554556 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.555481 kubelet[2591]: E0325 01:10:54.555223 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.555481 kubelet[2591]: W0325 01:10:54.555346 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.555481 kubelet[2591]: E0325 01:10:54.555367 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.555717 kubelet[2591]: E0325 01:10:54.555694 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.555717 kubelet[2591]: W0325 01:10:54.555709 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.555717 kubelet[2591]: E0325 01:10:54.555722 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.557368 kubelet[2591]: E0325 01:10:54.557347 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.557368 kubelet[2591]: W0325 01:10:54.557361 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.557606 kubelet[2591]: E0325 01:10:54.557376 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.557606 kubelet[2591]: E0325 01:10:54.557551 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:54.557606 kubelet[2591]: W0325 01:10:54.557560 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:54.557606 kubelet[2591]: E0325 01:10:54.557569 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:54.575615 systemd[1]: Started cri-containerd-94fdf550acb51321ba180aa761734dc25cbb67eea5df9585d0d220951ba78552.scope - libcontainer container 94fdf550acb51321ba180aa761734dc25cbb67eea5df9585d0d220951ba78552. Mar 25 01:10:54.616476 containerd[1474]: time="2025-03-25T01:10:54.616416909Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-fd6b95b6d-22v7x,Uid:f5097cb3-966b-4517-958c-51ba571954fe,Namespace:calico-system,Attempt:0,} returns sandbox id \"254501fb941265985dff10084acba0b54ff87c59a66e59570a8934d8e4bc2e4d\"" Mar 25 01:10:54.635834 containerd[1474]: time="2025-03-25T01:10:54.635596872Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\"" Mar 25 01:10:54.635834 containerd[1474]: time="2025-03-25T01:10:54.635620432Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-gnbvr,Uid:97d5d189-06a9-453c-900b-ee2963c54aec,Namespace:calico-system,Attempt:0,} returns sandbox id \"94fdf550acb51321ba180aa761734dc25cbb67eea5df9585d0d220951ba78552\"" Mar 25 01:10:55.982500 kubelet[2591]: E0325 01:10:55.982133 2591 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rm4dd" podUID="134e9f30-05b3-4c2c-8ea5-b587606a6022" Mar 25 01:10:57.972510 kubelet[2591]: E0325 01:10:57.972422 2591 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rm4dd" podUID="134e9f30-05b3-4c2c-8ea5-b587606a6022" Mar 25 01:10:57.991499 containerd[1474]: time="2025-03-25T01:10:57.991456395Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:10:57.992123 containerd[1474]: time="2025-03-25T01:10:57.992077075Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.2: active requests=0, bytes read=28363957" Mar 25 01:10:57.992722 containerd[1474]: time="2025-03-25T01:10:57.992693235Z" level=info msg="ImageCreate event name:\"sha256:38a4e8457549414848315eae0d5ab8ecd6c51f4baaea849fe5edce714d81a999\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:10:57.994592 containerd[1474]: time="2025-03-25T01:10:57.994558115Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:10:57.995198 containerd[1474]: time="2025-03-25T01:10:57.995168035Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.2\" with image id \"sha256:38a4e8457549414848315eae0d5ab8ecd6c51f4baaea849fe5edce714d81a999\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\", size \"29733706\" in 3.359516523s" Mar 25 01:10:57.995198 containerd[1474]: time="2025-03-25T01:10:57.995198155Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\" returns image reference \"sha256:38a4e8457549414848315eae0d5ab8ecd6c51f4baaea849fe5edce714d81a999\"" Mar 25 01:10:58.000701 containerd[1474]: time="2025-03-25T01:10:58.000672516Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\"" Mar 25 01:10:58.010894 containerd[1474]: time="2025-03-25T01:10:58.010864677Z" level=info msg="CreateContainer within sandbox \"254501fb941265985dff10084acba0b54ff87c59a66e59570a8934d8e4bc2e4d\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 25 01:10:58.017646 containerd[1474]: time="2025-03-25T01:10:58.017603598Z" level=info msg="Container 7565f3db032019ef980ce28cdcb1584b071d65285e7c9e392922c402d051147e: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:10:58.021207 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3126818000.mount: Deactivated successfully. Mar 25 01:10:58.025218 containerd[1474]: time="2025-03-25T01:10:58.025175199Z" level=info msg="CreateContainer within sandbox \"254501fb941265985dff10084acba0b54ff87c59a66e59570a8934d8e4bc2e4d\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"7565f3db032019ef980ce28cdcb1584b071d65285e7c9e392922c402d051147e\"" Mar 25 01:10:58.028373 containerd[1474]: time="2025-03-25T01:10:58.026884559Z" level=info msg="StartContainer for \"7565f3db032019ef980ce28cdcb1584b071d65285e7c9e392922c402d051147e\"" Mar 25 01:10:58.028373 containerd[1474]: time="2025-03-25T01:10:58.027906319Z" level=info msg="connecting to shim 7565f3db032019ef980ce28cdcb1584b071d65285e7c9e392922c402d051147e" address="unix:///run/containerd/s/b04eba15796ad6628eddb0c61a2cbe2571a81438c1bb53d2650cf1bccbe41745" protocol=ttrpc version=3 Mar 25 01:10:58.048572 systemd[1]: Started cri-containerd-7565f3db032019ef980ce28cdcb1584b071d65285e7c9e392922c402d051147e.scope - libcontainer container 7565f3db032019ef980ce28cdcb1584b071d65285e7c9e392922c402d051147e. Mar 25 01:10:58.092512 containerd[1474]: time="2025-03-25T01:10:58.092397167Z" level=info msg="StartContainer for \"7565f3db032019ef980ce28cdcb1584b071d65285e7c9e392922c402d051147e\" returns successfully" Mar 25 01:10:59.074072 kubelet[2591]: I0325 01:10:59.073990 2591 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-fd6b95b6d-22v7x" podStartSLOduration=1.706735604 podStartE2EDuration="5.073970849s" podCreationTimestamp="2025-03-25 01:10:54 +0000 UTC" firstStartedPulling="2025-03-25 01:10:54.629135471 +0000 UTC m=+13.734112806" lastFinishedPulling="2025-03-25 01:10:57.996370756 +0000 UTC m=+17.101348051" observedRunningTime="2025-03-25 01:10:59.071808089 +0000 UTC m=+18.176785424" watchObservedRunningTime="2025-03-25 01:10:59.073970849 +0000 UTC m=+18.178948184" Mar 25 01:10:59.091514 kubelet[2591]: E0325 01:10:59.091477 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:59.091514 kubelet[2591]: W0325 01:10:59.091503 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:59.091514 kubelet[2591]: E0325 01:10:59.091525 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:59.091730 kubelet[2591]: E0325 01:10:59.091711 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:59.091765 kubelet[2591]: W0325 01:10:59.091723 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:59.091765 kubelet[2591]: E0325 01:10:59.091763 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:59.091915 kubelet[2591]: E0325 01:10:59.091899 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:59.091915 kubelet[2591]: W0325 01:10:59.091909 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:59.091973 kubelet[2591]: E0325 01:10:59.091917 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:59.092060 kubelet[2591]: E0325 01:10:59.092044 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:59.092060 kubelet[2591]: W0325 01:10:59.092053 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:59.092109 kubelet[2591]: E0325 01:10:59.092061 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:59.092552 kubelet[2591]: E0325 01:10:59.092529 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:59.092552 kubelet[2591]: W0325 01:10:59.092544 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:59.092552 kubelet[2591]: E0325 01:10:59.092556 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:59.092742 kubelet[2591]: E0325 01:10:59.092725 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:59.092742 kubelet[2591]: W0325 01:10:59.092736 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:59.092799 kubelet[2591]: E0325 01:10:59.092745 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:59.092911 kubelet[2591]: E0325 01:10:59.092894 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:59.092911 kubelet[2591]: W0325 01:10:59.092904 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:59.093132 kubelet[2591]: E0325 01:10:59.093106 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:59.093316 kubelet[2591]: E0325 01:10:59.093297 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:59.093316 kubelet[2591]: W0325 01:10:59.093310 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:59.093380 kubelet[2591]: E0325 01:10:59.093320 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:59.093510 kubelet[2591]: E0325 01:10:59.093494 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:59.093510 kubelet[2591]: W0325 01:10:59.093504 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:59.093569 kubelet[2591]: E0325 01:10:59.093513 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:59.093949 kubelet[2591]: E0325 01:10:59.093924 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:59.093949 kubelet[2591]: W0325 01:10:59.093938 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:59.093949 kubelet[2591]: E0325 01:10:59.093949 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:59.094154 kubelet[2591]: E0325 01:10:59.094133 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:59.094154 kubelet[2591]: W0325 01:10:59.094145 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:59.094154 kubelet[2591]: E0325 01:10:59.094153 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:59.094317 kubelet[2591]: E0325 01:10:59.094296 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:59.094578 kubelet[2591]: W0325 01:10:59.094468 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:59.094686 kubelet[2591]: E0325 01:10:59.094579 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:59.095455 kubelet[2591]: E0325 01:10:59.094804 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:59.095455 kubelet[2591]: W0325 01:10:59.094817 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:59.095455 kubelet[2591]: E0325 01:10:59.094827 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:59.095455 kubelet[2591]: E0325 01:10:59.094996 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:59.095455 kubelet[2591]: W0325 01:10:59.095004 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:59.095455 kubelet[2591]: E0325 01:10:59.095012 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:59.095455 kubelet[2591]: E0325 01:10:59.095352 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:59.095455 kubelet[2591]: W0325 01:10:59.095363 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:59.095455 kubelet[2591]: E0325 01:10:59.095387 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:59.191945 kubelet[2591]: E0325 01:10:59.191918 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:59.191945 kubelet[2591]: W0325 01:10:59.191938 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:59.192102 kubelet[2591]: E0325 01:10:59.191957 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:59.192150 kubelet[2591]: E0325 01:10:59.192135 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:59.192150 kubelet[2591]: W0325 01:10:59.192147 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:59.192198 kubelet[2591]: E0325 01:10:59.192161 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:59.192351 kubelet[2591]: E0325 01:10:59.192339 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:59.192351 kubelet[2591]: W0325 01:10:59.192350 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:59.192416 kubelet[2591]: E0325 01:10:59.192363 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:59.192660 kubelet[2591]: E0325 01:10:59.192634 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:59.192660 kubelet[2591]: W0325 01:10:59.192652 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:59.192723 kubelet[2591]: E0325 01:10:59.192667 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:59.192832 kubelet[2591]: E0325 01:10:59.192820 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:59.192832 kubelet[2591]: W0325 01:10:59.192830 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:59.192882 kubelet[2591]: E0325 01:10:59.192842 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:59.192976 kubelet[2591]: E0325 01:10:59.192966 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:59.192976 kubelet[2591]: W0325 01:10:59.192975 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:59.193023 kubelet[2591]: E0325 01:10:59.192986 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:59.196713 kubelet[2591]: E0325 01:10:59.196695 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:59.196713 kubelet[2591]: W0325 01:10:59.196711 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:59.196784 kubelet[2591]: E0325 01:10:59.196748 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:59.196874 kubelet[2591]: E0325 01:10:59.196864 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:59.196874 kubelet[2591]: W0325 01:10:59.196873 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:59.196989 kubelet[2591]: E0325 01:10:59.196947 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:59.197024 kubelet[2591]: E0325 01:10:59.197010 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:59.197024 kubelet[2591]: W0325 01:10:59.197017 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:59.197076 kubelet[2591]: E0325 01:10:59.197030 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:59.197471 kubelet[2591]: E0325 01:10:59.197457 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:59.197471 kubelet[2591]: W0325 01:10:59.197469 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:59.197543 kubelet[2591]: E0325 01:10:59.197482 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:59.197694 kubelet[2591]: E0325 01:10:59.197676 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:59.197727 kubelet[2591]: W0325 01:10:59.197695 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:59.197727 kubelet[2591]: E0325 01:10:59.197713 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:59.197919 kubelet[2591]: E0325 01:10:59.197908 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:59.197919 kubelet[2591]: W0325 01:10:59.197918 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:59.198037 kubelet[2591]: E0325 01:10:59.197930 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:59.198362 kubelet[2591]: E0325 01:10:59.198339 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:59.198402 kubelet[2591]: W0325 01:10:59.198362 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:59.198402 kubelet[2591]: E0325 01:10:59.198393 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:59.198701 kubelet[2591]: E0325 01:10:59.198689 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:59.198701 kubelet[2591]: W0325 01:10:59.198700 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:59.198756 kubelet[2591]: E0325 01:10:59.198715 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:59.198968 kubelet[2591]: E0325 01:10:59.198955 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:59.198968 kubelet[2591]: W0325 01:10:59.198967 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:59.199024 kubelet[2591]: E0325 01:10:59.198977 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:59.199174 kubelet[2591]: E0325 01:10:59.199163 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:59.199201 kubelet[2591]: W0325 01:10:59.199175 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:59.199201 kubelet[2591]: E0325 01:10:59.199189 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:59.199583 kubelet[2591]: E0325 01:10:59.199466 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:59.199630 kubelet[2591]: W0325 01:10:59.199587 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:59.199630 kubelet[2591]: E0325 01:10:59.199611 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:59.199849 kubelet[2591]: E0325 01:10:59.199834 2591 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:10:59.199849 kubelet[2591]: W0325 01:10:59.199847 2591 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:10:59.200000 kubelet[2591]: E0325 01:10:59.199858 2591 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:10:59.395013 containerd[1474]: time="2025-03-25T01:10:59.394969286Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:10:59.396203 containerd[1474]: time="2025-03-25T01:10:59.396013007Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2: active requests=0, bytes read=5120152" Mar 25 01:10:59.405118 containerd[1474]: time="2025-03-25T01:10:59.404611728Z" level=info msg="ImageCreate event name:\"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:10:59.407174 containerd[1474]: time="2025-03-25T01:10:59.407055248Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:10:59.407984 containerd[1474]: time="2025-03-25T01:10:59.407857608Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" with image id \"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\", size \"6489869\" in 1.407146732s" Mar 25 01:10:59.407984 containerd[1474]: time="2025-03-25T01:10:59.407893728Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" returns image reference \"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\"" Mar 25 01:10:59.410727 containerd[1474]: time="2025-03-25T01:10:59.410695248Z" level=info msg="CreateContainer within sandbox \"94fdf550acb51321ba180aa761734dc25cbb67eea5df9585d0d220951ba78552\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 25 01:10:59.429846 containerd[1474]: time="2025-03-25T01:10:59.429797090Z" level=info msg="Container 01b291cc7f8c7f56003165cd7501343d297fdfb1c0396402cd802a7d75334c1e: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:10:59.438635 containerd[1474]: time="2025-03-25T01:10:59.438584971Z" level=info msg="CreateContainer within sandbox \"94fdf550acb51321ba180aa761734dc25cbb67eea5df9585d0d220951ba78552\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"01b291cc7f8c7f56003165cd7501343d297fdfb1c0396402cd802a7d75334c1e\"" Mar 25 01:10:59.439313 containerd[1474]: time="2025-03-25T01:10:59.439279892Z" level=info msg="StartContainer for \"01b291cc7f8c7f56003165cd7501343d297fdfb1c0396402cd802a7d75334c1e\"" Mar 25 01:10:59.440761 containerd[1474]: time="2025-03-25T01:10:59.440683332Z" level=info msg="connecting to shim 01b291cc7f8c7f56003165cd7501343d297fdfb1c0396402cd802a7d75334c1e" address="unix:///run/containerd/s/8cde3f311ab2c2eb29743bbe78365616eeab0ac96a3c3842adce2c666e0f8fce" protocol=ttrpc version=3 Mar 25 01:10:59.458586 systemd[1]: Started cri-containerd-01b291cc7f8c7f56003165cd7501343d297fdfb1c0396402cd802a7d75334c1e.scope - libcontainer container 01b291cc7f8c7f56003165cd7501343d297fdfb1c0396402cd802a7d75334c1e. Mar 25 01:10:59.527653 containerd[1474]: time="2025-03-25T01:10:59.527599462Z" level=info msg="StartContainer for \"01b291cc7f8c7f56003165cd7501343d297fdfb1c0396402cd802a7d75334c1e\" returns successfully" Mar 25 01:10:59.545857 systemd[1]: cri-containerd-01b291cc7f8c7f56003165cd7501343d297fdfb1c0396402cd802a7d75334c1e.scope: Deactivated successfully. Mar 25 01:10:59.565263 containerd[1474]: time="2025-03-25T01:10:59.565223906Z" level=info msg="TaskExit event in podsandbox handler container_id:\"01b291cc7f8c7f56003165cd7501343d297fdfb1c0396402cd802a7d75334c1e\" id:\"01b291cc7f8c7f56003165cd7501343d297fdfb1c0396402cd802a7d75334c1e\" pid:3268 exited_at:{seconds:1742865059 nanos:553915625}" Mar 25 01:10:59.568851 containerd[1474]: time="2025-03-25T01:10:59.568791667Z" level=info msg="received exit event container_id:\"01b291cc7f8c7f56003165cd7501343d297fdfb1c0396402cd802a7d75334c1e\" id:\"01b291cc7f8c7f56003165cd7501343d297fdfb1c0396402cd802a7d75334c1e\" pid:3268 exited_at:{seconds:1742865059 nanos:553915625}" Mar 25 01:10:59.607087 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-01b291cc7f8c7f56003165cd7501343d297fdfb1c0396402cd802a7d75334c1e-rootfs.mount: Deactivated successfully. Mar 25 01:10:59.972585 kubelet[2591]: E0325 01:10:59.972537 2591 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rm4dd" podUID="134e9f30-05b3-4c2c-8ea5-b587606a6022" Mar 25 01:11:00.059068 kubelet[2591]: I0325 01:11:00.059027 2591 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:11:00.060268 containerd[1474]: time="2025-03-25T01:11:00.060032883Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\"" Mar 25 01:11:01.973104 kubelet[2591]: E0325 01:11:01.973059 2591 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-rm4dd" podUID="134e9f30-05b3-4c2c-8ea5-b587606a6022" Mar 25 01:11:02.867622 containerd[1474]: time="2025-03-25T01:11:02.867579932Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:11:02.868123 containerd[1474]: time="2025-03-25T01:11:02.868071812Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.2: active requests=0, bytes read=91227396" Mar 25 01:11:02.868910 containerd[1474]: time="2025-03-25T01:11:02.868878412Z" level=info msg="ImageCreate event name:\"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:11:02.870693 containerd[1474]: time="2025-03-25T01:11:02.870665852Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:11:02.871400 containerd[1474]: time="2025-03-25T01:11:02.871371532Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.2\" with image id \"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\", size \"92597153\" in 2.811302249s" Mar 25 01:11:02.871444 containerd[1474]: time="2025-03-25T01:11:02.871403172Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\" returns image reference \"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\"" Mar 25 01:11:02.873983 containerd[1474]: time="2025-03-25T01:11:02.873952333Z" level=info msg="CreateContainer within sandbox \"94fdf550acb51321ba180aa761734dc25cbb67eea5df9585d0d220951ba78552\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 25 01:11:02.881875 containerd[1474]: time="2025-03-25T01:11:02.881830973Z" level=info msg="Container 7443f2621f6859c7f649f88bd66a38d1d415103c0afef9d84ccdcb72b507ca0c: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:11:02.888761 containerd[1474]: time="2025-03-25T01:11:02.888726214Z" level=info msg="CreateContainer within sandbox \"94fdf550acb51321ba180aa761734dc25cbb67eea5df9585d0d220951ba78552\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"7443f2621f6859c7f649f88bd66a38d1d415103c0afef9d84ccdcb72b507ca0c\"" Mar 25 01:11:02.890472 containerd[1474]: time="2025-03-25T01:11:02.890400574Z" level=info msg="StartContainer for \"7443f2621f6859c7f649f88bd66a38d1d415103c0afef9d84ccdcb72b507ca0c\"" Mar 25 01:11:02.892029 containerd[1474]: time="2025-03-25T01:11:02.892003014Z" level=info msg="connecting to shim 7443f2621f6859c7f649f88bd66a38d1d415103c0afef9d84ccdcb72b507ca0c" address="unix:///run/containerd/s/8cde3f311ab2c2eb29743bbe78365616eeab0ac96a3c3842adce2c666e0f8fce" protocol=ttrpc version=3 Mar 25 01:11:02.913618 systemd[1]: Started cri-containerd-7443f2621f6859c7f649f88bd66a38d1d415103c0afef9d84ccdcb72b507ca0c.scope - libcontainer container 7443f2621f6859c7f649f88bd66a38d1d415103c0afef9d84ccdcb72b507ca0c. Mar 25 01:11:02.986455 containerd[1474]: time="2025-03-25T01:11:02.984704063Z" level=info msg="StartContainer for \"7443f2621f6859c7f649f88bd66a38d1d415103c0afef9d84ccdcb72b507ca0c\" returns successfully" Mar 25 01:11:03.469484 systemd[1]: cri-containerd-7443f2621f6859c7f649f88bd66a38d1d415103c0afef9d84ccdcb72b507ca0c.scope: Deactivated successfully. Mar 25 01:11:03.469824 systemd[1]: cri-containerd-7443f2621f6859c7f649f88bd66a38d1d415103c0afef9d84ccdcb72b507ca0c.scope: Consumed 420ms CPU time, 160.3M memory peak, 4K read from disk, 150.3M written to disk. Mar 25 01:11:03.480364 containerd[1474]: time="2025-03-25T01:11:03.480306788Z" level=info msg="received exit event container_id:\"7443f2621f6859c7f649f88bd66a38d1d415103c0afef9d84ccdcb72b507ca0c\" id:\"7443f2621f6859c7f649f88bd66a38d1d415103c0afef9d84ccdcb72b507ca0c\" pid:3328 exited_at:{seconds:1742865063 nanos:480067868}" Mar 25 01:11:03.480755 containerd[1474]: time="2025-03-25T01:11:03.480604548Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7443f2621f6859c7f649f88bd66a38d1d415103c0afef9d84ccdcb72b507ca0c\" id:\"7443f2621f6859c7f649f88bd66a38d1d415103c0afef9d84ccdcb72b507ca0c\" pid:3328 exited_at:{seconds:1742865063 nanos:480067868}" Mar 25 01:11:03.494753 kubelet[2591]: I0325 01:11:03.494726 2591 kubelet_node_status.go:502] "Fast updating node status as it just became ready" Mar 25 01:11:03.503665 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-7443f2621f6859c7f649f88bd66a38d1d415103c0afef9d84ccdcb72b507ca0c-rootfs.mount: Deactivated successfully. Mar 25 01:11:03.552348 systemd[1]: Created slice kubepods-burstable-pod2ede0a9d_37ce_445e_ae94_9562960117af.slice - libcontainer container kubepods-burstable-pod2ede0a9d_37ce_445e_ae94_9562960117af.slice. Mar 25 01:11:03.578990 systemd[1]: Created slice kubepods-burstable-pod65f62a1e_a36a_47c3_8c4b_1dbed6b93395.slice - libcontainer container kubepods-burstable-pod65f62a1e_a36a_47c3_8c4b_1dbed6b93395.slice. Mar 25 01:11:03.583683 systemd[1]: Created slice kubepods-besteffort-podc3cbab45_79c8_427a_87cc_52cefa199441.slice - libcontainer container kubepods-besteffort-podc3cbab45_79c8_427a_87cc_52cefa199441.slice. Mar 25 01:11:03.589416 systemd[1]: Created slice kubepods-besteffort-pod3a461ca2_f983_4fd2_a905_67ee0b4f5a3b.slice - libcontainer container kubepods-besteffort-pod3a461ca2_f983_4fd2_a905_67ee0b4f5a3b.slice. Mar 25 01:11:03.594867 systemd[1]: Created slice kubepods-besteffort-podd3ecef61_e163_47cc_b4ee_25518b9134d1.slice - libcontainer container kubepods-besteffort-podd3ecef61_e163_47cc_b4ee_25518b9134d1.slice. Mar 25 01:11:03.620796 kubelet[2591]: I0325 01:11:03.620747 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr2xx\" (UniqueName: \"kubernetes.io/projected/2ede0a9d-37ce-445e-ae94-9562960117af-kube-api-access-tr2xx\") pod \"coredns-668d6bf9bc-hg98s\" (UID: \"2ede0a9d-37ce-445e-ae94-9562960117af\") " pod="kube-system/coredns-668d6bf9bc-hg98s" Mar 25 01:11:03.620796 kubelet[2591]: I0325 01:11:03.620793 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ede0a9d-37ce-445e-ae94-9562960117af-config-volume\") pod \"coredns-668d6bf9bc-hg98s\" (UID: \"2ede0a9d-37ce-445e-ae94-9562960117af\") " pod="kube-system/coredns-668d6bf9bc-hg98s" Mar 25 01:11:03.721760 kubelet[2591]: I0325 01:11:03.721140 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3ecef61-e163-47cc-b4ee-25518b9134d1-tigera-ca-bundle\") pod \"calico-kube-controllers-78fbfcd757-szsdv\" (UID: \"d3ecef61-e163-47cc-b4ee-25518b9134d1\") " pod="calico-system/calico-kube-controllers-78fbfcd757-szsdv" Mar 25 01:11:03.721760 kubelet[2591]: I0325 01:11:03.721176 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvsjm\" (UniqueName: \"kubernetes.io/projected/d3ecef61-e163-47cc-b4ee-25518b9134d1-kube-api-access-fvsjm\") pod \"calico-kube-controllers-78fbfcd757-szsdv\" (UID: \"d3ecef61-e163-47cc-b4ee-25518b9134d1\") " pod="calico-system/calico-kube-controllers-78fbfcd757-szsdv" Mar 25 01:11:03.721760 kubelet[2591]: I0325 01:11:03.721195 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/3a461ca2-f983-4fd2-a905-67ee0b4f5a3b-calico-apiserver-certs\") pod \"calico-apiserver-5b89686bfb-8kksr\" (UID: \"3a461ca2-f983-4fd2-a905-67ee0b4f5a3b\") " pod="calico-apiserver/calico-apiserver-5b89686bfb-8kksr" Mar 25 01:11:03.721760 kubelet[2591]: I0325 01:11:03.721252 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt9hg\" (UniqueName: \"kubernetes.io/projected/3a461ca2-f983-4fd2-a905-67ee0b4f5a3b-kube-api-access-rt9hg\") pod \"calico-apiserver-5b89686bfb-8kksr\" (UID: \"3a461ca2-f983-4fd2-a905-67ee0b4f5a3b\") " pod="calico-apiserver/calico-apiserver-5b89686bfb-8kksr" Mar 25 01:11:03.721760 kubelet[2591]: I0325 01:11:03.721304 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/c3cbab45-79c8-427a-87cc-52cefa199441-calico-apiserver-certs\") pod \"calico-apiserver-5b89686bfb-smsq9\" (UID: \"c3cbab45-79c8-427a-87cc-52cefa199441\") " pod="calico-apiserver/calico-apiserver-5b89686bfb-smsq9" Mar 25 01:11:03.721938 kubelet[2591]: I0325 01:11:03.721332 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6npz\" (UniqueName: \"kubernetes.io/projected/65f62a1e-a36a-47c3-8c4b-1dbed6b93395-kube-api-access-h6npz\") pod \"coredns-668d6bf9bc-g7msf\" (UID: \"65f62a1e-a36a-47c3-8c4b-1dbed6b93395\") " pod="kube-system/coredns-668d6bf9bc-g7msf" Mar 25 01:11:03.721938 kubelet[2591]: I0325 01:11:03.721364 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/65f62a1e-a36a-47c3-8c4b-1dbed6b93395-config-volume\") pod \"coredns-668d6bf9bc-g7msf\" (UID: \"65f62a1e-a36a-47c3-8c4b-1dbed6b93395\") " pod="kube-system/coredns-668d6bf9bc-g7msf" Mar 25 01:11:03.721938 kubelet[2591]: I0325 01:11:03.721405 2591 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79scr\" (UniqueName: \"kubernetes.io/projected/c3cbab45-79c8-427a-87cc-52cefa199441-kube-api-access-79scr\") pod \"calico-apiserver-5b89686bfb-smsq9\" (UID: \"c3cbab45-79c8-427a-87cc-52cefa199441\") " pod="calico-apiserver/calico-apiserver-5b89686bfb-smsq9" Mar 25 01:11:03.871726 containerd[1474]: time="2025-03-25T01:11:03.871677663Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-hg98s,Uid:2ede0a9d-37ce-445e-ae94-9562960117af,Namespace:kube-system,Attempt:0,}" Mar 25 01:11:03.884236 containerd[1474]: time="2025-03-25T01:11:03.884014584Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-g7msf,Uid:65f62a1e-a36a-47c3-8c4b-1dbed6b93395,Namespace:kube-system,Attempt:0,}" Mar 25 01:11:03.886835 containerd[1474]: time="2025-03-25T01:11:03.886773385Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5b89686bfb-smsq9,Uid:c3cbab45-79c8-427a-87cc-52cefa199441,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:11:03.894206 containerd[1474]: time="2025-03-25T01:11:03.894176465Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5b89686bfb-8kksr,Uid:3a461ca2-f983-4fd2-a905-67ee0b4f5a3b,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:11:03.911487 containerd[1474]: time="2025-03-25T01:11:03.911452627Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-78fbfcd757-szsdv,Uid:d3ecef61-e163-47cc-b4ee-25518b9134d1,Namespace:calico-system,Attempt:0,}" Mar 25 01:11:04.006809 systemd[1]: Created slice kubepods-besteffort-pod134e9f30_05b3_4c2c_8ea5_b587606a6022.slice - libcontainer container kubepods-besteffort-pod134e9f30_05b3_4c2c_8ea5_b587606a6022.slice. Mar 25 01:11:04.013446 containerd[1474]: time="2025-03-25T01:11:04.013391556Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-rm4dd,Uid:134e9f30-05b3-4c2c-8ea5-b587606a6022,Namespace:calico-system,Attempt:0,}" Mar 25 01:11:04.106402 containerd[1474]: time="2025-03-25T01:11:04.106361084Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\"" Mar 25 01:11:04.170006 containerd[1474]: time="2025-03-25T01:11:04.169948529Z" level=error msg="Failed to destroy network for sandbox \"8e8487c1bedbf47ef36a9d10ea745325a35dac548c14f2cdc6b5606b37077957\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:11:04.170138 containerd[1474]: time="2025-03-25T01:11:04.170011929Z" level=error msg="Failed to destroy network for sandbox \"14244060d44497108c1b6d8eabf008132509e9b73bd98fc49932180dcebb6213\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:11:04.172630 containerd[1474]: time="2025-03-25T01:11:04.172587729Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5b89686bfb-8kksr,Uid:3a461ca2-f983-4fd2-a905-67ee0b4f5a3b,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"14244060d44497108c1b6d8eabf008132509e9b73bd98fc49932180dcebb6213\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:11:04.173969 containerd[1474]: time="2025-03-25T01:11:04.173783809Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5b89686bfb-smsq9,Uid:c3cbab45-79c8-427a-87cc-52cefa199441,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e8487c1bedbf47ef36a9d10ea745325a35dac548c14f2cdc6b5606b37077957\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:11:04.179080 kubelet[2591]: E0325 01:11:04.179024 2591 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e8487c1bedbf47ef36a9d10ea745325a35dac548c14f2cdc6b5606b37077957\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:11:04.179341 kubelet[2591]: E0325 01:11:04.179021 2591 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"14244060d44497108c1b6d8eabf008132509e9b73bd98fc49932180dcebb6213\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:11:04.181367 kubelet[2591]: E0325 01:11:04.181332 2591 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e8487c1bedbf47ef36a9d10ea745325a35dac548c14f2cdc6b5606b37077957\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5b89686bfb-smsq9" Mar 25 01:11:04.181466 kubelet[2591]: E0325 01:11:04.181376 2591 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e8487c1bedbf47ef36a9d10ea745325a35dac548c14f2cdc6b5606b37077957\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5b89686bfb-smsq9" Mar 25 01:11:04.181466 kubelet[2591]: E0325 01:11:04.181439 2591 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5b89686bfb-smsq9_calico-apiserver(c3cbab45-79c8-427a-87cc-52cefa199441)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5b89686bfb-smsq9_calico-apiserver(c3cbab45-79c8-427a-87cc-52cefa199441)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8e8487c1bedbf47ef36a9d10ea745325a35dac548c14f2cdc6b5606b37077957\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5b89686bfb-smsq9" podUID="c3cbab45-79c8-427a-87cc-52cefa199441" Mar 25 01:11:04.181845 kubelet[2591]: E0325 01:11:04.181818 2591 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"14244060d44497108c1b6d8eabf008132509e9b73bd98fc49932180dcebb6213\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5b89686bfb-8kksr" Mar 25 01:11:04.181924 kubelet[2591]: E0325 01:11:04.181909 2591 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"14244060d44497108c1b6d8eabf008132509e9b73bd98fc49932180dcebb6213\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5b89686bfb-8kksr" Mar 25 01:11:04.184243 kubelet[2591]: E0325 01:11:04.181989 2591 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5b89686bfb-8kksr_calico-apiserver(3a461ca2-f983-4fd2-a905-67ee0b4f5a3b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5b89686bfb-8kksr_calico-apiserver(3a461ca2-f983-4fd2-a905-67ee0b4f5a3b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"14244060d44497108c1b6d8eabf008132509e9b73bd98fc49932180dcebb6213\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5b89686bfb-8kksr" podUID="3a461ca2-f983-4fd2-a905-67ee0b4f5a3b" Mar 25 01:11:04.188273 containerd[1474]: time="2025-03-25T01:11:04.188234251Z" level=error msg="Failed to destroy network for sandbox \"88d557e884aa4bfe4999d4ee091a87a5bc1cc01a927c04f79fdbe81936cbb08b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:11:04.189276 containerd[1474]: time="2025-03-25T01:11:04.189185731Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-rm4dd,Uid:134e9f30-05b3-4c2c-8ea5-b587606a6022,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"88d557e884aa4bfe4999d4ee091a87a5bc1cc01a927c04f79fdbe81936cbb08b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:11:04.189751 kubelet[2591]: E0325 01:11:04.189465 2591 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"88d557e884aa4bfe4999d4ee091a87a5bc1cc01a927c04f79fdbe81936cbb08b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:11:04.189751 kubelet[2591]: E0325 01:11:04.189515 2591 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"88d557e884aa4bfe4999d4ee091a87a5bc1cc01a927c04f79fdbe81936cbb08b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-rm4dd" Mar 25 01:11:04.189751 kubelet[2591]: E0325 01:11:04.189534 2591 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"88d557e884aa4bfe4999d4ee091a87a5bc1cc01a927c04f79fdbe81936cbb08b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-rm4dd" Mar 25 01:11:04.189883 containerd[1474]: time="2025-03-25T01:11:04.189518091Z" level=error msg="Failed to destroy network for sandbox \"3f29de5a9827934c72e8020273d86b040126a15cc6b2ab326ead9fbbaa8644e4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:11:04.189911 kubelet[2591]: E0325 01:11:04.189562 2591 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-rm4dd_calico-system(134e9f30-05b3-4c2c-8ea5-b587606a6022)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-rm4dd_calico-system(134e9f30-05b3-4c2c-8ea5-b587606a6022)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"88d557e884aa4bfe4999d4ee091a87a5bc1cc01a927c04f79fdbe81936cbb08b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-rm4dd" podUID="134e9f30-05b3-4c2c-8ea5-b587606a6022" Mar 25 01:11:04.192057 containerd[1474]: time="2025-03-25T01:11:04.192022411Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-78fbfcd757-szsdv,Uid:d3ecef61-e163-47cc-b4ee-25518b9134d1,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f29de5a9827934c72e8020273d86b040126a15cc6b2ab326ead9fbbaa8644e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:11:04.192526 kubelet[2591]: E0325 01:11:04.192175 2591 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f29de5a9827934c72e8020273d86b040126a15cc6b2ab326ead9fbbaa8644e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:11:04.192526 kubelet[2591]: E0325 01:11:04.192210 2591 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f29de5a9827934c72e8020273d86b040126a15cc6b2ab326ead9fbbaa8644e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-78fbfcd757-szsdv" Mar 25 01:11:04.192526 kubelet[2591]: E0325 01:11:04.192226 2591 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3f29de5a9827934c72e8020273d86b040126a15cc6b2ab326ead9fbbaa8644e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-78fbfcd757-szsdv" Mar 25 01:11:04.192627 kubelet[2591]: E0325 01:11:04.192253 2591 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-78fbfcd757-szsdv_calico-system(d3ecef61-e163-47cc-b4ee-25518b9134d1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-78fbfcd757-szsdv_calico-system(d3ecef61-e163-47cc-b4ee-25518b9134d1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3f29de5a9827934c72e8020273d86b040126a15cc6b2ab326ead9fbbaa8644e4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-78fbfcd757-szsdv" podUID="d3ecef61-e163-47cc-b4ee-25518b9134d1" Mar 25 01:11:04.196599 containerd[1474]: time="2025-03-25T01:11:04.196567371Z" level=error msg="Failed to destroy network for sandbox \"496fd6e8d1218207c821f3f410824ebb28282259f64a521da5d980a051965b25\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:11:04.196772 containerd[1474]: time="2025-03-25T01:11:04.196709011Z" level=error msg="Failed to destroy network for sandbox \"359fb8a869e382781a14db6293c84cf053afb14da80c72b0d463578ec127e6dd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:11:04.197629 containerd[1474]: time="2025-03-25T01:11:04.197598291Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-hg98s,Uid:2ede0a9d-37ce-445e-ae94-9562960117af,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"496fd6e8d1218207c821f3f410824ebb28282259f64a521da5d980a051965b25\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:11:04.198171 kubelet[2591]: E0325 01:11:04.197879 2591 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"496fd6e8d1218207c821f3f410824ebb28282259f64a521da5d980a051965b25\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:11:04.198171 kubelet[2591]: E0325 01:11:04.197917 2591 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"496fd6e8d1218207c821f3f410824ebb28282259f64a521da5d980a051965b25\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-hg98s" Mar 25 01:11:04.198171 kubelet[2591]: E0325 01:11:04.197932 2591 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"496fd6e8d1218207c821f3f410824ebb28282259f64a521da5d980a051965b25\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-hg98s" Mar 25 01:11:04.198297 kubelet[2591]: E0325 01:11:04.197970 2591 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-hg98s_kube-system(2ede0a9d-37ce-445e-ae94-9562960117af)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-hg98s_kube-system(2ede0a9d-37ce-445e-ae94-9562960117af)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"496fd6e8d1218207c821f3f410824ebb28282259f64a521da5d980a051965b25\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-hg98s" podUID="2ede0a9d-37ce-445e-ae94-9562960117af" Mar 25 01:11:04.198358 containerd[1474]: time="2025-03-25T01:11:04.198284092Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-g7msf,Uid:65f62a1e-a36a-47c3-8c4b-1dbed6b93395,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"359fb8a869e382781a14db6293c84cf053afb14da80c72b0d463578ec127e6dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:11:04.198548 kubelet[2591]: E0325 01:11:04.198525 2591 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"359fb8a869e382781a14db6293c84cf053afb14da80c72b0d463578ec127e6dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:11:04.198680 kubelet[2591]: E0325 01:11:04.198652 2591 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"359fb8a869e382781a14db6293c84cf053afb14da80c72b0d463578ec127e6dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-g7msf" Mar 25 01:11:04.198824 kubelet[2591]: E0325 01:11:04.198724 2591 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"359fb8a869e382781a14db6293c84cf053afb14da80c72b0d463578ec127e6dd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-g7msf" Mar 25 01:11:04.198824 kubelet[2591]: E0325 01:11:04.198763 2591 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-g7msf_kube-system(65f62a1e-a36a-47c3-8c4b-1dbed6b93395)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-g7msf_kube-system(65f62a1e-a36a-47c3-8c4b-1dbed6b93395)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"359fb8a869e382781a14db6293c84cf053afb14da80c72b0d463578ec127e6dd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-g7msf" podUID="65f62a1e-a36a-47c3-8c4b-1dbed6b93395" Mar 25 01:11:04.882148 systemd[1]: run-netns-cni\x2d52d66830\x2db575\x2dc318\x2d1892\x2de83c0236b9af.mount: Deactivated successfully. Mar 25 01:11:04.882243 systemd[1]: run-netns-cni\x2db5f991be\x2d5b3d\x2db646\x2d6257\x2d5eab56e31604.mount: Deactivated successfully. Mar 25 01:11:04.882290 systemd[1]: run-netns-cni\x2db3c2f534\x2d090a\x2d5326\x2d8bde\x2de590f5175610.mount: Deactivated successfully. Mar 25 01:11:04.882334 systemd[1]: run-netns-cni\x2d6179347f\x2d64d0\x2d3811\x2dd21b\x2d70506fde2ec1.mount: Deactivated successfully. Mar 25 01:11:04.882377 systemd[1]: run-netns-cni\x2df9963e75\x2dd521\x2dc79b\x2d6abc\x2df1b723de5c8e.mount: Deactivated successfully. Mar 25 01:11:07.586949 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount902779822.mount: Deactivated successfully. Mar 25 01:11:07.954342 containerd[1474]: time="2025-03-25T01:11:07.954294139Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:11:07.955292 containerd[1474]: time="2025-03-25T01:11:07.954920219Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.2: active requests=0, bytes read=137086024" Mar 25 01:11:07.955624 containerd[1474]: time="2025-03-25T01:11:07.955578459Z" level=info msg="ImageCreate event name:\"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:11:07.957277 containerd[1474]: time="2025-03-25T01:11:07.957217499Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:11:07.957726 containerd[1474]: time="2025-03-25T01:11:07.957692899Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.2\" with image id \"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\", size \"137085886\" in 3.851292335s" Mar 25 01:11:07.957773 containerd[1474]: time="2025-03-25T01:11:07.957728459Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\" returns image reference \"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\"" Mar 25 01:11:07.966786 containerd[1474]: time="2025-03-25T01:11:07.966189380Z" level=info msg="CreateContainer within sandbox \"94fdf550acb51321ba180aa761734dc25cbb67eea5df9585d0d220951ba78552\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 25 01:11:07.977470 containerd[1474]: time="2025-03-25T01:11:07.977441941Z" level=info msg="Container 06e1ac1023f644a7847fedf3fc8f36997e3bc229b505ba0373ded5a7099d4341: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:11:07.987118 containerd[1474]: time="2025-03-25T01:11:07.987080421Z" level=info msg="CreateContainer within sandbox \"94fdf550acb51321ba180aa761734dc25cbb67eea5df9585d0d220951ba78552\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"06e1ac1023f644a7847fedf3fc8f36997e3bc229b505ba0373ded5a7099d4341\"" Mar 25 01:11:07.988356 containerd[1474]: time="2025-03-25T01:11:07.988328861Z" level=info msg="StartContainer for \"06e1ac1023f644a7847fedf3fc8f36997e3bc229b505ba0373ded5a7099d4341\"" Mar 25 01:11:07.989736 containerd[1474]: time="2025-03-25T01:11:07.989697781Z" level=info msg="connecting to shim 06e1ac1023f644a7847fedf3fc8f36997e3bc229b505ba0373ded5a7099d4341" address="unix:///run/containerd/s/8cde3f311ab2c2eb29743bbe78365616eeab0ac96a3c3842adce2c666e0f8fce" protocol=ttrpc version=3 Mar 25 01:11:08.013636 systemd[1]: Started cri-containerd-06e1ac1023f644a7847fedf3fc8f36997e3bc229b505ba0373ded5a7099d4341.scope - libcontainer container 06e1ac1023f644a7847fedf3fc8f36997e3bc229b505ba0373ded5a7099d4341. Mar 25 01:11:08.095151 containerd[1474]: time="2025-03-25T01:11:08.095112988Z" level=info msg="StartContainer for \"06e1ac1023f644a7847fedf3fc8f36997e3bc229b505ba0373ded5a7099d4341\" returns successfully" Mar 25 01:11:08.138347 kubelet[2591]: I0325 01:11:08.138065 2591 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-gnbvr" podStartSLOduration=0.816401844 podStartE2EDuration="14.138050791s" podCreationTimestamp="2025-03-25 01:10:54 +0000 UTC" firstStartedPulling="2025-03-25 01:10:54.636682192 +0000 UTC m=+13.741659527" lastFinishedPulling="2025-03-25 01:11:07.958331139 +0000 UTC m=+27.063308474" observedRunningTime="2025-03-25 01:11:08.137522871 +0000 UTC m=+27.242500206" watchObservedRunningTime="2025-03-25 01:11:08.138050791 +0000 UTC m=+27.243028086" Mar 25 01:11:08.258788 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Mar 25 01:11:08.258893 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Mar 25 01:11:09.127795 kubelet[2591]: I0325 01:11:09.127761 2591 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:11:09.378963 containerd[1474]: time="2025-03-25T01:11:09.378867351Z" level=info msg="TaskExit event in podsandbox handler container_id:\"06e1ac1023f644a7847fedf3fc8f36997e3bc229b505ba0373ded5a7099d4341\" id:\"68bcfbfbee4f5c2273baff15ccd54487cd3ddaaaadd011b3fd352e638db5137d\" pid:3671 exit_status:1 exited_at:{seconds:1742865069 nanos:378528071}" Mar 25 01:11:09.472601 containerd[1474]: time="2025-03-25T01:11:09.472553556Z" level=info msg="TaskExit event in podsandbox handler container_id:\"06e1ac1023f644a7847fedf3fc8f36997e3bc229b505ba0373ded5a7099d4341\" id:\"a610e9f5ba452f4a24562f8c32e24015ed3b5886c943ce86c3e235e8ca33b8ef\" pid:3695 exit_status:1 exited_at:{seconds:1742865069 nanos:472230676}" Mar 25 01:11:09.616088 systemd[1]: Started sshd@7-10.0.0.25:22-10.0.0.1:38450.service - OpenSSH per-connection server daemon (10.0.0.1:38450). Mar 25 01:11:09.694257 sshd[3809]: Accepted publickey for core from 10.0.0.1 port 38450 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:11:09.695898 sshd-session[3809]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:11:09.699552 systemd-logind[1458]: New session 8 of user core. Mar 25 01:11:09.716620 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 25 01:11:09.850678 sshd[3811]: Connection closed by 10.0.0.1 port 38450 Mar 25 01:11:09.851044 sshd-session[3809]: pam_unix(sshd:session): session closed for user core Mar 25 01:11:09.854190 systemd[1]: session-8.scope: Deactivated successfully. Mar 25 01:11:09.855526 systemd-logind[1458]: Session 8 logged out. Waiting for processes to exit. Mar 25 01:11:09.855755 systemd[1]: sshd@7-10.0.0.25:22-10.0.0.1:38450.service: Deactivated successfully. Mar 25 01:11:09.859849 systemd-logind[1458]: Removed session 8. Mar 25 01:11:10.199933 containerd[1474]: time="2025-03-25T01:11:10.199696280Z" level=info msg="TaskExit event in podsandbox handler container_id:\"06e1ac1023f644a7847fedf3fc8f36997e3bc229b505ba0373ded5a7099d4341\" id:\"f110854cf4332ec670ee9991d3fc716df4e18295e43eef3ed1d954480ee3caa2\" pid:3835 exit_status:1 exited_at:{seconds:1742865070 nanos:199350400}" Mar 25 01:11:14.865244 systemd[1]: Started sshd@8-10.0.0.25:22-10.0.0.1:50264.service - OpenSSH per-connection server daemon (10.0.0.1:50264). Mar 25 01:11:14.917439 sshd[3971]: Accepted publickey for core from 10.0.0.1 port 50264 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:11:14.918814 sshd-session[3971]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:11:14.923149 systemd-logind[1458]: New session 9 of user core. Mar 25 01:11:14.933587 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 25 01:11:14.973257 containerd[1474]: time="2025-03-25T01:11:14.973225720Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-hg98s,Uid:2ede0a9d-37ce-445e-ae94-9562960117af,Namespace:kube-system,Attempt:0,}" Mar 25 01:11:15.068364 sshd[3973]: Connection closed by 10.0.0.1 port 50264 Mar 25 01:11:15.068727 sshd-session[3971]: pam_unix(sshd:session): session closed for user core Mar 25 01:11:15.072648 systemd[1]: sshd@8-10.0.0.25:22-10.0.0.1:50264.service: Deactivated successfully. Mar 25 01:11:15.074910 systemd[1]: session-9.scope: Deactivated successfully. Mar 25 01:11:15.076368 systemd-logind[1458]: Session 9 logged out. Waiting for processes to exit. Mar 25 01:11:15.077313 systemd-logind[1458]: Removed session 9. Mar 25 01:11:15.192673 systemd-networkd[1395]: cali44734bf112f: Link UP Mar 25 01:11:15.192879 systemd-networkd[1395]: cali44734bf112f: Gained carrier Mar 25 01:11:15.207036 containerd[1474]: 2025-03-25 01:11:15.006 [INFO][3975] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 25 01:11:15.207036 containerd[1474]: 2025-03-25 01:11:15.050 [INFO][3975] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--hg98s-eth0 coredns-668d6bf9bc- kube-system 2ede0a9d-37ce-445e-ae94-9562960117af 653 0 2025-03-25 01:10:47 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-hg98s eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali44734bf112f [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="de078038aef27d1714b7f272d109dbf51005947cd1bef3579217257fba8fe65e" Namespace="kube-system" Pod="coredns-668d6bf9bc-hg98s" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--hg98s-" Mar 25 01:11:15.207036 containerd[1474]: 2025-03-25 01:11:15.050 [INFO][3975] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="de078038aef27d1714b7f272d109dbf51005947cd1bef3579217257fba8fe65e" Namespace="kube-system" Pod="coredns-668d6bf9bc-hg98s" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--hg98s-eth0" Mar 25 01:11:15.207036 containerd[1474]: 2025-03-25 01:11:15.145 [INFO][3999] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="de078038aef27d1714b7f272d109dbf51005947cd1bef3579217257fba8fe65e" HandleID="k8s-pod-network.de078038aef27d1714b7f272d109dbf51005947cd1bef3579217257fba8fe65e" Workload="localhost-k8s-coredns--668d6bf9bc--hg98s-eth0" Mar 25 01:11:15.207270 containerd[1474]: 2025-03-25 01:11:15.156 [INFO][3999] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="de078038aef27d1714b7f272d109dbf51005947cd1bef3579217257fba8fe65e" HandleID="k8s-pod-network.de078038aef27d1714b7f272d109dbf51005947cd1bef3579217257fba8fe65e" Workload="localhost-k8s-coredns--668d6bf9bc--hg98s-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c03f0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-hg98s", "timestamp":"2025-03-25 01:11:15.145718248 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:11:15.207270 containerd[1474]: 2025-03-25 01:11:15.156 [INFO][3999] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:11:15.207270 containerd[1474]: 2025-03-25 01:11:15.157 [INFO][3999] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:11:15.207270 containerd[1474]: 2025-03-25 01:11:15.157 [INFO][3999] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 25 01:11:15.207270 containerd[1474]: 2025-03-25 01:11:15.158 [INFO][3999] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.de078038aef27d1714b7f272d109dbf51005947cd1bef3579217257fba8fe65e" host="localhost" Mar 25 01:11:15.207270 containerd[1474]: 2025-03-25 01:11:15.163 [INFO][3999] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 25 01:11:15.207270 containerd[1474]: 2025-03-25 01:11:15.166 [INFO][3999] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 25 01:11:15.207270 containerd[1474]: 2025-03-25 01:11:15.168 [INFO][3999] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 25 01:11:15.207270 containerd[1474]: 2025-03-25 01:11:15.171 [INFO][3999] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 25 01:11:15.207270 containerd[1474]: 2025-03-25 01:11:15.171 [INFO][3999] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.de078038aef27d1714b7f272d109dbf51005947cd1bef3579217257fba8fe65e" host="localhost" Mar 25 01:11:15.207491 containerd[1474]: 2025-03-25 01:11:15.172 [INFO][3999] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.de078038aef27d1714b7f272d109dbf51005947cd1bef3579217257fba8fe65e Mar 25 01:11:15.207491 containerd[1474]: 2025-03-25 01:11:15.176 [INFO][3999] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.de078038aef27d1714b7f272d109dbf51005947cd1bef3579217257fba8fe65e" host="localhost" Mar 25 01:11:15.207491 containerd[1474]: 2025-03-25 01:11:15.180 [INFO][3999] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.de078038aef27d1714b7f272d109dbf51005947cd1bef3579217257fba8fe65e" host="localhost" Mar 25 01:11:15.207491 containerd[1474]: 2025-03-25 01:11:15.180 [INFO][3999] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.de078038aef27d1714b7f272d109dbf51005947cd1bef3579217257fba8fe65e" host="localhost" Mar 25 01:11:15.207491 containerd[1474]: 2025-03-25 01:11:15.180 [INFO][3999] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:11:15.207491 containerd[1474]: 2025-03-25 01:11:15.180 [INFO][3999] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="de078038aef27d1714b7f272d109dbf51005947cd1bef3579217257fba8fe65e" HandleID="k8s-pod-network.de078038aef27d1714b7f272d109dbf51005947cd1bef3579217257fba8fe65e" Workload="localhost-k8s-coredns--668d6bf9bc--hg98s-eth0" Mar 25 01:11:15.207602 containerd[1474]: 2025-03-25 01:11:15.183 [INFO][3975] cni-plugin/k8s.go 386: Populated endpoint ContainerID="de078038aef27d1714b7f272d109dbf51005947cd1bef3579217257fba8fe65e" Namespace="kube-system" Pod="coredns-668d6bf9bc-hg98s" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--hg98s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--hg98s-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"2ede0a9d-37ce-445e-ae94-9562960117af", ResourceVersion:"653", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 10, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-hg98s", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali44734bf112f", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:11:15.207655 containerd[1474]: 2025-03-25 01:11:15.183 [INFO][3975] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.129/32] ContainerID="de078038aef27d1714b7f272d109dbf51005947cd1bef3579217257fba8fe65e" Namespace="kube-system" Pod="coredns-668d6bf9bc-hg98s" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--hg98s-eth0" Mar 25 01:11:15.207655 containerd[1474]: 2025-03-25 01:11:15.183 [INFO][3975] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali44734bf112f ContainerID="de078038aef27d1714b7f272d109dbf51005947cd1bef3579217257fba8fe65e" Namespace="kube-system" Pod="coredns-668d6bf9bc-hg98s" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--hg98s-eth0" Mar 25 01:11:15.207655 containerd[1474]: 2025-03-25 01:11:15.192 [INFO][3975] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="de078038aef27d1714b7f272d109dbf51005947cd1bef3579217257fba8fe65e" Namespace="kube-system" Pod="coredns-668d6bf9bc-hg98s" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--hg98s-eth0" Mar 25 01:11:15.207723 containerd[1474]: 2025-03-25 01:11:15.193 [INFO][3975] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="de078038aef27d1714b7f272d109dbf51005947cd1bef3579217257fba8fe65e" Namespace="kube-system" Pod="coredns-668d6bf9bc-hg98s" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--hg98s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--hg98s-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"2ede0a9d-37ce-445e-ae94-9562960117af", ResourceVersion:"653", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 10, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"de078038aef27d1714b7f272d109dbf51005947cd1bef3579217257fba8fe65e", Pod:"coredns-668d6bf9bc-hg98s", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali44734bf112f", MAC:"ce:60:37:d5:f1:9c", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:11:15.207723 containerd[1474]: 2025-03-25 01:11:15.205 [INFO][3975] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="de078038aef27d1714b7f272d109dbf51005947cd1bef3579217257fba8fe65e" Namespace="kube-system" Pod="coredns-668d6bf9bc-hg98s" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--hg98s-eth0" Mar 25 01:11:15.252290 containerd[1474]: time="2025-03-25T01:11:15.252248452Z" level=info msg="connecting to shim de078038aef27d1714b7f272d109dbf51005947cd1bef3579217257fba8fe65e" address="unix:///run/containerd/s/d392b635fda0eb47ba6772c02fd9d20bb2919b1ca51910772dd90dff9c2c34e1" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:11:15.277590 systemd[1]: Started cri-containerd-de078038aef27d1714b7f272d109dbf51005947cd1bef3579217257fba8fe65e.scope - libcontainer container de078038aef27d1714b7f272d109dbf51005947cd1bef3579217257fba8fe65e. Mar 25 01:11:15.289863 systemd-resolved[1320]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 25 01:11:15.309823 containerd[1474]: time="2025-03-25T01:11:15.309785814Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-hg98s,Uid:2ede0a9d-37ce-445e-ae94-9562960117af,Namespace:kube-system,Attempt:0,} returns sandbox id \"de078038aef27d1714b7f272d109dbf51005947cd1bef3579217257fba8fe65e\"" Mar 25 01:11:15.312153 containerd[1474]: time="2025-03-25T01:11:15.312127015Z" level=info msg="CreateContainer within sandbox \"de078038aef27d1714b7f272d109dbf51005947cd1bef3579217257fba8fe65e\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 25 01:11:15.320672 containerd[1474]: time="2025-03-25T01:11:15.319982735Z" level=info msg="Container 69e3b9f4b14cdfcc2c15abc5537e98af0553b2afff27de1d523b48f54bbaadfb: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:11:15.322583 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4022718495.mount: Deactivated successfully. Mar 25 01:11:15.326046 containerd[1474]: time="2025-03-25T01:11:15.326001695Z" level=info msg="CreateContainer within sandbox \"de078038aef27d1714b7f272d109dbf51005947cd1bef3579217257fba8fe65e\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"69e3b9f4b14cdfcc2c15abc5537e98af0553b2afff27de1d523b48f54bbaadfb\"" Mar 25 01:11:15.326536 containerd[1474]: time="2025-03-25T01:11:15.326364615Z" level=info msg="StartContainer for \"69e3b9f4b14cdfcc2c15abc5537e98af0553b2afff27de1d523b48f54bbaadfb\"" Mar 25 01:11:15.327114 containerd[1474]: time="2025-03-25T01:11:15.327088295Z" level=info msg="connecting to shim 69e3b9f4b14cdfcc2c15abc5537e98af0553b2afff27de1d523b48f54bbaadfb" address="unix:///run/containerd/s/d392b635fda0eb47ba6772c02fd9d20bb2919b1ca51910772dd90dff9c2c34e1" protocol=ttrpc version=3 Mar 25 01:11:15.342573 systemd[1]: Started cri-containerd-69e3b9f4b14cdfcc2c15abc5537e98af0553b2afff27de1d523b48f54bbaadfb.scope - libcontainer container 69e3b9f4b14cdfcc2c15abc5537e98af0553b2afff27de1d523b48f54bbaadfb. Mar 25 01:11:15.369905 containerd[1474]: time="2025-03-25T01:11:15.369844777Z" level=info msg="StartContainer for \"69e3b9f4b14cdfcc2c15abc5537e98af0553b2afff27de1d523b48f54bbaadfb\" returns successfully" Mar 25 01:11:15.973772 containerd[1474]: time="2025-03-25T01:11:15.973588202Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-rm4dd,Uid:134e9f30-05b3-4c2c-8ea5-b587606a6022,Namespace:calico-system,Attempt:0,}" Mar 25 01:11:15.973772 containerd[1474]: time="2025-03-25T01:11:15.973588122Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5b89686bfb-8kksr,Uid:3a461ca2-f983-4fd2-a905-67ee0b4f5a3b,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:11:15.973772 containerd[1474]: time="2025-03-25T01:11:15.973671002Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-78fbfcd757-szsdv,Uid:d3ecef61-e163-47cc-b4ee-25518b9134d1,Namespace:calico-system,Attempt:0,}" Mar 25 01:11:16.095915 systemd-networkd[1395]: cali6d8c023f2e8: Link UP Mar 25 01:11:16.096093 systemd-networkd[1395]: cali6d8c023f2e8: Gained carrier Mar 25 01:11:16.111127 containerd[1474]: 2025-03-25 01:11:16.008 [INFO][4138] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 25 01:11:16.111127 containerd[1474]: 2025-03-25 01:11:16.026 [INFO][4138] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--5b89686bfb--8kksr-eth0 calico-apiserver-5b89686bfb- calico-apiserver 3a461ca2-f983-4fd2-a905-67ee0b4f5a3b 657 0 2025-03-25 01:10:53 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5b89686bfb projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-5b89686bfb-8kksr eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali6d8c023f2e8 [] []}} ContainerID="ef78d507fc7584fcdd2aed395f622fa82009a078de25e0554b2f7a461f375173" Namespace="calico-apiserver" Pod="calico-apiserver-5b89686bfb-8kksr" WorkloadEndpoint="localhost-k8s-calico--apiserver--5b89686bfb--8kksr-" Mar 25 01:11:16.111127 containerd[1474]: 2025-03-25 01:11:16.026 [INFO][4138] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="ef78d507fc7584fcdd2aed395f622fa82009a078de25e0554b2f7a461f375173" Namespace="calico-apiserver" Pod="calico-apiserver-5b89686bfb-8kksr" WorkloadEndpoint="localhost-k8s-calico--apiserver--5b89686bfb--8kksr-eth0" Mar 25 01:11:16.111127 containerd[1474]: 2025-03-25 01:11:16.054 [INFO][4169] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ef78d507fc7584fcdd2aed395f622fa82009a078de25e0554b2f7a461f375173" HandleID="k8s-pod-network.ef78d507fc7584fcdd2aed395f622fa82009a078de25e0554b2f7a461f375173" Workload="localhost-k8s-calico--apiserver--5b89686bfb--8kksr-eth0" Mar 25 01:11:16.111127 containerd[1474]: 2025-03-25 01:11:16.068 [INFO][4169] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ef78d507fc7584fcdd2aed395f622fa82009a078de25e0554b2f7a461f375173" HandleID="k8s-pod-network.ef78d507fc7584fcdd2aed395f622fa82009a078de25e0554b2f7a461f375173" Workload="localhost-k8s-calico--apiserver--5b89686bfb--8kksr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003720e0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-5b89686bfb-8kksr", "timestamp":"2025-03-25 01:11:16.054225365 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:11:16.111127 containerd[1474]: 2025-03-25 01:11:16.068 [INFO][4169] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:11:16.111127 containerd[1474]: 2025-03-25 01:11:16.068 [INFO][4169] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:11:16.111127 containerd[1474]: 2025-03-25 01:11:16.068 [INFO][4169] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 25 01:11:16.111127 containerd[1474]: 2025-03-25 01:11:16.070 [INFO][4169] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.ef78d507fc7584fcdd2aed395f622fa82009a078de25e0554b2f7a461f375173" host="localhost" Mar 25 01:11:16.111127 containerd[1474]: 2025-03-25 01:11:16.074 [INFO][4169] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 25 01:11:16.111127 containerd[1474]: 2025-03-25 01:11:16.078 [INFO][4169] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 25 01:11:16.111127 containerd[1474]: 2025-03-25 01:11:16.080 [INFO][4169] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 25 01:11:16.111127 containerd[1474]: 2025-03-25 01:11:16.082 [INFO][4169] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 25 01:11:16.111127 containerd[1474]: 2025-03-25 01:11:16.082 [INFO][4169] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.ef78d507fc7584fcdd2aed395f622fa82009a078de25e0554b2f7a461f375173" host="localhost" Mar 25 01:11:16.111127 containerd[1474]: 2025-03-25 01:11:16.083 [INFO][4169] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.ef78d507fc7584fcdd2aed395f622fa82009a078de25e0554b2f7a461f375173 Mar 25 01:11:16.111127 containerd[1474]: 2025-03-25 01:11:16.087 [INFO][4169] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.ef78d507fc7584fcdd2aed395f622fa82009a078de25e0554b2f7a461f375173" host="localhost" Mar 25 01:11:16.111127 containerd[1474]: 2025-03-25 01:11:16.091 [INFO][4169] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.ef78d507fc7584fcdd2aed395f622fa82009a078de25e0554b2f7a461f375173" host="localhost" Mar 25 01:11:16.111127 containerd[1474]: 2025-03-25 01:11:16.091 [INFO][4169] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.ef78d507fc7584fcdd2aed395f622fa82009a078de25e0554b2f7a461f375173" host="localhost" Mar 25 01:11:16.111127 containerd[1474]: 2025-03-25 01:11:16.091 [INFO][4169] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:11:16.111127 containerd[1474]: 2025-03-25 01:11:16.091 [INFO][4169] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="ef78d507fc7584fcdd2aed395f622fa82009a078de25e0554b2f7a461f375173" HandleID="k8s-pod-network.ef78d507fc7584fcdd2aed395f622fa82009a078de25e0554b2f7a461f375173" Workload="localhost-k8s-calico--apiserver--5b89686bfb--8kksr-eth0" Mar 25 01:11:16.111712 containerd[1474]: 2025-03-25 01:11:16.093 [INFO][4138] cni-plugin/k8s.go 386: Populated endpoint ContainerID="ef78d507fc7584fcdd2aed395f622fa82009a078de25e0554b2f7a461f375173" Namespace="calico-apiserver" Pod="calico-apiserver-5b89686bfb-8kksr" WorkloadEndpoint="localhost-k8s-calico--apiserver--5b89686bfb--8kksr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5b89686bfb--8kksr-eth0", GenerateName:"calico-apiserver-5b89686bfb-", Namespace:"calico-apiserver", SelfLink:"", UID:"3a461ca2-f983-4fd2-a905-67ee0b4f5a3b", ResourceVersion:"657", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 10, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5b89686bfb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-5b89686bfb-8kksr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali6d8c023f2e8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:11:16.111712 containerd[1474]: 2025-03-25 01:11:16.094 [INFO][4138] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.130/32] ContainerID="ef78d507fc7584fcdd2aed395f622fa82009a078de25e0554b2f7a461f375173" Namespace="calico-apiserver" Pod="calico-apiserver-5b89686bfb-8kksr" WorkloadEndpoint="localhost-k8s-calico--apiserver--5b89686bfb--8kksr-eth0" Mar 25 01:11:16.111712 containerd[1474]: 2025-03-25 01:11:16.094 [INFO][4138] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6d8c023f2e8 ContainerID="ef78d507fc7584fcdd2aed395f622fa82009a078de25e0554b2f7a461f375173" Namespace="calico-apiserver" Pod="calico-apiserver-5b89686bfb-8kksr" WorkloadEndpoint="localhost-k8s-calico--apiserver--5b89686bfb--8kksr-eth0" Mar 25 01:11:16.111712 containerd[1474]: 2025-03-25 01:11:16.096 [INFO][4138] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ef78d507fc7584fcdd2aed395f622fa82009a078de25e0554b2f7a461f375173" Namespace="calico-apiserver" Pod="calico-apiserver-5b89686bfb-8kksr" WorkloadEndpoint="localhost-k8s-calico--apiserver--5b89686bfb--8kksr-eth0" Mar 25 01:11:16.111712 containerd[1474]: 2025-03-25 01:11:16.096 [INFO][4138] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="ef78d507fc7584fcdd2aed395f622fa82009a078de25e0554b2f7a461f375173" Namespace="calico-apiserver" Pod="calico-apiserver-5b89686bfb-8kksr" WorkloadEndpoint="localhost-k8s-calico--apiserver--5b89686bfb--8kksr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5b89686bfb--8kksr-eth0", GenerateName:"calico-apiserver-5b89686bfb-", Namespace:"calico-apiserver", SelfLink:"", UID:"3a461ca2-f983-4fd2-a905-67ee0b4f5a3b", ResourceVersion:"657", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 10, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5b89686bfb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"ef78d507fc7584fcdd2aed395f622fa82009a078de25e0554b2f7a461f375173", Pod:"calico-apiserver-5b89686bfb-8kksr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali6d8c023f2e8", MAC:"52:c6:fc:44:12:52", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:11:16.111712 containerd[1474]: 2025-03-25 01:11:16.109 [INFO][4138] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="ef78d507fc7584fcdd2aed395f622fa82009a078de25e0554b2f7a461f375173" Namespace="calico-apiserver" Pod="calico-apiserver-5b89686bfb-8kksr" WorkloadEndpoint="localhost-k8s-calico--apiserver--5b89686bfb--8kksr-eth0" Mar 25 01:11:16.138380 containerd[1474]: time="2025-03-25T01:11:16.137931608Z" level=info msg="connecting to shim ef78d507fc7584fcdd2aed395f622fa82009a078de25e0554b2f7a461f375173" address="unix:///run/containerd/s/ef6019b02b27c27d4b5d0e45709a82910f00ef3423aa6149bcf6182731319057" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:11:16.157539 kubelet[2591]: I0325 01:11:16.157120 2591 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-hg98s" podStartSLOduration=29.157051289 podStartE2EDuration="29.157051289s" podCreationTimestamp="2025-03-25 01:10:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:11:16.156903169 +0000 UTC m=+35.261880504" watchObservedRunningTime="2025-03-25 01:11:16.157051289 +0000 UTC m=+35.262028664" Mar 25 01:11:16.164624 systemd[1]: Started cri-containerd-ef78d507fc7584fcdd2aed395f622fa82009a078de25e0554b2f7a461f375173.scope - libcontainer container ef78d507fc7584fcdd2aed395f622fa82009a078de25e0554b2f7a461f375173. Mar 25 01:11:16.185276 systemd-resolved[1320]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 25 01:11:16.195634 kubelet[2591]: I0325 01:11:16.195529 2591 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:11:16.222487 containerd[1474]: time="2025-03-25T01:11:16.222295852Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5b89686bfb-8kksr,Uid:3a461ca2-f983-4fd2-a905-67ee0b4f5a3b,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"ef78d507fc7584fcdd2aed395f622fa82009a078de25e0554b2f7a461f375173\"" Mar 25 01:11:16.229801 containerd[1474]: time="2025-03-25T01:11:16.228223332Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 25 01:11:16.233553 systemd-networkd[1395]: cali7c4b88124d0: Link UP Mar 25 01:11:16.234764 systemd-networkd[1395]: cali7c4b88124d0: Gained carrier Mar 25 01:11:16.247288 containerd[1474]: 2025-03-25 01:11:16.005 [INFO][4141] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 25 01:11:16.247288 containerd[1474]: 2025-03-25 01:11:16.022 [INFO][4141] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--78fbfcd757--szsdv-eth0 calico-kube-controllers-78fbfcd757- calico-system d3ecef61-e163-47cc-b4ee-25518b9134d1 660 0 2025-03-25 01:10:54 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:78fbfcd757 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-78fbfcd757-szsdv eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali7c4b88124d0 [] []}} ContainerID="7c43ad221398b670042a2e33ed68b660db67c7062d783779b951694076bd5dbe" Namespace="calico-system" Pod="calico-kube-controllers-78fbfcd757-szsdv" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--78fbfcd757--szsdv-" Mar 25 01:11:16.247288 containerd[1474]: 2025-03-25 01:11:16.022 [INFO][4141] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="7c43ad221398b670042a2e33ed68b660db67c7062d783779b951694076bd5dbe" Namespace="calico-system" Pod="calico-kube-controllers-78fbfcd757-szsdv" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--78fbfcd757--szsdv-eth0" Mar 25 01:11:16.247288 containerd[1474]: 2025-03-25 01:11:16.063 [INFO][4176] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7c43ad221398b670042a2e33ed68b660db67c7062d783779b951694076bd5dbe" HandleID="k8s-pod-network.7c43ad221398b670042a2e33ed68b660db67c7062d783779b951694076bd5dbe" Workload="localhost-k8s-calico--kube--controllers--78fbfcd757--szsdv-eth0" Mar 25 01:11:16.247288 containerd[1474]: 2025-03-25 01:11:16.073 [INFO][4176] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7c43ad221398b670042a2e33ed68b660db67c7062d783779b951694076bd5dbe" HandleID="k8s-pod-network.7c43ad221398b670042a2e33ed68b660db67c7062d783779b951694076bd5dbe" Workload="localhost-k8s-calico--kube--controllers--78fbfcd757--szsdv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d8950), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-78fbfcd757-szsdv", "timestamp":"2025-03-25 01:11:16.063556206 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:11:16.247288 containerd[1474]: 2025-03-25 01:11:16.073 [INFO][4176] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:11:16.247288 containerd[1474]: 2025-03-25 01:11:16.091 [INFO][4176] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:11:16.247288 containerd[1474]: 2025-03-25 01:11:16.092 [INFO][4176] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 25 01:11:16.247288 containerd[1474]: 2025-03-25 01:11:16.172 [INFO][4176] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.7c43ad221398b670042a2e33ed68b660db67c7062d783779b951694076bd5dbe" host="localhost" Mar 25 01:11:16.247288 containerd[1474]: 2025-03-25 01:11:16.183 [INFO][4176] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 25 01:11:16.247288 containerd[1474]: 2025-03-25 01:11:16.189 [INFO][4176] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 25 01:11:16.247288 containerd[1474]: 2025-03-25 01:11:16.194 [INFO][4176] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 25 01:11:16.247288 containerd[1474]: 2025-03-25 01:11:16.197 [INFO][4176] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 25 01:11:16.247288 containerd[1474]: 2025-03-25 01:11:16.197 [INFO][4176] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.7c43ad221398b670042a2e33ed68b660db67c7062d783779b951694076bd5dbe" host="localhost" Mar 25 01:11:16.247288 containerd[1474]: 2025-03-25 01:11:16.199 [INFO][4176] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.7c43ad221398b670042a2e33ed68b660db67c7062d783779b951694076bd5dbe Mar 25 01:11:16.247288 containerd[1474]: 2025-03-25 01:11:16.205 [INFO][4176] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.7c43ad221398b670042a2e33ed68b660db67c7062d783779b951694076bd5dbe" host="localhost" Mar 25 01:11:16.247288 containerd[1474]: 2025-03-25 01:11:16.222 [INFO][4176] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.7c43ad221398b670042a2e33ed68b660db67c7062d783779b951694076bd5dbe" host="localhost" Mar 25 01:11:16.247288 containerd[1474]: 2025-03-25 01:11:16.222 [INFO][4176] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.7c43ad221398b670042a2e33ed68b660db67c7062d783779b951694076bd5dbe" host="localhost" Mar 25 01:11:16.247288 containerd[1474]: 2025-03-25 01:11:16.222 [INFO][4176] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:11:16.247288 containerd[1474]: 2025-03-25 01:11:16.222 [INFO][4176] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="7c43ad221398b670042a2e33ed68b660db67c7062d783779b951694076bd5dbe" HandleID="k8s-pod-network.7c43ad221398b670042a2e33ed68b660db67c7062d783779b951694076bd5dbe" Workload="localhost-k8s-calico--kube--controllers--78fbfcd757--szsdv-eth0" Mar 25 01:11:16.247934 containerd[1474]: 2025-03-25 01:11:16.228 [INFO][4141] cni-plugin/k8s.go 386: Populated endpoint ContainerID="7c43ad221398b670042a2e33ed68b660db67c7062d783779b951694076bd5dbe" Namespace="calico-system" Pod="calico-kube-controllers-78fbfcd757-szsdv" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--78fbfcd757--szsdv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--78fbfcd757--szsdv-eth0", GenerateName:"calico-kube-controllers-78fbfcd757-", Namespace:"calico-system", SelfLink:"", UID:"d3ecef61-e163-47cc-b4ee-25518b9134d1", ResourceVersion:"660", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 10, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"78fbfcd757", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-78fbfcd757-szsdv", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali7c4b88124d0", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:11:16.247934 containerd[1474]: 2025-03-25 01:11:16.230 [INFO][4141] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.131/32] ContainerID="7c43ad221398b670042a2e33ed68b660db67c7062d783779b951694076bd5dbe" Namespace="calico-system" Pod="calico-kube-controllers-78fbfcd757-szsdv" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--78fbfcd757--szsdv-eth0" Mar 25 01:11:16.247934 containerd[1474]: 2025-03-25 01:11:16.230 [INFO][4141] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7c4b88124d0 ContainerID="7c43ad221398b670042a2e33ed68b660db67c7062d783779b951694076bd5dbe" Namespace="calico-system" Pod="calico-kube-controllers-78fbfcd757-szsdv" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--78fbfcd757--szsdv-eth0" Mar 25 01:11:16.247934 containerd[1474]: 2025-03-25 01:11:16.235 [INFO][4141] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7c43ad221398b670042a2e33ed68b660db67c7062d783779b951694076bd5dbe" Namespace="calico-system" Pod="calico-kube-controllers-78fbfcd757-szsdv" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--78fbfcd757--szsdv-eth0" Mar 25 01:11:16.247934 containerd[1474]: 2025-03-25 01:11:16.236 [INFO][4141] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="7c43ad221398b670042a2e33ed68b660db67c7062d783779b951694076bd5dbe" Namespace="calico-system" Pod="calico-kube-controllers-78fbfcd757-szsdv" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--78fbfcd757--szsdv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--78fbfcd757--szsdv-eth0", GenerateName:"calico-kube-controllers-78fbfcd757-", Namespace:"calico-system", SelfLink:"", UID:"d3ecef61-e163-47cc-b4ee-25518b9134d1", ResourceVersion:"660", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 10, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"78fbfcd757", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"7c43ad221398b670042a2e33ed68b660db67c7062d783779b951694076bd5dbe", Pod:"calico-kube-controllers-78fbfcd757-szsdv", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali7c4b88124d0", MAC:"be:12:42:36:81:ec", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:11:16.247934 containerd[1474]: 2025-03-25 01:11:16.245 [INFO][4141] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="7c43ad221398b670042a2e33ed68b660db67c7062d783779b951694076bd5dbe" Namespace="calico-system" Pod="calico-kube-controllers-78fbfcd757-szsdv" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--78fbfcd757--szsdv-eth0" Mar 25 01:11:16.265256 containerd[1474]: time="2025-03-25T01:11:16.265212733Z" level=info msg="connecting to shim 7c43ad221398b670042a2e33ed68b660db67c7062d783779b951694076bd5dbe" address="unix:///run/containerd/s/3fa4e9c88d2f0cf5b710ef18c6cfb1ea3671bb16ac1621c42feeb7b8ac389e52" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:11:16.283592 systemd[1]: Started cri-containerd-7c43ad221398b670042a2e33ed68b660db67c7062d783779b951694076bd5dbe.scope - libcontainer container 7c43ad221398b670042a2e33ed68b660db67c7062d783779b951694076bd5dbe. Mar 25 01:11:16.296347 systemd-resolved[1320]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 25 01:11:16.308474 systemd-networkd[1395]: cali4af4cbb5656: Link UP Mar 25 01:11:16.308651 systemd-networkd[1395]: cali4af4cbb5656: Gained carrier Mar 25 01:11:16.324610 containerd[1474]: 2025-03-25 01:11:16.009 [INFO][4128] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 25 01:11:16.324610 containerd[1474]: 2025-03-25 01:11:16.028 [INFO][4128] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--rm4dd-eth0 csi-node-driver- calico-system 134e9f30-05b3-4c2c-8ea5-b587606a6022 577 0 2025-03-25 01:10:54 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:54877d75d5 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-rm4dd eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali4af4cbb5656 [] []}} ContainerID="f43cb48eb9549f34b4ea6708be27c34b809e1d367c543549e8d6ca2c45a93efe" Namespace="calico-system" Pod="csi-node-driver-rm4dd" WorkloadEndpoint="localhost-k8s-csi--node--driver--rm4dd-" Mar 25 01:11:16.324610 containerd[1474]: 2025-03-25 01:11:16.031 [INFO][4128] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="f43cb48eb9549f34b4ea6708be27c34b809e1d367c543549e8d6ca2c45a93efe" Namespace="calico-system" Pod="csi-node-driver-rm4dd" WorkloadEndpoint="localhost-k8s-csi--node--driver--rm4dd-eth0" Mar 25 01:11:16.324610 containerd[1474]: 2025-03-25 01:11:16.070 [INFO][4177] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f43cb48eb9549f34b4ea6708be27c34b809e1d367c543549e8d6ca2c45a93efe" HandleID="k8s-pod-network.f43cb48eb9549f34b4ea6708be27c34b809e1d367c543549e8d6ca2c45a93efe" Workload="localhost-k8s-csi--node--driver--rm4dd-eth0" Mar 25 01:11:16.324610 containerd[1474]: 2025-03-25 01:11:16.081 [INFO][4177] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f43cb48eb9549f34b4ea6708be27c34b809e1d367c543549e8d6ca2c45a93efe" HandleID="k8s-pod-network.f43cb48eb9549f34b4ea6708be27c34b809e1d367c543549e8d6ca2c45a93efe" Workload="localhost-k8s-csi--node--driver--rm4dd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400012acc0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-rm4dd", "timestamp":"2025-03-25 01:11:16.070267406 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:11:16.324610 containerd[1474]: 2025-03-25 01:11:16.081 [INFO][4177] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:11:16.324610 containerd[1474]: 2025-03-25 01:11:16.223 [INFO][4177] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:11:16.324610 containerd[1474]: 2025-03-25 01:11:16.223 [INFO][4177] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 25 01:11:16.324610 containerd[1474]: 2025-03-25 01:11:16.271 [INFO][4177] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.f43cb48eb9549f34b4ea6708be27c34b809e1d367c543549e8d6ca2c45a93efe" host="localhost" Mar 25 01:11:16.324610 containerd[1474]: 2025-03-25 01:11:16.282 [INFO][4177] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 25 01:11:16.324610 containerd[1474]: 2025-03-25 01:11:16.289 [INFO][4177] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 25 01:11:16.324610 containerd[1474]: 2025-03-25 01:11:16.291 [INFO][4177] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 25 01:11:16.324610 containerd[1474]: 2025-03-25 01:11:16.293 [INFO][4177] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 25 01:11:16.324610 containerd[1474]: 2025-03-25 01:11:16.293 [INFO][4177] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.f43cb48eb9549f34b4ea6708be27c34b809e1d367c543549e8d6ca2c45a93efe" host="localhost" Mar 25 01:11:16.324610 containerd[1474]: 2025-03-25 01:11:16.295 [INFO][4177] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.f43cb48eb9549f34b4ea6708be27c34b809e1d367c543549e8d6ca2c45a93efe Mar 25 01:11:16.324610 containerd[1474]: 2025-03-25 01:11:16.298 [INFO][4177] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.f43cb48eb9549f34b4ea6708be27c34b809e1d367c543549e8d6ca2c45a93efe" host="localhost" Mar 25 01:11:16.324610 containerd[1474]: 2025-03-25 01:11:16.304 [INFO][4177] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.f43cb48eb9549f34b4ea6708be27c34b809e1d367c543549e8d6ca2c45a93efe" host="localhost" Mar 25 01:11:16.324610 containerd[1474]: 2025-03-25 01:11:16.304 [INFO][4177] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.f43cb48eb9549f34b4ea6708be27c34b809e1d367c543549e8d6ca2c45a93efe" host="localhost" Mar 25 01:11:16.324610 containerd[1474]: 2025-03-25 01:11:16.304 [INFO][4177] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:11:16.324610 containerd[1474]: 2025-03-25 01:11:16.304 [INFO][4177] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="f43cb48eb9549f34b4ea6708be27c34b809e1d367c543549e8d6ca2c45a93efe" HandleID="k8s-pod-network.f43cb48eb9549f34b4ea6708be27c34b809e1d367c543549e8d6ca2c45a93efe" Workload="localhost-k8s-csi--node--driver--rm4dd-eth0" Mar 25 01:11:16.325310 containerd[1474]: 2025-03-25 01:11:16.306 [INFO][4128] cni-plugin/k8s.go 386: Populated endpoint ContainerID="f43cb48eb9549f34b4ea6708be27c34b809e1d367c543549e8d6ca2c45a93efe" Namespace="calico-system" Pod="csi-node-driver-rm4dd" WorkloadEndpoint="localhost-k8s-csi--node--driver--rm4dd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--rm4dd-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"134e9f30-05b3-4c2c-8ea5-b587606a6022", ResourceVersion:"577", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 10, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"54877d75d5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-rm4dd", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali4af4cbb5656", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:11:16.325310 containerd[1474]: 2025-03-25 01:11:16.306 [INFO][4128] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.132/32] ContainerID="f43cb48eb9549f34b4ea6708be27c34b809e1d367c543549e8d6ca2c45a93efe" Namespace="calico-system" Pod="csi-node-driver-rm4dd" WorkloadEndpoint="localhost-k8s-csi--node--driver--rm4dd-eth0" Mar 25 01:11:16.325310 containerd[1474]: 2025-03-25 01:11:16.306 [INFO][4128] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4af4cbb5656 ContainerID="f43cb48eb9549f34b4ea6708be27c34b809e1d367c543549e8d6ca2c45a93efe" Namespace="calico-system" Pod="csi-node-driver-rm4dd" WorkloadEndpoint="localhost-k8s-csi--node--driver--rm4dd-eth0" Mar 25 01:11:16.325310 containerd[1474]: 2025-03-25 01:11:16.310 [INFO][4128] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f43cb48eb9549f34b4ea6708be27c34b809e1d367c543549e8d6ca2c45a93efe" Namespace="calico-system" Pod="csi-node-driver-rm4dd" WorkloadEndpoint="localhost-k8s-csi--node--driver--rm4dd-eth0" Mar 25 01:11:16.325310 containerd[1474]: 2025-03-25 01:11:16.310 [INFO][4128] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="f43cb48eb9549f34b4ea6708be27c34b809e1d367c543549e8d6ca2c45a93efe" Namespace="calico-system" Pod="csi-node-driver-rm4dd" WorkloadEndpoint="localhost-k8s-csi--node--driver--rm4dd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--rm4dd-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"134e9f30-05b3-4c2c-8ea5-b587606a6022", ResourceVersion:"577", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 10, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"54877d75d5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"f43cb48eb9549f34b4ea6708be27c34b809e1d367c543549e8d6ca2c45a93efe", Pod:"csi-node-driver-rm4dd", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali4af4cbb5656", MAC:"ae:38:67:fb:6d:bb", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:11:16.325310 containerd[1474]: 2025-03-25 01:11:16.322 [INFO][4128] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="f43cb48eb9549f34b4ea6708be27c34b809e1d367c543549e8d6ca2c45a93efe" Namespace="calico-system" Pod="csi-node-driver-rm4dd" WorkloadEndpoint="localhost-k8s-csi--node--driver--rm4dd-eth0" Mar 25 01:11:16.338162 containerd[1474]: time="2025-03-25T01:11:16.337890976Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-78fbfcd757-szsdv,Uid:d3ecef61-e163-47cc-b4ee-25518b9134d1,Namespace:calico-system,Attempt:0,} returns sandbox id \"7c43ad221398b670042a2e33ed68b660db67c7062d783779b951694076bd5dbe\"" Mar 25 01:11:16.363414 containerd[1474]: time="2025-03-25T01:11:16.363327217Z" level=info msg="connecting to shim f43cb48eb9549f34b4ea6708be27c34b809e1d367c543549e8d6ca2c45a93efe" address="unix:///run/containerd/s/b9c90d76492ed997bebfc5a32b5b10d391c4cb0952f0a821dde5b77e6f082b72" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:11:16.392628 systemd[1]: Started cri-containerd-f43cb48eb9549f34b4ea6708be27c34b809e1d367c543549e8d6ca2c45a93efe.scope - libcontainer container f43cb48eb9549f34b4ea6708be27c34b809e1d367c543549e8d6ca2c45a93efe. Mar 25 01:11:16.403227 systemd-resolved[1320]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 25 01:11:16.415747 containerd[1474]: time="2025-03-25T01:11:16.415641579Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-rm4dd,Uid:134e9f30-05b3-4c2c-8ea5-b587606a6022,Namespace:calico-system,Attempt:0,} returns sandbox id \"f43cb48eb9549f34b4ea6708be27c34b809e1d367c543549e8d6ca2c45a93efe\"" Mar 25 01:11:16.857746 kernel: bpftool[4394]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 25 01:11:16.874623 systemd-networkd[1395]: cali44734bf112f: Gained IPv6LL Mar 25 01:11:17.066857 systemd-networkd[1395]: vxlan.calico: Link UP Mar 25 01:11:17.067037 systemd-networkd[1395]: vxlan.calico: Gained carrier Mar 25 01:11:17.195189 systemd-networkd[1395]: cali6d8c023f2e8: Gained IPv6LL Mar 25 01:11:17.578608 systemd-networkd[1395]: cali4af4cbb5656: Gained IPv6LL Mar 25 01:11:17.642544 systemd-networkd[1395]: cali7c4b88124d0: Gained IPv6LL Mar 25 01:11:17.942040 containerd[1474]: time="2025-03-25T01:11:17.941655236Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:11:17.942610 containerd[1474]: time="2025-03-25T01:11:17.942558716Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=40253267" Mar 25 01:11:17.943670 containerd[1474]: time="2025-03-25T01:11:17.943618756Z" level=info msg="ImageCreate event name:\"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:11:17.945693 containerd[1474]: time="2025-03-25T01:11:17.945639556Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:11:17.946413 containerd[1474]: time="2025-03-25T01:11:17.946291557Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"41623040\" in 1.718029745s" Mar 25 01:11:17.946413 containerd[1474]: time="2025-03-25T01:11:17.946323717Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\"" Mar 25 01:11:17.948225 containerd[1474]: time="2025-03-25T01:11:17.948192197Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\"" Mar 25 01:11:17.949260 containerd[1474]: time="2025-03-25T01:11:17.949223397Z" level=info msg="CreateContainer within sandbox \"ef78d507fc7584fcdd2aed395f622fa82009a078de25e0554b2f7a461f375173\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 25 01:11:17.956299 containerd[1474]: time="2025-03-25T01:11:17.956239237Z" level=info msg="Container a6ba2804a54a9664d9e7d77b35217e4b754305376f5665aa883eca8c3d276356: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:11:17.960402 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2800884779.mount: Deactivated successfully. Mar 25 01:11:17.963018 containerd[1474]: time="2025-03-25T01:11:17.962932797Z" level=info msg="CreateContainer within sandbox \"ef78d507fc7584fcdd2aed395f622fa82009a078de25e0554b2f7a461f375173\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"a6ba2804a54a9664d9e7d77b35217e4b754305376f5665aa883eca8c3d276356\"" Mar 25 01:11:17.963735 containerd[1474]: time="2025-03-25T01:11:17.963708037Z" level=info msg="StartContainer for \"a6ba2804a54a9664d9e7d77b35217e4b754305376f5665aa883eca8c3d276356\"" Mar 25 01:11:17.964937 containerd[1474]: time="2025-03-25T01:11:17.964911877Z" level=info msg="connecting to shim a6ba2804a54a9664d9e7d77b35217e4b754305376f5665aa883eca8c3d276356" address="unix:///run/containerd/s/ef6019b02b27c27d4b5d0e45709a82910f00ef3423aa6149bcf6182731319057" protocol=ttrpc version=3 Mar 25 01:11:17.975737 containerd[1474]: time="2025-03-25T01:11:17.975698638Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-g7msf,Uid:65f62a1e-a36a-47c3-8c4b-1dbed6b93395,Namespace:kube-system,Attempt:0,}" Mar 25 01:11:17.988379 systemd[1]: Started cri-containerd-a6ba2804a54a9664d9e7d77b35217e4b754305376f5665aa883eca8c3d276356.scope - libcontainer container a6ba2804a54a9664d9e7d77b35217e4b754305376f5665aa883eca8c3d276356. Mar 25 01:11:18.081816 containerd[1474]: time="2025-03-25T01:11:18.081772641Z" level=info msg="StartContainer for \"a6ba2804a54a9664d9e7d77b35217e4b754305376f5665aa883eca8c3d276356\" returns successfully" Mar 25 01:11:18.185733 kubelet[2591]: I0325 01:11:18.184704 2591 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5b89686bfb-8kksr" podStartSLOduration=23.46536334 podStartE2EDuration="25.184676485s" podCreationTimestamp="2025-03-25 01:10:53 +0000 UTC" firstStartedPulling="2025-03-25 01:11:16.227828612 +0000 UTC m=+35.332805947" lastFinishedPulling="2025-03-25 01:11:17.947141757 +0000 UTC m=+37.052119092" observedRunningTime="2025-03-25 01:11:18.183613445 +0000 UTC m=+37.288590740" watchObservedRunningTime="2025-03-25 01:11:18.184676485 +0000 UTC m=+37.289653820" Mar 25 01:11:18.216170 systemd-networkd[1395]: caliab622c899c3: Link UP Mar 25 01:11:18.217478 systemd-networkd[1395]: caliab622c899c3: Gained carrier Mar 25 01:11:18.234088 containerd[1474]: 2025-03-25 01:11:18.025 [INFO][4533] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--g7msf-eth0 coredns-668d6bf9bc- kube-system 65f62a1e-a36a-47c3-8c4b-1dbed6b93395 659 0 2025-03-25 01:10:47 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-g7msf eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] caliab622c899c3 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="22a1a502103f15f436afc0ccb6316509b174a7867edfc34fcf02b9275d886ab6" Namespace="kube-system" Pod="coredns-668d6bf9bc-g7msf" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--g7msf-" Mar 25 01:11:18.234088 containerd[1474]: 2025-03-25 01:11:18.025 [INFO][4533] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="22a1a502103f15f436afc0ccb6316509b174a7867edfc34fcf02b9275d886ab6" Namespace="kube-system" Pod="coredns-668d6bf9bc-g7msf" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--g7msf-eth0" Mar 25 01:11:18.234088 containerd[1474]: 2025-03-25 01:11:18.071 [INFO][4564] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="22a1a502103f15f436afc0ccb6316509b174a7867edfc34fcf02b9275d886ab6" HandleID="k8s-pod-network.22a1a502103f15f436afc0ccb6316509b174a7867edfc34fcf02b9275d886ab6" Workload="localhost-k8s-coredns--668d6bf9bc--g7msf-eth0" Mar 25 01:11:18.234088 containerd[1474]: 2025-03-25 01:11:18.083 [INFO][4564] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="22a1a502103f15f436afc0ccb6316509b174a7867edfc34fcf02b9275d886ab6" HandleID="k8s-pod-network.22a1a502103f15f436afc0ccb6316509b174a7867edfc34fcf02b9275d886ab6" Workload="localhost-k8s-coredns--668d6bf9bc--g7msf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000313480), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-g7msf", "timestamp":"2025-03-25 01:11:18.071161721 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:11:18.234088 containerd[1474]: 2025-03-25 01:11:18.084 [INFO][4564] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:11:18.234088 containerd[1474]: 2025-03-25 01:11:18.084 [INFO][4564] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:11:18.234088 containerd[1474]: 2025-03-25 01:11:18.084 [INFO][4564] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 25 01:11:18.234088 containerd[1474]: 2025-03-25 01:11:18.086 [INFO][4564] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.22a1a502103f15f436afc0ccb6316509b174a7867edfc34fcf02b9275d886ab6" host="localhost" Mar 25 01:11:18.234088 containerd[1474]: 2025-03-25 01:11:18.182 [INFO][4564] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 25 01:11:18.234088 containerd[1474]: 2025-03-25 01:11:18.191 [INFO][4564] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 25 01:11:18.234088 containerd[1474]: 2025-03-25 01:11:18.194 [INFO][4564] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 25 01:11:18.234088 containerd[1474]: 2025-03-25 01:11:18.197 [INFO][4564] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 25 01:11:18.234088 containerd[1474]: 2025-03-25 01:11:18.197 [INFO][4564] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.22a1a502103f15f436afc0ccb6316509b174a7867edfc34fcf02b9275d886ab6" host="localhost" Mar 25 01:11:18.234088 containerd[1474]: 2025-03-25 01:11:18.199 [INFO][4564] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.22a1a502103f15f436afc0ccb6316509b174a7867edfc34fcf02b9275d886ab6 Mar 25 01:11:18.234088 containerd[1474]: 2025-03-25 01:11:18.204 [INFO][4564] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.22a1a502103f15f436afc0ccb6316509b174a7867edfc34fcf02b9275d886ab6" host="localhost" Mar 25 01:11:18.234088 containerd[1474]: 2025-03-25 01:11:18.209 [INFO][4564] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.22a1a502103f15f436afc0ccb6316509b174a7867edfc34fcf02b9275d886ab6" host="localhost" Mar 25 01:11:18.234088 containerd[1474]: 2025-03-25 01:11:18.209 [INFO][4564] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.22a1a502103f15f436afc0ccb6316509b174a7867edfc34fcf02b9275d886ab6" host="localhost" Mar 25 01:11:18.234088 containerd[1474]: 2025-03-25 01:11:18.209 [INFO][4564] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:11:18.234088 containerd[1474]: 2025-03-25 01:11:18.209 [INFO][4564] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="22a1a502103f15f436afc0ccb6316509b174a7867edfc34fcf02b9275d886ab6" HandleID="k8s-pod-network.22a1a502103f15f436afc0ccb6316509b174a7867edfc34fcf02b9275d886ab6" Workload="localhost-k8s-coredns--668d6bf9bc--g7msf-eth0" Mar 25 01:11:18.234606 containerd[1474]: 2025-03-25 01:11:18.212 [INFO][4533] cni-plugin/k8s.go 386: Populated endpoint ContainerID="22a1a502103f15f436afc0ccb6316509b174a7867edfc34fcf02b9275d886ab6" Namespace="kube-system" Pod="coredns-668d6bf9bc-g7msf" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--g7msf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--g7msf-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"65f62a1e-a36a-47c3-8c4b-1dbed6b93395", ResourceVersion:"659", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 10, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-g7msf", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliab622c899c3", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:11:18.234606 containerd[1474]: 2025-03-25 01:11:18.212 [INFO][4533] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.133/32] ContainerID="22a1a502103f15f436afc0ccb6316509b174a7867edfc34fcf02b9275d886ab6" Namespace="kube-system" Pod="coredns-668d6bf9bc-g7msf" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--g7msf-eth0" Mar 25 01:11:18.234606 containerd[1474]: 2025-03-25 01:11:18.212 [INFO][4533] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliab622c899c3 ContainerID="22a1a502103f15f436afc0ccb6316509b174a7867edfc34fcf02b9275d886ab6" Namespace="kube-system" Pod="coredns-668d6bf9bc-g7msf" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--g7msf-eth0" Mar 25 01:11:18.234606 containerd[1474]: 2025-03-25 01:11:18.217 [INFO][4533] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="22a1a502103f15f436afc0ccb6316509b174a7867edfc34fcf02b9275d886ab6" Namespace="kube-system" Pod="coredns-668d6bf9bc-g7msf" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--g7msf-eth0" Mar 25 01:11:18.234606 containerd[1474]: 2025-03-25 01:11:18.218 [INFO][4533] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="22a1a502103f15f436afc0ccb6316509b174a7867edfc34fcf02b9275d886ab6" Namespace="kube-system" Pod="coredns-668d6bf9bc-g7msf" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--g7msf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--g7msf-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"65f62a1e-a36a-47c3-8c4b-1dbed6b93395", ResourceVersion:"659", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 10, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"22a1a502103f15f436afc0ccb6316509b174a7867edfc34fcf02b9275d886ab6", Pod:"coredns-668d6bf9bc-g7msf", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliab622c899c3", MAC:"d2:8a:8f:7e:56:dc", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:11:18.234606 containerd[1474]: 2025-03-25 01:11:18.228 [INFO][4533] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="22a1a502103f15f436afc0ccb6316509b174a7867edfc34fcf02b9275d886ab6" Namespace="kube-system" Pod="coredns-668d6bf9bc-g7msf" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--g7msf-eth0" Mar 25 01:11:18.252287 containerd[1474]: time="2025-03-25T01:11:18.252242527Z" level=info msg="connecting to shim 22a1a502103f15f436afc0ccb6316509b174a7867edfc34fcf02b9275d886ab6" address="unix:///run/containerd/s/3a082a45d95efbe87c44ab7a796c8b78b8e8a68c0ebff3e5647da14f98102367" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:11:18.273617 systemd[1]: Started cri-containerd-22a1a502103f15f436afc0ccb6316509b174a7867edfc34fcf02b9275d886ab6.scope - libcontainer container 22a1a502103f15f436afc0ccb6316509b174a7867edfc34fcf02b9275d886ab6. Mar 25 01:11:18.286957 systemd-resolved[1320]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 25 01:11:18.321591 containerd[1474]: time="2025-03-25T01:11:18.321538249Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-g7msf,Uid:65f62a1e-a36a-47c3-8c4b-1dbed6b93395,Namespace:kube-system,Attempt:0,} returns sandbox id \"22a1a502103f15f436afc0ccb6316509b174a7867edfc34fcf02b9275d886ab6\"" Mar 25 01:11:18.324395 containerd[1474]: time="2025-03-25T01:11:18.324250450Z" level=info msg="CreateContainer within sandbox \"22a1a502103f15f436afc0ccb6316509b174a7867edfc34fcf02b9275d886ab6\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 25 01:11:18.331525 containerd[1474]: time="2025-03-25T01:11:18.331291450Z" level=info msg="Container 0bba94f391a5967d1db68894535c621e3b6f99253d3463f1dce205c5d6508c2f: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:11:18.338226 containerd[1474]: time="2025-03-25T01:11:18.338191650Z" level=info msg="CreateContainer within sandbox \"22a1a502103f15f436afc0ccb6316509b174a7867edfc34fcf02b9275d886ab6\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"0bba94f391a5967d1db68894535c621e3b6f99253d3463f1dce205c5d6508c2f\"" Mar 25 01:11:18.341076 containerd[1474]: time="2025-03-25T01:11:18.341036010Z" level=info msg="StartContainer for \"0bba94f391a5967d1db68894535c621e3b6f99253d3463f1dce205c5d6508c2f\"" Mar 25 01:11:18.342444 containerd[1474]: time="2025-03-25T01:11:18.342391810Z" level=info msg="connecting to shim 0bba94f391a5967d1db68894535c621e3b6f99253d3463f1dce205c5d6508c2f" address="unix:///run/containerd/s/3a082a45d95efbe87c44ab7a796c8b78b8e8a68c0ebff3e5647da14f98102367" protocol=ttrpc version=3 Mar 25 01:11:18.369595 systemd[1]: Started cri-containerd-0bba94f391a5967d1db68894535c621e3b6f99253d3463f1dce205c5d6508c2f.scope - libcontainer container 0bba94f391a5967d1db68894535c621e3b6f99253d3463f1dce205c5d6508c2f. Mar 25 01:11:18.401749 containerd[1474]: time="2025-03-25T01:11:18.401712012Z" level=info msg="StartContainer for \"0bba94f391a5967d1db68894535c621e3b6f99253d3463f1dce205c5d6508c2f\" returns successfully" Mar 25 01:11:18.974571 containerd[1474]: time="2025-03-25T01:11:18.974284512Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5b89686bfb-smsq9,Uid:c3cbab45-79c8-427a-87cc-52cefa199441,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:11:18.990594 systemd-networkd[1395]: vxlan.calico: Gained IPv6LL Mar 25 01:11:19.121452 systemd-networkd[1395]: calic3bf61713f9: Link UP Mar 25 01:11:19.121768 systemd-networkd[1395]: calic3bf61713f9: Gained carrier Mar 25 01:11:19.143842 containerd[1474]: 2025-03-25 01:11:19.026 [INFO][4676] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--5b89686bfb--smsq9-eth0 calico-apiserver-5b89686bfb- calico-apiserver c3cbab45-79c8-427a-87cc-52cefa199441 658 0 2025-03-25 01:10:53 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5b89686bfb projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-5b89686bfb-smsq9 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calic3bf61713f9 [] []}} ContainerID="bb84cbfbe5d9d49507d7974e356ab8da5c305b01cc63f0db5cb2d3e93c511dce" Namespace="calico-apiserver" Pod="calico-apiserver-5b89686bfb-smsq9" WorkloadEndpoint="localhost-k8s-calico--apiserver--5b89686bfb--smsq9-" Mar 25 01:11:19.143842 containerd[1474]: 2025-03-25 01:11:19.026 [INFO][4676] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="bb84cbfbe5d9d49507d7974e356ab8da5c305b01cc63f0db5cb2d3e93c511dce" Namespace="calico-apiserver" Pod="calico-apiserver-5b89686bfb-smsq9" WorkloadEndpoint="localhost-k8s-calico--apiserver--5b89686bfb--smsq9-eth0" Mar 25 01:11:19.143842 containerd[1474]: 2025-03-25 01:11:19.069 [INFO][4690] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bb84cbfbe5d9d49507d7974e356ab8da5c305b01cc63f0db5cb2d3e93c511dce" HandleID="k8s-pod-network.bb84cbfbe5d9d49507d7974e356ab8da5c305b01cc63f0db5cb2d3e93c511dce" Workload="localhost-k8s-calico--apiserver--5b89686bfb--smsq9-eth0" Mar 25 01:11:19.143842 containerd[1474]: 2025-03-25 01:11:19.083 [INFO][4690] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="bb84cbfbe5d9d49507d7974e356ab8da5c305b01cc63f0db5cb2d3e93c511dce" HandleID="k8s-pod-network.bb84cbfbe5d9d49507d7974e356ab8da5c305b01cc63f0db5cb2d3e93c511dce" Workload="localhost-k8s-calico--apiserver--5b89686bfb--smsq9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004ca40), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-5b89686bfb-smsq9", "timestamp":"2025-03-25 01:11:19.069474275 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:11:19.143842 containerd[1474]: 2025-03-25 01:11:19.083 [INFO][4690] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:11:19.143842 containerd[1474]: 2025-03-25 01:11:19.083 [INFO][4690] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:11:19.143842 containerd[1474]: 2025-03-25 01:11:19.083 [INFO][4690] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 25 01:11:19.143842 containerd[1474]: 2025-03-25 01:11:19.086 [INFO][4690] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.bb84cbfbe5d9d49507d7974e356ab8da5c305b01cc63f0db5cb2d3e93c511dce" host="localhost" Mar 25 01:11:19.143842 containerd[1474]: 2025-03-25 01:11:19.091 [INFO][4690] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 25 01:11:19.143842 containerd[1474]: 2025-03-25 01:11:19.096 [INFO][4690] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 25 01:11:19.143842 containerd[1474]: 2025-03-25 01:11:19.098 [INFO][4690] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 25 01:11:19.143842 containerd[1474]: 2025-03-25 01:11:19.101 [INFO][4690] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 25 01:11:19.143842 containerd[1474]: 2025-03-25 01:11:19.101 [INFO][4690] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.bb84cbfbe5d9d49507d7974e356ab8da5c305b01cc63f0db5cb2d3e93c511dce" host="localhost" Mar 25 01:11:19.143842 containerd[1474]: 2025-03-25 01:11:19.103 [INFO][4690] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.bb84cbfbe5d9d49507d7974e356ab8da5c305b01cc63f0db5cb2d3e93c511dce Mar 25 01:11:19.143842 containerd[1474]: 2025-03-25 01:11:19.107 [INFO][4690] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.bb84cbfbe5d9d49507d7974e356ab8da5c305b01cc63f0db5cb2d3e93c511dce" host="localhost" Mar 25 01:11:19.143842 containerd[1474]: 2025-03-25 01:11:19.115 [INFO][4690] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.bb84cbfbe5d9d49507d7974e356ab8da5c305b01cc63f0db5cb2d3e93c511dce" host="localhost" Mar 25 01:11:19.143842 containerd[1474]: 2025-03-25 01:11:19.115 [INFO][4690] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.bb84cbfbe5d9d49507d7974e356ab8da5c305b01cc63f0db5cb2d3e93c511dce" host="localhost" Mar 25 01:11:19.143842 containerd[1474]: 2025-03-25 01:11:19.115 [INFO][4690] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:11:19.143842 containerd[1474]: 2025-03-25 01:11:19.115 [INFO][4690] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="bb84cbfbe5d9d49507d7974e356ab8da5c305b01cc63f0db5cb2d3e93c511dce" HandleID="k8s-pod-network.bb84cbfbe5d9d49507d7974e356ab8da5c305b01cc63f0db5cb2d3e93c511dce" Workload="localhost-k8s-calico--apiserver--5b89686bfb--smsq9-eth0" Mar 25 01:11:19.144423 containerd[1474]: 2025-03-25 01:11:19.119 [INFO][4676] cni-plugin/k8s.go 386: Populated endpoint ContainerID="bb84cbfbe5d9d49507d7974e356ab8da5c305b01cc63f0db5cb2d3e93c511dce" Namespace="calico-apiserver" Pod="calico-apiserver-5b89686bfb-smsq9" WorkloadEndpoint="localhost-k8s-calico--apiserver--5b89686bfb--smsq9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5b89686bfb--smsq9-eth0", GenerateName:"calico-apiserver-5b89686bfb-", Namespace:"calico-apiserver", SelfLink:"", UID:"c3cbab45-79c8-427a-87cc-52cefa199441", ResourceVersion:"658", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 10, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5b89686bfb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-5b89686bfb-smsq9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic3bf61713f9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:11:19.144423 containerd[1474]: 2025-03-25 01:11:19.120 [INFO][4676] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.134/32] ContainerID="bb84cbfbe5d9d49507d7974e356ab8da5c305b01cc63f0db5cb2d3e93c511dce" Namespace="calico-apiserver" Pod="calico-apiserver-5b89686bfb-smsq9" WorkloadEndpoint="localhost-k8s-calico--apiserver--5b89686bfb--smsq9-eth0" Mar 25 01:11:19.144423 containerd[1474]: 2025-03-25 01:11:19.120 [INFO][4676] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic3bf61713f9 ContainerID="bb84cbfbe5d9d49507d7974e356ab8da5c305b01cc63f0db5cb2d3e93c511dce" Namespace="calico-apiserver" Pod="calico-apiserver-5b89686bfb-smsq9" WorkloadEndpoint="localhost-k8s-calico--apiserver--5b89686bfb--smsq9-eth0" Mar 25 01:11:19.144423 containerd[1474]: 2025-03-25 01:11:19.122 [INFO][4676] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bb84cbfbe5d9d49507d7974e356ab8da5c305b01cc63f0db5cb2d3e93c511dce" Namespace="calico-apiserver" Pod="calico-apiserver-5b89686bfb-smsq9" WorkloadEndpoint="localhost-k8s-calico--apiserver--5b89686bfb--smsq9-eth0" Mar 25 01:11:19.144423 containerd[1474]: 2025-03-25 01:11:19.125 [INFO][4676] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="bb84cbfbe5d9d49507d7974e356ab8da5c305b01cc63f0db5cb2d3e93c511dce" Namespace="calico-apiserver" Pod="calico-apiserver-5b89686bfb-smsq9" WorkloadEndpoint="localhost-k8s-calico--apiserver--5b89686bfb--smsq9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5b89686bfb--smsq9-eth0", GenerateName:"calico-apiserver-5b89686bfb-", Namespace:"calico-apiserver", SelfLink:"", UID:"c3cbab45-79c8-427a-87cc-52cefa199441", ResourceVersion:"658", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 10, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5b89686bfb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"bb84cbfbe5d9d49507d7974e356ab8da5c305b01cc63f0db5cb2d3e93c511dce", Pod:"calico-apiserver-5b89686bfb-smsq9", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic3bf61713f9", MAC:"32:cc:da:c5:cb:51", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:11:19.144423 containerd[1474]: 2025-03-25 01:11:19.138 [INFO][4676] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="bb84cbfbe5d9d49507d7974e356ab8da5c305b01cc63f0db5cb2d3e93c511dce" Namespace="calico-apiserver" Pod="calico-apiserver-5b89686bfb-smsq9" WorkloadEndpoint="localhost-k8s-calico--apiserver--5b89686bfb--smsq9-eth0" Mar 25 01:11:19.176691 kubelet[2591]: I0325 01:11:19.176652 2591 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:11:19.191290 kubelet[2591]: I0325 01:11:19.190580 2591 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-g7msf" podStartSLOduration=32.190559559 podStartE2EDuration="32.190559559s" podCreationTimestamp="2025-03-25 01:10:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:11:19.190221039 +0000 UTC m=+38.295198374" watchObservedRunningTime="2025-03-25 01:11:19.190559559 +0000 UTC m=+38.295536854" Mar 25 01:11:19.208664 containerd[1474]: time="2025-03-25T01:11:19.208618359Z" level=info msg="connecting to shim bb84cbfbe5d9d49507d7974e356ab8da5c305b01cc63f0db5cb2d3e93c511dce" address="unix:///run/containerd/s/41ac35548eb889a87bbad462ba5c8367c5b4828778135751bc82b68246a0d4d8" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:11:19.241702 systemd[1]: Started cri-containerd-bb84cbfbe5d9d49507d7974e356ab8da5c305b01cc63f0db5cb2d3e93c511dce.scope - libcontainer container bb84cbfbe5d9d49507d7974e356ab8da5c305b01cc63f0db5cb2d3e93c511dce. Mar 25 01:11:19.258273 systemd-resolved[1320]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 25 01:11:19.284269 containerd[1474]: time="2025-03-25T01:11:19.284228402Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5b89686bfb-smsq9,Uid:c3cbab45-79c8-427a-87cc-52cefa199441,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"bb84cbfbe5d9d49507d7974e356ab8da5c305b01cc63f0db5cb2d3e93c511dce\"" Mar 25 01:11:19.291076 containerd[1474]: time="2025-03-25T01:11:19.291035362Z" level=info msg="CreateContainer within sandbox \"bb84cbfbe5d9d49507d7974e356ab8da5c305b01cc63f0db5cb2d3e93c511dce\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 25 01:11:19.302250 containerd[1474]: time="2025-03-25T01:11:19.299255082Z" level=info msg="Container ce00914d2dabcfa12274d7b090bb785f64ce7798327dd6bd2adfc6ccc5c53fa0: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:11:19.309385 containerd[1474]: time="2025-03-25T01:11:19.309320443Z" level=info msg="CreateContainer within sandbox \"bb84cbfbe5d9d49507d7974e356ab8da5c305b01cc63f0db5cb2d3e93c511dce\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"ce00914d2dabcfa12274d7b090bb785f64ce7798327dd6bd2adfc6ccc5c53fa0\"" Mar 25 01:11:19.312667 containerd[1474]: time="2025-03-25T01:11:19.312626283Z" level=info msg="StartContainer for \"ce00914d2dabcfa12274d7b090bb785f64ce7798327dd6bd2adfc6ccc5c53fa0\"" Mar 25 01:11:19.313957 containerd[1474]: time="2025-03-25T01:11:19.313928723Z" level=info msg="connecting to shim ce00914d2dabcfa12274d7b090bb785f64ce7798327dd6bd2adfc6ccc5c53fa0" address="unix:///run/containerd/s/41ac35548eb889a87bbad462ba5c8367c5b4828778135751bc82b68246a0d4d8" protocol=ttrpc version=3 Mar 25 01:11:19.341017 systemd[1]: Started cri-containerd-ce00914d2dabcfa12274d7b090bb785f64ce7798327dd6bd2adfc6ccc5c53fa0.scope - libcontainer container ce00914d2dabcfa12274d7b090bb785f64ce7798327dd6bd2adfc6ccc5c53fa0. Mar 25 01:11:19.422994 containerd[1474]: time="2025-03-25T01:11:19.422880646Z" level=info msg="StartContainer for \"ce00914d2dabcfa12274d7b090bb785f64ce7798327dd6bd2adfc6ccc5c53fa0\" returns successfully" Mar 25 01:11:19.768344 containerd[1474]: time="2025-03-25T01:11:19.768287657Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:11:19.768804 containerd[1474]: time="2025-03-25T01:11:19.768719457Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.2: active requests=0, bytes read=32560257" Mar 25 01:11:19.770114 containerd[1474]: time="2025-03-25T01:11:19.770074217Z" level=info msg="ImageCreate event name:\"sha256:39a6e91a11a792441d34dccf5e11416a0fd297782f169fdb871a5558ad50b229\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:11:19.771890 containerd[1474]: time="2025-03-25T01:11:19.771849617Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:11:19.772668 containerd[1474]: time="2025-03-25T01:11:19.772630577Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" with image id \"sha256:39a6e91a11a792441d34dccf5e11416a0fd297782f169fdb871a5558ad50b229\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\", size \"33929982\" in 1.8244023s" Mar 25 01:11:19.772668 containerd[1474]: time="2025-03-25T01:11:19.772664657Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" returns image reference \"sha256:39a6e91a11a792441d34dccf5e11416a0fd297782f169fdb871a5558ad50b229\"" Mar 25 01:11:19.774300 containerd[1474]: time="2025-03-25T01:11:19.774272778Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\"" Mar 25 01:11:19.784345 containerd[1474]: time="2025-03-25T01:11:19.784303938Z" level=info msg="CreateContainer within sandbox \"7c43ad221398b670042a2e33ed68b660db67c7062d783779b951694076bd5dbe\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 25 01:11:19.793120 containerd[1474]: time="2025-03-25T01:11:19.793065778Z" level=info msg="Container dad2ed685a246b8d0c600274eb722e0a2e92617557d874457e85b1ad1e8b7a54: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:11:19.800182 containerd[1474]: time="2025-03-25T01:11:19.800049818Z" level=info msg="CreateContainer within sandbox \"7c43ad221398b670042a2e33ed68b660db67c7062d783779b951694076bd5dbe\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"dad2ed685a246b8d0c600274eb722e0a2e92617557d874457e85b1ad1e8b7a54\"" Mar 25 01:11:19.800756 containerd[1474]: time="2025-03-25T01:11:19.800731058Z" level=info msg="StartContainer for \"dad2ed685a246b8d0c600274eb722e0a2e92617557d874457e85b1ad1e8b7a54\"" Mar 25 01:11:19.801843 containerd[1474]: time="2025-03-25T01:11:19.801816578Z" level=info msg="connecting to shim dad2ed685a246b8d0c600274eb722e0a2e92617557d874457e85b1ad1e8b7a54" address="unix:///run/containerd/s/3fa4e9c88d2f0cf5b710ef18c6cfb1ea3671bb16ac1621c42feeb7b8ac389e52" protocol=ttrpc version=3 Mar 25 01:11:19.827067 systemd[1]: Started cri-containerd-dad2ed685a246b8d0c600274eb722e0a2e92617557d874457e85b1ad1e8b7a54.scope - libcontainer container dad2ed685a246b8d0c600274eb722e0a2e92617557d874457e85b1ad1e8b7a54. Mar 25 01:11:19.879514 containerd[1474]: time="2025-03-25T01:11:19.879471061Z" level=info msg="StartContainer for \"dad2ed685a246b8d0c600274eb722e0a2e92617557d874457e85b1ad1e8b7a54\" returns successfully" Mar 25 01:11:20.084803 systemd[1]: Started sshd@9-10.0.0.25:22-10.0.0.1:50266.service - OpenSSH per-connection server daemon (10.0.0.1:50266). Mar 25 01:11:20.158676 sshd[4834]: Accepted publickey for core from 10.0.0.1 port 50266 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:11:20.160323 sshd-session[4834]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:11:20.164872 systemd-logind[1458]: New session 10 of user core. Mar 25 01:11:20.172610 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 25 01:11:20.192458 kubelet[2591]: I0325 01:11:20.192396 2591 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-78fbfcd757-szsdv" podStartSLOduration=22.75822631 podStartE2EDuration="26.192380031s" podCreationTimestamp="2025-03-25 01:10:54 +0000 UTC" firstStartedPulling="2025-03-25 01:11:16.339476656 +0000 UTC m=+35.444453951" lastFinishedPulling="2025-03-25 01:11:19.773630337 +0000 UTC m=+38.878607672" observedRunningTime="2025-03-25 01:11:20.192065111 +0000 UTC m=+39.297042446" watchObservedRunningTime="2025-03-25 01:11:20.192380031 +0000 UTC m=+39.297357366" Mar 25 01:11:20.203567 systemd-networkd[1395]: caliab622c899c3: Gained IPv6LL Mar 25 01:11:20.390443 sshd[4837]: Connection closed by 10.0.0.1 port 50266 Mar 25 01:11:20.390988 sshd-session[4834]: pam_unix(sshd:session): session closed for user core Mar 25 01:11:20.394708 systemd-networkd[1395]: calic3bf61713f9: Gained IPv6LL Mar 25 01:11:20.398858 systemd[1]: sshd@9-10.0.0.25:22-10.0.0.1:50266.service: Deactivated successfully. Mar 25 01:11:20.400410 systemd[1]: session-10.scope: Deactivated successfully. Mar 25 01:11:20.401099 systemd-logind[1458]: Session 10 logged out. Waiting for processes to exit. Mar 25 01:11:20.402902 systemd[1]: Started sshd@10-10.0.0.25:22-10.0.0.1:50272.service - OpenSSH per-connection server daemon (10.0.0.1:50272). Mar 25 01:11:20.403825 systemd-logind[1458]: Removed session 10. Mar 25 01:11:20.462060 sshd[4854]: Accepted publickey for core from 10.0.0.1 port 50272 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:11:20.463349 sshd-session[4854]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:11:20.468349 systemd-logind[1458]: New session 11 of user core. Mar 25 01:11:20.477587 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 25 01:11:20.672729 sshd[4857]: Connection closed by 10.0.0.1 port 50272 Mar 25 01:11:20.672859 sshd-session[4854]: pam_unix(sshd:session): session closed for user core Mar 25 01:11:20.686248 systemd[1]: sshd@10-10.0.0.25:22-10.0.0.1:50272.service: Deactivated successfully. Mar 25 01:11:20.692202 systemd[1]: session-11.scope: Deactivated successfully. Mar 25 01:11:20.694474 systemd-logind[1458]: Session 11 logged out. Waiting for processes to exit. Mar 25 01:11:20.697717 systemd[1]: Started sshd@11-10.0.0.25:22-10.0.0.1:50284.service - OpenSSH per-connection server daemon (10.0.0.1:50284). Mar 25 01:11:20.699262 systemd-logind[1458]: Removed session 11. Mar 25 01:11:20.769288 sshd[4867]: Accepted publickey for core from 10.0.0.1 port 50284 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:11:20.771131 sshd-session[4867]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:11:20.777425 systemd-logind[1458]: New session 12 of user core. Mar 25 01:11:20.784713 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 25 01:11:20.941384 containerd[1474]: time="2025-03-25T01:11:20.940412773Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:11:20.941826 containerd[1474]: time="2025-03-25T01:11:20.940904653Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.2: active requests=0, bytes read=7473801" Mar 25 01:11:20.942608 containerd[1474]: time="2025-03-25T01:11:20.942576653Z" level=info msg="ImageCreate event name:\"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:11:20.945351 containerd[1474]: time="2025-03-25T01:11:20.944970213Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:11:20.945683 containerd[1474]: time="2025-03-25T01:11:20.945653933Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.2\" with image id \"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\", size \"8843558\" in 1.171347315s" Mar 25 01:11:20.945746 containerd[1474]: time="2025-03-25T01:11:20.945704093Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\" returns image reference \"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\"" Mar 25 01:11:20.948481 containerd[1474]: time="2025-03-25T01:11:20.948447373Z" level=info msg="CreateContainer within sandbox \"f43cb48eb9549f34b4ea6708be27c34b809e1d367c543549e8d6ca2c45a93efe\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 25 01:11:20.985165 containerd[1474]: time="2025-03-25T01:11:20.985115614Z" level=info msg="Container 263c55460c6861c061663057dd6880dcd90cf1aadace7fcc47bf41b22f702ee7: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:11:21.079569 sshd[4874]: Connection closed by 10.0.0.1 port 50284 Mar 25 01:11:21.080063 sshd-session[4867]: pam_unix(sshd:session): session closed for user core Mar 25 01:11:21.083280 systemd[1]: sshd@11-10.0.0.25:22-10.0.0.1:50284.service: Deactivated successfully. Mar 25 01:11:21.085323 systemd[1]: session-12.scope: Deactivated successfully. Mar 25 01:11:21.086926 systemd-logind[1458]: Session 12 logged out. Waiting for processes to exit. Mar 25 01:11:21.087983 systemd-logind[1458]: Removed session 12. Mar 25 01:11:21.142853 containerd[1474]: time="2025-03-25T01:11:21.142740499Z" level=info msg="CreateContainer within sandbox \"f43cb48eb9549f34b4ea6708be27c34b809e1d367c543549e8d6ca2c45a93efe\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"263c55460c6861c061663057dd6880dcd90cf1aadace7fcc47bf41b22f702ee7\"" Mar 25 01:11:21.143418 containerd[1474]: time="2025-03-25T01:11:21.143333939Z" level=info msg="StartContainer for \"263c55460c6861c061663057dd6880dcd90cf1aadace7fcc47bf41b22f702ee7\"" Mar 25 01:11:21.144901 containerd[1474]: time="2025-03-25T01:11:21.144869779Z" level=info msg="connecting to shim 263c55460c6861c061663057dd6880dcd90cf1aadace7fcc47bf41b22f702ee7" address="unix:///run/containerd/s/b9c90d76492ed997bebfc5a32b5b10d391c4cb0952f0a821dde5b77e6f082b72" protocol=ttrpc version=3 Mar 25 01:11:21.161576 systemd[1]: Started cri-containerd-263c55460c6861c061663057dd6880dcd90cf1aadace7fcc47bf41b22f702ee7.scope - libcontainer container 263c55460c6861c061663057dd6880dcd90cf1aadace7fcc47bf41b22f702ee7. Mar 25 01:11:21.190482 kubelet[2591]: I0325 01:11:21.189618 2591 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:11:21.191051 kubelet[2591]: I0325 01:11:21.190700 2591 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:11:21.195098 containerd[1474]: time="2025-03-25T01:11:21.194949820Z" level=info msg="StartContainer for \"263c55460c6861c061663057dd6880dcd90cf1aadace7fcc47bf41b22f702ee7\" returns successfully" Mar 25 01:11:21.196164 containerd[1474]: time="2025-03-25T01:11:21.195954300Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\"" Mar 25 01:11:21.293307 containerd[1474]: time="2025-03-25T01:11:21.293266503Z" level=info msg="TaskExit event in podsandbox handler container_id:\"dad2ed685a246b8d0c600274eb722e0a2e92617557d874457e85b1ad1e8b7a54\" id:\"2b9aadc956aa226c3159fe6adcea439c02370445cdcd6a0b0cd0aa0b9b49c43a\" pid:4933 exited_at:{seconds:1742865081 nanos:292852903}" Mar 25 01:11:21.316302 kubelet[2591]: I0325 01:11:21.314937 2591 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5b89686bfb-smsq9" podStartSLOduration=28.314920664 podStartE2EDuration="28.314920664s" podCreationTimestamp="2025-03-25 01:10:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:11:20.203449031 +0000 UTC m=+39.308426366" watchObservedRunningTime="2025-03-25 01:11:21.314920664 +0000 UTC m=+40.419897999" Mar 25 01:11:21.363458 containerd[1474]: time="2025-03-25T01:11:21.363392825Z" level=info msg="TaskExit event in podsandbox handler container_id:\"dad2ed685a246b8d0c600274eb722e0a2e92617557d874457e85b1ad1e8b7a54\" id:\"e42106fe1df8947e8e2c6fe8acbdb2cb120e5a4ede7de6e58834650c5573cce7\" pid:4957 exited_at:{seconds:1742865081 nanos:363162705}" Mar 25 01:11:22.527254 containerd[1474]: time="2025-03-25T01:11:22.526595217Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:11:22.527254 containerd[1474]: time="2025-03-25T01:11:22.527201617Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2: active requests=0, bytes read=13121717" Mar 25 01:11:22.527883 containerd[1474]: time="2025-03-25T01:11:22.527853497Z" level=info msg="ImageCreate event name:\"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:11:22.529592 containerd[1474]: time="2025-03-25T01:11:22.529544457Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:11:22.530247 containerd[1474]: time="2025-03-25T01:11:22.530215057Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" with image id \"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\", size \"14491426\" in 1.334227117s" Mar 25 01:11:22.530303 containerd[1474]: time="2025-03-25T01:11:22.530248977Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" returns image reference \"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\"" Mar 25 01:11:22.532246 containerd[1474]: time="2025-03-25T01:11:22.532199057Z" level=info msg="CreateContainer within sandbox \"f43cb48eb9549f34b4ea6708be27c34b809e1d367c543549e8d6ca2c45a93efe\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 25 01:11:22.542454 containerd[1474]: time="2025-03-25T01:11:22.539749697Z" level=info msg="Container e8a08886a2a9fc7e26758b9795473e0c703d84cc802fd00dffc511527851b865: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:11:22.543071 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount62861975.mount: Deactivated successfully. Mar 25 01:11:22.552730 containerd[1474]: time="2025-03-25T01:11:22.552697578Z" level=info msg="CreateContainer within sandbox \"f43cb48eb9549f34b4ea6708be27c34b809e1d367c543549e8d6ca2c45a93efe\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"e8a08886a2a9fc7e26758b9795473e0c703d84cc802fd00dffc511527851b865\"" Mar 25 01:11:22.553232 containerd[1474]: time="2025-03-25T01:11:22.553207578Z" level=info msg="StartContainer for \"e8a08886a2a9fc7e26758b9795473e0c703d84cc802fd00dffc511527851b865\"" Mar 25 01:11:22.554832 containerd[1474]: time="2025-03-25T01:11:22.554802738Z" level=info msg="connecting to shim e8a08886a2a9fc7e26758b9795473e0c703d84cc802fd00dffc511527851b865" address="unix:///run/containerd/s/b9c90d76492ed997bebfc5a32b5b10d391c4cb0952f0a821dde5b77e6f082b72" protocol=ttrpc version=3 Mar 25 01:11:22.577585 systemd[1]: Started cri-containerd-e8a08886a2a9fc7e26758b9795473e0c703d84cc802fd00dffc511527851b865.scope - libcontainer container e8a08886a2a9fc7e26758b9795473e0c703d84cc802fd00dffc511527851b865. Mar 25 01:11:22.636777 containerd[1474]: time="2025-03-25T01:11:22.636738100Z" level=info msg="StartContainer for \"e8a08886a2a9fc7e26758b9795473e0c703d84cc802fd00dffc511527851b865\" returns successfully" Mar 25 01:11:23.064027 kubelet[2591]: I0325 01:11:23.063942 2591 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 25 01:11:23.065649 kubelet[2591]: I0325 01:11:23.065531 2591 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 25 01:11:23.214084 kubelet[2591]: I0325 01:11:23.213822 2591 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-rm4dd" podStartSLOduration=23.100226477 podStartE2EDuration="29.213805915s" podCreationTimestamp="2025-03-25 01:10:54 +0000 UTC" firstStartedPulling="2025-03-25 01:11:16.417385299 +0000 UTC m=+35.522362634" lastFinishedPulling="2025-03-25 01:11:22.530964737 +0000 UTC m=+41.635942072" observedRunningTime="2025-03-25 01:11:23.210208475 +0000 UTC m=+42.315185890" watchObservedRunningTime="2025-03-25 01:11:23.213805915 +0000 UTC m=+42.318783210" Mar 25 01:11:24.288525 kubelet[2591]: I0325 01:11:24.287297 2591 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:11:26.094456 systemd[1]: Started sshd@12-10.0.0.25:22-10.0.0.1:55136.service - OpenSSH per-connection server daemon (10.0.0.1:55136). Mar 25 01:11:26.170593 sshd[5023]: Accepted publickey for core from 10.0.0.1 port 55136 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:11:26.171950 sshd-session[5023]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:11:26.176206 systemd-logind[1458]: New session 13 of user core. Mar 25 01:11:26.182652 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 25 01:11:26.382219 sshd[5025]: Connection closed by 10.0.0.1 port 55136 Mar 25 01:11:26.381912 sshd-session[5023]: pam_unix(sshd:session): session closed for user core Mar 25 01:11:26.384904 systemd[1]: sshd@12-10.0.0.25:22-10.0.0.1:55136.service: Deactivated successfully. Mar 25 01:11:26.386601 systemd[1]: session-13.scope: Deactivated successfully. Mar 25 01:11:26.387846 systemd-logind[1458]: Session 13 logged out. Waiting for processes to exit. Mar 25 01:11:26.389102 systemd-logind[1458]: Removed session 13. Mar 25 01:11:27.090064 kubelet[2591]: I0325 01:11:27.089826 2591 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:11:31.394035 systemd[1]: Started sshd@13-10.0.0.25:22-10.0.0.1:55142.service - OpenSSH per-connection server daemon (10.0.0.1:55142). Mar 25 01:11:31.449811 sshd[5050]: Accepted publickey for core from 10.0.0.1 port 55142 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:11:31.451049 sshd-session[5050]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:11:31.455569 systemd-logind[1458]: New session 14 of user core. Mar 25 01:11:31.460750 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 25 01:11:31.625884 sshd[5052]: Connection closed by 10.0.0.1 port 55142 Mar 25 01:11:31.626222 sshd-session[5050]: pam_unix(sshd:session): session closed for user core Mar 25 01:11:31.629354 systemd[1]: sshd@13-10.0.0.25:22-10.0.0.1:55142.service: Deactivated successfully. Mar 25 01:11:31.631370 systemd[1]: session-14.scope: Deactivated successfully. Mar 25 01:11:31.632096 systemd-logind[1458]: Session 14 logged out. Waiting for processes to exit. Mar 25 01:11:31.633069 systemd-logind[1458]: Removed session 14. Mar 25 01:11:32.257165 containerd[1474]: time="2025-03-25T01:11:32.257094528Z" level=info msg="TaskExit event in podsandbox handler container_id:\"dad2ed685a246b8d0c600274eb722e0a2e92617557d874457e85b1ad1e8b7a54\" id:\"fff71cd9e3e54afb8fda0aab88d2909217d92ea73e6ea4f2c59cd7afea38a06a\" pid:5077 exited_at:{seconds:1742865092 nanos:256720368}" Mar 25 01:11:36.637771 systemd[1]: Started sshd@14-10.0.0.25:22-10.0.0.1:41810.service - OpenSSH per-connection server daemon (10.0.0.1:41810). Mar 25 01:11:36.704294 sshd[5088]: Accepted publickey for core from 10.0.0.1 port 41810 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:11:36.705525 sshd-session[5088]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:11:36.709671 systemd-logind[1458]: New session 15 of user core. Mar 25 01:11:36.723628 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 25 01:11:36.860383 sshd[5090]: Connection closed by 10.0.0.1 port 41810 Mar 25 01:11:36.862067 sshd-session[5088]: pam_unix(sshd:session): session closed for user core Mar 25 01:11:36.864733 systemd[1]: sshd@14-10.0.0.25:22-10.0.0.1:41810.service: Deactivated successfully. Mar 25 01:11:36.866482 systemd[1]: session-15.scope: Deactivated successfully. Mar 25 01:11:36.868183 systemd-logind[1458]: Session 15 logged out. Waiting for processes to exit. Mar 25 01:11:36.869079 systemd-logind[1458]: Removed session 15. Mar 25 01:11:40.188803 containerd[1474]: time="2025-03-25T01:11:40.188709015Z" level=info msg="TaskExit event in podsandbox handler container_id:\"06e1ac1023f644a7847fedf3fc8f36997e3bc229b505ba0373ded5a7099d4341\" id:\"91eee64f0a6f02fd19f70d04f71f9abecdf4d187b7d86229de560aa94b5619a2\" pid:5125 exited_at:{seconds:1742865100 nanos:188412695}" Mar 25 01:11:41.875815 systemd[1]: Started sshd@15-10.0.0.25:22-10.0.0.1:41812.service - OpenSSH per-connection server daemon (10.0.0.1:41812). Mar 25 01:11:41.948546 sshd[5141]: Accepted publickey for core from 10.0.0.1 port 41812 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:11:41.949869 sshd-session[5141]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:11:41.954004 systemd-logind[1458]: New session 16 of user core. Mar 25 01:11:41.961613 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 25 01:11:42.143205 sshd[5143]: Connection closed by 10.0.0.1 port 41812 Mar 25 01:11:42.143651 sshd-session[5141]: pam_unix(sshd:session): session closed for user core Mar 25 01:11:42.151672 systemd[1]: sshd@15-10.0.0.25:22-10.0.0.1:41812.service: Deactivated successfully. Mar 25 01:11:42.153420 systemd[1]: session-16.scope: Deactivated successfully. Mar 25 01:11:42.154894 systemd-logind[1458]: Session 16 logged out. Waiting for processes to exit. Mar 25 01:11:42.157393 systemd[1]: Started sshd@16-10.0.0.25:22-10.0.0.1:41822.service - OpenSSH per-connection server daemon (10.0.0.1:41822). Mar 25 01:11:42.158854 systemd-logind[1458]: Removed session 16. Mar 25 01:11:42.224909 sshd[5155]: Accepted publickey for core from 10.0.0.1 port 41822 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:11:42.226209 sshd-session[5155]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:11:42.233951 systemd-logind[1458]: New session 17 of user core. Mar 25 01:11:42.242611 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 25 01:11:42.507340 sshd[5158]: Connection closed by 10.0.0.1 port 41822 Mar 25 01:11:42.507723 sshd-session[5155]: pam_unix(sshd:session): session closed for user core Mar 25 01:11:42.524787 systemd[1]: sshd@16-10.0.0.25:22-10.0.0.1:41822.service: Deactivated successfully. Mar 25 01:11:42.528099 systemd[1]: session-17.scope: Deactivated successfully. Mar 25 01:11:42.529462 systemd-logind[1458]: Session 17 logged out. Waiting for processes to exit. Mar 25 01:11:42.531819 systemd[1]: Started sshd@17-10.0.0.25:22-10.0.0.1:45922.service - OpenSSH per-connection server daemon (10.0.0.1:45922). Mar 25 01:11:42.532931 systemd-logind[1458]: Removed session 17. Mar 25 01:11:42.589476 sshd[5169]: Accepted publickey for core from 10.0.0.1 port 45922 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:11:42.590840 sshd-session[5169]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:11:42.595812 systemd-logind[1458]: New session 18 of user core. Mar 25 01:11:42.606593 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 25 01:11:43.314409 sshd[5172]: Connection closed by 10.0.0.1 port 45922 Mar 25 01:11:43.316042 sshd-session[5169]: pam_unix(sshd:session): session closed for user core Mar 25 01:11:43.329300 systemd[1]: Started sshd@18-10.0.0.25:22-10.0.0.1:45926.service - OpenSSH per-connection server daemon (10.0.0.1:45926). Mar 25 01:11:43.331793 systemd[1]: sshd@17-10.0.0.25:22-10.0.0.1:45922.service: Deactivated successfully. Mar 25 01:11:43.335553 systemd[1]: session-18.scope: Deactivated successfully. Mar 25 01:11:43.341860 systemd-logind[1458]: Session 18 logged out. Waiting for processes to exit. Mar 25 01:11:43.346156 systemd-logind[1458]: Removed session 18. Mar 25 01:11:43.400864 sshd[5187]: Accepted publickey for core from 10.0.0.1 port 45926 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:11:43.402074 sshd-session[5187]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:11:43.407567 systemd-logind[1458]: New session 19 of user core. Mar 25 01:11:43.421606 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 25 01:11:43.748404 sshd[5192]: Connection closed by 10.0.0.1 port 45926 Mar 25 01:11:43.748611 sshd-session[5187]: pam_unix(sshd:session): session closed for user core Mar 25 01:11:43.764890 systemd[1]: sshd@18-10.0.0.25:22-10.0.0.1:45926.service: Deactivated successfully. Mar 25 01:11:43.766518 systemd[1]: session-19.scope: Deactivated successfully. Mar 25 01:11:43.767577 systemd-logind[1458]: Session 19 logged out. Waiting for processes to exit. Mar 25 01:11:43.769326 systemd[1]: Started sshd@19-10.0.0.25:22-10.0.0.1:45942.service - OpenSSH per-connection server daemon (10.0.0.1:45942). Mar 25 01:11:43.771241 systemd-logind[1458]: Removed session 19. Mar 25 01:11:43.827259 sshd[5203]: Accepted publickey for core from 10.0.0.1 port 45942 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:11:43.828637 sshd-session[5203]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:11:43.834457 systemd-logind[1458]: New session 20 of user core. Mar 25 01:11:43.840672 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 25 01:11:43.966204 sshd[5206]: Connection closed by 10.0.0.1 port 45942 Mar 25 01:11:43.966573 sshd-session[5203]: pam_unix(sshd:session): session closed for user core Mar 25 01:11:43.969518 systemd[1]: sshd@19-10.0.0.25:22-10.0.0.1:45942.service: Deactivated successfully. Mar 25 01:11:43.972162 systemd[1]: session-20.scope: Deactivated successfully. Mar 25 01:11:43.973844 systemd-logind[1458]: Session 20 logged out. Waiting for processes to exit. Mar 25 01:11:43.975011 systemd-logind[1458]: Removed session 20. Mar 25 01:11:48.981945 systemd[1]: Started sshd@20-10.0.0.25:22-10.0.0.1:45956.service - OpenSSH per-connection server daemon (10.0.0.1:45956). Mar 25 01:11:49.039500 sshd[5230]: Accepted publickey for core from 10.0.0.1 port 45956 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:11:49.040701 sshd-session[5230]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:11:49.045264 systemd-logind[1458]: New session 21 of user core. Mar 25 01:11:49.055610 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 25 01:11:49.184201 sshd[5232]: Connection closed by 10.0.0.1 port 45956 Mar 25 01:11:49.184793 sshd-session[5230]: pam_unix(sshd:session): session closed for user core Mar 25 01:11:49.188175 systemd[1]: sshd@20-10.0.0.25:22-10.0.0.1:45956.service: Deactivated successfully. Mar 25 01:11:49.190046 systemd[1]: session-21.scope: Deactivated successfully. Mar 25 01:11:49.190812 systemd-logind[1458]: Session 21 logged out. Waiting for processes to exit. Mar 25 01:11:49.191795 systemd-logind[1458]: Removed session 21. Mar 25 01:11:51.340744 containerd[1474]: time="2025-03-25T01:11:51.340690024Z" level=info msg="TaskExit event in podsandbox handler container_id:\"dad2ed685a246b8d0c600274eb722e0a2e92617557d874457e85b1ad1e8b7a54\" id:\"cdb0720e65861c74ec169498f605a194fa1e29f3ae4cb1db0e310418ccacf890\" pid:5256 exited_at:{seconds:1742865111 nanos:340347547}" Mar 25 01:11:54.196033 systemd[1]: Started sshd@21-10.0.0.25:22-10.0.0.1:50506.service - OpenSSH per-connection server daemon (10.0.0.1:50506). Mar 25 01:11:54.247438 sshd[5267]: Accepted publickey for core from 10.0.0.1 port 50506 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:11:54.248663 sshd-session[5267]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:11:54.252338 systemd-logind[1458]: New session 22 of user core. Mar 25 01:11:54.266575 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 25 01:11:54.395585 sshd[5269]: Connection closed by 10.0.0.1 port 50506 Mar 25 01:11:54.396052 sshd-session[5267]: pam_unix(sshd:session): session closed for user core Mar 25 01:11:54.400370 systemd[1]: sshd@21-10.0.0.25:22-10.0.0.1:50506.service: Deactivated successfully. Mar 25 01:11:54.405008 systemd[1]: session-22.scope: Deactivated successfully. Mar 25 01:11:54.407376 systemd-logind[1458]: Session 22 logged out. Waiting for processes to exit. Mar 25 01:11:54.408352 systemd-logind[1458]: Removed session 22. Mar 25 01:11:59.413025 systemd[1]: Started sshd@22-10.0.0.25:22-10.0.0.1:50520.service - OpenSSH per-connection server daemon (10.0.0.1:50520). Mar 25 01:11:59.486683 sshd[5293]: Accepted publickey for core from 10.0.0.1 port 50520 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:11:59.488399 sshd-session[5293]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:11:59.493991 systemd-logind[1458]: New session 23 of user core. Mar 25 01:11:59.503564 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 25 01:11:59.701104 sshd[5295]: Connection closed by 10.0.0.1 port 50520 Mar 25 01:11:59.701735 sshd-session[5293]: pam_unix(sshd:session): session closed for user core Mar 25 01:11:59.705357 systemd[1]: sshd@22-10.0.0.25:22-10.0.0.1:50520.service: Deactivated successfully. Mar 25 01:11:59.707960 systemd[1]: session-23.scope: Deactivated successfully. Mar 25 01:11:59.709053 systemd-logind[1458]: Session 23 logged out. Waiting for processes to exit. Mar 25 01:11:59.709843 systemd-logind[1458]: Removed session 23.