Mar 25 01:05:43.903751 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Mar 25 01:05:43.903775 kernel: Linux version 6.6.83-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT Mon Mar 24 23:39:14 -00 2025 Mar 25 01:05:43.903786 kernel: KASLR enabled Mar 25 01:05:43.903792 kernel: efi: EFI v2.7 by EDK II Mar 25 01:05:43.903798 kernel: efi: SMBIOS 3.0=0xdced0000 MEMATTR=0xdbbae018 ACPI 2.0=0xd9b43018 RNG=0xd9b43a18 MEMRESERVE=0xd9b40218 Mar 25 01:05:43.903804 kernel: random: crng init done Mar 25 01:05:43.903811 kernel: secureboot: Secure boot disabled Mar 25 01:05:43.903817 kernel: ACPI: Early table checksum verification disabled Mar 25 01:05:43.903823 kernel: ACPI: RSDP 0x00000000D9B43018 000024 (v02 BOCHS ) Mar 25 01:05:43.903831 kernel: ACPI: XSDT 0x00000000D9B43F18 000064 (v01 BOCHS BXPC 00000001 01000013) Mar 25 01:05:43.903838 kernel: ACPI: FACP 0x00000000D9B43B18 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:05:43.903844 kernel: ACPI: DSDT 0x00000000D9B41018 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:05:43.903850 kernel: ACPI: APIC 0x00000000D9B43C98 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:05:43.903856 kernel: ACPI: PPTT 0x00000000D9B43098 00009C (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:05:43.903863 kernel: ACPI: GTDT 0x00000000D9B43818 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:05:43.903871 kernel: ACPI: MCFG 0x00000000D9B43A98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:05:43.903878 kernel: ACPI: SPCR 0x00000000D9B43918 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:05:43.903884 kernel: ACPI: DBG2 0x00000000D9B43998 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:05:43.903891 kernel: ACPI: IORT 0x00000000D9B43198 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:05:43.903897 kernel: ACPI: SPCR: console: pl011,mmio,0x9000000,9600 Mar 25 01:05:43.903903 kernel: NUMA: Failed to initialise from firmware Mar 25 01:05:43.903910 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x00000000dcffffff] Mar 25 01:05:43.903916 kernel: NUMA: NODE_DATA [mem 0xdc958800-0xdc95dfff] Mar 25 01:05:43.903922 kernel: Zone ranges: Mar 25 01:05:43.903929 kernel: DMA [mem 0x0000000040000000-0x00000000dcffffff] Mar 25 01:05:43.903936 kernel: DMA32 empty Mar 25 01:05:43.903943 kernel: Normal empty Mar 25 01:05:43.903949 kernel: Movable zone start for each node Mar 25 01:05:43.903955 kernel: Early memory node ranges Mar 25 01:05:43.903961 kernel: node 0: [mem 0x0000000040000000-0x00000000d967ffff] Mar 25 01:05:43.903968 kernel: node 0: [mem 0x00000000d9680000-0x00000000d968ffff] Mar 25 01:05:43.903974 kernel: node 0: [mem 0x00000000d9690000-0x00000000d976ffff] Mar 25 01:05:43.903980 kernel: node 0: [mem 0x00000000d9770000-0x00000000d9b3ffff] Mar 25 01:05:43.903987 kernel: node 0: [mem 0x00000000d9b40000-0x00000000dce1ffff] Mar 25 01:05:43.903993 kernel: node 0: [mem 0x00000000dce20000-0x00000000dceaffff] Mar 25 01:05:43.903999 kernel: node 0: [mem 0x00000000dceb0000-0x00000000dcebffff] Mar 25 01:05:43.904006 kernel: node 0: [mem 0x00000000dcec0000-0x00000000dcfdffff] Mar 25 01:05:43.904014 kernel: node 0: [mem 0x00000000dcfe0000-0x00000000dcffffff] Mar 25 01:05:43.904020 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x00000000dcffffff] Mar 25 01:05:43.904026 kernel: On node 0, zone DMA: 12288 pages in unavailable ranges Mar 25 01:05:43.904036 kernel: psci: probing for conduit method from ACPI. Mar 25 01:05:43.904043 kernel: psci: PSCIv1.1 detected in firmware. Mar 25 01:05:43.904050 kernel: psci: Using standard PSCI v0.2 function IDs Mar 25 01:05:43.904058 kernel: psci: Trusted OS migration not required Mar 25 01:05:43.904092 kernel: psci: SMC Calling Convention v1.1 Mar 25 01:05:43.904100 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Mar 25 01:05:43.904115 kernel: percpu: Embedded 31 pages/cpu s86696 r8192 d32088 u126976 Mar 25 01:05:43.904122 kernel: pcpu-alloc: s86696 r8192 d32088 u126976 alloc=31*4096 Mar 25 01:05:43.904129 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Mar 25 01:05:43.904136 kernel: Detected PIPT I-cache on CPU0 Mar 25 01:05:43.904143 kernel: CPU features: detected: GIC system register CPU interface Mar 25 01:05:43.904150 kernel: CPU features: detected: Hardware dirty bit management Mar 25 01:05:43.904157 kernel: CPU features: detected: Spectre-v4 Mar 25 01:05:43.904166 kernel: CPU features: detected: Spectre-BHB Mar 25 01:05:43.904173 kernel: CPU features: kernel page table isolation forced ON by KASLR Mar 25 01:05:43.904180 kernel: CPU features: detected: Kernel page table isolation (KPTI) Mar 25 01:05:43.904186 kernel: CPU features: detected: ARM erratum 1418040 Mar 25 01:05:43.904193 kernel: CPU features: detected: SSBS not fully self-synchronizing Mar 25 01:05:43.904200 kernel: alternatives: applying boot alternatives Mar 25 01:05:43.904208 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=b84e5f613acd6cd0a8a878f32f5653a14f2e6fb2820997fecd5b2bd33a4ba3ab Mar 25 01:05:43.904216 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 25 01:05:43.904222 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 25 01:05:43.904230 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 25 01:05:43.904237 kernel: Fallback order for Node 0: 0 Mar 25 01:05:43.904245 kernel: Built 1 zonelists, mobility grouping on. Total pages: 633024 Mar 25 01:05:43.904252 kernel: Policy zone: DMA Mar 25 01:05:43.904259 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 25 01:05:43.904266 kernel: software IO TLB: area num 4. Mar 25 01:05:43.904273 kernel: software IO TLB: mapped [mem 0x00000000d2e00000-0x00000000d6e00000] (64MB) Mar 25 01:05:43.904280 kernel: Memory: 2387412K/2572288K available (10304K kernel code, 2186K rwdata, 8096K rodata, 38464K init, 897K bss, 184876K reserved, 0K cma-reserved) Mar 25 01:05:43.904287 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Mar 25 01:05:43.904294 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 25 01:05:43.904301 kernel: rcu: RCU event tracing is enabled. Mar 25 01:05:43.904308 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Mar 25 01:05:43.904315 kernel: Trampoline variant of Tasks RCU enabled. Mar 25 01:05:43.904330 kernel: Tracing variant of Tasks RCU enabled. Mar 25 01:05:43.904340 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 25 01:05:43.904347 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Mar 25 01:05:43.904354 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Mar 25 01:05:43.904361 kernel: GICv3: 256 SPIs implemented Mar 25 01:05:43.904367 kernel: GICv3: 0 Extended SPIs implemented Mar 25 01:05:43.904374 kernel: Root IRQ handler: gic_handle_irq Mar 25 01:05:43.904381 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Mar 25 01:05:43.904387 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Mar 25 01:05:43.904394 kernel: ITS [mem 0x08080000-0x0809ffff] Mar 25 01:05:43.904401 kernel: ITS@0x0000000008080000: allocated 8192 Devices @400c0000 (indirect, esz 8, psz 64K, shr 1) Mar 25 01:05:43.904408 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @400d0000 (flat, esz 8, psz 64K, shr 1) Mar 25 01:05:43.904416 kernel: GICv3: using LPI property table @0x00000000400f0000 Mar 25 01:05:43.904423 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000040100000 Mar 25 01:05:43.904430 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 25 01:05:43.904437 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 25 01:05:43.904444 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Mar 25 01:05:43.904451 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Mar 25 01:05:43.904458 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Mar 25 01:05:43.904465 kernel: arm-pv: using stolen time PV Mar 25 01:05:43.904471 kernel: Console: colour dummy device 80x25 Mar 25 01:05:43.904479 kernel: ACPI: Core revision 20230628 Mar 25 01:05:43.904486 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Mar 25 01:05:43.904494 kernel: pid_max: default: 32768 minimum: 301 Mar 25 01:05:43.904501 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 25 01:05:43.904508 kernel: landlock: Up and running. Mar 25 01:05:43.904515 kernel: SELinux: Initializing. Mar 25 01:05:43.904527 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 25 01:05:43.904534 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 25 01:05:43.904541 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Mar 25 01:05:43.904548 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Mar 25 01:05:43.904555 kernel: rcu: Hierarchical SRCU implementation. Mar 25 01:05:43.904564 kernel: rcu: Max phase no-delay instances is 400. Mar 25 01:05:43.904571 kernel: Platform MSI: ITS@0x8080000 domain created Mar 25 01:05:43.904578 kernel: PCI/MSI: ITS@0x8080000 domain created Mar 25 01:05:43.904585 kernel: Remapping and enabling EFI services. Mar 25 01:05:43.904591 kernel: smp: Bringing up secondary CPUs ... Mar 25 01:05:43.904598 kernel: Detected PIPT I-cache on CPU1 Mar 25 01:05:43.904605 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Mar 25 01:05:43.904613 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000040110000 Mar 25 01:05:43.904620 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 25 01:05:43.904628 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Mar 25 01:05:43.904636 kernel: Detected PIPT I-cache on CPU2 Mar 25 01:05:43.904648 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Mar 25 01:05:43.904658 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000040120000 Mar 25 01:05:43.904665 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 25 01:05:43.904672 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Mar 25 01:05:43.904680 kernel: Detected PIPT I-cache on CPU3 Mar 25 01:05:43.904688 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Mar 25 01:05:43.904696 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000040130000 Mar 25 01:05:43.904705 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 25 01:05:43.904712 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Mar 25 01:05:43.904719 kernel: smp: Brought up 1 node, 4 CPUs Mar 25 01:05:43.904726 kernel: SMP: Total of 4 processors activated. Mar 25 01:05:43.904734 kernel: CPU features: detected: 32-bit EL0 Support Mar 25 01:05:43.904741 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Mar 25 01:05:43.904749 kernel: CPU features: detected: Common not Private translations Mar 25 01:05:43.904756 kernel: CPU features: detected: CRC32 instructions Mar 25 01:05:43.904765 kernel: CPU features: detected: Enhanced Virtualization Traps Mar 25 01:05:43.904772 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Mar 25 01:05:43.904779 kernel: CPU features: detected: LSE atomic instructions Mar 25 01:05:43.904787 kernel: CPU features: detected: Privileged Access Never Mar 25 01:05:43.904794 kernel: CPU features: detected: RAS Extension Support Mar 25 01:05:43.904802 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Mar 25 01:05:43.904809 kernel: CPU: All CPU(s) started at EL1 Mar 25 01:05:43.904816 kernel: alternatives: applying system-wide alternatives Mar 25 01:05:43.904824 kernel: devtmpfs: initialized Mar 25 01:05:43.904831 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 25 01:05:43.904840 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Mar 25 01:05:43.904847 kernel: pinctrl core: initialized pinctrl subsystem Mar 25 01:05:43.904854 kernel: SMBIOS 3.0.0 present. Mar 25 01:05:43.904862 kernel: DMI: QEMU KVM Virtual Machine, BIOS unknown 02/02/2022 Mar 25 01:05:43.904869 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 25 01:05:43.904877 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Mar 25 01:05:43.904884 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Mar 25 01:05:43.904892 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Mar 25 01:05:43.904901 kernel: audit: initializing netlink subsys (disabled) Mar 25 01:05:43.904909 kernel: audit: type=2000 audit(0.018:1): state=initialized audit_enabled=0 res=1 Mar 25 01:05:43.904916 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 25 01:05:43.904924 kernel: cpuidle: using governor menu Mar 25 01:05:43.904931 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Mar 25 01:05:43.904938 kernel: ASID allocator initialised with 32768 entries Mar 25 01:05:43.904946 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 25 01:05:43.904953 kernel: Serial: AMBA PL011 UART driver Mar 25 01:05:43.904960 kernel: Modules: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Mar 25 01:05:43.904968 kernel: Modules: 0 pages in range for non-PLT usage Mar 25 01:05:43.904977 kernel: Modules: 509248 pages in range for PLT usage Mar 25 01:05:43.904984 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 25 01:05:43.904992 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Mar 25 01:05:43.904999 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Mar 25 01:05:43.905007 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Mar 25 01:05:43.905014 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 25 01:05:43.905022 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Mar 25 01:05:43.905029 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Mar 25 01:05:43.905037 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Mar 25 01:05:43.905045 kernel: ACPI: Added _OSI(Module Device) Mar 25 01:05:43.905053 kernel: ACPI: Added _OSI(Processor Device) Mar 25 01:05:43.905060 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 25 01:05:43.905067 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 25 01:05:43.905075 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 25 01:05:43.905082 kernel: ACPI: Interpreter enabled Mar 25 01:05:43.905089 kernel: ACPI: Using GIC for interrupt routing Mar 25 01:05:43.905097 kernel: ACPI: MCFG table detected, 1 entries Mar 25 01:05:43.905111 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Mar 25 01:05:43.905122 kernel: printk: console [ttyAMA0] enabled Mar 25 01:05:43.905129 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 25 01:05:43.905781 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 25 01:05:43.905866 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Mar 25 01:05:43.905937 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Mar 25 01:05:43.906017 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Mar 25 01:05:43.906085 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Mar 25 01:05:43.906100 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Mar 25 01:05:43.906120 kernel: PCI host bridge to bus 0000:00 Mar 25 01:05:43.906215 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Mar 25 01:05:43.906283 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Mar 25 01:05:43.906361 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Mar 25 01:05:43.906426 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 25 01:05:43.906512 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 Mar 25 01:05:43.906606 kernel: pci 0000:00:01.0: [1af4:1005] type 00 class 0x00ff00 Mar 25 01:05:43.906682 kernel: pci 0000:00:01.0: reg 0x10: [io 0x0000-0x001f] Mar 25 01:05:43.906758 kernel: pci 0000:00:01.0: reg 0x14: [mem 0x10000000-0x10000fff] Mar 25 01:05:43.906828 kernel: pci 0000:00:01.0: reg 0x20: [mem 0x8000000000-0x8000003fff 64bit pref] Mar 25 01:05:43.906988 kernel: pci 0000:00:01.0: BAR 4: assigned [mem 0x8000000000-0x8000003fff 64bit pref] Mar 25 01:05:43.907073 kernel: pci 0000:00:01.0: BAR 1: assigned [mem 0x10000000-0x10000fff] Mar 25 01:05:43.907177 kernel: pci 0000:00:01.0: BAR 0: assigned [io 0x1000-0x101f] Mar 25 01:05:43.907252 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Mar 25 01:05:43.907314 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Mar 25 01:05:43.907391 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Mar 25 01:05:43.907402 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Mar 25 01:05:43.907409 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Mar 25 01:05:43.907417 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Mar 25 01:05:43.907425 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Mar 25 01:05:43.907435 kernel: iommu: Default domain type: Translated Mar 25 01:05:43.907443 kernel: iommu: DMA domain TLB invalidation policy: strict mode Mar 25 01:05:43.907450 kernel: efivars: Registered efivars operations Mar 25 01:05:43.907458 kernel: vgaarb: loaded Mar 25 01:05:43.907465 kernel: clocksource: Switched to clocksource arch_sys_counter Mar 25 01:05:43.907472 kernel: VFS: Disk quotas dquot_6.6.0 Mar 25 01:05:43.907480 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 25 01:05:43.907488 kernel: pnp: PnP ACPI init Mar 25 01:05:43.907562 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Mar 25 01:05:43.907575 kernel: pnp: PnP ACPI: found 1 devices Mar 25 01:05:43.907583 kernel: NET: Registered PF_INET protocol family Mar 25 01:05:43.907590 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 25 01:05:43.907598 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 25 01:05:43.907605 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 25 01:05:43.907613 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 25 01:05:43.907621 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 25 01:05:43.907628 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 25 01:05:43.907636 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 25 01:05:43.907645 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 25 01:05:43.907653 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 25 01:05:43.907660 kernel: PCI: CLS 0 bytes, default 64 Mar 25 01:05:43.907668 kernel: kvm [1]: HYP mode not available Mar 25 01:05:43.907675 kernel: Initialise system trusted keyrings Mar 25 01:05:43.907682 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 25 01:05:43.907690 kernel: Key type asymmetric registered Mar 25 01:05:43.907697 kernel: Asymmetric key parser 'x509' registered Mar 25 01:05:43.907705 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Mar 25 01:05:43.907714 kernel: io scheduler mq-deadline registered Mar 25 01:05:43.907721 kernel: io scheduler kyber registered Mar 25 01:05:43.907728 kernel: io scheduler bfq registered Mar 25 01:05:43.907736 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Mar 25 01:05:43.907743 kernel: ACPI: button: Power Button [PWRB] Mar 25 01:05:43.907752 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Mar 25 01:05:43.907823 kernel: virtio-pci 0000:00:01.0: enabling device (0005 -> 0007) Mar 25 01:05:43.907833 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 25 01:05:43.907840 kernel: thunder_xcv, ver 1.0 Mar 25 01:05:43.907850 kernel: thunder_bgx, ver 1.0 Mar 25 01:05:43.907858 kernel: nicpf, ver 1.0 Mar 25 01:05:43.907866 kernel: nicvf, ver 1.0 Mar 25 01:05:43.907942 kernel: rtc-efi rtc-efi.0: registered as rtc0 Mar 25 01:05:43.908007 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-03-25T01:05:43 UTC (1742864743) Mar 25 01:05:43.908017 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 25 01:05:43.908025 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 counters available Mar 25 01:05:43.908033 kernel: watchdog: Delayed init of the lockup detector failed: -19 Mar 25 01:05:43.908042 kernel: watchdog: Hard watchdog permanently disabled Mar 25 01:05:43.908050 kernel: NET: Registered PF_INET6 protocol family Mar 25 01:05:43.908057 kernel: Segment Routing with IPv6 Mar 25 01:05:43.908065 kernel: In-situ OAM (IOAM) with IPv6 Mar 25 01:05:43.908072 kernel: NET: Registered PF_PACKET protocol family Mar 25 01:05:43.908079 kernel: Key type dns_resolver registered Mar 25 01:05:43.908099 kernel: registered taskstats version 1 Mar 25 01:05:43.908714 kernel: Loading compiled-in X.509 certificates Mar 25 01:05:43.908727 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.83-flatcar: ed4ababe871f0afac8b4236504477de11a6baf07' Mar 25 01:05:43.908740 kernel: Key type .fscrypt registered Mar 25 01:05:43.908747 kernel: Key type fscrypt-provisioning registered Mar 25 01:05:43.908755 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 25 01:05:43.908762 kernel: ima: Allocated hash algorithm: sha1 Mar 25 01:05:43.908770 kernel: ima: No architecture policies found Mar 25 01:05:43.908777 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Mar 25 01:05:43.908785 kernel: clk: Disabling unused clocks Mar 25 01:05:43.908792 kernel: Freeing unused kernel memory: 38464K Mar 25 01:05:43.908799 kernel: Run /init as init process Mar 25 01:05:43.908808 kernel: with arguments: Mar 25 01:05:43.908815 kernel: /init Mar 25 01:05:43.908822 kernel: with environment: Mar 25 01:05:43.908829 kernel: HOME=/ Mar 25 01:05:43.908836 kernel: TERM=linux Mar 25 01:05:43.908844 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 25 01:05:43.908852 systemd[1]: Successfully made /usr/ read-only. Mar 25 01:05:43.908863 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 25 01:05:43.908873 systemd[1]: Detected virtualization kvm. Mar 25 01:05:43.908880 systemd[1]: Detected architecture arm64. Mar 25 01:05:43.908888 systemd[1]: Running in initrd. Mar 25 01:05:43.908896 systemd[1]: No hostname configured, using default hostname. Mar 25 01:05:43.908904 systemd[1]: Hostname set to . Mar 25 01:05:43.908912 systemd[1]: Initializing machine ID from VM UUID. Mar 25 01:05:43.908920 systemd[1]: Queued start job for default target initrd.target. Mar 25 01:05:43.908928 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 01:05:43.908937 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 01:05:43.908947 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 25 01:05:43.908955 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 25 01:05:43.908963 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 25 01:05:43.908972 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 25 01:05:43.908981 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 25 01:05:43.908990 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 25 01:05:43.909004 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 01:05:43.909012 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 25 01:05:43.909020 systemd[1]: Reached target paths.target - Path Units. Mar 25 01:05:43.909028 systemd[1]: Reached target slices.target - Slice Units. Mar 25 01:05:43.909036 systemd[1]: Reached target swap.target - Swaps. Mar 25 01:05:43.909044 systemd[1]: Reached target timers.target - Timer Units. Mar 25 01:05:43.909052 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 25 01:05:43.909060 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 25 01:05:43.909070 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 25 01:05:43.909078 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 25 01:05:43.909086 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 25 01:05:43.909094 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 25 01:05:43.909102 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 01:05:43.909124 systemd[1]: Reached target sockets.target - Socket Units. Mar 25 01:05:43.909132 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 25 01:05:43.909141 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 25 01:05:43.909151 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 25 01:05:43.909158 systemd[1]: Starting systemd-fsck-usr.service... Mar 25 01:05:43.909167 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 25 01:05:43.909175 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 25 01:05:43.909183 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:05:43.909191 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 25 01:05:43.909199 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 01:05:43.909209 systemd[1]: Finished systemd-fsck-usr.service. Mar 25 01:05:43.909218 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:05:43.909226 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 01:05:43.909234 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 25 01:05:43.909271 systemd-journald[236]: Collecting audit messages is disabled. Mar 25 01:05:43.909294 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 25 01:05:43.909302 kernel: Bridge firewalling registered Mar 25 01:05:43.909309 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 25 01:05:43.909318 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 25 01:05:43.909337 systemd-journald[236]: Journal started Mar 25 01:05:43.909358 systemd-journald[236]: Runtime Journal (/run/log/journal/b6498a6ebbae40f9a94619869c0a21a7) is 5.9M, max 47.3M, 41.4M free. Mar 25 01:05:43.883721 systemd-modules-load[238]: Inserted module 'overlay' Mar 25 01:05:43.903358 systemd-modules-load[238]: Inserted module 'br_netfilter' Mar 25 01:05:43.915588 systemd[1]: Started systemd-journald.service - Journal Service. Mar 25 01:05:43.915799 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 25 01:05:43.917386 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 25 01:05:43.919707 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:05:43.924746 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 25 01:05:43.926444 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 25 01:05:43.931256 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 25 01:05:43.941478 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 01:05:43.944344 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 01:05:43.946673 dracut-cmdline[272]: dracut-dracut-053 Mar 25 01:05:43.947188 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 25 01:05:43.951641 dracut-cmdline[272]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=b84e5f613acd6cd0a8a878f32f5653a14f2e6fb2820997fecd5b2bd33a4ba3ab Mar 25 01:05:43.994920 systemd-resolved[288]: Positive Trust Anchors: Mar 25 01:05:43.994939 systemd-resolved[288]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 25 01:05:43.994971 systemd-resolved[288]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 25 01:05:44.001533 systemd-resolved[288]: Defaulting to hostname 'linux'. Mar 25 01:05:44.002564 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 25 01:05:44.004428 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 25 01:05:44.026148 kernel: SCSI subsystem initialized Mar 25 01:05:44.031122 kernel: Loading iSCSI transport class v2.0-870. Mar 25 01:05:44.039133 kernel: iscsi: registered transport (tcp) Mar 25 01:05:44.053152 kernel: iscsi: registered transport (qla4xxx) Mar 25 01:05:44.053206 kernel: QLogic iSCSI HBA Driver Mar 25 01:05:44.096407 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 25 01:05:44.098852 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 25 01:05:44.129349 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 25 01:05:44.129399 kernel: device-mapper: uevent: version 1.0.3 Mar 25 01:05:44.130972 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 25 01:05:44.178148 kernel: raid6: neonx8 gen() 15602 MB/s Mar 25 01:05:44.195134 kernel: raid6: neonx4 gen() 15619 MB/s Mar 25 01:05:44.212141 kernel: raid6: neonx2 gen() 13064 MB/s Mar 25 01:05:44.229134 kernel: raid6: neonx1 gen() 10366 MB/s Mar 25 01:05:44.246130 kernel: raid6: int64x8 gen() 6738 MB/s Mar 25 01:05:44.263131 kernel: raid6: int64x4 gen() 7268 MB/s Mar 25 01:05:44.280132 kernel: raid6: int64x2 gen() 6043 MB/s Mar 25 01:05:44.297310 kernel: raid6: int64x1 gen() 5015 MB/s Mar 25 01:05:44.297328 kernel: raid6: using algorithm neonx4 gen() 15619 MB/s Mar 25 01:05:44.315347 kernel: raid6: .... xor() 12289 MB/s, rmw enabled Mar 25 01:05:44.315368 kernel: raid6: using neon recovery algorithm Mar 25 01:05:44.320134 kernel: xor: measuring software checksum speed Mar 25 01:05:44.320153 kernel: 8regs : 18722 MB/sec Mar 25 01:05:44.321356 kernel: 32regs : 21647 MB/sec Mar 25 01:05:44.322592 kernel: arm64_neon : 27738 MB/sec Mar 25 01:05:44.322612 kernel: xor: using function: arm64_neon (27738 MB/sec) Mar 25 01:05:44.374136 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 25 01:05:44.384756 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 25 01:05:44.387862 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 01:05:44.414142 systemd-udevd[463]: Using default interface naming scheme 'v255'. Mar 25 01:05:44.417812 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 01:05:44.420409 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 25 01:05:44.458472 dracut-pre-trigger[465]: rd.md=0: removing MD RAID activation Mar 25 01:05:44.488429 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 25 01:05:44.500099 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 25 01:05:44.554968 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 01:05:44.562010 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 25 01:05:44.589168 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 25 01:05:44.590739 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 25 01:05:44.592594 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 01:05:44.596012 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 25 01:05:44.600720 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 25 01:05:44.622856 kernel: virtio_blk virtio1: 1/0/0 default/read/poll queues Mar 25 01:05:44.635620 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Mar 25 01:05:44.635741 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 25 01:05:44.635754 kernel: GPT:9289727 != 19775487 Mar 25 01:05:44.635780 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 25 01:05:44.635791 kernel: GPT:9289727 != 19775487 Mar 25 01:05:44.635800 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 25 01:05:44.635809 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 25 01:05:44.630535 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 25 01:05:44.639678 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 25 01:05:44.639792 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:05:44.644596 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 01:05:44.646259 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 25 01:05:44.646430 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:05:44.662210 kernel: BTRFS: device fsid bf348154-9cb1-474d-801c-0e035a5758cf devid 1 transid 39 /dev/vda3 scanned by (udev-worker) (526) Mar 25 01:05:44.662238 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 scanned by (udev-worker) (509) Mar 25 01:05:44.653459 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:05:44.662102 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:05:44.674637 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Mar 25 01:05:44.687139 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:05:44.700038 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Mar 25 01:05:44.706410 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Mar 25 01:05:44.707880 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Mar 25 01:05:44.717463 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 25 01:05:44.719478 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 25 01:05:44.721456 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 01:05:44.757070 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:05:44.834238 disk-uuid[554]: Primary Header is updated. Mar 25 01:05:44.834238 disk-uuid[554]: Secondary Entries is updated. Mar 25 01:05:44.834238 disk-uuid[554]: Secondary Header is updated. Mar 25 01:05:44.842125 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 25 01:05:45.849864 disk-uuid[563]: The operation has completed successfully. Mar 25 01:05:45.850975 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 25 01:05:45.883565 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 25 01:05:45.883679 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 25 01:05:45.908322 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 25 01:05:45.922019 sh[574]: Success Mar 25 01:05:45.947139 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" Mar 25 01:05:45.975168 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 25 01:05:45.978287 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 25 01:05:45.994677 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 25 01:05:46.009948 kernel: BTRFS info (device dm-0): first mount of filesystem bf348154-9cb1-474d-801c-0e035a5758cf Mar 25 01:05:46.009970 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Mar 25 01:05:46.009981 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 25 01:05:46.009991 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 25 01:05:46.010644 kernel: BTRFS info (device dm-0): using free space tree Mar 25 01:05:46.014661 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 25 01:05:46.015926 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 25 01:05:46.019342 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 25 01:05:46.030473 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 25 01:05:46.052134 kernel: BTRFS info (device vda6): first mount of filesystem 09629b08-d05c-4ce3-8bf7-615041c4b2c9 Mar 25 01:05:46.052186 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Mar 25 01:05:46.052197 kernel: BTRFS info (device vda6): using free space tree Mar 25 01:05:46.056160 kernel: BTRFS info (device vda6): auto enabling async discard Mar 25 01:05:46.061138 kernel: BTRFS info (device vda6): last unmount of filesystem 09629b08-d05c-4ce3-8bf7-615041c4b2c9 Mar 25 01:05:46.063715 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 25 01:05:46.065776 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 25 01:05:46.137143 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 25 01:05:46.141164 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 25 01:05:46.188515 ignition[663]: Ignition 2.20.0 Mar 25 01:05:46.188524 ignition[663]: Stage: fetch-offline Mar 25 01:05:46.188554 ignition[663]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:05:46.188562 ignition[663]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 25 01:05:46.188821 ignition[663]: parsed url from cmdline: "" Mar 25 01:05:46.188824 ignition[663]: no config URL provided Mar 25 01:05:46.188828 ignition[663]: reading system config file "/usr/lib/ignition/user.ign" Mar 25 01:05:46.188835 ignition[663]: no config at "/usr/lib/ignition/user.ign" Mar 25 01:05:46.188858 ignition[663]: op(1): [started] loading QEMU firmware config module Mar 25 01:05:46.188862 ignition[663]: op(1): executing: "modprobe" "qemu_fw_cfg" Mar 25 01:05:46.195969 systemd-networkd[757]: lo: Link UP Mar 25 01:05:46.195973 systemd-networkd[757]: lo: Gained carrier Mar 25 01:05:46.197553 systemd-networkd[757]: Enumeration completed Mar 25 01:05:46.198309 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 25 01:05:46.198884 systemd-networkd[757]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:05:46.198888 systemd-networkd[757]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 25 01:05:46.199491 systemd-networkd[757]: eth0: Link UP Mar 25 01:05:46.206305 ignition[663]: op(1): [finished] loading QEMU firmware config module Mar 25 01:05:46.199493 systemd-networkd[757]: eth0: Gained carrier Mar 25 01:05:46.199500 systemd-networkd[757]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:05:46.199787 systemd[1]: Reached target network.target - Network. Mar 25 01:05:46.226174 systemd-networkd[757]: eth0: DHCPv4 address 10.0.0.8/16, gateway 10.0.0.1 acquired from 10.0.0.1 Mar 25 01:05:46.251253 ignition[663]: parsing config with SHA512: 816e8d98d1fe31376bb9292b0b901e181beaa86126f6caf738b4d39550c4a4b4a1589cf70b78a55b8ae905f0cbe0418581d76ce5b668bd55c6a9b773c76eefa5 Mar 25 01:05:46.258170 unknown[663]: fetched base config from "system" Mar 25 01:05:46.258997 unknown[663]: fetched user config from "qemu" Mar 25 01:05:46.259574 ignition[663]: fetch-offline: fetch-offline passed Mar 25 01:05:46.259664 ignition[663]: Ignition finished successfully Mar 25 01:05:46.261133 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 25 01:05:46.262945 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Mar 25 01:05:46.263716 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 25 01:05:46.292360 ignition[771]: Ignition 2.20.0 Mar 25 01:05:46.292370 ignition[771]: Stage: kargs Mar 25 01:05:46.292524 ignition[771]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:05:46.292534 ignition[771]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 25 01:05:46.293376 ignition[771]: kargs: kargs passed Mar 25 01:05:46.297207 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 25 01:05:46.293421 ignition[771]: Ignition finished successfully Mar 25 01:05:46.300245 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 25 01:05:46.320647 ignition[779]: Ignition 2.20.0 Mar 25 01:05:46.320658 ignition[779]: Stage: disks Mar 25 01:05:46.320805 ignition[779]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:05:46.320814 ignition[779]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 25 01:05:46.321640 ignition[779]: disks: disks passed Mar 25 01:05:46.324827 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 25 01:05:46.321681 ignition[779]: Ignition finished successfully Mar 25 01:05:46.325992 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 25 01:05:46.327260 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 25 01:05:46.329191 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 25 01:05:46.330778 systemd[1]: Reached target sysinit.target - System Initialization. Mar 25 01:05:46.332678 systemd[1]: Reached target basic.target - Basic System. Mar 25 01:05:46.335323 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 25 01:05:46.359959 systemd-fsck[789]: ROOT: clean, 14/553520 files, 52654/553472 blocks Mar 25 01:05:46.363579 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 25 01:05:46.366211 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 25 01:05:46.421129 kernel: EXT4-fs (vda9): mounted filesystem a7a89271-ee7d-4bda-a834-705261d6cda9 r/w with ordered data mode. Quota mode: none. Mar 25 01:05:46.421351 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 25 01:05:46.422620 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 25 01:05:46.424832 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 25 01:05:46.426369 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 25 01:05:46.427318 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 25 01:05:46.427366 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 25 01:05:46.427403 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 25 01:05:46.439414 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 25 01:05:46.441360 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 25 01:05:46.447143 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/vda6 scanned by mount (797) Mar 25 01:05:46.447175 kernel: BTRFS info (device vda6): first mount of filesystem 09629b08-d05c-4ce3-8bf7-615041c4b2c9 Mar 25 01:05:46.447186 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Mar 25 01:05:46.448478 kernel: BTRFS info (device vda6): using free space tree Mar 25 01:05:46.451263 kernel: BTRFS info (device vda6): auto enabling async discard Mar 25 01:05:46.453017 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 25 01:05:46.488193 initrd-setup-root[821]: cut: /sysroot/etc/passwd: No such file or directory Mar 25 01:05:46.492574 initrd-setup-root[828]: cut: /sysroot/etc/group: No such file or directory Mar 25 01:05:46.496737 initrd-setup-root[835]: cut: /sysroot/etc/shadow: No such file or directory Mar 25 01:05:46.500243 initrd-setup-root[842]: cut: /sysroot/etc/gshadow: No such file or directory Mar 25 01:05:46.574789 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 25 01:05:46.576890 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 25 01:05:46.578515 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 25 01:05:46.599130 kernel: BTRFS info (device vda6): last unmount of filesystem 09629b08-d05c-4ce3-8bf7-615041c4b2c9 Mar 25 01:05:46.616314 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 25 01:05:46.626399 ignition[910]: INFO : Ignition 2.20.0 Mar 25 01:05:46.626399 ignition[910]: INFO : Stage: mount Mar 25 01:05:46.627955 ignition[910]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 01:05:46.627955 ignition[910]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 25 01:05:46.627955 ignition[910]: INFO : mount: mount passed Mar 25 01:05:46.627955 ignition[910]: INFO : Ignition finished successfully Mar 25 01:05:46.630197 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 25 01:05:46.633212 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 25 01:05:47.005165 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 25 01:05:47.006635 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 25 01:05:47.024884 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/vda6 scanned by mount (924) Mar 25 01:05:47.024915 kernel: BTRFS info (device vda6): first mount of filesystem 09629b08-d05c-4ce3-8bf7-615041c4b2c9 Mar 25 01:05:47.024926 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Mar 25 01:05:47.026518 kernel: BTRFS info (device vda6): using free space tree Mar 25 01:05:47.029128 kernel: BTRFS info (device vda6): auto enabling async discard Mar 25 01:05:47.029729 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 25 01:05:47.057158 ignition[941]: INFO : Ignition 2.20.0 Mar 25 01:05:47.057158 ignition[941]: INFO : Stage: files Mar 25 01:05:47.058696 ignition[941]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 01:05:47.058696 ignition[941]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 25 01:05:47.058696 ignition[941]: DEBUG : files: compiled without relabeling support, skipping Mar 25 01:05:47.062029 ignition[941]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 25 01:05:47.062029 ignition[941]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 25 01:05:47.062029 ignition[941]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 25 01:05:47.065936 ignition[941]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 25 01:05:47.065936 ignition[941]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 25 01:05:47.065936 ignition[941]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Mar 25 01:05:47.065936 ignition[941]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1 Mar 25 01:05:47.062517 unknown[941]: wrote ssh authorized keys file for user: core Mar 25 01:05:47.110565 ignition[941]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 25 01:05:47.275714 ignition[941]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Mar 25 01:05:47.277582 ignition[941]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 25 01:05:47.277582 ignition[941]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 25 01:05:47.277582 ignition[941]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 25 01:05:47.277582 ignition[941]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 25 01:05:47.277582 ignition[941]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 25 01:05:47.277582 ignition[941]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 25 01:05:47.277582 ignition[941]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 25 01:05:47.277582 ignition[941]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 25 01:05:47.277582 ignition[941]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 25 01:05:47.277582 ignition[941]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 25 01:05:47.277582 ignition[941]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" Mar 25 01:05:47.277582 ignition[941]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" Mar 25 01:05:47.277582 ignition[941]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" Mar 25 01:05:47.277582 ignition[941]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-arm64.raw: attempt #1 Mar 25 01:05:47.337326 systemd-networkd[757]: eth0: Gained IPv6LL Mar 25 01:05:47.704912 ignition[941]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 25 01:05:48.383215 ignition[941]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" Mar 25 01:05:48.383215 ignition[941]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 25 01:05:48.386993 ignition[941]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 25 01:05:48.386993 ignition[941]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 25 01:05:48.386993 ignition[941]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 25 01:05:48.386993 ignition[941]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Mar 25 01:05:48.386993 ignition[941]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Mar 25 01:05:48.386993 ignition[941]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Mar 25 01:05:48.386993 ignition[941]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Mar 25 01:05:48.386993 ignition[941]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Mar 25 01:05:48.400398 ignition[941]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Mar 25 01:05:48.403614 ignition[941]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Mar 25 01:05:48.406149 ignition[941]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Mar 25 01:05:48.406149 ignition[941]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Mar 25 01:05:48.406149 ignition[941]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Mar 25 01:05:48.406149 ignition[941]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 25 01:05:48.406149 ignition[941]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 25 01:05:48.406149 ignition[941]: INFO : files: files passed Mar 25 01:05:48.406149 ignition[941]: INFO : Ignition finished successfully Mar 25 01:05:48.407173 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 25 01:05:48.410242 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 25 01:05:48.413764 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 25 01:05:48.431335 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 25 01:05:48.431446 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 25 01:05:48.435187 initrd-setup-root-after-ignition[970]: grep: /sysroot/oem/oem-release: No such file or directory Mar 25 01:05:48.436410 initrd-setup-root-after-ignition[973]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 25 01:05:48.436410 initrd-setup-root-after-ignition[973]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 25 01:05:48.439355 initrd-setup-root-after-ignition[977]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 25 01:05:48.442098 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 25 01:05:48.443619 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 25 01:05:48.445917 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 25 01:05:48.474757 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 25 01:05:48.474868 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 25 01:05:48.476953 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 25 01:05:48.478706 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 25 01:05:48.480446 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 25 01:05:48.481099 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 25 01:05:48.511145 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 25 01:05:48.513458 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 25 01:05:48.537233 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 25 01:05:48.538417 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 01:05:48.540336 systemd[1]: Stopped target timers.target - Timer Units. Mar 25 01:05:48.541992 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 25 01:05:48.542101 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 25 01:05:48.544456 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 25 01:05:48.546337 systemd[1]: Stopped target basic.target - Basic System. Mar 25 01:05:48.547937 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 25 01:05:48.549584 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 25 01:05:48.551413 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 25 01:05:48.553162 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 25 01:05:48.554925 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 25 01:05:48.556758 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 25 01:05:48.558599 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 25 01:05:48.560160 systemd[1]: Stopped target swap.target - Swaps. Mar 25 01:05:48.561637 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 25 01:05:48.561749 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 25 01:05:48.563878 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 25 01:05:48.564990 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 01:05:48.566836 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 25 01:05:48.566938 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 01:05:48.568779 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 25 01:05:48.568887 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 25 01:05:48.571368 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 25 01:05:48.571478 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 25 01:05:48.573599 systemd[1]: Stopped target paths.target - Path Units. Mar 25 01:05:48.574983 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 25 01:05:48.575176 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 01:05:48.576923 systemd[1]: Stopped target slices.target - Slice Units. Mar 25 01:05:48.578553 systemd[1]: Stopped target sockets.target - Socket Units. Mar 25 01:05:48.580407 systemd[1]: iscsid.socket: Deactivated successfully. Mar 25 01:05:48.580510 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 25 01:05:48.581882 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 25 01:05:48.581959 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 25 01:05:48.583675 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 25 01:05:48.583787 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 25 01:05:48.585987 systemd[1]: ignition-files.service: Deactivated successfully. Mar 25 01:05:48.586091 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 25 01:05:48.588355 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 25 01:05:48.590497 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 25 01:05:48.591327 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 25 01:05:48.591476 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 01:05:48.593163 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 25 01:05:48.593266 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 25 01:05:48.598530 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 25 01:05:48.598606 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 25 01:05:48.606732 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 25 01:05:48.612024 ignition[997]: INFO : Ignition 2.20.0 Mar 25 01:05:48.612024 ignition[997]: INFO : Stage: umount Mar 25 01:05:48.613580 ignition[997]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 01:05:48.613580 ignition[997]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 25 01:05:48.615645 ignition[997]: INFO : umount: umount passed Mar 25 01:05:48.615645 ignition[997]: INFO : Ignition finished successfully Mar 25 01:05:48.615924 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 25 01:05:48.617173 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 25 01:05:48.619258 systemd[1]: Stopped target network.target - Network. Mar 25 01:05:48.620674 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 25 01:05:48.620738 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 25 01:05:48.622307 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 25 01:05:48.622366 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 25 01:05:48.623945 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 25 01:05:48.623988 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 25 01:05:48.625778 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 25 01:05:48.625822 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 25 01:05:48.627631 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 25 01:05:48.629330 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 25 01:05:48.631150 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 25 01:05:48.631238 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 25 01:05:48.632785 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 25 01:05:48.632864 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 25 01:05:48.636165 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 25 01:05:48.637030 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 25 01:05:48.637088 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 25 01:05:48.638565 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 25 01:05:48.638611 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 01:05:48.641877 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 25 01:05:48.643169 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 25 01:05:48.643253 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 25 01:05:48.646613 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 25 01:05:48.646999 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 25 01:05:48.647052 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 25 01:05:48.649592 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 25 01:05:48.650747 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 25 01:05:48.650800 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 25 01:05:48.653023 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 25 01:05:48.653066 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 25 01:05:48.655617 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 25 01:05:48.655659 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 25 01:05:48.657784 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 01:05:48.661287 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 25 01:05:48.678802 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 25 01:05:48.678903 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 25 01:05:48.680857 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 25 01:05:48.680957 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 01:05:48.683220 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 25 01:05:48.683279 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 25 01:05:48.684413 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 25 01:05:48.684445 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 01:05:48.686291 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 25 01:05:48.686338 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 25 01:05:48.688915 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 25 01:05:48.688967 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 25 01:05:48.691569 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 25 01:05:48.691622 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:05:48.697061 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 25 01:05:48.698413 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 25 01:05:48.698471 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 01:05:48.701540 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Mar 25 01:05:48.701583 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 25 01:05:48.703568 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 25 01:05:48.703610 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 01:05:48.705680 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 25 01:05:48.705721 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:05:48.718756 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 25 01:05:48.718845 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 25 01:05:48.720985 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 25 01:05:48.723335 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 25 01:05:48.739418 systemd[1]: Switching root. Mar 25 01:05:48.774008 systemd-journald[236]: Journal stopped Mar 25 01:05:49.515496 systemd-journald[236]: Received SIGTERM from PID 1 (systemd). Mar 25 01:05:49.515551 kernel: SELinux: policy capability network_peer_controls=1 Mar 25 01:05:49.515563 kernel: SELinux: policy capability open_perms=1 Mar 25 01:05:49.515572 kernel: SELinux: policy capability extended_socket_class=1 Mar 25 01:05:49.515588 kernel: SELinux: policy capability always_check_network=0 Mar 25 01:05:49.515597 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 25 01:05:49.515607 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 25 01:05:49.515615 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 25 01:05:49.515624 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 25 01:05:49.515633 kernel: audit: type=1403 audit(1742864748.913:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 25 01:05:49.515644 systemd[1]: Successfully loaded SELinux policy in 34.121ms. Mar 25 01:05:49.515660 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 9.391ms. Mar 25 01:05:49.515671 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 25 01:05:49.515684 systemd[1]: Detected virtualization kvm. Mar 25 01:05:49.515694 systemd[1]: Detected architecture arm64. Mar 25 01:05:49.515704 systemd[1]: Detected first boot. Mar 25 01:05:49.515714 systemd[1]: Initializing machine ID from VM UUID. Mar 25 01:05:49.515724 zram_generator::config[1046]: No configuration found. Mar 25 01:05:49.515735 kernel: NET: Registered PF_VSOCK protocol family Mar 25 01:05:49.515748 systemd[1]: Populated /etc with preset unit settings. Mar 25 01:05:49.515759 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 25 01:05:49.515771 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 25 01:05:49.515781 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 25 01:05:49.515791 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 25 01:05:49.515801 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 25 01:05:49.515812 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 25 01:05:49.515822 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 25 01:05:49.515832 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 25 01:05:49.515842 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 25 01:05:49.515852 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 25 01:05:49.515867 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 25 01:05:49.515877 systemd[1]: Created slice user.slice - User and Session Slice. Mar 25 01:05:49.515887 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 01:05:49.515898 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 01:05:49.515909 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 25 01:05:49.515918 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 25 01:05:49.515929 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 25 01:05:49.515939 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 25 01:05:49.515949 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Mar 25 01:05:49.515962 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 01:05:49.515973 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 25 01:05:49.515983 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 25 01:05:49.515993 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 25 01:05:49.516004 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 25 01:05:49.516013 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 01:05:49.516024 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 25 01:05:49.516034 systemd[1]: Reached target slices.target - Slice Units. Mar 25 01:05:49.516046 systemd[1]: Reached target swap.target - Swaps. Mar 25 01:05:49.516056 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 25 01:05:49.516066 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 25 01:05:49.516076 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 25 01:05:49.516086 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 25 01:05:49.516096 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 25 01:05:49.516114 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 01:05:49.516126 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 25 01:05:49.516136 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 25 01:05:49.516148 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 25 01:05:49.516159 systemd[1]: Mounting media.mount - External Media Directory... Mar 25 01:05:49.516169 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 25 01:05:49.516180 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 25 01:05:49.516190 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 25 01:05:49.516200 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 25 01:05:49.516211 systemd[1]: Reached target machines.target - Containers. Mar 25 01:05:49.516221 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 25 01:05:49.516232 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 01:05:49.516242 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 25 01:05:49.516252 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 25 01:05:49.516262 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 25 01:05:49.516272 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 25 01:05:49.516282 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 25 01:05:49.516292 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 25 01:05:49.516302 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 25 01:05:49.516312 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 25 01:05:49.516325 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 25 01:05:49.516335 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 25 01:05:49.516349 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 25 01:05:49.516361 systemd[1]: Stopped systemd-fsck-usr.service. Mar 25 01:05:49.516371 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 01:05:49.516382 kernel: fuse: init (API version 7.39) Mar 25 01:05:49.516391 kernel: loop: module loaded Mar 25 01:05:49.516401 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 25 01:05:49.516413 kernel: ACPI: bus type drm_connector registered Mar 25 01:05:49.516422 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 25 01:05:49.516433 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 25 01:05:49.516443 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 25 01:05:49.516453 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 25 01:05:49.516463 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 25 01:05:49.516475 systemd[1]: verity-setup.service: Deactivated successfully. Mar 25 01:05:49.516485 systemd[1]: Stopped verity-setup.service. Mar 25 01:05:49.516495 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 25 01:05:49.516505 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 25 01:05:49.516515 systemd[1]: Mounted media.mount - External Media Directory. Mar 25 01:05:49.516525 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 25 01:05:49.516535 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 25 01:05:49.516545 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 25 01:05:49.516556 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 25 01:05:49.516585 systemd-journald[1114]: Collecting audit messages is disabled. Mar 25 01:05:49.516608 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 01:05:49.516619 systemd-journald[1114]: Journal started Mar 25 01:05:49.516640 systemd-journald[1114]: Runtime Journal (/run/log/journal/b6498a6ebbae40f9a94619869c0a21a7) is 5.9M, max 47.3M, 41.4M free. Mar 25 01:05:49.291972 systemd[1]: Queued start job for default target multi-user.target. Mar 25 01:05:49.309918 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Mar 25 01:05:49.310308 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 25 01:05:49.518579 systemd[1]: Started systemd-journald.service - Journal Service. Mar 25 01:05:49.519301 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 25 01:05:49.519494 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 25 01:05:49.520798 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 25 01:05:49.520958 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 25 01:05:49.522270 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 25 01:05:49.522426 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 25 01:05:49.523664 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 25 01:05:49.523844 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 25 01:05:49.525315 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 25 01:05:49.525487 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 25 01:05:49.526692 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 25 01:05:49.526841 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 25 01:05:49.528102 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 25 01:05:49.529587 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 25 01:05:49.530985 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 25 01:05:49.532645 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 25 01:05:49.544338 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 25 01:05:49.546655 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 25 01:05:49.548629 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 25 01:05:49.549740 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 25 01:05:49.549769 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 25 01:05:49.551588 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 25 01:05:49.557867 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 25 01:05:49.559898 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 25 01:05:49.560974 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 01:05:49.561905 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 25 01:05:49.563746 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 25 01:05:49.564888 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 25 01:05:49.569062 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 25 01:05:49.570431 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 25 01:05:49.571861 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 25 01:05:49.572348 systemd-journald[1114]: Time spent on flushing to /var/log/journal/b6498a6ebbae40f9a94619869c0a21a7 is 17.495ms for 866 entries. Mar 25 01:05:49.572348 systemd-journald[1114]: System Journal (/var/log/journal/b6498a6ebbae40f9a94619869c0a21a7) is 8M, max 195.6M, 187.6M free. Mar 25 01:05:49.604756 systemd-journald[1114]: Received client request to flush runtime journal. Mar 25 01:05:49.604788 kernel: loop0: detected capacity change from 0 to 126448 Mar 25 01:05:49.578171 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 25 01:05:49.581080 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 25 01:05:49.587141 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 01:05:49.589224 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 25 01:05:49.590596 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 25 01:05:49.592137 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 25 01:05:49.601406 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 25 01:05:49.603701 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 25 01:05:49.610126 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 25 01:05:49.607141 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 25 01:05:49.608660 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 25 01:05:49.610941 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 25 01:05:49.613197 systemd-tmpfiles[1165]: ACLs are not supported, ignoring. Mar 25 01:05:49.613214 systemd-tmpfiles[1165]: ACLs are not supported, ignoring. Mar 25 01:05:49.613762 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 25 01:05:49.624919 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 25 01:05:49.628364 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 25 01:05:49.634281 udevadm[1173]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Mar 25 01:05:49.644127 kernel: loop1: detected capacity change from 0 to 103832 Mar 25 01:05:49.644269 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 25 01:05:49.658134 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 25 01:05:49.660986 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 25 01:05:49.681910 systemd-tmpfiles[1187]: ACLs are not supported, ignoring. Mar 25 01:05:49.681931 systemd-tmpfiles[1187]: ACLs are not supported, ignoring. Mar 25 01:05:49.685759 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 01:05:49.695125 kernel: loop2: detected capacity change from 0 to 194096 Mar 25 01:05:49.730135 kernel: loop3: detected capacity change from 0 to 126448 Mar 25 01:05:49.737140 kernel: loop4: detected capacity change from 0 to 103832 Mar 25 01:05:49.743137 kernel: loop5: detected capacity change from 0 to 194096 Mar 25 01:05:49.753236 (sd-merge)[1191]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Mar 25 01:05:49.753742 (sd-merge)[1191]: Merged extensions into '/usr'. Mar 25 01:05:49.759132 systemd[1]: Reload requested from client PID 1163 ('systemd-sysext') (unit systemd-sysext.service)... Mar 25 01:05:49.759153 systemd[1]: Reloading... Mar 25 01:05:49.818148 zram_generator::config[1218]: No configuration found. Mar 25 01:05:49.844991 ldconfig[1158]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 25 01:05:49.920612 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:05:49.970543 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 25 01:05:49.970955 systemd[1]: Reloading finished in 210 ms. Mar 25 01:05:49.993826 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 25 01:05:49.995246 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 25 01:05:50.015366 systemd[1]: Starting ensure-sysext.service... Mar 25 01:05:50.017233 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 25 01:05:50.033755 systemd[1]: Reload requested from client PID 1253 ('systemctl') (unit ensure-sysext.service)... Mar 25 01:05:50.033770 systemd[1]: Reloading... Mar 25 01:05:50.036488 systemd-tmpfiles[1254]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 25 01:05:50.036692 systemd-tmpfiles[1254]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 25 01:05:50.037343 systemd-tmpfiles[1254]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 25 01:05:50.037555 systemd-tmpfiles[1254]: ACLs are not supported, ignoring. Mar 25 01:05:50.037641 systemd-tmpfiles[1254]: ACLs are not supported, ignoring. Mar 25 01:05:50.040056 systemd-tmpfiles[1254]: Detected autofs mount point /boot during canonicalization of boot. Mar 25 01:05:50.040070 systemd-tmpfiles[1254]: Skipping /boot Mar 25 01:05:50.048999 systemd-tmpfiles[1254]: Detected autofs mount point /boot during canonicalization of boot. Mar 25 01:05:50.049016 systemd-tmpfiles[1254]: Skipping /boot Mar 25 01:05:50.082170 zram_generator::config[1283]: No configuration found. Mar 25 01:05:50.161792 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:05:50.211738 systemd[1]: Reloading finished in 177 ms. Mar 25 01:05:50.223707 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 25 01:05:50.236305 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 01:05:50.244696 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 25 01:05:50.247132 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 25 01:05:50.257102 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 25 01:05:50.263328 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 25 01:05:50.265906 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 01:05:50.268662 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 25 01:05:50.272499 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 01:05:50.277975 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 25 01:05:50.281361 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 25 01:05:50.287663 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 25 01:05:50.288968 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 01:05:50.289087 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 01:05:50.290799 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 25 01:05:50.306198 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 25 01:05:50.308281 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 25 01:05:50.308452 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 25 01:05:50.310319 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 25 01:05:50.310472 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 25 01:05:50.312617 systemd-udevd[1329]: Using default interface naming scheme 'v255'. Mar 25 01:05:50.316880 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 25 01:05:50.318725 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 25 01:05:50.320147 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 25 01:05:50.330594 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 01:05:50.333851 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 25 01:05:50.340742 systemd[1]: Finished ensure-sysext.service. Mar 25 01:05:50.343530 augenrules[1363]: No rules Mar 25 01:05:50.345531 systemd[1]: audit-rules.service: Deactivated successfully. Mar 25 01:05:50.346080 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 25 01:05:50.349698 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 01:05:50.352397 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 25 01:05:50.360247 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 25 01:05:50.366721 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 25 01:05:50.371368 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 25 01:05:50.372648 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 01:05:50.372695 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 01:05:50.378846 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 25 01:05:50.386834 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 25 01:05:50.388975 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 25 01:05:50.389999 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 25 01:05:50.414888 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 25 01:05:50.416619 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 25 01:05:50.417319 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 25 01:05:50.419278 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 25 01:05:50.419449 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 25 01:05:50.420939 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 25 01:05:50.421080 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 25 01:05:50.423284 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 25 01:05:50.423447 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 25 01:05:50.424838 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 25 01:05:50.431214 systemd-resolved[1323]: Positive Trust Anchors: Mar 25 01:05:50.432009 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Mar 25 01:05:50.432509 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 25 01:05:50.432574 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 25 01:05:50.434692 systemd-resolved[1323]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 25 01:05:50.434728 systemd-resolved[1323]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 25 01:05:50.440597 systemd-resolved[1323]: Defaulting to hostname 'linux'. Mar 25 01:05:50.442041 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 25 01:05:50.443386 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 25 01:05:50.471137 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (1360) Mar 25 01:05:50.491168 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 25 01:05:50.493206 systemd[1]: Reached target time-set.target - System Time Set. Mar 25 01:05:50.520674 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 25 01:05:50.524373 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 25 01:05:50.530950 systemd-networkd[1390]: lo: Link UP Mar 25 01:05:50.531250 systemd-networkd[1390]: lo: Gained carrier Mar 25 01:05:50.532337 systemd-networkd[1390]: Enumeration completed Mar 25 01:05:50.542557 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 25 01:05:50.545817 systemd[1]: Reached target network.target - Network. Mar 25 01:05:50.546096 systemd-networkd[1390]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:05:50.546174 systemd-networkd[1390]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 25 01:05:50.546832 systemd-networkd[1390]: eth0: Link UP Mar 25 01:05:50.547002 systemd-networkd[1390]: eth0: Gained carrier Mar 25 01:05:50.547063 systemd-networkd[1390]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:05:50.548234 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 25 01:05:50.550668 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 25 01:05:50.553685 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:05:50.560591 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 25 01:05:50.564578 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 25 01:05:50.568218 systemd-networkd[1390]: eth0: DHCPv4 address 10.0.0.8/16, gateway 10.0.0.1 acquired from 10.0.0.1 Mar 25 01:05:50.569318 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 25 01:05:50.569531 systemd-timesyncd[1392]: Network configuration changed, trying to establish connection. Mar 25 01:05:50.574077 systemd-timesyncd[1392]: Contacted time server 10.0.0.1:123 (10.0.0.1). Mar 25 01:05:50.574144 systemd-timesyncd[1392]: Initial clock synchronization to Tue 2025-03-25 01:05:50.378175 UTC. Mar 25 01:05:50.575508 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 25 01:05:50.588605 lvm[1417]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 25 01:05:50.602920 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:05:50.624807 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 25 01:05:50.626425 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 25 01:05:50.627553 systemd[1]: Reached target sysinit.target - System Initialization. Mar 25 01:05:50.628778 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 25 01:05:50.630061 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 25 01:05:50.631493 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 25 01:05:50.632839 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 25 01:05:50.634119 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 25 01:05:50.635436 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 25 01:05:50.635472 systemd[1]: Reached target paths.target - Path Units. Mar 25 01:05:50.636365 systemd[1]: Reached target timers.target - Timer Units. Mar 25 01:05:50.637886 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 25 01:05:50.640433 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 25 01:05:50.643752 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 25 01:05:50.645246 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 25 01:05:50.646530 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 25 01:05:50.653165 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 25 01:05:50.655133 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 25 01:05:50.657599 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 25 01:05:50.659277 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 25 01:05:50.660504 systemd[1]: Reached target sockets.target - Socket Units. Mar 25 01:05:50.661451 systemd[1]: Reached target basic.target - Basic System. Mar 25 01:05:50.662415 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 25 01:05:50.662445 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 25 01:05:50.663480 systemd[1]: Starting containerd.service - containerd container runtime... Mar 25 01:05:50.668185 lvm[1426]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 25 01:05:50.665652 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 25 01:05:50.667603 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 25 01:05:50.673712 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 25 01:05:50.675244 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 25 01:05:50.676424 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 25 01:05:50.683592 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 25 01:05:50.685833 jq[1429]: false Mar 25 01:05:50.688431 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 25 01:05:50.691775 extend-filesystems[1430]: Found loop3 Mar 25 01:05:50.691775 extend-filesystems[1430]: Found loop4 Mar 25 01:05:50.691775 extend-filesystems[1430]: Found loop5 Mar 25 01:05:50.698019 extend-filesystems[1430]: Found vda Mar 25 01:05:50.698019 extend-filesystems[1430]: Found vda1 Mar 25 01:05:50.698019 extend-filesystems[1430]: Found vda2 Mar 25 01:05:50.698019 extend-filesystems[1430]: Found vda3 Mar 25 01:05:50.698019 extend-filesystems[1430]: Found usr Mar 25 01:05:50.698019 extend-filesystems[1430]: Found vda4 Mar 25 01:05:50.698019 extend-filesystems[1430]: Found vda6 Mar 25 01:05:50.698019 extend-filesystems[1430]: Found vda7 Mar 25 01:05:50.698019 extend-filesystems[1430]: Found vda9 Mar 25 01:05:50.698019 extend-filesystems[1430]: Checking size of /dev/vda9 Mar 25 01:05:50.692707 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 25 01:05:50.715685 extend-filesystems[1430]: Resized partition /dev/vda9 Mar 25 01:05:50.698007 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 25 01:05:50.717586 extend-filesystems[1449]: resize2fs 1.47.2 (1-Jan-2025) Mar 25 01:05:50.716923 dbus-daemon[1428]: [system] SELinux support is enabled Mar 25 01:05:50.700311 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 25 01:05:50.700879 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 25 01:05:50.702460 systemd[1]: Starting update-engine.service - Update Engine... Mar 25 01:05:50.707305 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 25 01:05:50.711162 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 25 01:05:50.720920 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 25 01:05:50.722138 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Mar 25 01:05:50.725904 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 25 01:05:50.726217 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 25 01:05:50.726513 systemd[1]: motdgen.service: Deactivated successfully. Mar 25 01:05:50.728206 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 25 01:05:50.733340 jq[1446]: true Mar 25 01:05:50.741969 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 25 01:05:50.742214 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 25 01:05:50.746149 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (1366) Mar 25 01:05:50.762133 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Mar 25 01:05:50.772035 jq[1455]: true Mar 25 01:05:50.775015 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 25 01:05:50.782381 update_engine[1444]: I20250325 01:05:50.774688 1444 main.cc:92] Flatcar Update Engine starting Mar 25 01:05:50.775056 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 25 01:05:50.777645 (ntainerd)[1456]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 25 01:05:50.779758 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 25 01:05:50.779780 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 25 01:05:50.786113 extend-filesystems[1449]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Mar 25 01:05:50.786113 extend-filesystems[1449]: old_desc_blocks = 1, new_desc_blocks = 1 Mar 25 01:05:50.786113 extend-filesystems[1449]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Mar 25 01:05:50.790657 extend-filesystems[1430]: Resized filesystem in /dev/vda9 Mar 25 01:05:50.788431 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 25 01:05:50.795384 tar[1453]: linux-arm64/helm Mar 25 01:05:50.795619 update_engine[1444]: I20250325 01:05:50.794317 1444 update_check_scheduler.cc:74] Next update check in 5m47s Mar 25 01:05:50.790156 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 25 01:05:50.793179 systemd[1]: Started update-engine.service - Update Engine. Mar 25 01:05:50.801443 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 25 01:05:50.811284 systemd-logind[1439]: Watching system buttons on /dev/input/event0 (Power Button) Mar 25 01:05:50.811552 systemd-logind[1439]: New seat seat0. Mar 25 01:05:50.814580 systemd[1]: Started systemd-logind.service - User Login Management. Mar 25 01:05:50.893560 locksmithd[1478]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 25 01:05:50.966529 bash[1485]: Updated "/home/core/.ssh/authorized_keys" Mar 25 01:05:50.968166 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 25 01:05:50.973748 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Mar 25 01:05:51.007223 containerd[1456]: time="2025-03-25T01:05:51Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 25 01:05:51.008438 containerd[1456]: time="2025-03-25T01:05:51.008395513Z" level=info msg="starting containerd" revision=88aa2f531d6c2922003cc7929e51daf1c14caa0a version=v2.0.1 Mar 25 01:05:51.019041 containerd[1456]: time="2025-03-25T01:05:51.018990683Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="7.493µs" Mar 25 01:05:51.019041 containerd[1456]: time="2025-03-25T01:05:51.019036298Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 25 01:05:51.019154 containerd[1456]: time="2025-03-25T01:05:51.019056823Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 25 01:05:51.019268 containerd[1456]: time="2025-03-25T01:05:51.019244475Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 25 01:05:51.019301 containerd[1456]: time="2025-03-25T01:05:51.019269448Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 25 01:05:51.019301 containerd[1456]: time="2025-03-25T01:05:51.019297660Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 25 01:05:51.019368 containerd[1456]: time="2025-03-25T01:05:51.019350026Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 25 01:05:51.019391 containerd[1456]: time="2025-03-25T01:05:51.019366024Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 25 01:05:51.019711 containerd[1456]: time="2025-03-25T01:05:51.019688922Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 25 01:05:51.019742 containerd[1456]: time="2025-03-25T01:05:51.019710969Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 25 01:05:51.019742 containerd[1456]: time="2025-03-25T01:05:51.019722441Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 25 01:05:51.019742 containerd[1456]: time="2025-03-25T01:05:51.019730479Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 25 01:05:51.019819 containerd[1456]: time="2025-03-25T01:05:51.019802550Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 25 01:05:51.020042 containerd[1456]: time="2025-03-25T01:05:51.020021262Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 25 01:05:51.020073 containerd[1456]: time="2025-03-25T01:05:51.020058683Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 25 01:05:51.020073 containerd[1456]: time="2025-03-25T01:05:51.020069609Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 25 01:05:51.020139 containerd[1456]: time="2025-03-25T01:05:51.020123653Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 25 01:05:51.020394 containerd[1456]: time="2025-03-25T01:05:51.020375259Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 25 01:05:51.020461 containerd[1456]: time="2025-03-25T01:05:51.020443311Z" level=info msg="metadata content store policy set" policy=shared Mar 25 01:05:51.024334 containerd[1456]: time="2025-03-25T01:05:51.024300559Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 25 01:05:51.024367 containerd[1456]: time="2025-03-25T01:05:51.024358349Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 25 01:05:51.024386 containerd[1456]: time="2025-03-25T01:05:51.024373411Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 25 01:05:51.024416 containerd[1456]: time="2025-03-25T01:05:51.024385469Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 25 01:05:51.024416 containerd[1456]: time="2025-03-25T01:05:51.024398853Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 25 01:05:51.024416 containerd[1456]: time="2025-03-25T01:05:51.024409701Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 25 01:05:51.024463 containerd[1456]: time="2025-03-25T01:05:51.024422226Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 25 01:05:51.024463 containerd[1456]: time="2025-03-25T01:05:51.024435572Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 25 01:05:51.024463 containerd[1456]: time="2025-03-25T01:05:51.024449229Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 25 01:05:51.024511 containerd[1456]: time="2025-03-25T01:05:51.024461950Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 25 01:05:51.024511 containerd[1456]: time="2025-03-25T01:05:51.024473188Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 25 01:05:51.024511 containerd[1456]: time="2025-03-25T01:05:51.024485830Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 25 01:05:51.024638 containerd[1456]: time="2025-03-25T01:05:51.024617331Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 25 01:05:51.024661 containerd[1456]: time="2025-03-25T01:05:51.024650069Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 25 01:05:51.024679 containerd[1456]: time="2025-03-25T01:05:51.024665170Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 25 01:05:51.024700 containerd[1456]: time="2025-03-25T01:05:51.024677071Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 25 01:05:51.024700 containerd[1456]: time="2025-03-25T01:05:51.024688348Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 25 01:05:51.024738 containerd[1456]: time="2025-03-25T01:05:51.024698455Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 25 01:05:51.024738 containerd[1456]: time="2025-03-25T01:05:51.024710903Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 25 01:05:51.024738 containerd[1456]: time="2025-03-25T01:05:51.024721399Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 25 01:05:51.024798 containerd[1456]: time="2025-03-25T01:05:51.024742002Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 25 01:05:51.024798 containerd[1456]: time="2025-03-25T01:05:51.024755972Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 25 01:05:51.024798 containerd[1456]: time="2025-03-25T01:05:51.024767249Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 25 01:05:51.025223 containerd[1456]: time="2025-03-25T01:05:51.025205960Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 25 01:05:51.025247 containerd[1456]: time="2025-03-25T01:05:51.025227655Z" level=info msg="Start snapshots syncer" Mar 25 01:05:51.025265 containerd[1456]: time="2025-03-25T01:05:51.025255984Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 25 01:05:51.025531 containerd[1456]: time="2025-03-25T01:05:51.025495104Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 25 01:05:51.025622 containerd[1456]: time="2025-03-25T01:05:51.025550006Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 25 01:05:51.025649 containerd[1456]: time="2025-03-25T01:05:51.025621766Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 25 01:05:51.025763 containerd[1456]: time="2025-03-25T01:05:51.025741091Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 25 01:05:51.025789 containerd[1456]: time="2025-03-25T01:05:51.025773557Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 25 01:05:51.025789 containerd[1456]: time="2025-03-25T01:05:51.025786121Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 25 01:05:51.025829 containerd[1456]: time="2025-03-25T01:05:51.025797906Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 25 01:05:51.025829 containerd[1456]: time="2025-03-25T01:05:51.025811758Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 25 01:05:51.025829 containerd[1456]: time="2025-03-25T01:05:51.025822333Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 25 01:05:51.025880 containerd[1456]: time="2025-03-25T01:05:51.025832907Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 25 01:05:51.025880 containerd[1456]: time="2025-03-25T01:05:51.025868260Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 25 01:05:51.025912 containerd[1456]: time="2025-03-25T01:05:51.025880708Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 25 01:05:51.025912 containerd[1456]: time="2025-03-25T01:05:51.025891712Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 25 01:05:51.025947 containerd[1456]: time="2025-03-25T01:05:51.025916880Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 25 01:05:51.025947 containerd[1456]: time="2025-03-25T01:05:51.025931006Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 25 01:05:51.025947 containerd[1456]: time="2025-03-25T01:05:51.025940059Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 25 01:05:51.025994 containerd[1456]: time="2025-03-25T01:05:51.025949814Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 25 01:05:51.025994 containerd[1456]: time="2025-03-25T01:05:51.025958203Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 25 01:05:51.025994 containerd[1456]: time="2025-03-25T01:05:51.025968934Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 25 01:05:51.025994 containerd[1456]: time="2025-03-25T01:05:51.025979392Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 25 01:05:51.026142 containerd[1456]: time="2025-03-25T01:05:51.026125993Z" level=info msg="runtime interface created" Mar 25 01:05:51.026142 containerd[1456]: time="2025-03-25T01:05:51.026138948Z" level=info msg="created NRI interface" Mar 25 01:05:51.026180 containerd[1456]: time="2025-03-25T01:05:51.026153698Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 25 01:05:51.026180 containerd[1456]: time="2025-03-25T01:05:51.026166145Z" level=info msg="Connect containerd service" Mar 25 01:05:51.026212 containerd[1456]: time="2025-03-25T01:05:51.026198025Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 25 01:05:51.029173 containerd[1456]: time="2025-03-25T01:05:51.029130948Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 25 01:05:51.148914 containerd[1456]: time="2025-03-25T01:05:51.148869640Z" level=info msg="Start subscribing containerd event" Mar 25 01:05:51.148995 containerd[1456]: time="2025-03-25T01:05:51.148931371Z" level=info msg="Start recovering state" Mar 25 01:05:51.149038 containerd[1456]: time="2025-03-25T01:05:51.149021665Z" level=info msg="Start event monitor" Mar 25 01:05:51.149062 containerd[1456]: time="2025-03-25T01:05:51.149046990Z" level=info msg="Start cni network conf syncer for default" Mar 25 01:05:51.149062 containerd[1456]: time="2025-03-25T01:05:51.149055965Z" level=info msg="Start streaming server" Mar 25 01:05:51.149102 containerd[1456]: time="2025-03-25T01:05:51.149064705Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 25 01:05:51.149102 containerd[1456]: time="2025-03-25T01:05:51.149073641Z" level=info msg="runtime interface starting up..." Mar 25 01:05:51.149102 containerd[1456]: time="2025-03-25T01:05:51.149079650Z" level=info msg="starting plugins..." Mar 25 01:05:51.149102 containerd[1456]: time="2025-03-25T01:05:51.149093698Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 25 01:05:51.149538 containerd[1456]: time="2025-03-25T01:05:51.149510440Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 25 01:05:51.149588 containerd[1456]: time="2025-03-25T01:05:51.149575371Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 25 01:05:51.152273 systemd[1]: Started containerd.service - containerd container runtime. Mar 25 01:05:51.153760 containerd[1456]: time="2025-03-25T01:05:51.153680947Z" level=info msg="containerd successfully booted in 0.146961s" Mar 25 01:05:51.168541 tar[1453]: linux-arm64/LICENSE Mar 25 01:05:51.168621 tar[1453]: linux-arm64/README.md Mar 25 01:05:51.189554 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 25 01:05:51.413328 sshd_keygen[1452]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 25 01:05:51.432647 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 25 01:05:51.436540 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 25 01:05:51.457290 systemd[1]: issuegen.service: Deactivated successfully. Mar 25 01:05:51.457484 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 25 01:05:51.460727 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 25 01:05:51.482959 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 25 01:05:51.485867 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 25 01:05:51.488154 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Mar 25 01:05:51.489590 systemd[1]: Reached target getty.target - Login Prompts. Mar 25 01:05:51.817249 systemd-networkd[1390]: eth0: Gained IPv6LL Mar 25 01:05:51.819271 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 25 01:05:51.821474 systemd[1]: Reached target network-online.target - Network is Online. Mar 25 01:05:51.824029 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Mar 25 01:05:51.826640 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:05:51.838170 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 25 01:05:51.861172 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 25 01:05:51.863387 systemd[1]: coreos-metadata.service: Deactivated successfully. Mar 25 01:05:51.863559 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Mar 25 01:05:51.866216 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 25 01:05:52.296161 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:05:52.297702 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 25 01:05:52.298910 systemd[1]: Startup finished in 580ms (kernel) + 5.214s (initrd) + 3.422s (userspace) = 9.216s. Mar 25 01:05:52.300746 (kubelet)[1555]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:05:52.735766 kubelet[1555]: E0325 01:05:52.735664 1555 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:05:52.738360 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:05:52.738504 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:05:52.740269 systemd[1]: kubelet.service: Consumed 817ms CPU time, 241.9M memory peak. Mar 25 01:05:56.471565 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 25 01:05:56.472770 systemd[1]: Started sshd@0-10.0.0.8:22-10.0.0.1:54504.service - OpenSSH per-connection server daemon (10.0.0.1:54504). Mar 25 01:05:56.569862 sshd[1569]: Accepted publickey for core from 10.0.0.1 port 54504 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:05:56.572022 sshd-session[1569]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:05:56.582204 systemd-logind[1439]: New session 1 of user core. Mar 25 01:05:56.583199 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 25 01:05:56.584270 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 25 01:05:56.610848 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 25 01:05:56.613207 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 25 01:05:56.633385 (systemd)[1573]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 25 01:05:56.635639 systemd-logind[1439]: New session c1 of user core. Mar 25 01:05:56.746432 systemd[1573]: Queued start job for default target default.target. Mar 25 01:05:56.756127 systemd[1573]: Created slice app.slice - User Application Slice. Mar 25 01:05:56.756156 systemd[1573]: Reached target paths.target - Paths. Mar 25 01:05:56.756196 systemd[1573]: Reached target timers.target - Timers. Mar 25 01:05:56.757539 systemd[1573]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 25 01:05:56.767614 systemd[1573]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 25 01:05:56.767687 systemd[1573]: Reached target sockets.target - Sockets. Mar 25 01:05:56.767730 systemd[1573]: Reached target basic.target - Basic System. Mar 25 01:05:56.767758 systemd[1573]: Reached target default.target - Main User Target. Mar 25 01:05:56.767785 systemd[1573]: Startup finished in 125ms. Mar 25 01:05:56.767945 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 25 01:05:56.769521 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 25 01:05:56.833086 systemd[1]: Started sshd@1-10.0.0.8:22-10.0.0.1:54514.service - OpenSSH per-connection server daemon (10.0.0.1:54514). Mar 25 01:05:56.887118 sshd[1584]: Accepted publickey for core from 10.0.0.1 port 54514 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:05:56.888482 sshd-session[1584]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:05:56.893639 systemd-logind[1439]: New session 2 of user core. Mar 25 01:05:56.906310 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 25 01:05:56.958679 sshd[1586]: Connection closed by 10.0.0.1 port 54514 Mar 25 01:05:56.959327 sshd-session[1584]: pam_unix(sshd:session): session closed for user core Mar 25 01:05:56.974317 systemd[1]: sshd@1-10.0.0.8:22-10.0.0.1:54514.service: Deactivated successfully. Mar 25 01:05:56.977697 systemd[1]: session-2.scope: Deactivated successfully. Mar 25 01:05:56.980177 systemd-logind[1439]: Session 2 logged out. Waiting for processes to exit. Mar 25 01:05:56.981480 systemd[1]: Started sshd@2-10.0.0.8:22-10.0.0.1:54522.service - OpenSSH per-connection server daemon (10.0.0.1:54522). Mar 25 01:05:56.982847 systemd-logind[1439]: Removed session 2. Mar 25 01:05:57.037414 sshd[1591]: Accepted publickey for core from 10.0.0.1 port 54522 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:05:57.038657 sshd-session[1591]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:05:57.042966 systemd-logind[1439]: New session 3 of user core. Mar 25 01:05:57.055269 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 25 01:05:57.102563 sshd[1594]: Connection closed by 10.0.0.1 port 54522 Mar 25 01:05:57.103005 sshd-session[1591]: pam_unix(sshd:session): session closed for user core Mar 25 01:05:57.118227 systemd[1]: sshd@2-10.0.0.8:22-10.0.0.1:54522.service: Deactivated successfully. Mar 25 01:05:57.119782 systemd[1]: session-3.scope: Deactivated successfully. Mar 25 01:05:57.121067 systemd-logind[1439]: Session 3 logged out. Waiting for processes to exit. Mar 25 01:05:57.122369 systemd[1]: Started sshd@3-10.0.0.8:22-10.0.0.1:54532.service - OpenSSH per-connection server daemon (10.0.0.1:54532). Mar 25 01:05:57.123153 systemd-logind[1439]: Removed session 3. Mar 25 01:05:57.165298 sshd[1599]: Accepted publickey for core from 10.0.0.1 port 54532 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:05:57.166608 sshd-session[1599]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:05:57.170957 systemd-logind[1439]: New session 4 of user core. Mar 25 01:05:57.182279 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 25 01:05:57.233716 sshd[1602]: Connection closed by 10.0.0.1 port 54532 Mar 25 01:05:57.234165 sshd-session[1599]: pam_unix(sshd:session): session closed for user core Mar 25 01:05:57.244232 systemd[1]: sshd@3-10.0.0.8:22-10.0.0.1:54532.service: Deactivated successfully. Mar 25 01:05:57.246580 systemd[1]: session-4.scope: Deactivated successfully. Mar 25 01:05:57.247940 systemd-logind[1439]: Session 4 logged out. Waiting for processes to exit. Mar 25 01:05:57.249091 systemd[1]: Started sshd@4-10.0.0.8:22-10.0.0.1:54540.service - OpenSSH per-connection server daemon (10.0.0.1:54540). Mar 25 01:05:57.249978 systemd-logind[1439]: Removed session 4. Mar 25 01:05:57.301171 sshd[1607]: Accepted publickey for core from 10.0.0.1 port 54540 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:05:57.302690 sshd-session[1607]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:05:57.308091 systemd-logind[1439]: New session 5 of user core. Mar 25 01:05:57.323311 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 25 01:05:57.383416 sudo[1611]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 25 01:05:57.383704 sudo[1611]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:05:57.396102 sudo[1611]: pam_unix(sudo:session): session closed for user root Mar 25 01:05:57.397593 sshd[1610]: Connection closed by 10.0.0.1 port 54540 Mar 25 01:05:57.397988 sshd-session[1607]: pam_unix(sshd:session): session closed for user core Mar 25 01:05:57.410323 systemd[1]: sshd@4-10.0.0.8:22-10.0.0.1:54540.service: Deactivated successfully. Mar 25 01:05:57.411937 systemd[1]: session-5.scope: Deactivated successfully. Mar 25 01:05:57.413295 systemd-logind[1439]: Session 5 logged out. Waiting for processes to exit. Mar 25 01:05:57.414692 systemd[1]: Started sshd@5-10.0.0.8:22-10.0.0.1:54554.service - OpenSSH per-connection server daemon (10.0.0.1:54554). Mar 25 01:05:57.415472 systemd-logind[1439]: Removed session 5. Mar 25 01:05:57.470146 sshd[1616]: Accepted publickey for core from 10.0.0.1 port 54554 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:05:57.471491 sshd-session[1616]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:05:57.475782 systemd-logind[1439]: New session 6 of user core. Mar 25 01:05:57.487272 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 25 01:05:57.539380 sudo[1621]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 25 01:05:57.539673 sudo[1621]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:05:57.542804 sudo[1621]: pam_unix(sudo:session): session closed for user root Mar 25 01:05:57.547811 sudo[1620]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 25 01:05:57.548141 sudo[1620]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:05:57.557485 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 25 01:05:57.589345 augenrules[1643]: No rules Mar 25 01:05:57.590547 systemd[1]: audit-rules.service: Deactivated successfully. Mar 25 01:05:57.592160 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 25 01:05:57.593157 sudo[1620]: pam_unix(sudo:session): session closed for user root Mar 25 01:05:57.594535 sshd[1619]: Connection closed by 10.0.0.1 port 54554 Mar 25 01:05:57.594863 sshd-session[1616]: pam_unix(sshd:session): session closed for user core Mar 25 01:05:57.602381 systemd[1]: sshd@5-10.0.0.8:22-10.0.0.1:54554.service: Deactivated successfully. Mar 25 01:05:57.603970 systemd[1]: session-6.scope: Deactivated successfully. Mar 25 01:05:57.604685 systemd-logind[1439]: Session 6 logged out. Waiting for processes to exit. Mar 25 01:05:57.606581 systemd[1]: Started sshd@6-10.0.0.8:22-10.0.0.1:54566.service - OpenSSH per-connection server daemon (10.0.0.1:54566). Mar 25 01:05:57.607374 systemd-logind[1439]: Removed session 6. Mar 25 01:05:57.658183 sshd[1651]: Accepted publickey for core from 10.0.0.1 port 54566 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:05:57.659438 sshd-session[1651]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:05:57.663597 systemd-logind[1439]: New session 7 of user core. Mar 25 01:05:57.670267 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 25 01:05:57.720529 sudo[1655]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 25 01:05:57.720793 sudo[1655]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:05:58.055548 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 25 01:05:58.070452 (dockerd)[1677]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 25 01:05:58.326545 dockerd[1677]: time="2025-03-25T01:05:58.326410001Z" level=info msg="Starting up" Mar 25 01:05:58.327322 dockerd[1677]: time="2025-03-25T01:05:58.327295724Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 25 01:05:58.431131 dockerd[1677]: time="2025-03-25T01:05:58.430863117Z" level=info msg="Loading containers: start." Mar 25 01:05:58.608128 kernel: Initializing XFRM netlink socket Mar 25 01:05:58.688759 systemd-networkd[1390]: docker0: Link UP Mar 25 01:05:58.768535 dockerd[1677]: time="2025-03-25T01:05:58.768492335Z" level=info msg="Loading containers: done." Mar 25 01:05:58.782159 dockerd[1677]: time="2025-03-25T01:05:58.782092252Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 25 01:05:58.782313 dockerd[1677]: time="2025-03-25T01:05:58.782203096Z" level=info msg="Docker daemon" commit=c710b88579fcb5e0d53f96dcae976d79323b9166 containerd-snapshotter=false storage-driver=overlay2 version=27.4.1 Mar 25 01:05:58.782413 dockerd[1677]: time="2025-03-25T01:05:58.782386436Z" level=info msg="Daemon has completed initialization" Mar 25 01:05:58.812962 dockerd[1677]: time="2025-03-25T01:05:58.812879702Z" level=info msg="API listen on /run/docker.sock" Mar 25 01:05:58.813049 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 25 01:05:59.824968 containerd[1456]: time="2025-03-25T01:05:59.824434941Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.11\"" Mar 25 01:06:00.518499 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1966899608.mount: Deactivated successfully. Mar 25 01:06:02.209958 containerd[1456]: time="2025-03-25T01:06:02.209873284Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:06:02.210790 containerd[1456]: time="2025-03-25T01:06:02.210738222Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.30.11: active requests=0, bytes read=29793526" Mar 25 01:06:02.211456 containerd[1456]: time="2025-03-25T01:06:02.211391998Z" level=info msg="ImageCreate event name:\"sha256:fcbef283ab16167d1ca4acb66836af518e9fe445111fbc618fdbe196858f9530\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:06:02.214088 containerd[1456]: time="2025-03-25T01:06:02.214025235Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:77c54346965036acc7ac95c3200597ede36db9246179248dde21c1a3ecc1caf0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:06:02.215624 containerd[1456]: time="2025-03-25T01:06:02.215372123Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.30.11\" with image id \"sha256:fcbef283ab16167d1ca4acb66836af518e9fe445111fbc618fdbe196858f9530\", repo tag \"registry.k8s.io/kube-apiserver:v1.30.11\", repo digest \"registry.k8s.io/kube-apiserver@sha256:77c54346965036acc7ac95c3200597ede36db9246179248dde21c1a3ecc1caf0\", size \"29790324\" in 2.390894636s" Mar 25 01:06:02.215624 containerd[1456]: time="2025-03-25T01:06:02.215411221Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.11\" returns image reference \"sha256:fcbef283ab16167d1ca4acb66836af518e9fe445111fbc618fdbe196858f9530\"" Mar 25 01:06:02.231956 containerd[1456]: time="2025-03-25T01:06:02.231913227Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.11\"" Mar 25 01:06:02.988884 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 25 01:06:02.990347 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:06:03.097192 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:06:03.101092 (kubelet)[1963]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:06:03.141169 kubelet[1963]: E0325 01:06:03.141091 1963 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:06:03.144494 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:06:03.144767 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:06:03.145318 systemd[1]: kubelet.service: Consumed 141ms CPU time, 97.3M memory peak. Mar 25 01:06:04.489735 containerd[1456]: time="2025-03-25T01:06:04.489673504Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:06:04.490242 containerd[1456]: time="2025-03-25T01:06:04.490198390Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.30.11: active requests=0, bytes read=26861169" Mar 25 01:06:04.491339 containerd[1456]: time="2025-03-25T01:06:04.490790743Z" level=info msg="ImageCreate event name:\"sha256:9469d949b9e8c03b6cb06af513f683dd2975b57092f3deb2a9e125e0d05188d3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:06:04.493439 containerd[1456]: time="2025-03-25T01:06:04.493406688Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d8874f3fb45591ecdac67a3035c730808f18b3ab13147495c7d77eb1960d4f6f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:06:04.494497 containerd[1456]: time="2025-03-25T01:06:04.494467691Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.30.11\" with image id \"sha256:9469d949b9e8c03b6cb06af513f683dd2975b57092f3deb2a9e125e0d05188d3\", repo tag \"registry.k8s.io/kube-controller-manager:v1.30.11\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d8874f3fb45591ecdac67a3035c730808f18b3ab13147495c7d77eb1960d4f6f\", size \"28301963\" in 2.262510739s" Mar 25 01:06:04.494548 containerd[1456]: time="2025-03-25T01:06:04.494501823Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.11\" returns image reference \"sha256:9469d949b9e8c03b6cb06af513f683dd2975b57092f3deb2a9e125e0d05188d3\"" Mar 25 01:06:04.510357 containerd[1456]: time="2025-03-25T01:06:04.510321279Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.11\"" Mar 25 01:06:05.925707 containerd[1456]: time="2025-03-25T01:06:05.925647403Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:06:05.926190 containerd[1456]: time="2025-03-25T01:06:05.926123040Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.30.11: active requests=0, bytes read=16264638" Mar 25 01:06:05.926868 containerd[1456]: time="2025-03-25T01:06:05.926814859Z" level=info msg="ImageCreate event name:\"sha256:3540cd10f52fac0a58ba43c004c6d3941e2a9f53e06440b982b9c130a72c0213\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:06:05.929269 containerd[1456]: time="2025-03-25T01:06:05.929237041Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:c699f8c97ae7ec819c8bd878d3db104ba72fc440d810d9030e09286b696017b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:06:05.931132 containerd[1456]: time="2025-03-25T01:06:05.931083604Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.30.11\" with image id \"sha256:3540cd10f52fac0a58ba43c004c6d3941e2a9f53e06440b982b9c130a72c0213\", repo tag \"registry.k8s.io/kube-scheduler:v1.30.11\", repo digest \"registry.k8s.io/kube-scheduler@sha256:c699f8c97ae7ec819c8bd878d3db104ba72fc440d810d9030e09286b696017b5\", size \"17705450\" in 1.420723607s" Mar 25 01:06:05.931198 containerd[1456]: time="2025-03-25T01:06:05.931136125Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.11\" returns image reference \"sha256:3540cd10f52fac0a58ba43c004c6d3941e2a9f53e06440b982b9c130a72c0213\"" Mar 25 01:06:05.947436 containerd[1456]: time="2025-03-25T01:06:05.947368655Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.11\"" Mar 25 01:06:07.227459 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1402786374.mount: Deactivated successfully. Mar 25 01:06:07.433518 containerd[1456]: time="2025-03-25T01:06:07.433471085Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:06:07.434095 containerd[1456]: time="2025-03-25T01:06:07.434054634Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.30.11: active requests=0, bytes read=25771850" Mar 25 01:06:07.435010 containerd[1456]: time="2025-03-25T01:06:07.434963958Z" level=info msg="ImageCreate event name:\"sha256:fe83790bf8a35411788b67fe5f0ce35309056c40530484d516af2ca01375220c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:06:07.437256 containerd[1456]: time="2025-03-25T01:06:07.437222651Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:ea4da798040a18ed3f302e8d5f67307c7275a2a53bcf3d51bcec223acda84a55\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:06:07.438248 containerd[1456]: time="2025-03-25T01:06:07.438162487Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.30.11\" with image id \"sha256:fe83790bf8a35411788b67fe5f0ce35309056c40530484d516af2ca01375220c\", repo tag \"registry.k8s.io/kube-proxy:v1.30.11\", repo digest \"registry.k8s.io/kube-proxy@sha256:ea4da798040a18ed3f302e8d5f67307c7275a2a53bcf3d51bcec223acda84a55\", size \"25770867\" in 1.490756965s" Mar 25 01:06:07.438248 containerd[1456]: time="2025-03-25T01:06:07.438198543Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.11\" returns image reference \"sha256:fe83790bf8a35411788b67fe5f0ce35309056c40530484d516af2ca01375220c\"" Mar 25 01:06:07.453069 containerd[1456]: time="2025-03-25T01:06:07.453039764Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Mar 25 01:06:08.033647 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1137700404.mount: Deactivated successfully. Mar 25 01:06:09.187615 containerd[1456]: time="2025-03-25T01:06:09.187550055Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:06:09.188300 containerd[1456]: time="2025-03-25T01:06:09.188247987Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=16485383" Mar 25 01:06:09.189168 containerd[1456]: time="2025-03-25T01:06:09.189126759Z" level=info msg="ImageCreate event name:\"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:06:09.191762 containerd[1456]: time="2025-03-25T01:06:09.191716695Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:06:09.193208 containerd[1456]: time="2025-03-25T01:06:09.193172945Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"16482581\" in 1.740094804s" Mar 25 01:06:09.193248 containerd[1456]: time="2025-03-25T01:06:09.193208546Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\"" Mar 25 01:06:09.208083 containerd[1456]: time="2025-03-25T01:06:09.208044085Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Mar 25 01:06:09.696419 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount752314710.mount: Deactivated successfully. Mar 25 01:06:09.701896 containerd[1456]: time="2025-03-25T01:06:09.701346123Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:06:09.701977 containerd[1456]: time="2025-03-25T01:06:09.701930946Z" level=info msg="stop pulling image registry.k8s.io/pause:3.9: active requests=0, bytes read=268823" Mar 25 01:06:09.702940 containerd[1456]: time="2025-03-25T01:06:09.702911052Z" level=info msg="ImageCreate event name:\"sha256:829e9de338bd5fdd3f16f68f83a9fb288fbc8453e881e5d5cfd0f6f2ff72b43e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:06:09.705122 containerd[1456]: time="2025-03-25T01:06:09.704793437Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:06:09.705625 containerd[1456]: time="2025-03-25T01:06:09.705414420Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.9\" with image id \"sha256:829e9de338bd5fdd3f16f68f83a9fb288fbc8453e881e5d5cfd0f6f2ff72b43e\", repo tag \"registry.k8s.io/pause:3.9\", repo digest \"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\", size \"268051\" in 497.331222ms" Mar 25 01:06:09.705625 containerd[1456]: time="2025-03-25T01:06:09.705447148Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:829e9de338bd5fdd3f16f68f83a9fb288fbc8453e881e5d5cfd0f6f2ff72b43e\"" Mar 25 01:06:09.720416 containerd[1456]: time="2025-03-25T01:06:09.720375641Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\"" Mar 25 01:06:10.299798 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2887027658.mount: Deactivated successfully. Mar 25 01:06:13.395197 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 25 01:06:13.397258 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:06:13.537793 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:06:13.541755 (kubelet)[2131]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:06:13.608489 kubelet[2131]: E0325 01:06:13.608437 2131 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:06:13.611125 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:06:13.611270 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:06:13.611573 systemd[1]: kubelet.service: Consumed 148ms CPU time, 101.3M memory peak. Mar 25 01:06:13.685441 containerd[1456]: time="2025-03-25T01:06:13.685304292Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.12-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:06:13.687168 containerd[1456]: time="2025-03-25T01:06:13.687093688Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.12-0: active requests=0, bytes read=66191474" Mar 25 01:06:13.690945 containerd[1456]: time="2025-03-25T01:06:13.690738834Z" level=info msg="ImageCreate event name:\"sha256:014faa467e29798aeef733fe6d1a3b5e382688217b053ad23410e6cccd5d22fd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:06:13.693884 containerd[1456]: time="2025-03-25T01:06:13.693844001Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:06:13.695702 containerd[1456]: time="2025-03-25T01:06:13.695665355Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.12-0\" with image id \"sha256:014faa467e29798aeef733fe6d1a3b5e382688217b053ad23410e6cccd5d22fd\", repo tag \"registry.k8s.io/etcd:3.5.12-0\", repo digest \"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\", size \"66189079\" in 3.975250195s" Mar 25 01:06:13.695748 containerd[1456]: time="2025-03-25T01:06:13.695712614Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\" returns image reference \"sha256:014faa467e29798aeef733fe6d1a3b5e382688217b053ad23410e6cccd5d22fd\"" Mar 25 01:06:20.079486 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:06:20.079626 systemd[1]: kubelet.service: Consumed 148ms CPU time, 101.3M memory peak. Mar 25 01:06:20.081775 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:06:20.112203 systemd[1]: Reload requested from client PID 2241 ('systemctl') (unit session-7.scope)... Mar 25 01:06:20.112222 systemd[1]: Reloading... Mar 25 01:06:20.191166 zram_generator::config[2290]: No configuration found. Mar 25 01:06:20.297014 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:06:20.370456 systemd[1]: Reloading finished in 257 ms. Mar 25 01:06:20.413194 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:06:20.416426 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:06:20.416849 systemd[1]: kubelet.service: Deactivated successfully. Mar 25 01:06:20.417087 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:06:20.417179 systemd[1]: kubelet.service: Consumed 86ms CPU time, 82.5M memory peak. Mar 25 01:06:20.419804 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:06:20.527593 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:06:20.531199 (kubelet)[2331]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 25 01:06:20.581169 kubelet[2331]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:06:20.581169 kubelet[2331]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 25 01:06:20.581169 kubelet[2331]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:06:20.581169 kubelet[2331]: I0325 01:06:20.579727 2331 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 25 01:06:22.331362 kubelet[2331]: I0325 01:06:22.331317 2331 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Mar 25 01:06:22.331362 kubelet[2331]: I0325 01:06:22.331350 2331 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 25 01:06:22.331691 kubelet[2331]: I0325 01:06:22.331545 2331 server.go:927] "Client rotation is on, will bootstrap in background" Mar 25 01:06:22.377576 kubelet[2331]: E0325 01:06:22.377540 2331 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://10.0.0.8:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 10.0.0.8:6443: connect: connection refused Mar 25 01:06:22.377648 kubelet[2331]: I0325 01:06:22.377615 2331 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 25 01:06:22.386626 kubelet[2331]: I0325 01:06:22.386601 2331 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 25 01:06:22.387676 kubelet[2331]: I0325 01:06:22.387634 2331 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 25 01:06:22.387858 kubelet[2331]: I0325 01:06:22.387684 2331 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Mar 25 01:06:22.387942 kubelet[2331]: I0325 01:06:22.387870 2331 topology_manager.go:138] "Creating topology manager with none policy" Mar 25 01:06:22.387942 kubelet[2331]: I0325 01:06:22.387881 2331 container_manager_linux.go:301] "Creating device plugin manager" Mar 25 01:06:22.388168 kubelet[2331]: I0325 01:06:22.388151 2331 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:06:22.389758 kubelet[2331]: I0325 01:06:22.389738 2331 kubelet.go:400] "Attempting to sync node with API server" Mar 25 01:06:22.389803 kubelet[2331]: I0325 01:06:22.389761 2331 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 25 01:06:22.390496 kubelet[2331]: I0325 01:06:22.389950 2331 kubelet.go:312] "Adding apiserver pod source" Mar 25 01:06:22.390496 kubelet[2331]: I0325 01:06:22.390145 2331 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 25 01:06:22.391549 kubelet[2331]: W0325 01:06:22.391497 2331 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.8:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.0.0.8:6443: connect: connection refused Mar 25 01:06:22.391601 kubelet[2331]: E0325 01:06:22.391554 2331 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.0.0.8:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.0.0.8:6443: connect: connection refused Mar 25 01:06:22.391722 kubelet[2331]: W0325 01:06:22.391685 2331 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.8:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.8:6443: connect: connection refused Mar 25 01:06:22.391762 kubelet[2331]: E0325 01:06:22.391730 2331 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://10.0.0.8:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.8:6443: connect: connection refused Mar 25 01:06:22.392085 kubelet[2331]: I0325 01:06:22.392064 2331 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 25 01:06:22.392444 kubelet[2331]: I0325 01:06:22.392422 2331 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 25 01:06:22.392491 kubelet[2331]: W0325 01:06:22.392465 2331 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 25 01:06:22.393990 kubelet[2331]: I0325 01:06:22.393491 2331 server.go:1264] "Started kubelet" Mar 25 01:06:22.394294 kubelet[2331]: I0325 01:06:22.394232 2331 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 25 01:06:22.394907 kubelet[2331]: I0325 01:06:22.394519 2331 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 25 01:06:22.394907 kubelet[2331]: I0325 01:06:22.394786 2331 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 25 01:06:22.399133 kubelet[2331]: I0325 01:06:22.397206 2331 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 25 01:06:22.399133 kubelet[2331]: I0325 01:06:22.397898 2331 server.go:455] "Adding debug handlers to kubelet server" Mar 25 01:06:22.400200 kubelet[2331]: E0325 01:06:22.399501 2331 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.8:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.8:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.182fe6491619d686 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-03-25 01:06:22.393464454 +0000 UTC m=+1.859254248,LastTimestamp:2025-03-25 01:06:22.393464454 +0000 UTC m=+1.859254248,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Mar 25 01:06:22.400200 kubelet[2331]: I0325 01:06:22.399910 2331 volume_manager.go:291] "Starting Kubelet Volume Manager" Mar 25 01:06:22.400200 kubelet[2331]: I0325 01:06:22.399991 2331 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 25 01:06:22.400927 kubelet[2331]: I0325 01:06:22.400887 2331 reconciler.go:26] "Reconciler: start to sync state" Mar 25 01:06:22.402130 kubelet[2331]: W0325 01:06:22.401204 2331 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.8:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.8:6443: connect: connection refused Mar 25 01:06:22.402130 kubelet[2331]: E0325 01:06:22.401254 2331 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://10.0.0.8:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.8:6443: connect: connection refused Mar 25 01:06:22.402130 kubelet[2331]: E0325 01:06:22.401862 2331 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.8:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.8:6443: connect: connection refused" interval="200ms" Mar 25 01:06:22.402369 kubelet[2331]: E0325 01:06:22.402352 2331 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 25 01:06:22.402741 kubelet[2331]: I0325 01:06:22.402705 2331 factory.go:221] Registration of the systemd container factory successfully Mar 25 01:06:22.402862 kubelet[2331]: I0325 01:06:22.402831 2331 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 25 01:06:22.404096 kubelet[2331]: I0325 01:06:22.404067 2331 factory.go:221] Registration of the containerd container factory successfully Mar 25 01:06:22.411919 kubelet[2331]: I0325 01:06:22.411876 2331 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 25 01:06:22.412893 kubelet[2331]: I0325 01:06:22.412864 2331 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 25 01:06:22.413031 kubelet[2331]: I0325 01:06:22.413020 2331 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 25 01:06:22.413066 kubelet[2331]: I0325 01:06:22.413039 2331 kubelet.go:2337] "Starting kubelet main sync loop" Mar 25 01:06:22.413089 kubelet[2331]: E0325 01:06:22.413074 2331 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 25 01:06:22.416628 kubelet[2331]: W0325 01:06:22.416518 2331 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.8:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.8:6443: connect: connection refused Mar 25 01:06:22.416628 kubelet[2331]: E0325 01:06:22.416567 2331 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://10.0.0.8:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.8:6443: connect: connection refused Mar 25 01:06:22.416628 kubelet[2331]: I0325 01:06:22.416625 2331 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 25 01:06:22.416628 kubelet[2331]: I0325 01:06:22.416633 2331 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 25 01:06:22.416764 kubelet[2331]: I0325 01:06:22.416649 2331 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:06:22.483851 kubelet[2331]: I0325 01:06:22.483821 2331 policy_none.go:49] "None policy: Start" Mar 25 01:06:22.484564 kubelet[2331]: I0325 01:06:22.484531 2331 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 25 01:06:22.484564 kubelet[2331]: I0325 01:06:22.484559 2331 state_mem.go:35] "Initializing new in-memory state store" Mar 25 01:06:22.492311 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 25 01:06:22.500647 kubelet[2331]: I0325 01:06:22.500620 2331 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 25 01:06:22.500968 kubelet[2331]: E0325 01:06:22.500925 2331 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.0.0.8:6443/api/v1/nodes\": dial tcp 10.0.0.8:6443: connect: connection refused" node="localhost" Mar 25 01:06:22.502449 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 25 01:06:22.505800 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 25 01:06:22.513337 kubelet[2331]: E0325 01:06:22.513302 2331 kubelet.go:2361] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 25 01:06:22.515842 kubelet[2331]: I0325 01:06:22.515816 2331 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 25 01:06:22.515842 kubelet[2331]: I0325 01:06:22.515994 2331 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 25 01:06:22.515842 kubelet[2331]: I0325 01:06:22.516101 2331 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 25 01:06:22.518144 kubelet[2331]: E0325 01:06:22.518031 2331 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Mar 25 01:06:22.602917 kubelet[2331]: E0325 01:06:22.602810 2331 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.8:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.8:6443: connect: connection refused" interval="400ms" Mar 25 01:06:22.703124 kubelet[2331]: I0325 01:06:22.703071 2331 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 25 01:06:22.703522 kubelet[2331]: E0325 01:06:22.703484 2331 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.0.0.8:6443/api/v1/nodes\": dial tcp 10.0.0.8:6443: connect: connection refused" node="localhost" Mar 25 01:06:22.713617 kubelet[2331]: I0325 01:06:22.713567 2331 topology_manager.go:215] "Topology Admit Handler" podUID="ea1c6d3298545caa5fde17122e60796f" podNamespace="kube-system" podName="kube-apiserver-localhost" Mar 25 01:06:22.714599 kubelet[2331]: I0325 01:06:22.714568 2331 topology_manager.go:215] "Topology Admit Handler" podUID="23a18e2dc14f395c5f1bea711a5a9344" podNamespace="kube-system" podName="kube-controller-manager-localhost" Mar 25 01:06:22.716141 kubelet[2331]: I0325 01:06:22.715607 2331 topology_manager.go:215] "Topology Admit Handler" podUID="d79ab404294384d4bcc36fb5b5509bbb" podNamespace="kube-system" podName="kube-scheduler-localhost" Mar 25 01:06:22.721357 systemd[1]: Created slice kubepods-burstable-podea1c6d3298545caa5fde17122e60796f.slice - libcontainer container kubepods-burstable-podea1c6d3298545caa5fde17122e60796f.slice. Mar 25 01:06:22.747924 systemd[1]: Created slice kubepods-burstable-pod23a18e2dc14f395c5f1bea711a5a9344.slice - libcontainer container kubepods-burstable-pod23a18e2dc14f395c5f1bea711a5a9344.slice. Mar 25 01:06:22.763393 systemd[1]: Created slice kubepods-burstable-podd79ab404294384d4bcc36fb5b5509bbb.slice - libcontainer container kubepods-burstable-podd79ab404294384d4bcc36fb5b5509bbb.slice. Mar 25 01:06:22.803277 kubelet[2331]: I0325 01:06:22.803042 2331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ea1c6d3298545caa5fde17122e60796f-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"ea1c6d3298545caa5fde17122e60796f\") " pod="kube-system/kube-apiserver-localhost" Mar 25 01:06:22.803277 kubelet[2331]: I0325 01:06:22.803290 2331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ea1c6d3298545caa5fde17122e60796f-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"ea1c6d3298545caa5fde17122e60796f\") " pod="kube-system/kube-apiserver-localhost" Mar 25 01:06:22.803423 kubelet[2331]: I0325 01:06:22.803310 2331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 25 01:06:22.803423 kubelet[2331]: I0325 01:06:22.803326 2331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d79ab404294384d4bcc36fb5b5509bbb-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"d79ab404294384d4bcc36fb5b5509bbb\") " pod="kube-system/kube-scheduler-localhost" Mar 25 01:06:22.803423 kubelet[2331]: I0325 01:06:22.803341 2331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ea1c6d3298545caa5fde17122e60796f-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"ea1c6d3298545caa5fde17122e60796f\") " pod="kube-system/kube-apiserver-localhost" Mar 25 01:06:22.803423 kubelet[2331]: I0325 01:06:22.803357 2331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 25 01:06:22.803423 kubelet[2331]: I0325 01:06:22.803371 2331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 25 01:06:22.803528 kubelet[2331]: I0325 01:06:22.803387 2331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 25 01:06:22.803528 kubelet[2331]: I0325 01:06:22.803404 2331 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 25 01:06:23.003503 kubelet[2331]: E0325 01:06:23.003381 2331 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.8:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.8:6443: connect: connection refused" interval="800ms" Mar 25 01:06:23.046457 containerd[1456]: time="2025-03-25T01:06:23.046361448Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:ea1c6d3298545caa5fde17122e60796f,Namespace:kube-system,Attempt:0,}" Mar 25 01:06:23.062139 containerd[1456]: time="2025-03-25T01:06:23.062005766Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:23a18e2dc14f395c5f1bea711a5a9344,Namespace:kube-system,Attempt:0,}" Mar 25 01:06:23.065844 containerd[1456]: time="2025-03-25T01:06:23.065652528Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:d79ab404294384d4bcc36fb5b5509bbb,Namespace:kube-system,Attempt:0,}" Mar 25 01:06:23.105256 kubelet[2331]: I0325 01:06:23.105224 2331 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 25 01:06:23.105597 kubelet[2331]: E0325 01:06:23.105557 2331 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.0.0.8:6443/api/v1/nodes\": dial tcp 10.0.0.8:6443: connect: connection refused" node="localhost" Mar 25 01:06:23.244657 kubelet[2331]: W0325 01:06:23.244598 2331 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.8:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.8:6443: connect: connection refused Mar 25 01:06:23.244657 kubelet[2331]: E0325 01:06:23.244641 2331 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://10.0.0.8:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.8:6443: connect: connection refused Mar 25 01:06:23.306434 kubelet[2331]: W0325 01:06:23.306313 2331 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.8:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.8:6443: connect: connection refused Mar 25 01:06:23.306434 kubelet[2331]: E0325 01:06:23.306356 2331 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://10.0.0.8:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.8:6443: connect: connection refused Mar 25 01:06:23.408952 kubelet[2331]: W0325 01:06:23.408901 2331 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.8:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.8:6443: connect: connection refused Mar 25 01:06:23.409305 kubelet[2331]: E0325 01:06:23.408968 2331 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://10.0.0.8:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.8:6443: connect: connection refused Mar 25 01:06:23.598688 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount184374623.mount: Deactivated successfully. Mar 25 01:06:23.604172 containerd[1456]: time="2025-03-25T01:06:23.603755166Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:06:23.605003 containerd[1456]: time="2025-03-25T01:06:23.604945713Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268705" Mar 25 01:06:23.608060 containerd[1456]: time="2025-03-25T01:06:23.608023521Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:06:23.609265 containerd[1456]: time="2025-03-25T01:06:23.609231109Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:06:23.610676 containerd[1456]: time="2025-03-25T01:06:23.610620174Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 25 01:06:23.611127 containerd[1456]: time="2025-03-25T01:06:23.611086410Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:06:23.612364 containerd[1456]: time="2025-03-25T01:06:23.612326237Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:06:23.612622 containerd[1456]: time="2025-03-25T01:06:23.612574754Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 25 01:06:23.614450 containerd[1456]: time="2025-03-25T01:06:23.614243657Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 550.357791ms" Mar 25 01:06:23.614729 containerd[1456]: time="2025-03-25T01:06:23.614700292Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 547.446861ms" Mar 25 01:06:23.617868 containerd[1456]: time="2025-03-25T01:06:23.617830819Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 569.168795ms" Mar 25 01:06:23.636661 containerd[1456]: time="2025-03-25T01:06:23.636611784Z" level=info msg="connecting to shim 21b4281c158b2d30f932e27db1c49337d6bf754e3e01e6089e68675e44c46c15" address="unix:///run/containerd/s/a28fd37e1de972a7d96b8b17ffdb380935fda905a31139e6bafa1e9c86499abe" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:06:23.644665 containerd[1456]: time="2025-03-25T01:06:23.644191025Z" level=info msg="connecting to shim cd1484b9e3e2b85a52eb2e67cb87763dea554d38308c5149bb50fa5fe976bf04" address="unix:///run/containerd/s/a47c33a70f72b44a7362e41de07f94f7a727a4ef86e067b590b5583481e3a2dd" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:06:23.650193 containerd[1456]: time="2025-03-25T01:06:23.650146483Z" level=info msg="connecting to shim 415f533548b6518db03b6a591b8b1021271bf2a483a265a1fbc42c974c05109d" address="unix:///run/containerd/s/6bd8ef3c50813eaed96e30339941307b64bf636c4e994dbe96a8cac078252382" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:06:23.673321 systemd[1]: Started cri-containerd-21b4281c158b2d30f932e27db1c49337d6bf754e3e01e6089e68675e44c46c15.scope - libcontainer container 21b4281c158b2d30f932e27db1c49337d6bf754e3e01e6089e68675e44c46c15. Mar 25 01:06:23.674863 systemd[1]: Started cri-containerd-cd1484b9e3e2b85a52eb2e67cb87763dea554d38308c5149bb50fa5fe976bf04.scope - libcontainer container cd1484b9e3e2b85a52eb2e67cb87763dea554d38308c5149bb50fa5fe976bf04. Mar 25 01:06:23.681394 systemd[1]: Started cri-containerd-415f533548b6518db03b6a591b8b1021271bf2a483a265a1fbc42c974c05109d.scope - libcontainer container 415f533548b6518db03b6a591b8b1021271bf2a483a265a1fbc42c974c05109d. Mar 25 01:06:23.726796 containerd[1456]: time="2025-03-25T01:06:23.726750885Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:d79ab404294384d4bcc36fb5b5509bbb,Namespace:kube-system,Attempt:0,} returns sandbox id \"21b4281c158b2d30f932e27db1c49337d6bf754e3e01e6089e68675e44c46c15\"" Mar 25 01:06:23.728001 containerd[1456]: time="2025-03-25T01:06:23.727194361Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:23a18e2dc14f395c5f1bea711a5a9344,Namespace:kube-system,Attempt:0,} returns sandbox id \"cd1484b9e3e2b85a52eb2e67cb87763dea554d38308c5149bb50fa5fe976bf04\"" Mar 25 01:06:23.731191 containerd[1456]: time="2025-03-25T01:06:23.730816003Z" level=info msg="CreateContainer within sandbox \"21b4281c158b2d30f932e27db1c49337d6bf754e3e01e6089e68675e44c46c15\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 25 01:06:23.731464 containerd[1456]: time="2025-03-25T01:06:23.731431357Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:ea1c6d3298545caa5fde17122e60796f,Namespace:kube-system,Attempt:0,} returns sandbox id \"415f533548b6518db03b6a591b8b1021271bf2a483a265a1fbc42c974c05109d\"" Mar 25 01:06:23.732005 containerd[1456]: time="2025-03-25T01:06:23.731974231Z" level=info msg="CreateContainer within sandbox \"cd1484b9e3e2b85a52eb2e67cb87763dea554d38308c5149bb50fa5fe976bf04\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 25 01:06:23.734802 containerd[1456]: time="2025-03-25T01:06:23.734770842Z" level=info msg="CreateContainer within sandbox \"415f533548b6518db03b6a591b8b1021271bf2a483a265a1fbc42c974c05109d\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 25 01:06:23.740877 containerd[1456]: time="2025-03-25T01:06:23.740820299Z" level=info msg="Container 68a13d0cff16a21ec96666baf048ec300023b4364cd74cf01a7c5129b12207a8: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:06:23.751705 containerd[1456]: time="2025-03-25T01:06:23.751482668Z" level=info msg="Container 32720e50a09cfcd32801133825ffdd756ed34cb39686235b1fcd3a9a1d4f8a68: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:06:23.751705 containerd[1456]: time="2025-03-25T01:06:23.751547867Z" level=info msg="CreateContainer within sandbox \"21b4281c158b2d30f932e27db1c49337d6bf754e3e01e6089e68675e44c46c15\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"68a13d0cff16a21ec96666baf048ec300023b4364cd74cf01a7c5129b12207a8\"" Mar 25 01:06:23.752672 containerd[1456]: time="2025-03-25T01:06:23.752444858Z" level=info msg="StartContainer for \"68a13d0cff16a21ec96666baf048ec300023b4364cd74cf01a7c5129b12207a8\"" Mar 25 01:06:23.753362 containerd[1456]: time="2025-03-25T01:06:23.753332129Z" level=info msg="Container 15b0857433ca42e65ddbbd4032ed2102bdc0556717ef5b83fb2db9fe3fbef658: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:06:23.753608 containerd[1456]: time="2025-03-25T01:06:23.753581246Z" level=info msg="connecting to shim 68a13d0cff16a21ec96666baf048ec300023b4364cd74cf01a7c5129b12207a8" address="unix:///run/containerd/s/a28fd37e1de972a7d96b8b17ffdb380935fda905a31139e6bafa1e9c86499abe" protocol=ttrpc version=3 Mar 25 01:06:23.762069 containerd[1456]: time="2025-03-25T01:06:23.762014078Z" level=info msg="CreateContainer within sandbox \"cd1484b9e3e2b85a52eb2e67cb87763dea554d38308c5149bb50fa5fe976bf04\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"32720e50a09cfcd32801133825ffdd756ed34cb39686235b1fcd3a9a1d4f8a68\"" Mar 25 01:06:23.762230 containerd[1456]: time="2025-03-25T01:06:23.762041918Z" level=info msg="CreateContainer within sandbox \"415f533548b6518db03b6a591b8b1021271bf2a483a265a1fbc42c974c05109d\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"15b0857433ca42e65ddbbd4032ed2102bdc0556717ef5b83fb2db9fe3fbef658\"" Mar 25 01:06:23.762586 containerd[1456]: time="2025-03-25T01:06:23.762560073Z" level=info msg="StartContainer for \"32720e50a09cfcd32801133825ffdd756ed34cb39686235b1fcd3a9a1d4f8a68\"" Mar 25 01:06:23.764371 containerd[1456]: time="2025-03-25T01:06:23.762840230Z" level=info msg="StartContainer for \"15b0857433ca42e65ddbbd4032ed2102bdc0556717ef5b83fb2db9fe3fbef658\"" Mar 25 01:06:23.764371 containerd[1456]: time="2025-03-25T01:06:23.763910899Z" level=info msg="connecting to shim 15b0857433ca42e65ddbbd4032ed2102bdc0556717ef5b83fb2db9fe3fbef658" address="unix:///run/containerd/s/6bd8ef3c50813eaed96e30339941307b64bf636c4e994dbe96a8cac078252382" protocol=ttrpc version=3 Mar 25 01:06:23.765663 containerd[1456]: time="2025-03-25T01:06:23.765627201Z" level=info msg="connecting to shim 32720e50a09cfcd32801133825ffdd756ed34cb39686235b1fcd3a9a1d4f8a68" address="unix:///run/containerd/s/a47c33a70f72b44a7362e41de07f94f7a727a4ef86e067b590b5583481e3a2dd" protocol=ttrpc version=3 Mar 25 01:06:23.777305 systemd[1]: Started cri-containerd-68a13d0cff16a21ec96666baf048ec300023b4364cd74cf01a7c5129b12207a8.scope - libcontainer container 68a13d0cff16a21ec96666baf048ec300023b4364cd74cf01a7c5129b12207a8. Mar 25 01:06:23.791290 systemd[1]: Started cri-containerd-15b0857433ca42e65ddbbd4032ed2102bdc0556717ef5b83fb2db9fe3fbef658.scope - libcontainer container 15b0857433ca42e65ddbbd4032ed2102bdc0556717ef5b83fb2db9fe3fbef658. Mar 25 01:06:23.792295 systemd[1]: Started cri-containerd-32720e50a09cfcd32801133825ffdd756ed34cb39686235b1fcd3a9a1d4f8a68.scope - libcontainer container 32720e50a09cfcd32801133825ffdd756ed34cb39686235b1fcd3a9a1d4f8a68. Mar 25 01:06:23.804615 kubelet[2331]: E0325 01:06:23.804535 2331 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.8:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.8:6443: connect: connection refused" interval="1.6s" Mar 25 01:06:23.834246 containerd[1456]: time="2025-03-25T01:06:23.834203087Z" level=info msg="StartContainer for \"68a13d0cff16a21ec96666baf048ec300023b4364cd74cf01a7c5129b12207a8\" returns successfully" Mar 25 01:06:23.875385 containerd[1456]: time="2025-03-25T01:06:23.875269059Z" level=info msg="StartContainer for \"15b0857433ca42e65ddbbd4032ed2102bdc0556717ef5b83fb2db9fe3fbef658\" returns successfully" Mar 25 01:06:23.881583 containerd[1456]: time="2025-03-25T01:06:23.881546514Z" level=info msg="StartContainer for \"32720e50a09cfcd32801133825ffdd756ed34cb39686235b1fcd3a9a1d4f8a68\" returns successfully" Mar 25 01:06:23.907851 kubelet[2331]: I0325 01:06:23.907820 2331 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 25 01:06:23.908585 kubelet[2331]: E0325 01:06:23.908552 2331 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.0.0.8:6443/api/v1/nodes\": dial tcp 10.0.0.8:6443: connect: connection refused" node="localhost" Mar 25 01:06:23.915185 kubelet[2331]: W0325 01:06:23.915085 2331 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.8:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.0.0.8:6443: connect: connection refused Mar 25 01:06:23.915185 kubelet[2331]: E0325 01:06:23.915166 2331 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.0.0.8:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.0.0.8:6443: connect: connection refused Mar 25 01:06:25.510566 kubelet[2331]: I0325 01:06:25.510522 2331 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 25 01:06:25.589904 kubelet[2331]: E0325 01:06:25.589844 2331 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Mar 25 01:06:25.660007 kubelet[2331]: I0325 01:06:25.659832 2331 kubelet_node_status.go:76] "Successfully registered node" node="localhost" Mar 25 01:06:25.672259 kubelet[2331]: E0325 01:06:25.672207 2331 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 25 01:06:25.773102 kubelet[2331]: E0325 01:06:25.772971 2331 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 25 01:06:25.873505 kubelet[2331]: E0325 01:06:25.873457 2331 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 25 01:06:25.974437 kubelet[2331]: E0325 01:06:25.974394 2331 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 25 01:06:26.075222 kubelet[2331]: E0325 01:06:26.074875 2331 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 25 01:06:26.394246 kubelet[2331]: I0325 01:06:26.393935 2331 apiserver.go:52] "Watching apiserver" Mar 25 01:06:26.400435 kubelet[2331]: I0325 01:06:26.400392 2331 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 25 01:06:26.458212 kubelet[2331]: E0325 01:06:26.458175 2331 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Mar 25 01:06:27.521801 systemd[1]: Reload requested from client PID 2609 ('systemctl') (unit session-7.scope)... Mar 25 01:06:27.521818 systemd[1]: Reloading... Mar 25 01:06:27.584141 zram_generator::config[2656]: No configuration found. Mar 25 01:06:27.737293 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:06:27.820360 systemd[1]: Reloading finished in 298 ms. Mar 25 01:06:27.840132 kubelet[2331]: I0325 01:06:27.839920 2331 dynamic_cafile_content.go:171] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 25 01:06:27.840064 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:06:27.856196 systemd[1]: kubelet.service: Deactivated successfully. Mar 25 01:06:27.856739 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:06:27.856799 systemd[1]: kubelet.service: Consumed 2.220s CPU time, 114.4M memory peak. Mar 25 01:06:27.859066 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:06:27.977123 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:06:27.981906 (kubelet)[2695]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 25 01:06:28.022239 kubelet[2695]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:06:28.023129 kubelet[2695]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 25 01:06:28.023129 kubelet[2695]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:06:28.023129 kubelet[2695]: I0325 01:06:28.022619 2695 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 25 01:06:28.026394 kubelet[2695]: I0325 01:06:28.026369 2695 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Mar 25 01:06:28.026581 kubelet[2695]: I0325 01:06:28.026490 2695 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 25 01:06:28.026777 kubelet[2695]: I0325 01:06:28.026758 2695 server.go:927] "Client rotation is on, will bootstrap in background" Mar 25 01:06:28.030523 kubelet[2695]: I0325 01:06:28.030489 2695 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 25 01:06:28.031791 kubelet[2695]: I0325 01:06:28.031759 2695 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 25 01:06:28.036658 kubelet[2695]: I0325 01:06:28.036632 2695 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 25 01:06:28.036832 kubelet[2695]: I0325 01:06:28.036792 2695 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 25 01:06:28.036984 kubelet[2695]: I0325 01:06:28.036821 2695 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Mar 25 01:06:28.036984 kubelet[2695]: I0325 01:06:28.036984 2695 topology_manager.go:138] "Creating topology manager with none policy" Mar 25 01:06:28.037088 kubelet[2695]: I0325 01:06:28.036992 2695 container_manager_linux.go:301] "Creating device plugin manager" Mar 25 01:06:28.037088 kubelet[2695]: I0325 01:06:28.037025 2695 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:06:28.037172 kubelet[2695]: I0325 01:06:28.037160 2695 kubelet.go:400] "Attempting to sync node with API server" Mar 25 01:06:28.037202 kubelet[2695]: I0325 01:06:28.037177 2695 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 25 01:06:28.037228 kubelet[2695]: I0325 01:06:28.037203 2695 kubelet.go:312] "Adding apiserver pod source" Mar 25 01:06:28.037228 kubelet[2695]: I0325 01:06:28.037216 2695 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 25 01:06:28.038137 kubelet[2695]: I0325 01:06:28.038088 2695 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 25 01:06:28.038412 kubelet[2695]: I0325 01:06:28.038398 2695 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 25 01:06:28.042151 kubelet[2695]: I0325 01:06:28.040819 2695 server.go:1264] "Started kubelet" Mar 25 01:06:28.042151 kubelet[2695]: I0325 01:06:28.041346 2695 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 25 01:06:28.042245 kubelet[2695]: I0325 01:06:28.042154 2695 server.go:455] "Adding debug handlers to kubelet server" Mar 25 01:06:28.047120 kubelet[2695]: I0325 01:06:28.044192 2695 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 25 01:06:28.048704 kubelet[2695]: I0325 01:06:28.048676 2695 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 25 01:06:28.050597 kubelet[2695]: I0325 01:06:28.050553 2695 volume_manager.go:291] "Starting Kubelet Volume Manager" Mar 25 01:06:28.052278 kubelet[2695]: I0325 01:06:28.052255 2695 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 25 01:06:28.052810 kubelet[2695]: I0325 01:06:28.052490 2695 reconciler.go:26] "Reconciler: start to sync state" Mar 25 01:06:28.052985 kubelet[2695]: I0325 01:06:28.052565 2695 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 25 01:06:28.055170 kubelet[2695]: I0325 01:06:28.055151 2695 factory.go:221] Registration of the systemd container factory successfully Mar 25 01:06:28.057560 kubelet[2695]: I0325 01:06:28.055256 2695 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 25 01:06:28.063198 kubelet[2695]: I0325 01:06:28.063169 2695 factory.go:221] Registration of the containerd container factory successfully Mar 25 01:06:28.064263 kubelet[2695]: E0325 01:06:28.064227 2695 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 25 01:06:28.069418 kubelet[2695]: I0325 01:06:28.069379 2695 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 25 01:06:28.070739 kubelet[2695]: I0325 01:06:28.070567 2695 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 25 01:06:28.070739 kubelet[2695]: I0325 01:06:28.070634 2695 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 25 01:06:28.070739 kubelet[2695]: I0325 01:06:28.070655 2695 kubelet.go:2337] "Starting kubelet main sync loop" Mar 25 01:06:28.071512 kubelet[2695]: E0325 01:06:28.070693 2695 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 25 01:06:28.093415 kubelet[2695]: I0325 01:06:28.093376 2695 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 25 01:06:28.093415 kubelet[2695]: I0325 01:06:28.093401 2695 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 25 01:06:28.093415 kubelet[2695]: I0325 01:06:28.093427 2695 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:06:28.093660 kubelet[2695]: I0325 01:06:28.093626 2695 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 25 01:06:28.093660 kubelet[2695]: I0325 01:06:28.093646 2695 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 25 01:06:28.093660 kubelet[2695]: I0325 01:06:28.093665 2695 policy_none.go:49] "None policy: Start" Mar 25 01:06:28.094267 kubelet[2695]: I0325 01:06:28.094249 2695 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 25 01:06:28.094306 kubelet[2695]: I0325 01:06:28.094273 2695 state_mem.go:35] "Initializing new in-memory state store" Mar 25 01:06:28.094408 kubelet[2695]: I0325 01:06:28.094394 2695 state_mem.go:75] "Updated machine memory state" Mar 25 01:06:28.101201 kubelet[2695]: I0325 01:06:28.101176 2695 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 25 01:06:28.101782 kubelet[2695]: I0325 01:06:28.101475 2695 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 25 01:06:28.101782 kubelet[2695]: I0325 01:06:28.101589 2695 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 25 01:06:28.152259 kubelet[2695]: I0325 01:06:28.152232 2695 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 25 01:06:28.157670 kubelet[2695]: I0325 01:06:28.157643 2695 kubelet_node_status.go:112] "Node was previously registered" node="localhost" Mar 25 01:06:28.157773 kubelet[2695]: I0325 01:06:28.157719 2695 kubelet_node_status.go:76] "Successfully registered node" node="localhost" Mar 25 01:06:28.172485 kubelet[2695]: I0325 01:06:28.172442 2695 topology_manager.go:215] "Topology Admit Handler" podUID="ea1c6d3298545caa5fde17122e60796f" podNamespace="kube-system" podName="kube-apiserver-localhost" Mar 25 01:06:28.172605 kubelet[2695]: I0325 01:06:28.172558 2695 topology_manager.go:215] "Topology Admit Handler" podUID="23a18e2dc14f395c5f1bea711a5a9344" podNamespace="kube-system" podName="kube-controller-manager-localhost" Mar 25 01:06:28.172605 kubelet[2695]: I0325 01:06:28.172596 2695 topology_manager.go:215] "Topology Admit Handler" podUID="d79ab404294384d4bcc36fb5b5509bbb" podNamespace="kube-system" podName="kube-scheduler-localhost" Mar 25 01:06:28.254380 kubelet[2695]: I0325 01:06:28.254313 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ea1c6d3298545caa5fde17122e60796f-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"ea1c6d3298545caa5fde17122e60796f\") " pod="kube-system/kube-apiserver-localhost" Mar 25 01:06:28.254380 kubelet[2695]: I0325 01:06:28.254378 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 25 01:06:28.254548 kubelet[2695]: I0325 01:06:28.254411 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d79ab404294384d4bcc36fb5b5509bbb-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"d79ab404294384d4bcc36fb5b5509bbb\") " pod="kube-system/kube-scheduler-localhost" Mar 25 01:06:28.254548 kubelet[2695]: I0325 01:06:28.254441 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ea1c6d3298545caa5fde17122e60796f-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"ea1c6d3298545caa5fde17122e60796f\") " pod="kube-system/kube-apiserver-localhost" Mar 25 01:06:28.254548 kubelet[2695]: I0325 01:06:28.254467 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ea1c6d3298545caa5fde17122e60796f-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"ea1c6d3298545caa5fde17122e60796f\") " pod="kube-system/kube-apiserver-localhost" Mar 25 01:06:28.254548 kubelet[2695]: I0325 01:06:28.254489 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 25 01:06:28.254548 kubelet[2695]: I0325 01:06:28.254504 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 25 01:06:28.254655 kubelet[2695]: I0325 01:06:28.254524 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 25 01:06:28.254655 kubelet[2695]: I0325 01:06:28.254543 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 25 01:06:29.038238 kubelet[2695]: I0325 01:06:29.038185 2695 apiserver.go:52] "Watching apiserver" Mar 25 01:06:29.053063 kubelet[2695]: I0325 01:06:29.053020 2695 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 25 01:06:29.129545 kubelet[2695]: I0325 01:06:29.129472 2695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.129428051 podStartE2EDuration="1.129428051s" podCreationTimestamp="2025-03-25 01:06:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:06:29.129206693 +0000 UTC m=+1.144264377" watchObservedRunningTime="2025-03-25 01:06:29.129428051 +0000 UTC m=+1.144485735" Mar 25 01:06:29.129750 kubelet[2695]: E0325 01:06:29.129718 2695 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Mar 25 01:06:29.160811 kubelet[2695]: I0325 01:06:29.160754 2695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.1607386179999999 podStartE2EDuration="1.160738618s" podCreationTimestamp="2025-03-25 01:06:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:06:29.149596021 +0000 UTC m=+1.164653705" watchObservedRunningTime="2025-03-25 01:06:29.160738618 +0000 UTC m=+1.175796302" Mar 25 01:06:29.172460 kubelet[2695]: I0325 01:06:29.172393 2695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.1723778519999999 podStartE2EDuration="1.172377852s" podCreationTimestamp="2025-03-25 01:06:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:06:29.161409893 +0000 UTC m=+1.176467577" watchObservedRunningTime="2025-03-25 01:06:29.172377852 +0000 UTC m=+1.187435496" Mar 25 01:06:32.776310 sudo[1655]: pam_unix(sudo:session): session closed for user root Mar 25 01:06:32.777771 sshd[1654]: Connection closed by 10.0.0.1 port 54566 Mar 25 01:06:32.778310 sshd-session[1651]: pam_unix(sshd:session): session closed for user core Mar 25 01:06:32.782680 systemd[1]: sshd@6-10.0.0.8:22-10.0.0.1:54566.service: Deactivated successfully. Mar 25 01:06:32.784475 systemd[1]: session-7.scope: Deactivated successfully. Mar 25 01:06:32.784666 systemd[1]: session-7.scope: Consumed 8.245s CPU time, 240.4M memory peak. Mar 25 01:06:32.785452 systemd-logind[1439]: Session 7 logged out. Waiting for processes to exit. Mar 25 01:06:32.786423 systemd-logind[1439]: Removed session 7. Mar 25 01:06:36.082970 update_engine[1444]: I20250325 01:06:36.082883 1444 update_attempter.cc:509] Updating boot flags... Mar 25 01:06:36.127137 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (2791) Mar 25 01:06:36.170259 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (2795) Mar 25 01:06:44.231611 kubelet[2695]: I0325 01:06:44.231566 2695 topology_manager.go:215] "Topology Admit Handler" podUID="5b610744-7dec-4af0-be3f-78ea299e5651" podNamespace="kube-system" podName="kube-proxy-dwd6m" Mar 25 01:06:44.244524 systemd[1]: Created slice kubepods-besteffort-pod5b610744_7dec_4af0_be3f_78ea299e5651.slice - libcontainer container kubepods-besteffort-pod5b610744_7dec_4af0_be3f_78ea299e5651.slice. Mar 25 01:06:44.256438 kubelet[2695]: I0325 01:06:44.256390 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/5b610744-7dec-4af0-be3f-78ea299e5651-xtables-lock\") pod \"kube-proxy-dwd6m\" (UID: \"5b610744-7dec-4af0-be3f-78ea299e5651\") " pod="kube-system/kube-proxy-dwd6m" Mar 25 01:06:44.256438 kubelet[2695]: I0325 01:06:44.256427 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5b610744-7dec-4af0-be3f-78ea299e5651-lib-modules\") pod \"kube-proxy-dwd6m\" (UID: \"5b610744-7dec-4af0-be3f-78ea299e5651\") " pod="kube-system/kube-proxy-dwd6m" Mar 25 01:06:44.256438 kubelet[2695]: I0325 01:06:44.256445 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/5b610744-7dec-4af0-be3f-78ea299e5651-kube-proxy\") pod \"kube-proxy-dwd6m\" (UID: \"5b610744-7dec-4af0-be3f-78ea299e5651\") " pod="kube-system/kube-proxy-dwd6m" Mar 25 01:06:44.256755 kubelet[2695]: I0325 01:06:44.256465 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxxv2\" (UniqueName: \"kubernetes.io/projected/5b610744-7dec-4af0-be3f-78ea299e5651-kube-api-access-hxxv2\") pod \"kube-proxy-dwd6m\" (UID: \"5b610744-7dec-4af0-be3f-78ea299e5651\") " pod="kube-system/kube-proxy-dwd6m" Mar 25 01:06:44.258427 kubelet[2695]: I0325 01:06:44.257982 2695 kuberuntime_manager.go:1523] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 25 01:06:44.261864 containerd[1456]: time="2025-03-25T01:06:44.261678865Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 25 01:06:44.262794 kubelet[2695]: I0325 01:06:44.262317 2695 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 25 01:06:44.370250 kubelet[2695]: I0325 01:06:44.370203 2695 topology_manager.go:215] "Topology Admit Handler" podUID="c1746632-e25e-4008-af48-18f1b58bf9f5" podNamespace="tigera-operator" podName="tigera-operator-6479d6dc54-6t74z" Mar 25 01:06:44.382283 systemd[1]: Created slice kubepods-besteffort-podc1746632_e25e_4008_af48_18f1b58bf9f5.slice - libcontainer container kubepods-besteffort-podc1746632_e25e_4008_af48_18f1b58bf9f5.slice. Mar 25 01:06:44.456944 kubelet[2695]: I0325 01:06:44.456883 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77hrh\" (UniqueName: \"kubernetes.io/projected/c1746632-e25e-4008-af48-18f1b58bf9f5-kube-api-access-77hrh\") pod \"tigera-operator-6479d6dc54-6t74z\" (UID: \"c1746632-e25e-4008-af48-18f1b58bf9f5\") " pod="tigera-operator/tigera-operator-6479d6dc54-6t74z" Mar 25 01:06:44.457174 kubelet[2695]: I0325 01:06:44.457142 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/c1746632-e25e-4008-af48-18f1b58bf9f5-var-lib-calico\") pod \"tigera-operator-6479d6dc54-6t74z\" (UID: \"c1746632-e25e-4008-af48-18f1b58bf9f5\") " pod="tigera-operator/tigera-operator-6479d6dc54-6t74z" Mar 25 01:06:44.561613 containerd[1456]: time="2025-03-25T01:06:44.561569163Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-dwd6m,Uid:5b610744-7dec-4af0-be3f-78ea299e5651,Namespace:kube-system,Attempt:0,}" Mar 25 01:06:44.590147 containerd[1456]: time="2025-03-25T01:06:44.589804143Z" level=info msg="connecting to shim 5bccbbf80a78bc3bf9cc3ad6767a460a29af9af744d535edcac64fcdde3c33b9" address="unix:///run/containerd/s/412da4151134347e24ae084f9bffd1a91884bad9a975037df37a1213e1356bfe" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:06:44.610291 systemd[1]: Started cri-containerd-5bccbbf80a78bc3bf9cc3ad6767a460a29af9af744d535edcac64fcdde3c33b9.scope - libcontainer container 5bccbbf80a78bc3bf9cc3ad6767a460a29af9af744d535edcac64fcdde3c33b9. Mar 25 01:06:44.632273 containerd[1456]: time="2025-03-25T01:06:44.632235232Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-dwd6m,Uid:5b610744-7dec-4af0-be3f-78ea299e5651,Namespace:kube-system,Attempt:0,} returns sandbox id \"5bccbbf80a78bc3bf9cc3ad6767a460a29af9af744d535edcac64fcdde3c33b9\"" Mar 25 01:06:44.636145 containerd[1456]: time="2025-03-25T01:06:44.635135102Z" level=info msg="CreateContainer within sandbox \"5bccbbf80a78bc3bf9cc3ad6767a460a29af9af744d535edcac64fcdde3c33b9\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 25 01:06:44.642263 containerd[1456]: time="2025-03-25T01:06:44.642225917Z" level=info msg="Container 9caed3215969c57d88d97feff80b60878337edffe8d0f0b8aa9166d7822bc86f: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:06:44.658354 containerd[1456]: time="2025-03-25T01:06:44.658298340Z" level=info msg="CreateContainer within sandbox \"5bccbbf80a78bc3bf9cc3ad6767a460a29af9af744d535edcac64fcdde3c33b9\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"9caed3215969c57d88d97feff80b60878337edffe8d0f0b8aa9166d7822bc86f\"" Mar 25 01:06:44.661556 containerd[1456]: time="2025-03-25T01:06:44.661379569Z" level=info msg="StartContainer for \"9caed3215969c57d88d97feff80b60878337edffe8d0f0b8aa9166d7822bc86f\"" Mar 25 01:06:44.663234 containerd[1456]: time="2025-03-25T01:06:44.663196243Z" level=info msg="connecting to shim 9caed3215969c57d88d97feff80b60878337edffe8d0f0b8aa9166d7822bc86f" address="unix:///run/containerd/s/412da4151134347e24ae084f9bffd1a91884bad9a975037df37a1213e1356bfe" protocol=ttrpc version=3 Mar 25 01:06:44.685435 containerd[1456]: time="2025-03-25T01:06:44.685191245Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6479d6dc54-6t74z,Uid:c1746632-e25e-4008-af48-18f1b58bf9f5,Namespace:tigera-operator,Attempt:0,}" Mar 25 01:06:44.692304 systemd[1]: Started cri-containerd-9caed3215969c57d88d97feff80b60878337edffe8d0f0b8aa9166d7822bc86f.scope - libcontainer container 9caed3215969c57d88d97feff80b60878337edffe8d0f0b8aa9166d7822bc86f. Mar 25 01:06:44.728162 containerd[1456]: time="2025-03-25T01:06:44.728120493Z" level=info msg="StartContainer for \"9caed3215969c57d88d97feff80b60878337edffe8d0f0b8aa9166d7822bc86f\" returns successfully" Mar 25 01:06:44.757385 containerd[1456]: time="2025-03-25T01:06:44.756620352Z" level=info msg="connecting to shim 646d73c5318545e1ac2b9d80925a6d63a7cdad018a222bbdd33c5f9112a63005" address="unix:///run/containerd/s/2b411b4270e543556ab4d60f611358e48a284bf76053ae04d6bee54ffdc0ddec" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:06:44.790298 systemd[1]: Started cri-containerd-646d73c5318545e1ac2b9d80925a6d63a7cdad018a222bbdd33c5f9112a63005.scope - libcontainer container 646d73c5318545e1ac2b9d80925a6d63a7cdad018a222bbdd33c5f9112a63005. Mar 25 01:06:44.825363 containerd[1456]: time="2025-03-25T01:06:44.825115389Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6479d6dc54-6t74z,Uid:c1746632-e25e-4008-af48-18f1b58bf9f5,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"646d73c5318545e1ac2b9d80925a6d63a7cdad018a222bbdd33c5f9112a63005\"" Mar 25 01:06:44.827774 containerd[1456]: time="2025-03-25T01:06:44.827651700Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\"" Mar 25 01:06:45.136922 kubelet[2695]: I0325 01:06:45.136762 2695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-dwd6m" podStartSLOduration=1.136744065 podStartE2EDuration="1.136744065s" podCreationTimestamp="2025-03-25 01:06:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:06:45.136567586 +0000 UTC m=+17.151625270" watchObservedRunningTime="2025-03-25 01:06:45.136744065 +0000 UTC m=+17.151801749" Mar 25 01:06:46.126368 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount139120447.mount: Deactivated successfully. Mar 25 01:06:46.423713 containerd[1456]: time="2025-03-25T01:06:46.423604077Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:06:46.424636 containerd[1456]: time="2025-03-25T01:06:46.424036075Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.5: active requests=0, bytes read=19271115" Mar 25 01:06:46.424939 containerd[1456]: time="2025-03-25T01:06:46.424910712Z" level=info msg="ImageCreate event name:\"sha256:a709184cc04589116e7266cb3575491ae8f2ac1c959975fea966447025f66eaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:06:46.427192 containerd[1456]: time="2025-03-25T01:06:46.427159585Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:06:46.427868 containerd[1456]: time="2025-03-25T01:06:46.427836743Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.5\" with image id \"sha256:a709184cc04589116e7266cb3575491ae8f2ac1c959975fea966447025f66eaa\", repo tag \"quay.io/tigera/operator:v1.36.5\", repo digest \"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\", size \"19267110\" in 1.600137003s" Mar 25 01:06:46.427910 containerd[1456]: time="2025-03-25T01:06:46.427868943Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\" returns image reference \"sha256:a709184cc04589116e7266cb3575491ae8f2ac1c959975fea966447025f66eaa\"" Mar 25 01:06:46.440743 containerd[1456]: time="2025-03-25T01:06:46.440493782Z" level=info msg="CreateContainer within sandbox \"646d73c5318545e1ac2b9d80925a6d63a7cdad018a222bbdd33c5f9112a63005\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 25 01:06:46.461237 containerd[1456]: time="2025-03-25T01:06:46.460374157Z" level=info msg="Container 22c1cd556ed8884e4ce238613f568f94a04eb7500e9bb45bc55cd1f34b06bdaf: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:06:46.478938 containerd[1456]: time="2025-03-25T01:06:46.478689937Z" level=info msg="CreateContainer within sandbox \"646d73c5318545e1ac2b9d80925a6d63a7cdad018a222bbdd33c5f9112a63005\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"22c1cd556ed8884e4ce238613f568f94a04eb7500e9bb45bc55cd1f34b06bdaf\"" Mar 25 01:06:46.479497 containerd[1456]: time="2025-03-25T01:06:46.479459135Z" level=info msg="StartContainer for \"22c1cd556ed8884e4ce238613f568f94a04eb7500e9bb45bc55cd1f34b06bdaf\"" Mar 25 01:06:46.480352 containerd[1456]: time="2025-03-25T01:06:46.480327332Z" level=info msg="connecting to shim 22c1cd556ed8884e4ce238613f568f94a04eb7500e9bb45bc55cd1f34b06bdaf" address="unix:///run/containerd/s/2b411b4270e543556ab4d60f611358e48a284bf76053ae04d6bee54ffdc0ddec" protocol=ttrpc version=3 Mar 25 01:06:46.528274 systemd[1]: Started cri-containerd-22c1cd556ed8884e4ce238613f568f94a04eb7500e9bb45bc55cd1f34b06bdaf.scope - libcontainer container 22c1cd556ed8884e4ce238613f568f94a04eb7500e9bb45bc55cd1f34b06bdaf. Mar 25 01:06:46.596030 containerd[1456]: time="2025-03-25T01:06:46.595920236Z" level=info msg="StartContainer for \"22c1cd556ed8884e4ce238613f568f94a04eb7500e9bb45bc55cd1f34b06bdaf\" returns successfully" Mar 25 01:06:47.149375 kubelet[2695]: I0325 01:06:47.149150 2695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6479d6dc54-6t74z" podStartSLOduration=1.540591519 podStartE2EDuration="3.149135615s" podCreationTimestamp="2025-03-25 01:06:44 +0000 UTC" firstStartedPulling="2025-03-25 01:06:44.826954942 +0000 UTC m=+16.842012586" lastFinishedPulling="2025-03-25 01:06:46.435498998 +0000 UTC m=+18.450556682" observedRunningTime="2025-03-25 01:06:47.148980415 +0000 UTC m=+19.164038099" watchObservedRunningTime="2025-03-25 01:06:47.149135615 +0000 UTC m=+19.164193339" Mar 25 01:06:50.330689 kubelet[2695]: I0325 01:06:50.330624 2695 topology_manager.go:215] "Topology Admit Handler" podUID="f709e83d-70e0-4ab2-acda-55b66daea5b0" podNamespace="calico-system" podName="calico-typha-77f4c89b4d-96cq9" Mar 25 01:06:50.353054 systemd[1]: Created slice kubepods-besteffort-podf709e83d_70e0_4ab2_acda_55b66daea5b0.slice - libcontainer container kubepods-besteffort-podf709e83d_70e0_4ab2_acda_55b66daea5b0.slice. Mar 25 01:06:50.401603 kubelet[2695]: I0325 01:06:50.401547 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f709e83d-70e0-4ab2-acda-55b66daea5b0-tigera-ca-bundle\") pod \"calico-typha-77f4c89b4d-96cq9\" (UID: \"f709e83d-70e0-4ab2-acda-55b66daea5b0\") " pod="calico-system/calico-typha-77f4c89b4d-96cq9" Mar 25 01:06:50.401603 kubelet[2695]: I0325 01:06:50.401598 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8sgp\" (UniqueName: \"kubernetes.io/projected/f709e83d-70e0-4ab2-acda-55b66daea5b0-kube-api-access-n8sgp\") pod \"calico-typha-77f4c89b4d-96cq9\" (UID: \"f709e83d-70e0-4ab2-acda-55b66daea5b0\") " pod="calico-system/calico-typha-77f4c89b4d-96cq9" Mar 25 01:06:50.401779 kubelet[2695]: I0325 01:06:50.401666 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/f709e83d-70e0-4ab2-acda-55b66daea5b0-typha-certs\") pod \"calico-typha-77f4c89b4d-96cq9\" (UID: \"f709e83d-70e0-4ab2-acda-55b66daea5b0\") " pod="calico-system/calico-typha-77f4c89b4d-96cq9" Mar 25 01:06:50.517122 kubelet[2695]: I0325 01:06:50.515892 2695 topology_manager.go:215] "Topology Admit Handler" podUID="510b38ef-e7f7-4d2c-b60c-0e0e347c70fe" podNamespace="calico-system" podName="calico-node-klxhh" Mar 25 01:06:50.528633 systemd[1]: Created slice kubepods-besteffort-pod510b38ef_e7f7_4d2c_b60c_0e0e347c70fe.slice - libcontainer container kubepods-besteffort-pod510b38ef_e7f7_4d2c_b60c_0e0e347c70fe.slice. Mar 25 01:06:50.602665 kubelet[2695]: I0325 01:06:50.602483 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/510b38ef-e7f7-4d2c-b60c-0e0e347c70fe-lib-modules\") pod \"calico-node-klxhh\" (UID: \"510b38ef-e7f7-4d2c-b60c-0e0e347c70fe\") " pod="calico-system/calico-node-klxhh" Mar 25 01:06:50.602665 kubelet[2695]: I0325 01:06:50.602533 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/510b38ef-e7f7-4d2c-b60c-0e0e347c70fe-policysync\") pod \"calico-node-klxhh\" (UID: \"510b38ef-e7f7-4d2c-b60c-0e0e347c70fe\") " pod="calico-system/calico-node-klxhh" Mar 25 01:06:50.602665 kubelet[2695]: I0325 01:06:50.602556 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/510b38ef-e7f7-4d2c-b60c-0e0e347c70fe-cni-log-dir\") pod \"calico-node-klxhh\" (UID: \"510b38ef-e7f7-4d2c-b60c-0e0e347c70fe\") " pod="calico-system/calico-node-klxhh" Mar 25 01:06:50.602665 kubelet[2695]: I0325 01:06:50.602576 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/510b38ef-e7f7-4d2c-b60c-0e0e347c70fe-node-certs\") pod \"calico-node-klxhh\" (UID: \"510b38ef-e7f7-4d2c-b60c-0e0e347c70fe\") " pod="calico-system/calico-node-klxhh" Mar 25 01:06:50.602665 kubelet[2695]: I0325 01:06:50.602593 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/510b38ef-e7f7-4d2c-b60c-0e0e347c70fe-var-lib-calico\") pod \"calico-node-klxhh\" (UID: \"510b38ef-e7f7-4d2c-b60c-0e0e347c70fe\") " pod="calico-system/calico-node-klxhh" Mar 25 01:06:50.602887 kubelet[2695]: I0325 01:06:50.602610 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/510b38ef-e7f7-4d2c-b60c-0e0e347c70fe-cni-net-dir\") pod \"calico-node-klxhh\" (UID: \"510b38ef-e7f7-4d2c-b60c-0e0e347c70fe\") " pod="calico-system/calico-node-klxhh" Mar 25 01:06:50.602887 kubelet[2695]: I0325 01:06:50.602624 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44sxw\" (UniqueName: \"kubernetes.io/projected/510b38ef-e7f7-4d2c-b60c-0e0e347c70fe-kube-api-access-44sxw\") pod \"calico-node-klxhh\" (UID: \"510b38ef-e7f7-4d2c-b60c-0e0e347c70fe\") " pod="calico-system/calico-node-klxhh" Mar 25 01:06:50.602887 kubelet[2695]: I0325 01:06:50.602644 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/510b38ef-e7f7-4d2c-b60c-0e0e347c70fe-xtables-lock\") pod \"calico-node-klxhh\" (UID: \"510b38ef-e7f7-4d2c-b60c-0e0e347c70fe\") " pod="calico-system/calico-node-klxhh" Mar 25 01:06:50.602887 kubelet[2695]: I0325 01:06:50.602669 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/510b38ef-e7f7-4d2c-b60c-0e0e347c70fe-var-run-calico\") pod \"calico-node-klxhh\" (UID: \"510b38ef-e7f7-4d2c-b60c-0e0e347c70fe\") " pod="calico-system/calico-node-klxhh" Mar 25 01:06:50.602887 kubelet[2695]: I0325 01:06:50.602683 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/510b38ef-e7f7-4d2c-b60c-0e0e347c70fe-cni-bin-dir\") pod \"calico-node-klxhh\" (UID: \"510b38ef-e7f7-4d2c-b60c-0e0e347c70fe\") " pod="calico-system/calico-node-klxhh" Mar 25 01:06:50.602987 kubelet[2695]: I0325 01:06:50.602696 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/510b38ef-e7f7-4d2c-b60c-0e0e347c70fe-flexvol-driver-host\") pod \"calico-node-klxhh\" (UID: \"510b38ef-e7f7-4d2c-b60c-0e0e347c70fe\") " pod="calico-system/calico-node-klxhh" Mar 25 01:06:50.602987 kubelet[2695]: I0325 01:06:50.602713 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/510b38ef-e7f7-4d2c-b60c-0e0e347c70fe-tigera-ca-bundle\") pod \"calico-node-klxhh\" (UID: \"510b38ef-e7f7-4d2c-b60c-0e0e347c70fe\") " pod="calico-system/calico-node-klxhh" Mar 25 01:06:50.658323 containerd[1456]: time="2025-03-25T01:06:50.658278405Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-77f4c89b4d-96cq9,Uid:f709e83d-70e0-4ab2-acda-55b66daea5b0,Namespace:calico-system,Attempt:0,}" Mar 25 01:06:50.676911 containerd[1456]: time="2025-03-25T01:06:50.676778073Z" level=info msg="connecting to shim eaf3ed447b2bb729a0420e7d0fb620894978cb714bb798cbc0a83ac003ff6e33" address="unix:///run/containerd/s/45a35b1c9e33eaebcd5b9a95f00a33f5ab5b8555a2ead2959507e141648a3621" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:06:50.713291 kubelet[2695]: E0325 01:06:50.713260 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.713291 kubelet[2695]: W0325 01:06:50.713285 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.713514 kubelet[2695]: E0325 01:06:50.713308 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.719515 systemd[1]: Started cri-containerd-eaf3ed447b2bb729a0420e7d0fb620894978cb714bb798cbc0a83ac003ff6e33.scope - libcontainer container eaf3ed447b2bb729a0420e7d0fb620894978cb714bb798cbc0a83ac003ff6e33. Mar 25 01:06:50.725014 kubelet[2695]: I0325 01:06:50.724652 2695 topology_manager.go:215] "Topology Admit Handler" podUID="0fe46c57-cacb-46aa-b01f-e805370dad30" podNamespace="calico-system" podName="csi-node-driver-ggjvq" Mar 25 01:06:50.725014 kubelet[2695]: E0325 01:06:50.724993 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.725184 kubelet[2695]: W0325 01:06:50.725027 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.725184 kubelet[2695]: E0325 01:06:50.725153 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.726621 kubelet[2695]: E0325 01:06:50.726556 2695 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ggjvq" podUID="0fe46c57-cacb-46aa-b01f-e805370dad30" Mar 25 01:06:50.785607 containerd[1456]: time="2025-03-25T01:06:50.785534771Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-77f4c89b4d-96cq9,Uid:f709e83d-70e0-4ab2-acda-55b66daea5b0,Namespace:calico-system,Attempt:0,} returns sandbox id \"eaf3ed447b2bb729a0420e7d0fb620894978cb714bb798cbc0a83ac003ff6e33\"" Mar 25 01:06:50.787801 containerd[1456]: time="2025-03-25T01:06:50.787759245Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\"" Mar 25 01:06:50.791946 kubelet[2695]: E0325 01:06:50.791634 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.792075 kubelet[2695]: W0325 01:06:50.791948 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.792075 kubelet[2695]: E0325 01:06:50.791976 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.792717 kubelet[2695]: E0325 01:06:50.792680 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.792717 kubelet[2695]: W0325 01:06:50.792700 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.792717 kubelet[2695]: E0325 01:06:50.792715 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.793322 kubelet[2695]: E0325 01:06:50.793294 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.793322 kubelet[2695]: W0325 01:06:50.793313 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.793408 kubelet[2695]: E0325 01:06:50.793327 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.793921 kubelet[2695]: E0325 01:06:50.793890 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.793921 kubelet[2695]: W0325 01:06:50.793910 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.793921 kubelet[2695]: E0325 01:06:50.793923 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.794180 kubelet[2695]: E0325 01:06:50.794165 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.794180 kubelet[2695]: W0325 01:06:50.794179 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.794246 kubelet[2695]: E0325 01:06:50.794190 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.794387 kubelet[2695]: E0325 01:06:50.794375 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.794387 kubelet[2695]: W0325 01:06:50.794386 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.794437 kubelet[2695]: E0325 01:06:50.794395 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.794567 kubelet[2695]: E0325 01:06:50.794554 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.794567 kubelet[2695]: W0325 01:06:50.794566 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.794627 kubelet[2695]: E0325 01:06:50.794574 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.795857 kubelet[2695]: E0325 01:06:50.795824 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.795857 kubelet[2695]: W0325 01:06:50.795843 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.795932 kubelet[2695]: E0325 01:06:50.795858 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.796864 kubelet[2695]: E0325 01:06:50.796836 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.796864 kubelet[2695]: W0325 01:06:50.796855 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.796941 kubelet[2695]: E0325 01:06:50.796869 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.799665 kubelet[2695]: E0325 01:06:50.799624 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.799665 kubelet[2695]: W0325 01:06:50.799652 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.799665 kubelet[2695]: E0325 01:06:50.799672 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.800018 kubelet[2695]: E0325 01:06:50.799992 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.800018 kubelet[2695]: W0325 01:06:50.800006 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.800018 kubelet[2695]: E0325 01:06:50.800016 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.800227 kubelet[2695]: E0325 01:06:50.800206 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.800227 kubelet[2695]: W0325 01:06:50.800219 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.800294 kubelet[2695]: E0325 01:06:50.800232 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.800551 kubelet[2695]: E0325 01:06:50.800432 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.800551 kubelet[2695]: W0325 01:06:50.800445 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.800551 kubelet[2695]: E0325 01:06:50.800457 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.800812 kubelet[2695]: E0325 01:06:50.800674 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.800812 kubelet[2695]: W0325 01:06:50.800689 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.800812 kubelet[2695]: E0325 01:06:50.800699 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.800908 kubelet[2695]: E0325 01:06:50.800832 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.800908 kubelet[2695]: W0325 01:06:50.800839 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.800908 kubelet[2695]: E0325 01:06:50.800846 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.800976 kubelet[2695]: E0325 01:06:50.800969 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.800998 kubelet[2695]: W0325 01:06:50.800975 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.800998 kubelet[2695]: E0325 01:06:50.800984 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.801253 kubelet[2695]: E0325 01:06:50.801238 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.801253 kubelet[2695]: W0325 01:06:50.801249 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.801253 kubelet[2695]: E0325 01:06:50.801257 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.801395 kubelet[2695]: E0325 01:06:50.801383 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.801395 kubelet[2695]: W0325 01:06:50.801392 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.801446 kubelet[2695]: E0325 01:06:50.801402 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.802333 kubelet[2695]: E0325 01:06:50.802292 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.802333 kubelet[2695]: W0325 01:06:50.802313 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.802333 kubelet[2695]: E0325 01:06:50.802326 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.803008 kubelet[2695]: E0325 01:06:50.802808 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.803008 kubelet[2695]: W0325 01:06:50.803004 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.803185 kubelet[2695]: E0325 01:06:50.803022 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.805605 kubelet[2695]: E0325 01:06:50.805576 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.805605 kubelet[2695]: W0325 01:06:50.805595 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.805605 kubelet[2695]: E0325 01:06:50.805608 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.805711 kubelet[2695]: I0325 01:06:50.805637 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0fe46c57-cacb-46aa-b01f-e805370dad30-kubelet-dir\") pod \"csi-node-driver-ggjvq\" (UID: \"0fe46c57-cacb-46aa-b01f-e805370dad30\") " pod="calico-system/csi-node-driver-ggjvq" Mar 25 01:06:50.806321 kubelet[2695]: E0325 01:06:50.805841 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.806321 kubelet[2695]: W0325 01:06:50.805860 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.806321 kubelet[2695]: E0325 01:06:50.805870 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.806321 kubelet[2695]: I0325 01:06:50.805885 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0fe46c57-cacb-46aa-b01f-e805370dad30-socket-dir\") pod \"csi-node-driver-ggjvq\" (UID: \"0fe46c57-cacb-46aa-b01f-e805370dad30\") " pod="calico-system/csi-node-driver-ggjvq" Mar 25 01:06:50.806321 kubelet[2695]: E0325 01:06:50.806076 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.806321 kubelet[2695]: W0325 01:06:50.806087 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.806321 kubelet[2695]: E0325 01:06:50.806102 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.806321 kubelet[2695]: I0325 01:06:50.806146 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/0fe46c57-cacb-46aa-b01f-e805370dad30-varrun\") pod \"csi-node-driver-ggjvq\" (UID: \"0fe46c57-cacb-46aa-b01f-e805370dad30\") " pod="calico-system/csi-node-driver-ggjvq" Mar 25 01:06:50.806585 kubelet[2695]: E0325 01:06:50.806350 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.806585 kubelet[2695]: W0325 01:06:50.806364 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.806585 kubelet[2695]: E0325 01:06:50.806379 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.806585 kubelet[2695]: I0325 01:06:50.806394 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0fe46c57-cacb-46aa-b01f-e805370dad30-registration-dir\") pod \"csi-node-driver-ggjvq\" (UID: \"0fe46c57-cacb-46aa-b01f-e805370dad30\") " pod="calico-system/csi-node-driver-ggjvq" Mar 25 01:06:50.806766 kubelet[2695]: E0325 01:06:50.806631 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.806766 kubelet[2695]: W0325 01:06:50.806648 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.806766 kubelet[2695]: E0325 01:06:50.806667 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.807410 kubelet[2695]: E0325 01:06:50.806858 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.807410 kubelet[2695]: W0325 01:06:50.806869 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.807410 kubelet[2695]: E0325 01:06:50.806878 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.807410 kubelet[2695]: E0325 01:06:50.807386 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.807410 kubelet[2695]: W0325 01:06:50.807400 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.807533 kubelet[2695]: E0325 01:06:50.807417 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.809187 kubelet[2695]: E0325 01:06:50.807852 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.809187 kubelet[2695]: W0325 01:06:50.807869 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.809187 kubelet[2695]: E0325 01:06:50.807902 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.809187 kubelet[2695]: E0325 01:06:50.808120 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.809187 kubelet[2695]: W0325 01:06:50.808130 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.809977 kubelet[2695]: E0325 01:06:50.808186 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.810041 kubelet[2695]: E0325 01:06:50.808295 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.810089 kubelet[2695]: W0325 01:06:50.810035 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.810564 kubelet[2695]: E0325 01:06:50.810325 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.810564 kubelet[2695]: I0325 01:06:50.810380 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bdb6\" (UniqueName: \"kubernetes.io/projected/0fe46c57-cacb-46aa-b01f-e805370dad30-kube-api-access-9bdb6\") pod \"csi-node-driver-ggjvq\" (UID: \"0fe46c57-cacb-46aa-b01f-e805370dad30\") " pod="calico-system/csi-node-driver-ggjvq" Mar 25 01:06:50.810983 kubelet[2695]: E0325 01:06:50.810831 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.810983 kubelet[2695]: W0325 01:06:50.810856 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.810983 kubelet[2695]: E0325 01:06:50.810902 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.812960 kubelet[2695]: E0325 01:06:50.812931 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.813144 kubelet[2695]: W0325 01:06:50.812998 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.813144 kubelet[2695]: E0325 01:06:50.813022 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.815094 kubelet[2695]: E0325 01:06:50.815068 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.815951 kubelet[2695]: W0325 01:06:50.815091 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.815951 kubelet[2695]: E0325 01:06:50.815912 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.816204 kubelet[2695]: E0325 01:06:50.816187 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.816204 kubelet[2695]: W0325 01:06:50.816205 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.816276 kubelet[2695]: E0325 01:06:50.816215 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.816567 kubelet[2695]: E0325 01:06:50.816536 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.816567 kubelet[2695]: W0325 01:06:50.816555 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.816617 kubelet[2695]: E0325 01:06:50.816569 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.831542 containerd[1456]: time="2025-03-25T01:06:50.831481244Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-klxhh,Uid:510b38ef-e7f7-4d2c-b60c-0e0e347c70fe,Namespace:calico-system,Attempt:0,}" Mar 25 01:06:50.917796 kubelet[2695]: E0325 01:06:50.917617 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.917796 kubelet[2695]: W0325 01:06:50.917639 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.917796 kubelet[2695]: E0325 01:06:50.917658 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.917945 kubelet[2695]: E0325 01:06:50.917856 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.917945 kubelet[2695]: W0325 01:06:50.917865 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.917945 kubelet[2695]: E0325 01:06:50.917874 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.918241 kubelet[2695]: E0325 01:06:50.918100 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.918241 kubelet[2695]: W0325 01:06:50.918130 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.918241 kubelet[2695]: E0325 01:06:50.918145 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.918573 kubelet[2695]: E0325 01:06:50.918306 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.918573 kubelet[2695]: W0325 01:06:50.918314 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.918573 kubelet[2695]: E0325 01:06:50.918329 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.918782 kubelet[2695]: E0325 01:06:50.918766 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.919014 kubelet[2695]: W0325 01:06:50.918839 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.919014 kubelet[2695]: E0325 01:06:50.918919 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.919267 kubelet[2695]: E0325 01:06:50.919254 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.919322 kubelet[2695]: W0325 01:06:50.919311 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.919465 kubelet[2695]: E0325 01:06:50.919452 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.919692 kubelet[2695]: E0325 01:06:50.919658 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.919692 kubelet[2695]: W0325 01:06:50.919675 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.919692 kubelet[2695]: E0325 01:06:50.919693 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.919941 kubelet[2695]: E0325 01:06:50.919835 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.919941 kubelet[2695]: W0325 01:06:50.919842 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.919941 kubelet[2695]: E0325 01:06:50.919871 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.920023 kubelet[2695]: E0325 01:06:50.919971 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.920023 kubelet[2695]: W0325 01:06:50.919979 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.920073 kubelet[2695]: E0325 01:06:50.920027 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.920358 kubelet[2695]: E0325 01:06:50.920119 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.920358 kubelet[2695]: W0325 01:06:50.920127 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.920358 kubelet[2695]: E0325 01:06:50.920158 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.920358 kubelet[2695]: E0325 01:06:50.920256 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.920358 kubelet[2695]: W0325 01:06:50.920263 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.920358 kubelet[2695]: E0325 01:06:50.920286 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.920667 kubelet[2695]: E0325 01:06:50.920393 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.920667 kubelet[2695]: W0325 01:06:50.920400 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.920667 kubelet[2695]: E0325 01:06:50.920409 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.920667 kubelet[2695]: E0325 01:06:50.920596 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.920667 kubelet[2695]: W0325 01:06:50.920603 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.920667 kubelet[2695]: E0325 01:06:50.920616 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.920870 kubelet[2695]: E0325 01:06:50.920760 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.920870 kubelet[2695]: W0325 01:06:50.920768 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.920870 kubelet[2695]: E0325 01:06:50.920782 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.922204 kubelet[2695]: E0325 01:06:50.922186 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.922204 kubelet[2695]: W0325 01:06:50.922201 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.922302 kubelet[2695]: E0325 01:06:50.922217 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.922421 kubelet[2695]: E0325 01:06:50.922401 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.922421 kubelet[2695]: W0325 01:06:50.922415 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.922421 kubelet[2695]: E0325 01:06:50.922424 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.923166 kubelet[2695]: E0325 01:06:50.922642 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.923166 kubelet[2695]: W0325 01:06:50.922654 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.923166 kubelet[2695]: E0325 01:06:50.922692 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.923166 kubelet[2695]: E0325 01:06:50.922832 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.923166 kubelet[2695]: W0325 01:06:50.922841 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.923166 kubelet[2695]: E0325 01:06:50.922923 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.923166 kubelet[2695]: E0325 01:06:50.923042 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.923166 kubelet[2695]: W0325 01:06:50.923051 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.923166 kubelet[2695]: E0325 01:06:50.923077 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.923405 kubelet[2695]: E0325 01:06:50.923262 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.923405 kubelet[2695]: W0325 01:06:50.923272 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.923405 kubelet[2695]: E0325 01:06:50.923355 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.923469 kubelet[2695]: E0325 01:06:50.923454 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.923469 kubelet[2695]: W0325 01:06:50.923462 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.923523 kubelet[2695]: E0325 01:06:50.923471 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.923737 kubelet[2695]: E0325 01:06:50.923628 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.923737 kubelet[2695]: W0325 01:06:50.923642 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.923737 kubelet[2695]: E0325 01:06:50.923653 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.924256 kubelet[2695]: E0325 01:06:50.923826 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.924256 kubelet[2695]: W0325 01:06:50.923838 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.924256 kubelet[2695]: E0325 01:06:50.923855 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.924256 kubelet[2695]: E0325 01:06:50.924053 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.924256 kubelet[2695]: W0325 01:06:50.924062 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.924256 kubelet[2695]: E0325 01:06:50.924072 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.924577 kubelet[2695]: E0325 01:06:50.924422 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.924577 kubelet[2695]: W0325 01:06:50.924435 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.924577 kubelet[2695]: E0325 01:06:50.924446 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:50.958812 kubelet[2695]: E0325 01:06:50.958787 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:50.959006 kubelet[2695]: W0325 01:06:50.958948 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:50.959006 kubelet[2695]: E0325 01:06:50.958975 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:51.127512 containerd[1456]: time="2025-03-25T01:06:51.127446715Z" level=info msg="connecting to shim e42e4c4ae172f60b07aabfe7ee21df42032fa6a1718cacf530d78018abe65ae6" address="unix:///run/containerd/s/96ef2d36d799f04579b68f7294a7f6bd31dbf4f2d2ec3431eee000179b3d6338" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:06:51.153286 systemd[1]: Started cri-containerd-e42e4c4ae172f60b07aabfe7ee21df42032fa6a1718cacf530d78018abe65ae6.scope - libcontainer container e42e4c4ae172f60b07aabfe7ee21df42032fa6a1718cacf530d78018abe65ae6. Mar 25 01:06:51.188294 containerd[1456]: time="2025-03-25T01:06:51.188065712Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-klxhh,Uid:510b38ef-e7f7-4d2c-b60c-0e0e347c70fe,Namespace:calico-system,Attempt:0,} returns sandbox id \"e42e4c4ae172f60b07aabfe7ee21df42032fa6a1718cacf530d78018abe65ae6\"" Mar 25 01:06:53.071784 kubelet[2695]: E0325 01:06:53.071738 2695 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ggjvq" podUID="0fe46c57-cacb-46aa-b01f-e805370dad30" Mar 25 01:06:53.439071 containerd[1456]: time="2025-03-25T01:06:53.438956586Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:06:53.439686 containerd[1456]: time="2025-03-25T01:06:53.439630224Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.2: active requests=0, bytes read=28363957" Mar 25 01:06:53.440469 containerd[1456]: time="2025-03-25T01:06:53.440434182Z" level=info msg="ImageCreate event name:\"sha256:38a4e8457549414848315eae0d5ab8ecd6c51f4baaea849fe5edce714d81a999\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:06:53.442185 containerd[1456]: time="2025-03-25T01:06:53.442152978Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:06:53.443009 containerd[1456]: time="2025-03-25T01:06:53.442976576Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.2\" with image id \"sha256:38a4e8457549414848315eae0d5ab8ecd6c51f4baaea849fe5edce714d81a999\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\", size \"29733706\" in 2.655180811s" Mar 25 01:06:53.443053 containerd[1456]: time="2025-03-25T01:06:53.443008335Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\" returns image reference \"sha256:38a4e8457549414848315eae0d5ab8ecd6c51f4baaea849fe5edce714d81a999\"" Mar 25 01:06:53.445180 containerd[1456]: time="2025-03-25T01:06:53.444291052Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\"" Mar 25 01:06:53.460488 containerd[1456]: time="2025-03-25T01:06:53.460445332Z" level=info msg="CreateContainer within sandbox \"eaf3ed447b2bb729a0420e7d0fb620894978cb714bb798cbc0a83ac003ff6e33\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 25 01:06:53.468512 containerd[1456]: time="2025-03-25T01:06:53.467038076Z" level=info msg="Container a2c9c7e7b718f49bc3a1665f4183fe665d6d102bf3798f9bf89e10e5a7d50a45: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:06:53.475414 containerd[1456]: time="2025-03-25T01:06:53.475372935Z" level=info msg="CreateContainer within sandbox \"eaf3ed447b2bb729a0420e7d0fb620894978cb714bb798cbc0a83ac003ff6e33\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"a2c9c7e7b718f49bc3a1665f4183fe665d6d102bf3798f9bf89e10e5a7d50a45\"" Mar 25 01:06:53.475957 containerd[1456]: time="2025-03-25T01:06:53.475933053Z" level=info msg="StartContainer for \"a2c9c7e7b718f49bc3a1665f4183fe665d6d102bf3798f9bf89e10e5a7d50a45\"" Mar 25 01:06:53.477081 containerd[1456]: time="2025-03-25T01:06:53.477044251Z" level=info msg="connecting to shim a2c9c7e7b718f49bc3a1665f4183fe665d6d102bf3798f9bf89e10e5a7d50a45" address="unix:///run/containerd/s/45a35b1c9e33eaebcd5b9a95f00a33f5ab5b8555a2ead2959507e141648a3621" protocol=ttrpc version=3 Mar 25 01:06:53.497290 systemd[1]: Started cri-containerd-a2c9c7e7b718f49bc3a1665f4183fe665d6d102bf3798f9bf89e10e5a7d50a45.scope - libcontainer container a2c9c7e7b718f49bc3a1665f4183fe665d6d102bf3798f9bf89e10e5a7d50a45. Mar 25 01:06:53.541124 containerd[1456]: time="2025-03-25T01:06:53.540496693Z" level=info msg="StartContainer for \"a2c9c7e7b718f49bc3a1665f4183fe665d6d102bf3798f9bf89e10e5a7d50a45\" returns successfully" Mar 25 01:06:54.166786 kubelet[2695]: I0325 01:06:54.166603 2695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-77f4c89b4d-96cq9" podStartSLOduration=1.510365738 podStartE2EDuration="4.166586506s" podCreationTimestamp="2025-03-25 01:06:50 +0000 UTC" firstStartedPulling="2025-03-25 01:06:50.787549726 +0000 UTC m=+22.802607370" lastFinishedPulling="2025-03-25 01:06:53.443770454 +0000 UTC m=+25.458828138" observedRunningTime="2025-03-25 01:06:54.166465707 +0000 UTC m=+26.181523391" watchObservedRunningTime="2025-03-25 01:06:54.166586506 +0000 UTC m=+26.181644190" Mar 25 01:06:54.227520 kubelet[2695]: E0325 01:06:54.227476 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:54.227520 kubelet[2695]: W0325 01:06:54.227506 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:54.227520 kubelet[2695]: E0325 01:06:54.227526 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:54.227711 kubelet[2695]: E0325 01:06:54.227686 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:54.227711 kubelet[2695]: W0325 01:06:54.227695 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:54.227711 kubelet[2695]: E0325 01:06:54.227704 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:54.227882 kubelet[2695]: E0325 01:06:54.227853 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:54.227882 kubelet[2695]: W0325 01:06:54.227862 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:54.227882 kubelet[2695]: E0325 01:06:54.227870 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:54.228826 kubelet[2695]: E0325 01:06:54.228804 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:54.228826 kubelet[2695]: W0325 01:06:54.228818 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:54.228826 kubelet[2695]: E0325 01:06:54.228828 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:54.229137 kubelet[2695]: E0325 01:06:54.229056 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:54.229137 kubelet[2695]: W0325 01:06:54.229070 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:54.229227 kubelet[2695]: E0325 01:06:54.229142 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:54.229372 kubelet[2695]: E0325 01:06:54.229342 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:54.229372 kubelet[2695]: W0325 01:06:54.229357 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:54.229372 kubelet[2695]: E0325 01:06:54.229367 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:54.229644 kubelet[2695]: E0325 01:06:54.229566 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:54.229644 kubelet[2695]: W0325 01:06:54.229577 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:54.229644 kubelet[2695]: E0325 01:06:54.229588 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:54.229795 kubelet[2695]: E0325 01:06:54.229776 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:54.229795 kubelet[2695]: W0325 01:06:54.229792 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:54.229881 kubelet[2695]: E0325 01:06:54.229800 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:54.230002 kubelet[2695]: E0325 01:06:54.229975 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:54.230002 kubelet[2695]: W0325 01:06:54.229988 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:54.230002 kubelet[2695]: E0325 01:06:54.229996 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:54.230246 kubelet[2695]: E0325 01:06:54.230132 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:54.230246 kubelet[2695]: W0325 01:06:54.230140 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:54.230246 kubelet[2695]: E0325 01:06:54.230147 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:54.230334 kubelet[2695]: E0325 01:06:54.230295 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:54.230334 kubelet[2695]: W0325 01:06:54.230302 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:54.230334 kubelet[2695]: E0325 01:06:54.230309 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:54.230444 kubelet[2695]: E0325 01:06:54.230428 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:54.230444 kubelet[2695]: W0325 01:06:54.230439 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:54.230504 kubelet[2695]: E0325 01:06:54.230446 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:54.230618 kubelet[2695]: E0325 01:06:54.230595 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:54.230618 kubelet[2695]: W0325 01:06:54.230606 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:54.230618 kubelet[2695]: E0325 01:06:54.230614 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:54.230745 kubelet[2695]: E0325 01:06:54.230734 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:54.230745 kubelet[2695]: W0325 01:06:54.230744 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:54.230789 kubelet[2695]: E0325 01:06:54.230751 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:54.230880 kubelet[2695]: E0325 01:06:54.230871 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:54.230908 kubelet[2695]: W0325 01:06:54.230880 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:54.230908 kubelet[2695]: E0325 01:06:54.230887 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:54.244748 kubelet[2695]: E0325 01:06:54.244723 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:54.244748 kubelet[2695]: W0325 01:06:54.244744 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:54.244748 kubelet[2695]: E0325 01:06:54.244759 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:54.244995 kubelet[2695]: E0325 01:06:54.244982 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:54.244995 kubelet[2695]: W0325 01:06:54.244994 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:54.245059 kubelet[2695]: E0325 01:06:54.245007 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:54.245226 kubelet[2695]: E0325 01:06:54.245211 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:54.245226 kubelet[2695]: W0325 01:06:54.245224 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:54.245302 kubelet[2695]: E0325 01:06:54.245237 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:54.245418 kubelet[2695]: E0325 01:06:54.245405 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:54.245418 kubelet[2695]: W0325 01:06:54.245416 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:54.245477 kubelet[2695]: E0325 01:06:54.245428 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:54.245591 kubelet[2695]: E0325 01:06:54.245579 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:54.245591 kubelet[2695]: W0325 01:06:54.245589 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:54.245646 kubelet[2695]: E0325 01:06:54.245601 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:54.245752 kubelet[2695]: E0325 01:06:54.245741 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:54.245752 kubelet[2695]: W0325 01:06:54.245752 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:54.245819 kubelet[2695]: E0325 01:06:54.245760 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:54.245928 kubelet[2695]: E0325 01:06:54.245918 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:54.245959 kubelet[2695]: W0325 01:06:54.245930 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:54.245959 kubelet[2695]: E0325 01:06:54.245942 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:54.246097 kubelet[2695]: E0325 01:06:54.246087 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:54.246097 kubelet[2695]: W0325 01:06:54.246097 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:54.246173 kubelet[2695]: E0325 01:06:54.246119 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:54.246440 kubelet[2695]: E0325 01:06:54.246425 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:54.246440 kubelet[2695]: W0325 01:06:54.246439 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:54.246517 kubelet[2695]: E0325 01:06:54.246479 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:54.246661 kubelet[2695]: E0325 01:06:54.246647 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:54.246661 kubelet[2695]: W0325 01:06:54.246658 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:54.246796 kubelet[2695]: E0325 01:06:54.246726 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:54.246881 kubelet[2695]: E0325 01:06:54.246866 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:54.246881 kubelet[2695]: W0325 01:06:54.246876 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:54.246941 kubelet[2695]: E0325 01:06:54.246887 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:54.247037 kubelet[2695]: E0325 01:06:54.247022 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:54.247037 kubelet[2695]: W0325 01:06:54.247033 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:54.247137 kubelet[2695]: E0325 01:06:54.247044 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:54.247209 kubelet[2695]: E0325 01:06:54.247198 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:54.247209 kubelet[2695]: W0325 01:06:54.247208 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:54.247280 kubelet[2695]: E0325 01:06:54.247222 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:54.247410 kubelet[2695]: E0325 01:06:54.247392 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:54.247438 kubelet[2695]: W0325 01:06:54.247411 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:54.247438 kubelet[2695]: E0325 01:06:54.247429 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:54.247624 kubelet[2695]: E0325 01:06:54.247613 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:54.247650 kubelet[2695]: W0325 01:06:54.247624 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:54.247650 kubelet[2695]: E0325 01:06:54.247637 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:54.247796 kubelet[2695]: E0325 01:06:54.247786 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:54.247826 kubelet[2695]: W0325 01:06:54.247797 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:54.247826 kubelet[2695]: E0325 01:06:54.247809 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:54.248020 kubelet[2695]: E0325 01:06:54.248004 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:54.248020 kubelet[2695]: W0325 01:06:54.248019 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:54.248069 kubelet[2695]: E0325 01:06:54.248029 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:54.248263 kubelet[2695]: E0325 01:06:54.248250 2695 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:06:54.248298 kubelet[2695]: W0325 01:06:54.248263 2695 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:06:54.248298 kubelet[2695]: E0325 01:06:54.248272 2695 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:06:54.541942 containerd[1456]: time="2025-03-25T01:06:54.541888043Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:06:54.542914 containerd[1456]: time="2025-03-25T01:06:54.542858440Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2: active requests=0, bytes read=5120152" Mar 25 01:06:54.544124 containerd[1456]: time="2025-03-25T01:06:54.544070957Z" level=info msg="ImageCreate event name:\"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:06:54.546384 containerd[1456]: time="2025-03-25T01:06:54.546347872Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" with image id \"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\", size \"6489869\" in 1.10202182s" Mar 25 01:06:54.546455 containerd[1456]: time="2025-03-25T01:06:54.546386552Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" returns image reference \"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\"" Mar 25 01:06:54.547045 containerd[1456]: time="2025-03-25T01:06:54.546726431Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:06:54.548733 containerd[1456]: time="2025-03-25T01:06:54.548660106Z" level=info msg="CreateContainer within sandbox \"e42e4c4ae172f60b07aabfe7ee21df42032fa6a1718cacf530d78018abe65ae6\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 25 01:06:54.556732 containerd[1456]: time="2025-03-25T01:06:54.555690089Z" level=info msg="Container 19ac694c9a94a2ea4ff0439bbf4a8d2e1ac8b4da11ba0aaca55b63112d1005af: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:06:54.563806 containerd[1456]: time="2025-03-25T01:06:54.563741750Z" level=info msg="CreateContainer within sandbox \"e42e4c4ae172f60b07aabfe7ee21df42032fa6a1718cacf530d78018abe65ae6\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"19ac694c9a94a2ea4ff0439bbf4a8d2e1ac8b4da11ba0aaca55b63112d1005af\"" Mar 25 01:06:54.564400 containerd[1456]: time="2025-03-25T01:06:54.564368188Z" level=info msg="StartContainer for \"19ac694c9a94a2ea4ff0439bbf4a8d2e1ac8b4da11ba0aaca55b63112d1005af\"" Mar 25 01:06:54.565958 containerd[1456]: time="2025-03-25T01:06:54.565903065Z" level=info msg="connecting to shim 19ac694c9a94a2ea4ff0439bbf4a8d2e1ac8b4da11ba0aaca55b63112d1005af" address="unix:///run/containerd/s/96ef2d36d799f04579b68f7294a7f6bd31dbf4f2d2ec3431eee000179b3d6338" protocol=ttrpc version=3 Mar 25 01:06:54.595265 systemd[1]: Started cri-containerd-19ac694c9a94a2ea4ff0439bbf4a8d2e1ac8b4da11ba0aaca55b63112d1005af.scope - libcontainer container 19ac694c9a94a2ea4ff0439bbf4a8d2e1ac8b4da11ba0aaca55b63112d1005af. Mar 25 01:06:54.632657 containerd[1456]: time="2025-03-25T01:06:54.632616504Z" level=info msg="StartContainer for \"19ac694c9a94a2ea4ff0439bbf4a8d2e1ac8b4da11ba0aaca55b63112d1005af\" returns successfully" Mar 25 01:06:54.656650 systemd[1]: cri-containerd-19ac694c9a94a2ea4ff0439bbf4a8d2e1ac8b4da11ba0aaca55b63112d1005af.scope: Deactivated successfully. Mar 25 01:06:54.684285 containerd[1456]: time="2025-03-25T01:06:54.684143340Z" level=info msg="received exit event container_id:\"19ac694c9a94a2ea4ff0439bbf4a8d2e1ac8b4da11ba0aaca55b63112d1005af\" id:\"19ac694c9a94a2ea4ff0439bbf4a8d2e1ac8b4da11ba0aaca55b63112d1005af\" pid:3358 exited_at:{seconds:1742864814 nanos:669267376}" Mar 25 01:06:54.684285 containerd[1456]: time="2025-03-25T01:06:54.684238900Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19ac694c9a94a2ea4ff0439bbf4a8d2e1ac8b4da11ba0aaca55b63112d1005af\" id:\"19ac694c9a94a2ea4ff0439bbf4a8d2e1ac8b4da11ba0aaca55b63112d1005af\" pid:3358 exited_at:{seconds:1742864814 nanos:669267376}" Mar 25 01:06:54.712652 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-19ac694c9a94a2ea4ff0439bbf4a8d2e1ac8b4da11ba0aaca55b63112d1005af-rootfs.mount: Deactivated successfully. Mar 25 01:06:55.072811 kubelet[2695]: E0325 01:06:55.071689 2695 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ggjvq" podUID="0fe46c57-cacb-46aa-b01f-e805370dad30" Mar 25 01:06:55.160404 containerd[1456]: time="2025-03-25T01:06:55.160371286Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\"" Mar 25 01:06:55.164977 kubelet[2695]: I0325 01:06:55.164943 2695 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:06:56.438970 systemd[1]: Started sshd@7-10.0.0.8:22-10.0.0.1:33084.service - OpenSSH per-connection server daemon (10.0.0.1:33084). Mar 25 01:06:56.502772 sshd[3398]: Accepted publickey for core from 10.0.0.1 port 33084 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:06:56.504303 sshd-session[3398]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:06:56.510556 systemd-logind[1439]: New session 8 of user core. Mar 25 01:06:56.515264 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 25 01:06:56.642140 sshd[3400]: Connection closed by 10.0.0.1 port 33084 Mar 25 01:06:56.642742 sshd-session[3398]: pam_unix(sshd:session): session closed for user core Mar 25 01:06:56.649014 systemd[1]: sshd@7-10.0.0.8:22-10.0.0.1:33084.service: Deactivated successfully. Mar 25 01:06:56.650763 systemd[1]: session-8.scope: Deactivated successfully. Mar 25 01:06:56.652397 systemd-logind[1439]: Session 8 logged out. Waiting for processes to exit. Mar 25 01:06:56.654558 systemd-logind[1439]: Removed session 8. Mar 25 01:06:57.071373 kubelet[2695]: E0325 01:06:57.071320 2695 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ggjvq" podUID="0fe46c57-cacb-46aa-b01f-e805370dad30" Mar 25 01:06:59.071434 kubelet[2695]: E0325 01:06:59.071392 2695 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ggjvq" podUID="0fe46c57-cacb-46aa-b01f-e805370dad30" Mar 25 01:06:59.583601 containerd[1456]: time="2025-03-25T01:06:59.583556837Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:06:59.584553 containerd[1456]: time="2025-03-25T01:06:59.584507475Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.2: active requests=0, bytes read=91227396" Mar 25 01:06:59.585464 containerd[1456]: time="2025-03-25T01:06:59.585443513Z" level=info msg="ImageCreate event name:\"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:06:59.587331 containerd[1456]: time="2025-03-25T01:06:59.587299589Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:06:59.587905 containerd[1456]: time="2025-03-25T01:06:59.587869268Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.2\" with image id \"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\", size \"92597153\" in 4.426399305s" Mar 25 01:06:59.587940 containerd[1456]: time="2025-03-25T01:06:59.587900988Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\" returns image reference \"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\"" Mar 25 01:06:59.589874 containerd[1456]: time="2025-03-25T01:06:59.589844864Z" level=info msg="CreateContainer within sandbox \"e42e4c4ae172f60b07aabfe7ee21df42032fa6a1718cacf530d78018abe65ae6\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 25 01:06:59.599509 containerd[1456]: time="2025-03-25T01:06:59.599463124Z" level=info msg="Container ecdbe14d775d047433b31797024bbf33ba9a8e69947fb6f033aef4cf499fd39d: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:06:59.607273 containerd[1456]: time="2025-03-25T01:06:59.607240028Z" level=info msg="CreateContainer within sandbox \"e42e4c4ae172f60b07aabfe7ee21df42032fa6a1718cacf530d78018abe65ae6\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"ecdbe14d775d047433b31797024bbf33ba9a8e69947fb6f033aef4cf499fd39d\"" Mar 25 01:06:59.607788 containerd[1456]: time="2025-03-25T01:06:59.607701107Z" level=info msg="StartContainer for \"ecdbe14d775d047433b31797024bbf33ba9a8e69947fb6f033aef4cf499fd39d\"" Mar 25 01:06:59.609586 containerd[1456]: time="2025-03-25T01:06:59.609544903Z" level=info msg="connecting to shim ecdbe14d775d047433b31797024bbf33ba9a8e69947fb6f033aef4cf499fd39d" address="unix:///run/containerd/s/96ef2d36d799f04579b68f7294a7f6bd31dbf4f2d2ec3431eee000179b3d6338" protocol=ttrpc version=3 Mar 25 01:06:59.632282 systemd[1]: Started cri-containerd-ecdbe14d775d047433b31797024bbf33ba9a8e69947fb6f033aef4cf499fd39d.scope - libcontainer container ecdbe14d775d047433b31797024bbf33ba9a8e69947fb6f033aef4cf499fd39d. Mar 25 01:06:59.670401 containerd[1456]: time="2025-03-25T01:06:59.670356658Z" level=info msg="StartContainer for \"ecdbe14d775d047433b31797024bbf33ba9a8e69947fb6f033aef4cf499fd39d\" returns successfully" Mar 25 01:07:00.196006 systemd[1]: cri-containerd-ecdbe14d775d047433b31797024bbf33ba9a8e69947fb6f033aef4cf499fd39d.scope: Deactivated successfully. Mar 25 01:07:00.196330 systemd[1]: cri-containerd-ecdbe14d775d047433b31797024bbf33ba9a8e69947fb6f033aef4cf499fd39d.scope: Consumed 441ms CPU time, 159.6M memory peak, 4K read from disk, 150.3M written to disk. Mar 25 01:07:00.198137 containerd[1456]: time="2025-03-25T01:07:00.198080100Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecdbe14d775d047433b31797024bbf33ba9a8e69947fb6f033aef4cf499fd39d\" id:\"ecdbe14d775d047433b31797024bbf33ba9a8e69947fb6f033aef4cf499fd39d\" pid:3434 exited_at:{seconds:1742864820 nanos:197758261}" Mar 25 01:07:00.198137 containerd[1456]: time="2025-03-25T01:07:00.198084780Z" level=info msg="received exit event container_id:\"ecdbe14d775d047433b31797024bbf33ba9a8e69947fb6f033aef4cf499fd39d\" id:\"ecdbe14d775d047433b31797024bbf33ba9a8e69947fb6f033aef4cf499fd39d\" pid:3434 exited_at:{seconds:1742864820 nanos:197758261}" Mar 25 01:07:00.218549 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ecdbe14d775d047433b31797024bbf33ba9a8e69947fb6f033aef4cf499fd39d-rootfs.mount: Deactivated successfully. Mar 25 01:07:00.260463 kubelet[2695]: I0325 01:07:00.260433 2695 kubelet_node_status.go:497] "Fast updating node status as it just became ready" Mar 25 01:07:00.282785 kubelet[2695]: I0325 01:07:00.282724 2695 topology_manager.go:215] "Topology Admit Handler" podUID="26e4919a-ab6f-4279-a456-a216a730af58" podNamespace="kube-system" podName="coredns-7db6d8ff4d-p2l2k" Mar 25 01:07:00.285957 kubelet[2695]: I0325 01:07:00.285922 2695 topology_manager.go:215] "Topology Admit Handler" podUID="09129f1d-9047-499c-9939-6616b096e952" podNamespace="calico-system" podName="calico-kube-controllers-7fcd8cd96b-w98xb" Mar 25 01:07:00.286743 kubelet[2695]: I0325 01:07:00.286589 2695 topology_manager.go:215] "Topology Admit Handler" podUID="f25e2e69-f991-42f2-b563-8cc9510c735f" podNamespace="kube-system" podName="coredns-7db6d8ff4d-vl8nc" Mar 25 01:07:00.287853 kubelet[2695]: I0325 01:07:00.287822 2695 topology_manager.go:215] "Topology Admit Handler" podUID="8f6521f3-e126-490b-8750-3f1707506fb2" podNamespace="calico-apiserver" podName="calico-apiserver-5558c45bbd-zxfm7" Mar 25 01:07:00.290213 kubelet[2695]: I0325 01:07:00.288843 2695 topology_manager.go:215] "Topology Admit Handler" podUID="200784d1-a1d0-438e-9ee6-bc1706e134c5" podNamespace="calico-apiserver" podName="calico-apiserver-5558c45bbd-dzxf2" Mar 25 01:07:00.294588 systemd[1]: Created slice kubepods-burstable-pod26e4919a_ab6f_4279_a456_a216a730af58.slice - libcontainer container kubepods-burstable-pod26e4919a_ab6f_4279_a456_a216a730af58.slice. Mar 25 01:07:00.304658 systemd[1]: Created slice kubepods-besteffort-pod09129f1d_9047_499c_9939_6616b096e952.slice - libcontainer container kubepods-besteffort-pod09129f1d_9047_499c_9939_6616b096e952.slice. Mar 25 01:07:00.319318 systemd[1]: Created slice kubepods-burstable-podf25e2e69_f991_42f2_b563_8cc9510c735f.slice - libcontainer container kubepods-burstable-podf25e2e69_f991_42f2_b563_8cc9510c735f.slice. Mar 25 01:07:00.326678 systemd[1]: Created slice kubepods-besteffort-pod8f6521f3_e126_490b_8750_3f1707506fb2.slice - libcontainer container kubepods-besteffort-pod8f6521f3_e126_490b_8750_3f1707506fb2.slice. Mar 25 01:07:00.332337 systemd[1]: Created slice kubepods-besteffort-pod200784d1_a1d0_438e_9ee6_bc1706e134c5.slice - libcontainer container kubepods-besteffort-pod200784d1_a1d0_438e_9ee6_bc1706e134c5.slice. Mar 25 01:07:00.380201 kubelet[2695]: I0325 01:07:00.380127 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlrgr\" (UniqueName: \"kubernetes.io/projected/8f6521f3-e126-490b-8750-3f1707506fb2-kube-api-access-dlrgr\") pod \"calico-apiserver-5558c45bbd-zxfm7\" (UID: \"8f6521f3-e126-490b-8750-3f1707506fb2\") " pod="calico-apiserver/calico-apiserver-5558c45bbd-zxfm7" Mar 25 01:07:00.380201 kubelet[2695]: I0325 01:07:00.380170 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bqf9\" (UniqueName: \"kubernetes.io/projected/200784d1-a1d0-438e-9ee6-bc1706e134c5-kube-api-access-7bqf9\") pod \"calico-apiserver-5558c45bbd-dzxf2\" (UID: \"200784d1-a1d0-438e-9ee6-bc1706e134c5\") " pod="calico-apiserver/calico-apiserver-5558c45bbd-dzxf2" Mar 25 01:07:00.380201 kubelet[2695]: I0325 01:07:00.380190 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67lgv\" (UniqueName: \"kubernetes.io/projected/f25e2e69-f991-42f2-b563-8cc9510c735f-kube-api-access-67lgv\") pod \"coredns-7db6d8ff4d-vl8nc\" (UID: \"f25e2e69-f991-42f2-b563-8cc9510c735f\") " pod="kube-system/coredns-7db6d8ff4d-vl8nc" Mar 25 01:07:00.380201 kubelet[2695]: I0325 01:07:00.380220 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/8f6521f3-e126-490b-8750-3f1707506fb2-calico-apiserver-certs\") pod \"calico-apiserver-5558c45bbd-zxfm7\" (UID: \"8f6521f3-e126-490b-8750-3f1707506fb2\") " pod="calico-apiserver/calico-apiserver-5558c45bbd-zxfm7" Mar 25 01:07:00.380201 kubelet[2695]: I0325 01:07:00.380239 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/26e4919a-ab6f-4279-a456-a216a730af58-config-volume\") pod \"coredns-7db6d8ff4d-p2l2k\" (UID: \"26e4919a-ab6f-4279-a456-a216a730af58\") " pod="kube-system/coredns-7db6d8ff4d-p2l2k" Mar 25 01:07:00.380698 kubelet[2695]: I0325 01:07:00.380262 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhh7l\" (UniqueName: \"kubernetes.io/projected/26e4919a-ab6f-4279-a456-a216a730af58-kube-api-access-dhh7l\") pod \"coredns-7db6d8ff4d-p2l2k\" (UID: \"26e4919a-ab6f-4279-a456-a216a730af58\") " pod="kube-system/coredns-7db6d8ff4d-p2l2k" Mar 25 01:07:00.380698 kubelet[2695]: I0325 01:07:00.380289 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/200784d1-a1d0-438e-9ee6-bc1706e134c5-calico-apiserver-certs\") pod \"calico-apiserver-5558c45bbd-dzxf2\" (UID: \"200784d1-a1d0-438e-9ee6-bc1706e134c5\") " pod="calico-apiserver/calico-apiserver-5558c45bbd-dzxf2" Mar 25 01:07:00.380698 kubelet[2695]: I0325 01:07:00.380360 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09129f1d-9047-499c-9939-6616b096e952-tigera-ca-bundle\") pod \"calico-kube-controllers-7fcd8cd96b-w98xb\" (UID: \"09129f1d-9047-499c-9939-6616b096e952\") " pod="calico-system/calico-kube-controllers-7fcd8cd96b-w98xb" Mar 25 01:07:00.380698 kubelet[2695]: I0325 01:07:00.380400 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g29j9\" (UniqueName: \"kubernetes.io/projected/09129f1d-9047-499c-9939-6616b096e952-kube-api-access-g29j9\") pod \"calico-kube-controllers-7fcd8cd96b-w98xb\" (UID: \"09129f1d-9047-499c-9939-6616b096e952\") " pod="calico-system/calico-kube-controllers-7fcd8cd96b-w98xb" Mar 25 01:07:00.380698 kubelet[2695]: I0325 01:07:00.380426 2695 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f25e2e69-f991-42f2-b563-8cc9510c735f-config-volume\") pod \"coredns-7db6d8ff4d-vl8nc\" (UID: \"f25e2e69-f991-42f2-b563-8cc9510c735f\") " pod="kube-system/coredns-7db6d8ff4d-vl8nc" Mar 25 01:07:00.599044 containerd[1456]: time="2025-03-25T01:07:00.598992896Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-p2l2k,Uid:26e4919a-ab6f-4279-a456-a216a730af58,Namespace:kube-system,Attempt:0,}" Mar 25 01:07:00.608143 containerd[1456]: time="2025-03-25T01:07:00.608020598Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7fcd8cd96b-w98xb,Uid:09129f1d-9047-499c-9939-6616b096e952,Namespace:calico-system,Attempt:0,}" Mar 25 01:07:00.623118 containerd[1456]: time="2025-03-25T01:07:00.622994448Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-vl8nc,Uid:f25e2e69-f991-42f2-b563-8cc9510c735f,Namespace:kube-system,Attempt:0,}" Mar 25 01:07:00.637757 containerd[1456]: time="2025-03-25T01:07:00.630606033Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5558c45bbd-zxfm7,Uid:8f6521f3-e126-490b-8750-3f1707506fb2,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:07:00.644702 containerd[1456]: time="2025-03-25T01:07:00.643479727Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5558c45bbd-dzxf2,Uid:200784d1-a1d0-438e-9ee6-bc1706e134c5,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:07:01.075279 containerd[1456]: time="2025-03-25T01:07:01.075217784Z" level=error msg="Failed to destroy network for sandbox \"0f89770cf518924b77a718d238031f894a4c1b0350e865ca78a2e592057d4527\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:07:01.075843 containerd[1456]: time="2025-03-25T01:07:01.075539744Z" level=error msg="Failed to destroy network for sandbox \"4d154203f2e116962f748cd3b37b32a49be723c1a8e15da27009b4d72b24ac87\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:07:01.076163 containerd[1456]: time="2025-03-25T01:07:01.076127263Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-vl8nc,Uid:f25e2e69-f991-42f2-b563-8cc9510c735f,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0f89770cf518924b77a718d238031f894a4c1b0350e865ca78a2e592057d4527\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:07:01.076870 kubelet[2695]: E0325 01:07:01.076829 2695 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0f89770cf518924b77a718d238031f894a4c1b0350e865ca78a2e592057d4527\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:07:01.076943 kubelet[2695]: E0325 01:07:01.076896 2695 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0f89770cf518924b77a718d238031f894a4c1b0350e865ca78a2e592057d4527\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-vl8nc" Mar 25 01:07:01.076943 kubelet[2695]: E0325 01:07:01.076917 2695 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0f89770cf518924b77a718d238031f894a4c1b0350e865ca78a2e592057d4527\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-vl8nc" Mar 25 01:07:01.076996 containerd[1456]: time="2025-03-25T01:07:01.076882941Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7fcd8cd96b-w98xb,Uid:09129f1d-9047-499c-9939-6616b096e952,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4d154203f2e116962f748cd3b37b32a49be723c1a8e15da27009b4d72b24ac87\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:07:01.077041 kubelet[2695]: E0325 01:07:01.076957 2695 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-vl8nc_kube-system(f25e2e69-f991-42f2-b563-8cc9510c735f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-vl8nc_kube-system(f25e2e69-f991-42f2-b563-8cc9510c735f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0f89770cf518924b77a718d238031f894a4c1b0350e865ca78a2e592057d4527\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-vl8nc" podUID="f25e2e69-f991-42f2-b563-8cc9510c735f" Mar 25 01:07:01.078563 kubelet[2695]: E0325 01:07:01.078349 2695 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4d154203f2e116962f748cd3b37b32a49be723c1a8e15da27009b4d72b24ac87\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:07:01.078563 kubelet[2695]: E0325 01:07:01.078392 2695 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4d154203f2e116962f748cd3b37b32a49be723c1a8e15da27009b4d72b24ac87\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7fcd8cd96b-w98xb" Mar 25 01:07:01.078563 kubelet[2695]: E0325 01:07:01.078408 2695 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4d154203f2e116962f748cd3b37b32a49be723c1a8e15da27009b4d72b24ac87\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7fcd8cd96b-w98xb" Mar 25 01:07:01.078683 kubelet[2695]: E0325 01:07:01.078438 2695 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7fcd8cd96b-w98xb_calico-system(09129f1d-9047-499c-9939-6616b096e952)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7fcd8cd96b-w98xb_calico-system(09129f1d-9047-499c-9939-6616b096e952)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4d154203f2e116962f748cd3b37b32a49be723c1a8e15da27009b4d72b24ac87\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7fcd8cd96b-w98xb" podUID="09129f1d-9047-499c-9939-6616b096e952" Mar 25 01:07:01.082335 containerd[1456]: time="2025-03-25T01:07:01.082270251Z" level=error msg="Failed to destroy network for sandbox \"8a314a9b72fa43b9fb71fdb5334d394bf5b25a082351c81fe4707149452c868b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:07:01.084243 containerd[1456]: time="2025-03-25T01:07:01.083619728Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5558c45bbd-dzxf2,Uid:200784d1-a1d0-438e-9ee6-bc1706e134c5,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a314a9b72fa43b9fb71fdb5334d394bf5b25a082351c81fe4707149452c868b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:07:01.084990 kubelet[2695]: E0325 01:07:01.084644 2695 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a314a9b72fa43b9fb71fdb5334d394bf5b25a082351c81fe4707149452c868b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:07:01.084990 kubelet[2695]: E0325 01:07:01.084685 2695 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a314a9b72fa43b9fb71fdb5334d394bf5b25a082351c81fe4707149452c868b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5558c45bbd-dzxf2" Mar 25 01:07:01.084990 kubelet[2695]: E0325 01:07:01.084701 2695 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a314a9b72fa43b9fb71fdb5334d394bf5b25a082351c81fe4707149452c868b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5558c45bbd-dzxf2" Mar 25 01:07:01.085127 kubelet[2695]: E0325 01:07:01.084736 2695 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5558c45bbd-dzxf2_calico-apiserver(200784d1-a1d0-438e-9ee6-bc1706e134c5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5558c45bbd-dzxf2_calico-apiserver(200784d1-a1d0-438e-9ee6-bc1706e134c5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8a314a9b72fa43b9fb71fdb5334d394bf5b25a082351c81fe4707149452c868b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5558c45bbd-dzxf2" podUID="200784d1-a1d0-438e-9ee6-bc1706e134c5" Mar 25 01:07:01.085422 systemd[1]: Created slice kubepods-besteffort-pod0fe46c57_cacb_46aa_b01f_e805370dad30.slice - libcontainer container kubepods-besteffort-pod0fe46c57_cacb_46aa_b01f_e805370dad30.slice. Mar 25 01:07:01.087229 containerd[1456]: time="2025-03-25T01:07:01.087030601Z" level=error msg="Failed to destroy network for sandbox \"282caa1c65c446d588581a799f59fb9e89871d24a598e28b92fa52183e057dfa\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:07:01.087929 containerd[1456]: time="2025-03-25T01:07:01.087895440Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-p2l2k,Uid:26e4919a-ab6f-4279-a456-a216a730af58,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"282caa1c65c446d588581a799f59fb9e89871d24a598e28b92fa52183e057dfa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:07:01.088701 kubelet[2695]: E0325 01:07:01.088060 2695 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"282caa1c65c446d588581a799f59fb9e89871d24a598e28b92fa52183e057dfa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:07:01.088701 kubelet[2695]: E0325 01:07:01.088099 2695 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"282caa1c65c446d588581a799f59fb9e89871d24a598e28b92fa52183e057dfa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-p2l2k" Mar 25 01:07:01.088701 kubelet[2695]: E0325 01:07:01.088134 2695 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"282caa1c65c446d588581a799f59fb9e89871d24a598e28b92fa52183e057dfa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-p2l2k" Mar 25 01:07:01.088843 containerd[1456]: time="2025-03-25T01:07:01.088469759Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ggjvq,Uid:0fe46c57-cacb-46aa-b01f-e805370dad30,Namespace:calico-system,Attempt:0,}" Mar 25 01:07:01.088871 kubelet[2695]: E0325 01:07:01.088163 2695 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-p2l2k_kube-system(26e4919a-ab6f-4279-a456-a216a730af58)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-p2l2k_kube-system(26e4919a-ab6f-4279-a456-a216a730af58)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"282caa1c65c446d588581a799f59fb9e89871d24a598e28b92fa52183e057dfa\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-p2l2k" podUID="26e4919a-ab6f-4279-a456-a216a730af58" Mar 25 01:07:01.092537 containerd[1456]: time="2025-03-25T01:07:01.092477951Z" level=error msg="Failed to destroy network for sandbox \"bb9e7acc0e530b962176445fd2453d7fcf9dca40975f343ba26e7b5002d8e1c2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:07:01.093356 containerd[1456]: time="2025-03-25T01:07:01.093318949Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5558c45bbd-zxfm7,Uid:8f6521f3-e126-490b-8750-3f1707506fb2,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"bb9e7acc0e530b962176445fd2453d7fcf9dca40975f343ba26e7b5002d8e1c2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:07:01.094060 kubelet[2695]: E0325 01:07:01.094025 2695 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bb9e7acc0e530b962176445fd2453d7fcf9dca40975f343ba26e7b5002d8e1c2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:07:01.094150 kubelet[2695]: E0325 01:07:01.094080 2695 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bb9e7acc0e530b962176445fd2453d7fcf9dca40975f343ba26e7b5002d8e1c2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5558c45bbd-zxfm7" Mar 25 01:07:01.094150 kubelet[2695]: E0325 01:07:01.094102 2695 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bb9e7acc0e530b962176445fd2453d7fcf9dca40975f343ba26e7b5002d8e1c2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5558c45bbd-zxfm7" Mar 25 01:07:01.094208 kubelet[2695]: E0325 01:07:01.094148 2695 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5558c45bbd-zxfm7_calico-apiserver(8f6521f3-e126-490b-8750-3f1707506fb2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5558c45bbd-zxfm7_calico-apiserver(8f6521f3-e126-490b-8750-3f1707506fb2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bb9e7acc0e530b962176445fd2453d7fcf9dca40975f343ba26e7b5002d8e1c2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5558c45bbd-zxfm7" podUID="8f6521f3-e126-490b-8750-3f1707506fb2" Mar 25 01:07:01.132937 containerd[1456]: time="2025-03-25T01:07:01.132888752Z" level=error msg="Failed to destroy network for sandbox \"aa87571984b775f20ceca56e2ec846997d430e91f6a60ac095739e07e9e4e269\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:07:01.133828 containerd[1456]: time="2025-03-25T01:07:01.133785190Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ggjvq,Uid:0fe46c57-cacb-46aa-b01f-e805370dad30,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa87571984b775f20ceca56e2ec846997d430e91f6a60ac095739e07e9e4e269\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:07:01.135133 kubelet[2695]: E0325 01:07:01.134067 2695 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa87571984b775f20ceca56e2ec846997d430e91f6a60ac095739e07e9e4e269\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:07:01.135133 kubelet[2695]: E0325 01:07:01.134129 2695 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa87571984b775f20ceca56e2ec846997d430e91f6a60ac095739e07e9e4e269\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-ggjvq" Mar 25 01:07:01.135133 kubelet[2695]: E0325 01:07:01.134156 2695 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa87571984b775f20ceca56e2ec846997d430e91f6a60ac095739e07e9e4e269\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-ggjvq" Mar 25 01:07:01.135279 kubelet[2695]: E0325 01:07:01.134200 2695 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-ggjvq_calico-system(0fe46c57-cacb-46aa-b01f-e805370dad30)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-ggjvq_calico-system(0fe46c57-cacb-46aa-b01f-e805370dad30)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"aa87571984b775f20ceca56e2ec846997d430e91f6a60ac095739e07e9e4e269\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-ggjvq" podUID="0fe46c57-cacb-46aa-b01f-e805370dad30" Mar 25 01:07:01.188833 containerd[1456]: time="2025-03-25T01:07:01.188793483Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\"" Mar 25 01:07:01.599728 systemd[1]: run-netns-cni\x2d5f00ccaf\x2d6ce7\x2dcc26\x2d8149\x2d6c041ba1d8bd.mount: Deactivated successfully. Mar 25 01:07:01.599829 systemd[1]: run-netns-cni\x2d95245e38\x2d2dd9\x2d2c20\x2dbc37\x2dafd3771ceb1e.mount: Deactivated successfully. Mar 25 01:07:01.599879 systemd[1]: run-netns-cni\x2d5b7d44c7\x2d883c\x2d2c8a\x2d3642\x2de208dcaf5795.mount: Deactivated successfully. Mar 25 01:07:01.599926 systemd[1]: run-netns-cni\x2df8b83338\x2d6355\x2d8670\x2d231b\x2d00eb5ec6e299.mount: Deactivated successfully. Mar 25 01:07:01.658145 systemd[1]: Started sshd@8-10.0.0.8:22-10.0.0.1:33088.service - OpenSSH per-connection server daemon (10.0.0.1:33088). Mar 25 01:07:01.706914 sshd[3696]: Accepted publickey for core from 10.0.0.1 port 33088 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:07:01.708265 sshd-session[3696]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:07:01.711990 systemd-logind[1439]: New session 9 of user core. Mar 25 01:07:01.724536 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 25 01:07:01.832360 sshd[3698]: Connection closed by 10.0.0.1 port 33088 Mar 25 01:07:01.832687 sshd-session[3696]: pam_unix(sshd:session): session closed for user core Mar 25 01:07:01.838402 systemd[1]: sshd@8-10.0.0.8:22-10.0.0.1:33088.service: Deactivated successfully. Mar 25 01:07:01.839950 systemd[1]: session-9.scope: Deactivated successfully. Mar 25 01:07:01.841079 systemd-logind[1439]: Session 9 logged out. Waiting for processes to exit. Mar 25 01:07:01.841980 systemd-logind[1439]: Removed session 9. Mar 25 01:07:04.889496 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2271919959.mount: Deactivated successfully. Mar 25 01:07:05.217421 containerd[1456]: time="2025-03-25T01:07:05.217284220Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:07:05.218383 containerd[1456]: time="2025-03-25T01:07:05.218327618Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.2: active requests=0, bytes read=137086024" Mar 25 01:07:05.220840 containerd[1456]: time="2025-03-25T01:07:05.220803333Z" level=info msg="ImageCreate event name:\"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:07:05.224704 containerd[1456]: time="2025-03-25T01:07:05.224660567Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:07:05.226111 containerd[1456]: time="2025-03-25T01:07:05.226064604Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.2\" with image id \"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\", size \"137085886\" in 4.037228801s" Mar 25 01:07:05.226160 containerd[1456]: time="2025-03-25T01:07:05.226123604Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\" returns image reference \"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\"" Mar 25 01:07:05.235628 containerd[1456]: time="2025-03-25T01:07:05.235589347Z" level=info msg="CreateContainer within sandbox \"e42e4c4ae172f60b07aabfe7ee21df42032fa6a1718cacf530d78018abe65ae6\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 25 01:07:05.244151 containerd[1456]: time="2025-03-25T01:07:05.243785493Z" level=info msg="Container 4b212a9c005b741d0aef7bf8f4253aef2997374a667f22f8d3d61bf3c4998e81: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:07:05.265169 containerd[1456]: time="2025-03-25T01:07:05.264452296Z" level=info msg="CreateContainer within sandbox \"e42e4c4ae172f60b07aabfe7ee21df42032fa6a1718cacf530d78018abe65ae6\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"4b212a9c005b741d0aef7bf8f4253aef2997374a667f22f8d3d61bf3c4998e81\"" Mar 25 01:07:05.267051 containerd[1456]: time="2025-03-25T01:07:05.267007211Z" level=info msg="StartContainer for \"4b212a9c005b741d0aef7bf8f4253aef2997374a667f22f8d3d61bf3c4998e81\"" Mar 25 01:07:05.268695 containerd[1456]: time="2025-03-25T01:07:05.268665489Z" level=info msg="connecting to shim 4b212a9c005b741d0aef7bf8f4253aef2997374a667f22f8d3d61bf3c4998e81" address="unix:///run/containerd/s/96ef2d36d799f04579b68f7294a7f6bd31dbf4f2d2ec3431eee000179b3d6338" protocol=ttrpc version=3 Mar 25 01:07:05.291319 systemd[1]: Started cri-containerd-4b212a9c005b741d0aef7bf8f4253aef2997374a667f22f8d3d61bf3c4998e81.scope - libcontainer container 4b212a9c005b741d0aef7bf8f4253aef2997374a667f22f8d3d61bf3c4998e81. Mar 25 01:07:05.334888 containerd[1456]: time="2025-03-25T01:07:05.334768331Z" level=info msg="StartContainer for \"4b212a9c005b741d0aef7bf8f4253aef2997374a667f22f8d3d61bf3c4998e81\" returns successfully" Mar 25 01:07:05.508625 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Mar 25 01:07:05.508770 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Mar 25 01:07:06.215702 kubelet[2695]: I0325 01:07:06.215629 2695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-klxhh" podStartSLOduration=2.180618518 podStartE2EDuration="16.215611379s" podCreationTimestamp="2025-03-25 01:06:50 +0000 UTC" firstStartedPulling="2025-03-25 01:06:51.191793702 +0000 UTC m=+23.206851346" lastFinishedPulling="2025-03-25 01:07:05.226786523 +0000 UTC m=+37.241844207" observedRunningTime="2025-03-25 01:07:06.215391099 +0000 UTC m=+38.230448783" watchObservedRunningTime="2025-03-25 01:07:06.215611379 +0000 UTC m=+38.230669063" Mar 25 01:07:06.850984 systemd[1]: Started sshd@9-10.0.0.8:22-10.0.0.1:39084.service - OpenSSH per-connection server daemon (10.0.0.1:39084). Mar 25 01:07:06.915747 sshd[3880]: Accepted publickey for core from 10.0.0.1 port 39084 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:07:06.917330 sshd-session[3880]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:07:06.921199 systemd-logind[1439]: New session 10 of user core. Mar 25 01:07:06.932300 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 25 01:07:07.050296 sshd[3885]: Connection closed by 10.0.0.1 port 39084 Mar 25 01:07:07.050880 sshd-session[3880]: pam_unix(sshd:session): session closed for user core Mar 25 01:07:07.060748 systemd[1]: sshd@9-10.0.0.8:22-10.0.0.1:39084.service: Deactivated successfully. Mar 25 01:07:07.064061 systemd[1]: session-10.scope: Deactivated successfully. Mar 25 01:07:07.065136 systemd-logind[1439]: Session 10 logged out. Waiting for processes to exit. Mar 25 01:07:07.068649 systemd[1]: Started sshd@10-10.0.0.8:22-10.0.0.1:39098.service - OpenSSH per-connection server daemon (10.0.0.1:39098). Mar 25 01:07:07.070665 systemd-logind[1439]: Removed session 10. Mar 25 01:07:07.121241 sshd[3901]: Accepted publickey for core from 10.0.0.1 port 39098 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:07:07.122283 sshd-session[3901]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:07:07.126404 systemd-logind[1439]: New session 11 of user core. Mar 25 01:07:07.134299 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 25 01:07:07.209432 kubelet[2695]: I0325 01:07:07.209401 2695 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:07:07.289752 sshd[3904]: Connection closed by 10.0.0.1 port 39098 Mar 25 01:07:07.290507 sshd-session[3901]: pam_unix(sshd:session): session closed for user core Mar 25 01:07:07.307778 systemd[1]: sshd@10-10.0.0.8:22-10.0.0.1:39098.service: Deactivated successfully. Mar 25 01:07:07.314034 systemd[1]: session-11.scope: Deactivated successfully. Mar 25 01:07:07.322146 systemd-logind[1439]: Session 11 logged out. Waiting for processes to exit. Mar 25 01:07:07.325405 systemd[1]: Started sshd@11-10.0.0.8:22-10.0.0.1:39102.service - OpenSSH per-connection server daemon (10.0.0.1:39102). Mar 25 01:07:07.329704 systemd-logind[1439]: Removed session 11. Mar 25 01:07:07.387780 sshd[3915]: Accepted publickey for core from 10.0.0.1 port 39102 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:07:07.389365 sshd-session[3915]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:07:07.395212 systemd-logind[1439]: New session 12 of user core. Mar 25 01:07:07.403318 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 25 01:07:07.516348 sshd[3918]: Connection closed by 10.0.0.1 port 39102 Mar 25 01:07:07.516876 sshd-session[3915]: pam_unix(sshd:session): session closed for user core Mar 25 01:07:07.520234 systemd[1]: sshd@11-10.0.0.8:22-10.0.0.1:39102.service: Deactivated successfully. Mar 25 01:07:07.522066 systemd[1]: session-12.scope: Deactivated successfully. Mar 25 01:07:07.522823 systemd-logind[1439]: Session 12 logged out. Waiting for processes to exit. Mar 25 01:07:07.523724 systemd-logind[1439]: Removed session 12. Mar 25 01:07:10.039497 kubelet[2695]: I0325 01:07:10.039452 2695 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:07:11.065161 kernel: bpftool[4049]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 25 01:07:11.228510 systemd-networkd[1390]: vxlan.calico: Link UP Mar 25 01:07:11.228516 systemd-networkd[1390]: vxlan.calico: Gained carrier Mar 25 01:07:12.396678 systemd-networkd[1390]: vxlan.calico: Gained IPv6LL Mar 25 01:07:12.534515 systemd[1]: Started sshd@12-10.0.0.8:22-10.0.0.1:55344.service - OpenSSH per-connection server daemon (10.0.0.1:55344). Mar 25 01:07:12.593662 sshd[4140]: Accepted publickey for core from 10.0.0.1 port 55344 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:07:12.595170 sshd-session[4140]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:07:12.598925 systemd-logind[1439]: New session 13 of user core. Mar 25 01:07:12.605272 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 25 01:07:12.744590 sshd[4142]: Connection closed by 10.0.0.1 port 55344 Mar 25 01:07:12.744913 sshd-session[4140]: pam_unix(sshd:session): session closed for user core Mar 25 01:07:12.748531 systemd[1]: sshd@12-10.0.0.8:22-10.0.0.1:55344.service: Deactivated successfully. Mar 25 01:07:12.752360 systemd[1]: session-13.scope: Deactivated successfully. Mar 25 01:07:12.753025 systemd-logind[1439]: Session 13 logged out. Waiting for processes to exit. Mar 25 01:07:12.753840 systemd-logind[1439]: Removed session 13. Mar 25 01:07:14.071827 containerd[1456]: time="2025-03-25T01:07:14.071620309Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5558c45bbd-dzxf2,Uid:200784d1-a1d0-438e-9ee6-bc1706e134c5,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:07:14.071827 containerd[1456]: time="2025-03-25T01:07:14.071633109Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7fcd8cd96b-w98xb,Uid:09129f1d-9047-499c-9939-6616b096e952,Namespace:calico-system,Attempt:0,}" Mar 25 01:07:14.072256 containerd[1456]: time="2025-03-25T01:07:14.071795749Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-vl8nc,Uid:f25e2e69-f991-42f2-b563-8cc9510c735f,Namespace:kube-system,Attempt:0,}" Mar 25 01:07:14.072256 containerd[1456]: time="2025-03-25T01:07:14.072218028Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-p2l2k,Uid:26e4919a-ab6f-4279-a456-a216a730af58,Namespace:kube-system,Attempt:0,}" Mar 25 01:07:14.470581 systemd-networkd[1390]: caliaef712da230: Link UP Mar 25 01:07:14.471816 systemd-networkd[1390]: caliaef712da230: Gained carrier Mar 25 01:07:14.483940 containerd[1456]: 2025-03-25 01:07:14.183 [INFO][4173] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7db6d8ff4d--vl8nc-eth0 coredns-7db6d8ff4d- kube-system f25e2e69-f991-42f2-b563-8cc9510c735f 731 0 2025-03-25 01:06:44 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7db6d8ff4d-vl8nc eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] caliaef712da230 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="c17aa3bd6eff395ac558d67502e848ea897409b32e7cfc81aab41e9cdb359063" Namespace="kube-system" Pod="coredns-7db6d8ff4d-vl8nc" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--vl8nc-" Mar 25 01:07:14.483940 containerd[1456]: 2025-03-25 01:07:14.183 [INFO][4173] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="c17aa3bd6eff395ac558d67502e848ea897409b32e7cfc81aab41e9cdb359063" Namespace="kube-system" Pod="coredns-7db6d8ff4d-vl8nc" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--vl8nc-eth0" Mar 25 01:07:14.483940 containerd[1456]: 2025-03-25 01:07:14.415 [INFO][4219] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c17aa3bd6eff395ac558d67502e848ea897409b32e7cfc81aab41e9cdb359063" HandleID="k8s-pod-network.c17aa3bd6eff395ac558d67502e848ea897409b32e7cfc81aab41e9cdb359063" Workload="localhost-k8s-coredns--7db6d8ff4d--vl8nc-eth0" Mar 25 01:07:14.485094 containerd[1456]: 2025-03-25 01:07:14.436 [INFO][4219] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c17aa3bd6eff395ac558d67502e848ea897409b32e7cfc81aab41e9cdb359063" HandleID="k8s-pod-network.c17aa3bd6eff395ac558d67502e848ea897409b32e7cfc81aab41e9cdb359063" Workload="localhost-k8s-coredns--7db6d8ff4d--vl8nc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000593940), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7db6d8ff4d-vl8nc", "timestamp":"2025-03-25 01:07:14.415271513 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:07:14.485094 containerd[1456]: 2025-03-25 01:07:14.436 [INFO][4219] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:07:14.485094 containerd[1456]: 2025-03-25 01:07:14.436 [INFO][4219] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:07:14.485094 containerd[1456]: 2025-03-25 01:07:14.436 [INFO][4219] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 25 01:07:14.485094 containerd[1456]: 2025-03-25 01:07:14.439 [INFO][4219] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.c17aa3bd6eff395ac558d67502e848ea897409b32e7cfc81aab41e9cdb359063" host="localhost" Mar 25 01:07:14.485094 containerd[1456]: 2025-03-25 01:07:14.445 [INFO][4219] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 25 01:07:14.485094 containerd[1456]: 2025-03-25 01:07:14.449 [INFO][4219] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 25 01:07:14.485094 containerd[1456]: 2025-03-25 01:07:14.451 [INFO][4219] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 25 01:07:14.485094 containerd[1456]: 2025-03-25 01:07:14.453 [INFO][4219] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 25 01:07:14.485094 containerd[1456]: 2025-03-25 01:07:14.453 [INFO][4219] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.c17aa3bd6eff395ac558d67502e848ea897409b32e7cfc81aab41e9cdb359063" host="localhost" Mar 25 01:07:14.485520 containerd[1456]: 2025-03-25 01:07:14.455 [INFO][4219] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.c17aa3bd6eff395ac558d67502e848ea897409b32e7cfc81aab41e9cdb359063 Mar 25 01:07:14.485520 containerd[1456]: 2025-03-25 01:07:14.458 [INFO][4219] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.c17aa3bd6eff395ac558d67502e848ea897409b32e7cfc81aab41e9cdb359063" host="localhost" Mar 25 01:07:14.485520 containerd[1456]: 2025-03-25 01:07:14.463 [INFO][4219] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.c17aa3bd6eff395ac558d67502e848ea897409b32e7cfc81aab41e9cdb359063" host="localhost" Mar 25 01:07:14.485520 containerd[1456]: 2025-03-25 01:07:14.463 [INFO][4219] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.c17aa3bd6eff395ac558d67502e848ea897409b32e7cfc81aab41e9cdb359063" host="localhost" Mar 25 01:07:14.485520 containerd[1456]: 2025-03-25 01:07:14.463 [INFO][4219] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:07:14.485520 containerd[1456]: 2025-03-25 01:07:14.463 [INFO][4219] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="c17aa3bd6eff395ac558d67502e848ea897409b32e7cfc81aab41e9cdb359063" HandleID="k8s-pod-network.c17aa3bd6eff395ac558d67502e848ea897409b32e7cfc81aab41e9cdb359063" Workload="localhost-k8s-coredns--7db6d8ff4d--vl8nc-eth0" Mar 25 01:07:14.485658 containerd[1456]: 2025-03-25 01:07:14.465 [INFO][4173] cni-plugin/k8s.go 386: Populated endpoint ContainerID="c17aa3bd6eff395ac558d67502e848ea897409b32e7cfc81aab41e9cdb359063" Namespace="kube-system" Pod="coredns-7db6d8ff4d-vl8nc" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--vl8nc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--vl8nc-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"f25e2e69-f991-42f2-b563-8cc9510c735f", ResourceVersion:"731", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 6, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7db6d8ff4d-vl8nc", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliaef712da230", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:07:14.485715 containerd[1456]: 2025-03-25 01:07:14.466 [INFO][4173] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.129/32] ContainerID="c17aa3bd6eff395ac558d67502e848ea897409b32e7cfc81aab41e9cdb359063" Namespace="kube-system" Pod="coredns-7db6d8ff4d-vl8nc" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--vl8nc-eth0" Mar 25 01:07:14.485715 containerd[1456]: 2025-03-25 01:07:14.466 [INFO][4173] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliaef712da230 ContainerID="c17aa3bd6eff395ac558d67502e848ea897409b32e7cfc81aab41e9cdb359063" Namespace="kube-system" Pod="coredns-7db6d8ff4d-vl8nc" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--vl8nc-eth0" Mar 25 01:07:14.485715 containerd[1456]: 2025-03-25 01:07:14.470 [INFO][4173] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c17aa3bd6eff395ac558d67502e848ea897409b32e7cfc81aab41e9cdb359063" Namespace="kube-system" Pod="coredns-7db6d8ff4d-vl8nc" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--vl8nc-eth0" Mar 25 01:07:14.485781 containerd[1456]: 2025-03-25 01:07:14.471 [INFO][4173] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="c17aa3bd6eff395ac558d67502e848ea897409b32e7cfc81aab41e9cdb359063" Namespace="kube-system" Pod="coredns-7db6d8ff4d-vl8nc" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--vl8nc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--vl8nc-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"f25e2e69-f991-42f2-b563-8cc9510c735f", ResourceVersion:"731", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 6, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"c17aa3bd6eff395ac558d67502e848ea897409b32e7cfc81aab41e9cdb359063", Pod:"coredns-7db6d8ff4d-vl8nc", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliaef712da230", MAC:"a6:07:b4:cc:a0:56", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:07:14.485781 containerd[1456]: 2025-03-25 01:07:14.481 [INFO][4173] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="c17aa3bd6eff395ac558d67502e848ea897409b32e7cfc81aab41e9cdb359063" Namespace="kube-system" Pod="coredns-7db6d8ff4d-vl8nc" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--vl8nc-eth0" Mar 25 01:07:14.509204 systemd-networkd[1390]: calib0802aac4cd: Link UP Mar 25 01:07:14.510133 systemd-networkd[1390]: calib0802aac4cd: Gained carrier Mar 25 01:07:14.522975 containerd[1456]: 2025-03-25 01:07:14.181 [INFO][4157] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--5558c45bbd--dzxf2-eth0 calico-apiserver-5558c45bbd- calico-apiserver 200784d1-a1d0-438e-9ee6-bc1706e134c5 732 0 2025-03-25 01:06:49 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5558c45bbd projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-5558c45bbd-dzxf2 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calib0802aac4cd [] []}} ContainerID="670c5c9be460f49c9150f6dcd6e3ff293dcaf36b91eab09ec15c281ec6898221" Namespace="calico-apiserver" Pod="calico-apiserver-5558c45bbd-dzxf2" WorkloadEndpoint="localhost-k8s-calico--apiserver--5558c45bbd--dzxf2-" Mar 25 01:07:14.522975 containerd[1456]: 2025-03-25 01:07:14.181 [INFO][4157] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="670c5c9be460f49c9150f6dcd6e3ff293dcaf36b91eab09ec15c281ec6898221" Namespace="calico-apiserver" Pod="calico-apiserver-5558c45bbd-dzxf2" WorkloadEndpoint="localhost-k8s-calico--apiserver--5558c45bbd--dzxf2-eth0" Mar 25 01:07:14.522975 containerd[1456]: 2025-03-25 01:07:14.415 [INFO][4224] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="670c5c9be460f49c9150f6dcd6e3ff293dcaf36b91eab09ec15c281ec6898221" HandleID="k8s-pod-network.670c5c9be460f49c9150f6dcd6e3ff293dcaf36b91eab09ec15c281ec6898221" Workload="localhost-k8s-calico--apiserver--5558c45bbd--dzxf2-eth0" Mar 25 01:07:14.522975 containerd[1456]: 2025-03-25 01:07:14.435 [INFO][4224] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="670c5c9be460f49c9150f6dcd6e3ff293dcaf36b91eab09ec15c281ec6898221" HandleID="k8s-pod-network.670c5c9be460f49c9150f6dcd6e3ff293dcaf36b91eab09ec15c281ec6898221" Workload="localhost-k8s-calico--apiserver--5558c45bbd--dzxf2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000684730), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-5558c45bbd-dzxf2", "timestamp":"2025-03-25 01:07:14.415269633 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:07:14.522975 containerd[1456]: 2025-03-25 01:07:14.436 [INFO][4224] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:07:14.522975 containerd[1456]: 2025-03-25 01:07:14.463 [INFO][4224] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:07:14.522975 containerd[1456]: 2025-03-25 01:07:14.463 [INFO][4224] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 25 01:07:14.522975 containerd[1456]: 2025-03-25 01:07:14.466 [INFO][4224] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.670c5c9be460f49c9150f6dcd6e3ff293dcaf36b91eab09ec15c281ec6898221" host="localhost" Mar 25 01:07:14.522975 containerd[1456]: 2025-03-25 01:07:14.474 [INFO][4224] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 25 01:07:14.522975 containerd[1456]: 2025-03-25 01:07:14.480 [INFO][4224] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 25 01:07:14.522975 containerd[1456]: 2025-03-25 01:07:14.483 [INFO][4224] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 25 01:07:14.522975 containerd[1456]: 2025-03-25 01:07:14.491 [INFO][4224] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 25 01:07:14.522975 containerd[1456]: 2025-03-25 01:07:14.491 [INFO][4224] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.670c5c9be460f49c9150f6dcd6e3ff293dcaf36b91eab09ec15c281ec6898221" host="localhost" Mar 25 01:07:14.522975 containerd[1456]: 2025-03-25 01:07:14.493 [INFO][4224] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.670c5c9be460f49c9150f6dcd6e3ff293dcaf36b91eab09ec15c281ec6898221 Mar 25 01:07:14.522975 containerd[1456]: 2025-03-25 01:07:14.498 [INFO][4224] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.670c5c9be460f49c9150f6dcd6e3ff293dcaf36b91eab09ec15c281ec6898221" host="localhost" Mar 25 01:07:14.522975 containerd[1456]: 2025-03-25 01:07:14.503 [INFO][4224] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.670c5c9be460f49c9150f6dcd6e3ff293dcaf36b91eab09ec15c281ec6898221" host="localhost" Mar 25 01:07:14.522975 containerd[1456]: 2025-03-25 01:07:14.503 [INFO][4224] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.670c5c9be460f49c9150f6dcd6e3ff293dcaf36b91eab09ec15c281ec6898221" host="localhost" Mar 25 01:07:14.522975 containerd[1456]: 2025-03-25 01:07:14.503 [INFO][4224] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:07:14.522975 containerd[1456]: 2025-03-25 01:07:14.503 [INFO][4224] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="670c5c9be460f49c9150f6dcd6e3ff293dcaf36b91eab09ec15c281ec6898221" HandleID="k8s-pod-network.670c5c9be460f49c9150f6dcd6e3ff293dcaf36b91eab09ec15c281ec6898221" Workload="localhost-k8s-calico--apiserver--5558c45bbd--dzxf2-eth0" Mar 25 01:07:14.523864 containerd[1456]: 2025-03-25 01:07:14.506 [INFO][4157] cni-plugin/k8s.go 386: Populated endpoint ContainerID="670c5c9be460f49c9150f6dcd6e3ff293dcaf36b91eab09ec15c281ec6898221" Namespace="calico-apiserver" Pod="calico-apiserver-5558c45bbd-dzxf2" WorkloadEndpoint="localhost-k8s-calico--apiserver--5558c45bbd--dzxf2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5558c45bbd--dzxf2-eth0", GenerateName:"calico-apiserver-5558c45bbd-", Namespace:"calico-apiserver", SelfLink:"", UID:"200784d1-a1d0-438e-9ee6-bc1706e134c5", ResourceVersion:"732", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 6, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5558c45bbd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-5558c45bbd-dzxf2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib0802aac4cd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:07:14.523864 containerd[1456]: 2025-03-25 01:07:14.506 [INFO][4157] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.130/32] ContainerID="670c5c9be460f49c9150f6dcd6e3ff293dcaf36b91eab09ec15c281ec6898221" Namespace="calico-apiserver" Pod="calico-apiserver-5558c45bbd-dzxf2" WorkloadEndpoint="localhost-k8s-calico--apiserver--5558c45bbd--dzxf2-eth0" Mar 25 01:07:14.523864 containerd[1456]: 2025-03-25 01:07:14.506 [INFO][4157] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib0802aac4cd ContainerID="670c5c9be460f49c9150f6dcd6e3ff293dcaf36b91eab09ec15c281ec6898221" Namespace="calico-apiserver" Pod="calico-apiserver-5558c45bbd-dzxf2" WorkloadEndpoint="localhost-k8s-calico--apiserver--5558c45bbd--dzxf2-eth0" Mar 25 01:07:14.523864 containerd[1456]: 2025-03-25 01:07:14.510 [INFO][4157] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="670c5c9be460f49c9150f6dcd6e3ff293dcaf36b91eab09ec15c281ec6898221" Namespace="calico-apiserver" Pod="calico-apiserver-5558c45bbd-dzxf2" WorkloadEndpoint="localhost-k8s-calico--apiserver--5558c45bbd--dzxf2-eth0" Mar 25 01:07:14.523864 containerd[1456]: 2025-03-25 01:07:14.511 [INFO][4157] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="670c5c9be460f49c9150f6dcd6e3ff293dcaf36b91eab09ec15c281ec6898221" Namespace="calico-apiserver" Pod="calico-apiserver-5558c45bbd-dzxf2" WorkloadEndpoint="localhost-k8s-calico--apiserver--5558c45bbd--dzxf2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5558c45bbd--dzxf2-eth0", GenerateName:"calico-apiserver-5558c45bbd-", Namespace:"calico-apiserver", SelfLink:"", UID:"200784d1-a1d0-438e-9ee6-bc1706e134c5", ResourceVersion:"732", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 6, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5558c45bbd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"670c5c9be460f49c9150f6dcd6e3ff293dcaf36b91eab09ec15c281ec6898221", Pod:"calico-apiserver-5558c45bbd-dzxf2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib0802aac4cd", MAC:"e2:ac:cd:3a:73:1e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:07:14.523864 containerd[1456]: 2025-03-25 01:07:14.519 [INFO][4157] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="670c5c9be460f49c9150f6dcd6e3ff293dcaf36b91eab09ec15c281ec6898221" Namespace="calico-apiserver" Pod="calico-apiserver-5558c45bbd-dzxf2" WorkloadEndpoint="localhost-k8s-calico--apiserver--5558c45bbd--dzxf2-eth0" Mar 25 01:07:14.569299 systemd-networkd[1390]: calie20225c3011: Link UP Mar 25 01:07:14.570250 systemd-networkd[1390]: calie20225c3011: Gained carrier Mar 25 01:07:14.714646 containerd[1456]: 2025-03-25 01:07:14.181 [INFO][4178] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7db6d8ff4d--p2l2k-eth0 coredns-7db6d8ff4d- kube-system 26e4919a-ab6f-4279-a456-a216a730af58 727 0 2025-03-25 01:06:44 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7db6d8ff4d-p2l2k eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calie20225c3011 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="4fe150aced9a82e6f4f96e1f2caa6f0e86d160a59599a4067fe868b3414ae753" Namespace="kube-system" Pod="coredns-7db6d8ff4d-p2l2k" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--p2l2k-" Mar 25 01:07:14.714646 containerd[1456]: 2025-03-25 01:07:14.181 [INFO][4178] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="4fe150aced9a82e6f4f96e1f2caa6f0e86d160a59599a4067fe868b3414ae753" Namespace="kube-system" Pod="coredns-7db6d8ff4d-p2l2k" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--p2l2k-eth0" Mar 25 01:07:14.714646 containerd[1456]: 2025-03-25 01:07:14.415 [INFO][4223] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4fe150aced9a82e6f4f96e1f2caa6f0e86d160a59599a4067fe868b3414ae753" HandleID="k8s-pod-network.4fe150aced9a82e6f4f96e1f2caa6f0e86d160a59599a4067fe868b3414ae753" Workload="localhost-k8s-coredns--7db6d8ff4d--p2l2k-eth0" Mar 25 01:07:14.714646 containerd[1456]: 2025-03-25 01:07:14.439 [INFO][4223] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4fe150aced9a82e6f4f96e1f2caa6f0e86d160a59599a4067fe868b3414ae753" HandleID="k8s-pod-network.4fe150aced9a82e6f4f96e1f2caa6f0e86d160a59599a4067fe868b3414ae753" Workload="localhost-k8s-coredns--7db6d8ff4d--p2l2k-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002897c0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7db6d8ff4d-p2l2k", "timestamp":"2025-03-25 01:07:14.415281553 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:07:14.714646 containerd[1456]: 2025-03-25 01:07:14.439 [INFO][4223] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:07:14.714646 containerd[1456]: 2025-03-25 01:07:14.503 [INFO][4223] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:07:14.714646 containerd[1456]: 2025-03-25 01:07:14.503 [INFO][4223] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 25 01:07:14.714646 containerd[1456]: 2025-03-25 01:07:14.506 [INFO][4223] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.4fe150aced9a82e6f4f96e1f2caa6f0e86d160a59599a4067fe868b3414ae753" host="localhost" Mar 25 01:07:14.714646 containerd[1456]: 2025-03-25 01:07:14.511 [INFO][4223] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 25 01:07:14.714646 containerd[1456]: 2025-03-25 01:07:14.519 [INFO][4223] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 25 01:07:14.714646 containerd[1456]: 2025-03-25 01:07:14.523 [INFO][4223] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 25 01:07:14.714646 containerd[1456]: 2025-03-25 01:07:14.528 [INFO][4223] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 25 01:07:14.714646 containerd[1456]: 2025-03-25 01:07:14.528 [INFO][4223] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.4fe150aced9a82e6f4f96e1f2caa6f0e86d160a59599a4067fe868b3414ae753" host="localhost" Mar 25 01:07:14.714646 containerd[1456]: 2025-03-25 01:07:14.530 [INFO][4223] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.4fe150aced9a82e6f4f96e1f2caa6f0e86d160a59599a4067fe868b3414ae753 Mar 25 01:07:14.714646 containerd[1456]: 2025-03-25 01:07:14.539 [INFO][4223] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.4fe150aced9a82e6f4f96e1f2caa6f0e86d160a59599a4067fe868b3414ae753" host="localhost" Mar 25 01:07:14.714646 containerd[1456]: 2025-03-25 01:07:14.563 [INFO][4223] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.4fe150aced9a82e6f4f96e1f2caa6f0e86d160a59599a4067fe868b3414ae753" host="localhost" Mar 25 01:07:14.714646 containerd[1456]: 2025-03-25 01:07:14.563 [INFO][4223] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.4fe150aced9a82e6f4f96e1f2caa6f0e86d160a59599a4067fe868b3414ae753" host="localhost" Mar 25 01:07:14.714646 containerd[1456]: 2025-03-25 01:07:14.563 [INFO][4223] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:07:14.714646 containerd[1456]: 2025-03-25 01:07:14.563 [INFO][4223] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="4fe150aced9a82e6f4f96e1f2caa6f0e86d160a59599a4067fe868b3414ae753" HandleID="k8s-pod-network.4fe150aced9a82e6f4f96e1f2caa6f0e86d160a59599a4067fe868b3414ae753" Workload="localhost-k8s-coredns--7db6d8ff4d--p2l2k-eth0" Mar 25 01:07:14.715690 containerd[1456]: 2025-03-25 01:07:14.565 [INFO][4178] cni-plugin/k8s.go 386: Populated endpoint ContainerID="4fe150aced9a82e6f4f96e1f2caa6f0e86d160a59599a4067fe868b3414ae753" Namespace="kube-system" Pod="coredns-7db6d8ff4d-p2l2k" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--p2l2k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--p2l2k-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"26e4919a-ab6f-4279-a456-a216a730af58", ResourceVersion:"727", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 6, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7db6d8ff4d-p2l2k", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie20225c3011", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:07:14.715690 containerd[1456]: 2025-03-25 01:07:14.565 [INFO][4178] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.131/32] ContainerID="4fe150aced9a82e6f4f96e1f2caa6f0e86d160a59599a4067fe868b3414ae753" Namespace="kube-system" Pod="coredns-7db6d8ff4d-p2l2k" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--p2l2k-eth0" Mar 25 01:07:14.715690 containerd[1456]: 2025-03-25 01:07:14.565 [INFO][4178] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie20225c3011 ContainerID="4fe150aced9a82e6f4f96e1f2caa6f0e86d160a59599a4067fe868b3414ae753" Namespace="kube-system" Pod="coredns-7db6d8ff4d-p2l2k" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--p2l2k-eth0" Mar 25 01:07:14.715690 containerd[1456]: 2025-03-25 01:07:14.570 [INFO][4178] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4fe150aced9a82e6f4f96e1f2caa6f0e86d160a59599a4067fe868b3414ae753" Namespace="kube-system" Pod="coredns-7db6d8ff4d-p2l2k" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--p2l2k-eth0" Mar 25 01:07:14.715690 containerd[1456]: 2025-03-25 01:07:14.571 [INFO][4178] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="4fe150aced9a82e6f4f96e1f2caa6f0e86d160a59599a4067fe868b3414ae753" Namespace="kube-system" Pod="coredns-7db6d8ff4d-p2l2k" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--p2l2k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--p2l2k-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"26e4919a-ab6f-4279-a456-a216a730af58", ResourceVersion:"727", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 6, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"4fe150aced9a82e6f4f96e1f2caa6f0e86d160a59599a4067fe868b3414ae753", Pod:"coredns-7db6d8ff4d-p2l2k", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie20225c3011", MAC:"92:64:cb:54:2c:ed", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:07:14.715690 containerd[1456]: 2025-03-25 01:07:14.711 [INFO][4178] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="4fe150aced9a82e6f4f96e1f2caa6f0e86d160a59599a4067fe868b3414ae753" Namespace="kube-system" Pod="coredns-7db6d8ff4d-p2l2k" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--p2l2k-eth0" Mar 25 01:07:14.842395 systemd-networkd[1390]: cali8d20dd4ef9a: Link UP Mar 25 01:07:14.842600 systemd-networkd[1390]: cali8d20dd4ef9a: Gained carrier Mar 25 01:07:14.883687 containerd[1456]: 2025-03-25 01:07:14.182 [INFO][4163] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--7fcd8cd96b--w98xb-eth0 calico-kube-controllers-7fcd8cd96b- calico-system 09129f1d-9047-499c-9939-6616b096e952 730 0 2025-03-25 01:06:50 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7fcd8cd96b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-7fcd8cd96b-w98xb eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali8d20dd4ef9a [] []}} ContainerID="4b532ad188053ce206b3aabefbd6d80965a87aecffb97ace5ef657d507b61427" Namespace="calico-system" Pod="calico-kube-controllers-7fcd8cd96b-w98xb" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7fcd8cd96b--w98xb-" Mar 25 01:07:14.883687 containerd[1456]: 2025-03-25 01:07:14.182 [INFO][4163] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="4b532ad188053ce206b3aabefbd6d80965a87aecffb97ace5ef657d507b61427" Namespace="calico-system" Pod="calico-kube-controllers-7fcd8cd96b-w98xb" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7fcd8cd96b--w98xb-eth0" Mar 25 01:07:14.883687 containerd[1456]: 2025-03-25 01:07:14.415 [INFO][4221] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4b532ad188053ce206b3aabefbd6d80965a87aecffb97ace5ef657d507b61427" HandleID="k8s-pod-network.4b532ad188053ce206b3aabefbd6d80965a87aecffb97ace5ef657d507b61427" Workload="localhost-k8s-calico--kube--controllers--7fcd8cd96b--w98xb-eth0" Mar 25 01:07:14.883687 containerd[1456]: 2025-03-25 01:07:14.442 [INFO][4221] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4b532ad188053ce206b3aabefbd6d80965a87aecffb97ace5ef657d507b61427" HandleID="k8s-pod-network.4b532ad188053ce206b3aabefbd6d80965a87aecffb97ace5ef657d507b61427" Workload="localhost-k8s-calico--kube--controllers--7fcd8cd96b--w98xb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40004618d0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-7fcd8cd96b-w98xb", "timestamp":"2025-03-25 01:07:14.415268433 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:07:14.883687 containerd[1456]: 2025-03-25 01:07:14.442 [INFO][4221] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:07:14.883687 containerd[1456]: 2025-03-25 01:07:14.563 [INFO][4221] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:07:14.883687 containerd[1456]: 2025-03-25 01:07:14.563 [INFO][4221] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 25 01:07:14.883687 containerd[1456]: 2025-03-25 01:07:14.565 [INFO][4221] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.4b532ad188053ce206b3aabefbd6d80965a87aecffb97ace5ef657d507b61427" host="localhost" Mar 25 01:07:14.883687 containerd[1456]: 2025-03-25 01:07:14.570 [INFO][4221] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 25 01:07:14.883687 containerd[1456]: 2025-03-25 01:07:14.577 [INFO][4221] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 25 01:07:14.883687 containerd[1456]: 2025-03-25 01:07:14.712 [INFO][4221] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 25 01:07:14.883687 containerd[1456]: 2025-03-25 01:07:14.715 [INFO][4221] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 25 01:07:14.883687 containerd[1456]: 2025-03-25 01:07:14.715 [INFO][4221] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.4b532ad188053ce206b3aabefbd6d80965a87aecffb97ace5ef657d507b61427" host="localhost" Mar 25 01:07:14.883687 containerd[1456]: 2025-03-25 01:07:14.717 [INFO][4221] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.4b532ad188053ce206b3aabefbd6d80965a87aecffb97ace5ef657d507b61427 Mar 25 01:07:14.883687 containerd[1456]: 2025-03-25 01:07:14.801 [INFO][4221] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.4b532ad188053ce206b3aabefbd6d80965a87aecffb97ace5ef657d507b61427" host="localhost" Mar 25 01:07:14.883687 containerd[1456]: 2025-03-25 01:07:14.836 [INFO][4221] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.4b532ad188053ce206b3aabefbd6d80965a87aecffb97ace5ef657d507b61427" host="localhost" Mar 25 01:07:14.883687 containerd[1456]: 2025-03-25 01:07:14.836 [INFO][4221] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.4b532ad188053ce206b3aabefbd6d80965a87aecffb97ace5ef657d507b61427" host="localhost" Mar 25 01:07:14.883687 containerd[1456]: 2025-03-25 01:07:14.836 [INFO][4221] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:07:14.883687 containerd[1456]: 2025-03-25 01:07:14.836 [INFO][4221] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="4b532ad188053ce206b3aabefbd6d80965a87aecffb97ace5ef657d507b61427" HandleID="k8s-pod-network.4b532ad188053ce206b3aabefbd6d80965a87aecffb97ace5ef657d507b61427" Workload="localhost-k8s-calico--kube--controllers--7fcd8cd96b--w98xb-eth0" Mar 25 01:07:14.884281 containerd[1456]: 2025-03-25 01:07:14.839 [INFO][4163] cni-plugin/k8s.go 386: Populated endpoint ContainerID="4b532ad188053ce206b3aabefbd6d80965a87aecffb97ace5ef657d507b61427" Namespace="calico-system" Pod="calico-kube-controllers-7fcd8cd96b-w98xb" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7fcd8cd96b--w98xb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--7fcd8cd96b--w98xb-eth0", GenerateName:"calico-kube-controllers-7fcd8cd96b-", Namespace:"calico-system", SelfLink:"", UID:"09129f1d-9047-499c-9939-6616b096e952", ResourceVersion:"730", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 6, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7fcd8cd96b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-7fcd8cd96b-w98xb", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali8d20dd4ef9a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:07:14.884281 containerd[1456]: 2025-03-25 01:07:14.839 [INFO][4163] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.132/32] ContainerID="4b532ad188053ce206b3aabefbd6d80965a87aecffb97ace5ef657d507b61427" Namespace="calico-system" Pod="calico-kube-controllers-7fcd8cd96b-w98xb" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7fcd8cd96b--w98xb-eth0" Mar 25 01:07:14.884281 containerd[1456]: 2025-03-25 01:07:14.839 [INFO][4163] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8d20dd4ef9a ContainerID="4b532ad188053ce206b3aabefbd6d80965a87aecffb97ace5ef657d507b61427" Namespace="calico-system" Pod="calico-kube-controllers-7fcd8cd96b-w98xb" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7fcd8cd96b--w98xb-eth0" Mar 25 01:07:14.884281 containerd[1456]: 2025-03-25 01:07:14.842 [INFO][4163] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4b532ad188053ce206b3aabefbd6d80965a87aecffb97ace5ef657d507b61427" Namespace="calico-system" Pod="calico-kube-controllers-7fcd8cd96b-w98xb" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7fcd8cd96b--w98xb-eth0" Mar 25 01:07:14.884281 containerd[1456]: 2025-03-25 01:07:14.843 [INFO][4163] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="4b532ad188053ce206b3aabefbd6d80965a87aecffb97ace5ef657d507b61427" Namespace="calico-system" Pod="calico-kube-controllers-7fcd8cd96b-w98xb" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7fcd8cd96b--w98xb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--7fcd8cd96b--w98xb-eth0", GenerateName:"calico-kube-controllers-7fcd8cd96b-", Namespace:"calico-system", SelfLink:"", UID:"09129f1d-9047-499c-9939-6616b096e952", ResourceVersion:"730", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 6, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7fcd8cd96b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"4b532ad188053ce206b3aabefbd6d80965a87aecffb97ace5ef657d507b61427", Pod:"calico-kube-controllers-7fcd8cd96b-w98xb", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali8d20dd4ef9a", MAC:"ae:69:8c:3e:65:ba", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:07:14.884281 containerd[1456]: 2025-03-25 01:07:14.880 [INFO][4163] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="4b532ad188053ce206b3aabefbd6d80965a87aecffb97ace5ef657d507b61427" Namespace="calico-system" Pod="calico-kube-controllers-7fcd8cd96b-w98xb" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7fcd8cd96b--w98xb-eth0" Mar 25 01:07:14.980841 containerd[1456]: time="2025-03-25T01:07:14.980785384Z" level=info msg="connecting to shim 4b532ad188053ce206b3aabefbd6d80965a87aecffb97ace5ef657d507b61427" address="unix:///run/containerd/s/3fd35187f6f3bb21092bab8c83ca3a7b6837209cca640b8934140e8db9b8755f" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:07:14.982488 containerd[1456]: time="2025-03-25T01:07:14.982456502Z" level=info msg="connecting to shim c17aa3bd6eff395ac558d67502e848ea897409b32e7cfc81aab41e9cdb359063" address="unix:///run/containerd/s/cff0444f2ccf9030104d790cdb53cf99cf9b459a5977d07aa2325707fad53c9c" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:07:14.985813 containerd[1456]: time="2025-03-25T01:07:14.985774297Z" level=info msg="connecting to shim 4fe150aced9a82e6f4f96e1f2caa6f0e86d160a59599a4067fe868b3414ae753" address="unix:///run/containerd/s/bcceb8dbb9031eb25084b8ac584031d757510aa02374c15eac857418d8f3897e" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:07:14.986150 containerd[1456]: time="2025-03-25T01:07:14.986086056Z" level=info msg="connecting to shim 670c5c9be460f49c9150f6dcd6e3ff293dcaf36b91eab09ec15c281ec6898221" address="unix:///run/containerd/s/46884131b9ea7714d3878aef35f4f434cf446811c5083e0252f6b0eb9a44424a" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:07:15.011294 systemd[1]: Started cri-containerd-4b532ad188053ce206b3aabefbd6d80965a87aecffb97ace5ef657d507b61427.scope - libcontainer container 4b532ad188053ce206b3aabefbd6d80965a87aecffb97ace5ef657d507b61427. Mar 25 01:07:15.032560 systemd-resolved[1323]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 25 01:07:15.040285 systemd[1]: Started cri-containerd-4fe150aced9a82e6f4f96e1f2caa6f0e86d160a59599a4067fe868b3414ae753.scope - libcontainer container 4fe150aced9a82e6f4f96e1f2caa6f0e86d160a59599a4067fe868b3414ae753. Mar 25 01:07:15.041939 systemd[1]: Started cri-containerd-670c5c9be460f49c9150f6dcd6e3ff293dcaf36b91eab09ec15c281ec6898221.scope - libcontainer container 670c5c9be460f49c9150f6dcd6e3ff293dcaf36b91eab09ec15c281ec6898221. Mar 25 01:07:15.043931 systemd[1]: Started cri-containerd-c17aa3bd6eff395ac558d67502e848ea897409b32e7cfc81aab41e9cdb359063.scope - libcontainer container c17aa3bd6eff395ac558d67502e848ea897409b32e7cfc81aab41e9cdb359063. Mar 25 01:07:15.064474 systemd-resolved[1323]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 25 01:07:15.065162 systemd-resolved[1323]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 25 01:07:15.073733 containerd[1456]: time="2025-03-25T01:07:15.071997649Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ggjvq,Uid:0fe46c57-cacb-46aa-b01f-e805370dad30,Namespace:calico-system,Attempt:0,}" Mar 25 01:07:15.086133 containerd[1456]: time="2025-03-25T01:07:15.085686829Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5558c45bbd-zxfm7,Uid:8f6521f3-e126-490b-8750-3f1707506fb2,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:07:15.112293 systemd-resolved[1323]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 25 01:07:15.114081 containerd[1456]: time="2025-03-25T01:07:15.114046107Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7fcd8cd96b-w98xb,Uid:09129f1d-9047-499c-9939-6616b096e952,Namespace:calico-system,Attempt:0,} returns sandbox id \"4b532ad188053ce206b3aabefbd6d80965a87aecffb97ace5ef657d507b61427\"" Mar 25 01:07:15.117320 containerd[1456]: time="2025-03-25T01:07:15.117284302Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-vl8nc,Uid:f25e2e69-f991-42f2-b563-8cc9510c735f,Namespace:kube-system,Attempt:0,} returns sandbox id \"c17aa3bd6eff395ac558d67502e848ea897409b32e7cfc81aab41e9cdb359063\"" Mar 25 01:07:15.121832 containerd[1456]: time="2025-03-25T01:07:15.121640015Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\"" Mar 25 01:07:15.124651 containerd[1456]: time="2025-03-25T01:07:15.124571051Z" level=info msg="CreateContainer within sandbox \"c17aa3bd6eff395ac558d67502e848ea897409b32e7cfc81aab41e9cdb359063\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 25 01:07:15.136432 containerd[1456]: time="2025-03-25T01:07:15.136377434Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-p2l2k,Uid:26e4919a-ab6f-4279-a456-a216a730af58,Namespace:kube-system,Attempt:0,} returns sandbox id \"4fe150aced9a82e6f4f96e1f2caa6f0e86d160a59599a4067fe868b3414ae753\"" Mar 25 01:07:15.141250 containerd[1456]: time="2025-03-25T01:07:15.141196466Z" level=info msg="CreateContainer within sandbox \"4fe150aced9a82e6f4f96e1f2caa6f0e86d160a59599a4067fe868b3414ae753\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 25 01:07:15.153575 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount171815551.mount: Deactivated successfully. Mar 25 01:07:15.157232 containerd[1456]: time="2025-03-25T01:07:15.157193403Z" level=info msg="Container 223a2f9731c47c0c45cbf42fe206bc52e27a0dbb1acf8a42caadda8142d03752: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:07:15.161004 containerd[1456]: time="2025-03-25T01:07:15.160962277Z" level=info msg="Container 7d5b58be6b277f0fe4dabfd615bdc6b516538e735d610890b16b184fe903d447: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:07:15.167295 containerd[1456]: time="2025-03-25T01:07:15.167256468Z" level=info msg="CreateContainer within sandbox \"c17aa3bd6eff395ac558d67502e848ea897409b32e7cfc81aab41e9cdb359063\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"223a2f9731c47c0c45cbf42fe206bc52e27a0dbb1acf8a42caadda8142d03752\"" Mar 25 01:07:15.173436 containerd[1456]: time="2025-03-25T01:07:15.173391979Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5558c45bbd-dzxf2,Uid:200784d1-a1d0-438e-9ee6-bc1706e134c5,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"670c5c9be460f49c9150f6dcd6e3ff293dcaf36b91eab09ec15c281ec6898221\"" Mar 25 01:07:15.173941 containerd[1456]: time="2025-03-25T01:07:15.173908418Z" level=info msg="StartContainer for \"223a2f9731c47c0c45cbf42fe206bc52e27a0dbb1acf8a42caadda8142d03752\"" Mar 25 01:07:15.174831 containerd[1456]: time="2025-03-25T01:07:15.174777017Z" level=info msg="CreateContainer within sandbox \"4fe150aced9a82e6f4f96e1f2caa6f0e86d160a59599a4067fe868b3414ae753\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"7d5b58be6b277f0fe4dabfd615bdc6b516538e735d610890b16b184fe903d447\"" Mar 25 01:07:15.176603 containerd[1456]: time="2025-03-25T01:07:15.176572294Z" level=info msg="connecting to shim 223a2f9731c47c0c45cbf42fe206bc52e27a0dbb1acf8a42caadda8142d03752" address="unix:///run/containerd/s/cff0444f2ccf9030104d790cdb53cf99cf9b459a5977d07aa2325707fad53c9c" protocol=ttrpc version=3 Mar 25 01:07:15.177641 containerd[1456]: time="2025-03-25T01:07:15.176702654Z" level=info msg="StartContainer for \"7d5b58be6b277f0fe4dabfd615bdc6b516538e735d610890b16b184fe903d447\"" Mar 25 01:07:15.179159 containerd[1456]: time="2025-03-25T01:07:15.179127770Z" level=info msg="connecting to shim 7d5b58be6b277f0fe4dabfd615bdc6b516538e735d610890b16b184fe903d447" address="unix:///run/containerd/s/bcceb8dbb9031eb25084b8ac584031d757510aa02374c15eac857418d8f3897e" protocol=ttrpc version=3 Mar 25 01:07:15.208329 systemd[1]: Started cri-containerd-223a2f9731c47c0c45cbf42fe206bc52e27a0dbb1acf8a42caadda8142d03752.scope - libcontainer container 223a2f9731c47c0c45cbf42fe206bc52e27a0dbb1acf8a42caadda8142d03752. Mar 25 01:07:15.209854 systemd[1]: Started cri-containerd-7d5b58be6b277f0fe4dabfd615bdc6b516538e735d610890b16b184fe903d447.scope - libcontainer container 7d5b58be6b277f0fe4dabfd615bdc6b516538e735d610890b16b184fe903d447. Mar 25 01:07:15.265276 containerd[1456]: time="2025-03-25T01:07:15.265233443Z" level=info msg="StartContainer for \"7d5b58be6b277f0fe4dabfd615bdc6b516538e735d610890b16b184fe903d447\" returns successfully" Mar 25 01:07:15.266060 systemd-networkd[1390]: cali6314993728b: Link UP Mar 25 01:07:15.266244 systemd-networkd[1390]: cali6314993728b: Gained carrier Mar 25 01:07:15.273741 containerd[1456]: time="2025-03-25T01:07:15.273644710Z" level=info msg="StartContainer for \"223a2f9731c47c0c45cbf42fe206bc52e27a0dbb1acf8a42caadda8142d03752\" returns successfully" Mar 25 01:07:15.286826 containerd[1456]: 2025-03-25 01:07:15.140 [INFO][4469] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--ggjvq-eth0 csi-node-driver- calico-system 0fe46c57-cacb-46aa-b01f-e805370dad30 605 0 2025-03-25 01:06:50 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:69ddf5d45d k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-ggjvq eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali6314993728b [] []}} ContainerID="e0c5c9341d506f2b04707f88794efb8a28315d8fb22cff2a71828ce2dcff0b5d" Namespace="calico-system" Pod="csi-node-driver-ggjvq" WorkloadEndpoint="localhost-k8s-csi--node--driver--ggjvq-" Mar 25 01:07:15.286826 containerd[1456]: 2025-03-25 01:07:15.141 [INFO][4469] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="e0c5c9341d506f2b04707f88794efb8a28315d8fb22cff2a71828ce2dcff0b5d" Namespace="calico-system" Pod="csi-node-driver-ggjvq" WorkloadEndpoint="localhost-k8s-csi--node--driver--ggjvq-eth0" Mar 25 01:07:15.286826 containerd[1456]: 2025-03-25 01:07:15.179 [INFO][4521] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e0c5c9341d506f2b04707f88794efb8a28315d8fb22cff2a71828ce2dcff0b5d" HandleID="k8s-pod-network.e0c5c9341d506f2b04707f88794efb8a28315d8fb22cff2a71828ce2dcff0b5d" Workload="localhost-k8s-csi--node--driver--ggjvq-eth0" Mar 25 01:07:15.286826 containerd[1456]: 2025-03-25 01:07:15.192 [INFO][4521] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e0c5c9341d506f2b04707f88794efb8a28315d8fb22cff2a71828ce2dcff0b5d" HandleID="k8s-pod-network.e0c5c9341d506f2b04707f88794efb8a28315d8fb22cff2a71828ce2dcff0b5d" Workload="localhost-k8s-csi--node--driver--ggjvq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003e00a0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-ggjvq", "timestamp":"2025-03-25 01:07:15.17964045 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:07:15.286826 containerd[1456]: 2025-03-25 01:07:15.192 [INFO][4521] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:07:15.286826 containerd[1456]: 2025-03-25 01:07:15.192 [INFO][4521] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:07:15.286826 containerd[1456]: 2025-03-25 01:07:15.192 [INFO][4521] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 25 01:07:15.286826 containerd[1456]: 2025-03-25 01:07:15.197 [INFO][4521] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.e0c5c9341d506f2b04707f88794efb8a28315d8fb22cff2a71828ce2dcff0b5d" host="localhost" Mar 25 01:07:15.286826 containerd[1456]: 2025-03-25 01:07:15.204 [INFO][4521] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 25 01:07:15.286826 containerd[1456]: 2025-03-25 01:07:15.213 [INFO][4521] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 25 01:07:15.286826 containerd[1456]: 2025-03-25 01:07:15.217 [INFO][4521] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 25 01:07:15.286826 containerd[1456]: 2025-03-25 01:07:15.223 [INFO][4521] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 25 01:07:15.286826 containerd[1456]: 2025-03-25 01:07:15.223 [INFO][4521] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.e0c5c9341d506f2b04707f88794efb8a28315d8fb22cff2a71828ce2dcff0b5d" host="localhost" Mar 25 01:07:15.286826 containerd[1456]: 2025-03-25 01:07:15.231 [INFO][4521] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.e0c5c9341d506f2b04707f88794efb8a28315d8fb22cff2a71828ce2dcff0b5d Mar 25 01:07:15.286826 containerd[1456]: 2025-03-25 01:07:15.241 [INFO][4521] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.e0c5c9341d506f2b04707f88794efb8a28315d8fb22cff2a71828ce2dcff0b5d" host="localhost" Mar 25 01:07:15.286826 containerd[1456]: 2025-03-25 01:07:15.252 [INFO][4521] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.e0c5c9341d506f2b04707f88794efb8a28315d8fb22cff2a71828ce2dcff0b5d" host="localhost" Mar 25 01:07:15.286826 containerd[1456]: 2025-03-25 01:07:15.252 [INFO][4521] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.e0c5c9341d506f2b04707f88794efb8a28315d8fb22cff2a71828ce2dcff0b5d" host="localhost" Mar 25 01:07:15.286826 containerd[1456]: 2025-03-25 01:07:15.252 [INFO][4521] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:07:15.286826 containerd[1456]: 2025-03-25 01:07:15.252 [INFO][4521] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="e0c5c9341d506f2b04707f88794efb8a28315d8fb22cff2a71828ce2dcff0b5d" HandleID="k8s-pod-network.e0c5c9341d506f2b04707f88794efb8a28315d8fb22cff2a71828ce2dcff0b5d" Workload="localhost-k8s-csi--node--driver--ggjvq-eth0" Mar 25 01:07:15.287357 containerd[1456]: 2025-03-25 01:07:15.258 [INFO][4469] cni-plugin/k8s.go 386: Populated endpoint ContainerID="e0c5c9341d506f2b04707f88794efb8a28315d8fb22cff2a71828ce2dcff0b5d" Namespace="calico-system" Pod="csi-node-driver-ggjvq" WorkloadEndpoint="localhost-k8s-csi--node--driver--ggjvq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--ggjvq-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"0fe46c57-cacb-46aa-b01f-e805370dad30", ResourceVersion:"605", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 6, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"69ddf5d45d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-ggjvq", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali6314993728b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:07:15.287357 containerd[1456]: 2025-03-25 01:07:15.259 [INFO][4469] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.133/32] ContainerID="e0c5c9341d506f2b04707f88794efb8a28315d8fb22cff2a71828ce2dcff0b5d" Namespace="calico-system" Pod="csi-node-driver-ggjvq" WorkloadEndpoint="localhost-k8s-csi--node--driver--ggjvq-eth0" Mar 25 01:07:15.287357 containerd[1456]: 2025-03-25 01:07:15.259 [INFO][4469] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6314993728b ContainerID="e0c5c9341d506f2b04707f88794efb8a28315d8fb22cff2a71828ce2dcff0b5d" Namespace="calico-system" Pod="csi-node-driver-ggjvq" WorkloadEndpoint="localhost-k8s-csi--node--driver--ggjvq-eth0" Mar 25 01:07:15.287357 containerd[1456]: 2025-03-25 01:07:15.264 [INFO][4469] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e0c5c9341d506f2b04707f88794efb8a28315d8fb22cff2a71828ce2dcff0b5d" Namespace="calico-system" Pod="csi-node-driver-ggjvq" WorkloadEndpoint="localhost-k8s-csi--node--driver--ggjvq-eth0" Mar 25 01:07:15.287357 containerd[1456]: 2025-03-25 01:07:15.264 [INFO][4469] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="e0c5c9341d506f2b04707f88794efb8a28315d8fb22cff2a71828ce2dcff0b5d" Namespace="calico-system" Pod="csi-node-driver-ggjvq" WorkloadEndpoint="localhost-k8s-csi--node--driver--ggjvq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--ggjvq-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"0fe46c57-cacb-46aa-b01f-e805370dad30", ResourceVersion:"605", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 6, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"69ddf5d45d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"e0c5c9341d506f2b04707f88794efb8a28315d8fb22cff2a71828ce2dcff0b5d", Pod:"csi-node-driver-ggjvq", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali6314993728b", MAC:"8e:e4:1a:3d:d7:2a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:07:15.287357 containerd[1456]: 2025-03-25 01:07:15.280 [INFO][4469] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="e0c5c9341d506f2b04707f88794efb8a28315d8fb22cff2a71828ce2dcff0b5d" Namespace="calico-system" Pod="csi-node-driver-ggjvq" WorkloadEndpoint="localhost-k8s-csi--node--driver--ggjvq-eth0" Mar 25 01:07:15.322332 systemd-networkd[1390]: cali7f7da876c6f: Link UP Mar 25 01:07:15.323555 systemd-networkd[1390]: cali7f7da876c6f: Gained carrier Mar 25 01:07:15.333354 containerd[1456]: time="2025-03-25T01:07:15.333312022Z" level=info msg="connecting to shim e0c5c9341d506f2b04707f88794efb8a28315d8fb22cff2a71828ce2dcff0b5d" address="unix:///run/containerd/s/a9181c83f20e4597ab27d9796ffaf61f104ac4d420559cfe29c30bf47aa0ba0a" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:07:15.339260 containerd[1456]: 2025-03-25 01:07:15.183 [INFO][4497] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--5558c45bbd--zxfm7-eth0 calico-apiserver-5558c45bbd- calico-apiserver 8f6521f3-e126-490b-8750-3f1707506fb2 733 0 2025-03-25 01:06:49 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5558c45bbd projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-5558c45bbd-zxfm7 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali7f7da876c6f [] []}} ContainerID="a399b1c996596e62cbca240337a04eb5b4989d921853bfa7a839c0da3c9ee26e" Namespace="calico-apiserver" Pod="calico-apiserver-5558c45bbd-zxfm7" WorkloadEndpoint="localhost-k8s-calico--apiserver--5558c45bbd--zxfm7-" Mar 25 01:07:15.339260 containerd[1456]: 2025-03-25 01:07:15.183 [INFO][4497] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="a399b1c996596e62cbca240337a04eb5b4989d921853bfa7a839c0da3c9ee26e" Namespace="calico-apiserver" Pod="calico-apiserver-5558c45bbd-zxfm7" WorkloadEndpoint="localhost-k8s-calico--apiserver--5558c45bbd--zxfm7-eth0" Mar 25 01:07:15.339260 containerd[1456]: 2025-03-25 01:07:15.231 [INFO][4556] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a399b1c996596e62cbca240337a04eb5b4989d921853bfa7a839c0da3c9ee26e" HandleID="k8s-pod-network.a399b1c996596e62cbca240337a04eb5b4989d921853bfa7a839c0da3c9ee26e" Workload="localhost-k8s-calico--apiserver--5558c45bbd--zxfm7-eth0" Mar 25 01:07:15.339260 containerd[1456]: 2025-03-25 01:07:15.252 [INFO][4556] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a399b1c996596e62cbca240337a04eb5b4989d921853bfa7a839c0da3c9ee26e" HandleID="k8s-pod-network.a399b1c996596e62cbca240337a04eb5b4989d921853bfa7a839c0da3c9ee26e" Workload="localhost-k8s-calico--apiserver--5558c45bbd--zxfm7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400011d510), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-5558c45bbd-zxfm7", "timestamp":"2025-03-25 01:07:15.231182093 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:07:15.339260 containerd[1456]: 2025-03-25 01:07:15.252 [INFO][4556] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:07:15.339260 containerd[1456]: 2025-03-25 01:07:15.252 [INFO][4556] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:07:15.339260 containerd[1456]: 2025-03-25 01:07:15.252 [INFO][4556] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 25 01:07:15.339260 containerd[1456]: 2025-03-25 01:07:15.255 [INFO][4556] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.a399b1c996596e62cbca240337a04eb5b4989d921853bfa7a839c0da3c9ee26e" host="localhost" Mar 25 01:07:15.339260 containerd[1456]: 2025-03-25 01:07:15.267 [INFO][4556] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 25 01:07:15.339260 containerd[1456]: 2025-03-25 01:07:15.281 [INFO][4556] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 25 01:07:15.339260 containerd[1456]: 2025-03-25 01:07:15.292 [INFO][4556] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 25 01:07:15.339260 containerd[1456]: 2025-03-25 01:07:15.296 [INFO][4556] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 25 01:07:15.339260 containerd[1456]: 2025-03-25 01:07:15.296 [INFO][4556] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.a399b1c996596e62cbca240337a04eb5b4989d921853bfa7a839c0da3c9ee26e" host="localhost" Mar 25 01:07:15.339260 containerd[1456]: 2025-03-25 01:07:15.299 [INFO][4556] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.a399b1c996596e62cbca240337a04eb5b4989d921853bfa7a839c0da3c9ee26e Mar 25 01:07:15.339260 containerd[1456]: 2025-03-25 01:07:15.305 [INFO][4556] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.a399b1c996596e62cbca240337a04eb5b4989d921853bfa7a839c0da3c9ee26e" host="localhost" Mar 25 01:07:15.339260 containerd[1456]: 2025-03-25 01:07:15.313 [INFO][4556] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.a399b1c996596e62cbca240337a04eb5b4989d921853bfa7a839c0da3c9ee26e" host="localhost" Mar 25 01:07:15.339260 containerd[1456]: 2025-03-25 01:07:15.313 [INFO][4556] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.a399b1c996596e62cbca240337a04eb5b4989d921853bfa7a839c0da3c9ee26e" host="localhost" Mar 25 01:07:15.339260 containerd[1456]: 2025-03-25 01:07:15.313 [INFO][4556] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:07:15.339260 containerd[1456]: 2025-03-25 01:07:15.313 [INFO][4556] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="a399b1c996596e62cbca240337a04eb5b4989d921853bfa7a839c0da3c9ee26e" HandleID="k8s-pod-network.a399b1c996596e62cbca240337a04eb5b4989d921853bfa7a839c0da3c9ee26e" Workload="localhost-k8s-calico--apiserver--5558c45bbd--zxfm7-eth0" Mar 25 01:07:15.340071 containerd[1456]: 2025-03-25 01:07:15.316 [INFO][4497] cni-plugin/k8s.go 386: Populated endpoint ContainerID="a399b1c996596e62cbca240337a04eb5b4989d921853bfa7a839c0da3c9ee26e" Namespace="calico-apiserver" Pod="calico-apiserver-5558c45bbd-zxfm7" WorkloadEndpoint="localhost-k8s-calico--apiserver--5558c45bbd--zxfm7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5558c45bbd--zxfm7-eth0", GenerateName:"calico-apiserver-5558c45bbd-", Namespace:"calico-apiserver", SelfLink:"", UID:"8f6521f3-e126-490b-8750-3f1707506fb2", ResourceVersion:"733", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 6, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5558c45bbd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-5558c45bbd-zxfm7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali7f7da876c6f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:07:15.340071 containerd[1456]: 2025-03-25 01:07:15.317 [INFO][4497] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.134/32] ContainerID="a399b1c996596e62cbca240337a04eb5b4989d921853bfa7a839c0da3c9ee26e" Namespace="calico-apiserver" Pod="calico-apiserver-5558c45bbd-zxfm7" WorkloadEndpoint="localhost-k8s-calico--apiserver--5558c45bbd--zxfm7-eth0" Mar 25 01:07:15.340071 containerd[1456]: 2025-03-25 01:07:15.317 [INFO][4497] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7f7da876c6f ContainerID="a399b1c996596e62cbca240337a04eb5b4989d921853bfa7a839c0da3c9ee26e" Namespace="calico-apiserver" Pod="calico-apiserver-5558c45bbd-zxfm7" WorkloadEndpoint="localhost-k8s-calico--apiserver--5558c45bbd--zxfm7-eth0" Mar 25 01:07:15.340071 containerd[1456]: 2025-03-25 01:07:15.323 [INFO][4497] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a399b1c996596e62cbca240337a04eb5b4989d921853bfa7a839c0da3c9ee26e" Namespace="calico-apiserver" Pod="calico-apiserver-5558c45bbd-zxfm7" WorkloadEndpoint="localhost-k8s-calico--apiserver--5558c45bbd--zxfm7-eth0" Mar 25 01:07:15.340071 containerd[1456]: 2025-03-25 01:07:15.323 [INFO][4497] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="a399b1c996596e62cbca240337a04eb5b4989d921853bfa7a839c0da3c9ee26e" Namespace="calico-apiserver" Pod="calico-apiserver-5558c45bbd-zxfm7" WorkloadEndpoint="localhost-k8s-calico--apiserver--5558c45bbd--zxfm7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--5558c45bbd--zxfm7-eth0", GenerateName:"calico-apiserver-5558c45bbd-", Namespace:"calico-apiserver", SelfLink:"", UID:"8f6521f3-e126-490b-8750-3f1707506fb2", ResourceVersion:"733", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 6, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5558c45bbd", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"a399b1c996596e62cbca240337a04eb5b4989d921853bfa7a839c0da3c9ee26e", Pod:"calico-apiserver-5558c45bbd-zxfm7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali7f7da876c6f", MAC:"42:ca:44:98:af:ad", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:07:15.340071 containerd[1456]: 2025-03-25 01:07:15.337 [INFO][4497] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="a399b1c996596e62cbca240337a04eb5b4989d921853bfa7a839c0da3c9ee26e" Namespace="calico-apiserver" Pod="calico-apiserver-5558c45bbd-zxfm7" WorkloadEndpoint="localhost-k8s-calico--apiserver--5558c45bbd--zxfm7-eth0" Mar 25 01:07:15.368575 systemd[1]: Started cri-containerd-e0c5c9341d506f2b04707f88794efb8a28315d8fb22cff2a71828ce2dcff0b5d.scope - libcontainer container e0c5c9341d506f2b04707f88794efb8a28315d8fb22cff2a71828ce2dcff0b5d. Mar 25 01:07:15.386182 systemd-resolved[1323]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 25 01:07:15.402535 containerd[1456]: time="2025-03-25T01:07:15.402268880Z" level=info msg="connecting to shim a399b1c996596e62cbca240337a04eb5b4989d921853bfa7a839c0da3c9ee26e" address="unix:///run/containerd/s/c7b739d5815a24b795cacf7e4181866468dc9c8ff8bb204b1882e4f28ebd14ba" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:07:15.424419 containerd[1456]: time="2025-03-25T01:07:15.424299248Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ggjvq,Uid:0fe46c57-cacb-46aa-b01f-e805370dad30,Namespace:calico-system,Attempt:0,} returns sandbox id \"e0c5c9341d506f2b04707f88794efb8a28315d8fb22cff2a71828ce2dcff0b5d\"" Mar 25 01:07:15.452419 systemd[1]: Started cri-containerd-a399b1c996596e62cbca240337a04eb5b4989d921853bfa7a839c0da3c9ee26e.scope - libcontainer container a399b1c996596e62cbca240337a04eb5b4989d921853bfa7a839c0da3c9ee26e. Mar 25 01:07:15.463744 systemd-resolved[1323]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 25 01:07:15.484971 containerd[1456]: time="2025-03-25T01:07:15.482540961Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5558c45bbd-zxfm7,Uid:8f6521f3-e126-490b-8750-3f1707506fb2,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"a399b1c996596e62cbca240337a04eb5b4989d921853bfa7a839c0da3c9ee26e\"" Mar 25 01:07:16.261260 kubelet[2695]: I0325 01:07:16.260336 2695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-p2l2k" podStartSLOduration=32.260319296 podStartE2EDuration="32.260319296s" podCreationTimestamp="2025-03-25 01:06:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:07:16.259846537 +0000 UTC m=+48.274904221" watchObservedRunningTime="2025-03-25 01:07:16.260319296 +0000 UTC m=+48.275376980" Mar 25 01:07:16.295678 kubelet[2695]: I0325 01:07:16.294620 2695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-vl8nc" podStartSLOduration=32.294600726 podStartE2EDuration="32.294600726s" podCreationTimestamp="2025-03-25 01:06:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:07:16.294042087 +0000 UTC m=+48.309099771" watchObservedRunningTime="2025-03-25 01:07:16.294600726 +0000 UTC m=+48.309658410" Mar 25 01:07:16.425249 systemd-networkd[1390]: caliaef712da230: Gained IPv6LL Mar 25 01:07:16.426731 systemd-networkd[1390]: cali8d20dd4ef9a: Gained IPv6LL Mar 25 01:07:16.490045 systemd-networkd[1390]: calib0802aac4cd: Gained IPv6LL Mar 25 01:07:16.553750 systemd-networkd[1390]: calie20225c3011: Gained IPv6LL Mar 25 01:07:16.595518 containerd[1456]: time="2025-03-25T01:07:16.595474567Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:07:16.598150 containerd[1456]: time="2025-03-25T01:07:16.598055443Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.2: active requests=0, bytes read=32560257" Mar 25 01:07:16.600334 containerd[1456]: time="2025-03-25T01:07:16.600292920Z" level=info msg="ImageCreate event name:\"sha256:39a6e91a11a792441d34dccf5e11416a0fd297782f169fdb871a5558ad50b229\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:07:16.602736 containerd[1456]: time="2025-03-25T01:07:16.602705076Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:07:16.603243 containerd[1456]: time="2025-03-25T01:07:16.603223556Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" with image id \"sha256:39a6e91a11a792441d34dccf5e11416a0fd297782f169fdb871a5558ad50b229\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\", size \"33929982\" in 1.481389141s" Mar 25 01:07:16.603300 containerd[1456]: time="2025-03-25T01:07:16.603249716Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" returns image reference \"sha256:39a6e91a11a792441d34dccf5e11416a0fd297782f169fdb871a5558ad50b229\"" Mar 25 01:07:16.604249 containerd[1456]: time="2025-03-25T01:07:16.604225954Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 25 01:07:16.614313 containerd[1456]: time="2025-03-25T01:07:16.612999581Z" level=info msg="CreateContainer within sandbox \"4b532ad188053ce206b3aabefbd6d80965a87aecffb97ace5ef657d507b61427\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 25 01:07:16.726671 kubelet[2695]: I0325 01:07:16.726632 2695 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:07:16.768405 containerd[1456]: time="2025-03-25T01:07:16.768365835Z" level=info msg="Container cac910be9b4024a31377df3745aae58d2350ff51c52a0de4424b40c6f60dade0: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:07:16.785958 containerd[1456]: time="2025-03-25T01:07:16.785916609Z" level=info msg="CreateContainer within sandbox \"4b532ad188053ce206b3aabefbd6d80965a87aecffb97ace5ef657d507b61427\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"cac910be9b4024a31377df3745aae58d2350ff51c52a0de4424b40c6f60dade0\"" Mar 25 01:07:16.786479 containerd[1456]: time="2025-03-25T01:07:16.786453488Z" level=info msg="StartContainer for \"cac910be9b4024a31377df3745aae58d2350ff51c52a0de4424b40c6f60dade0\"" Mar 25 01:07:16.787956 containerd[1456]: time="2025-03-25T01:07:16.787919166Z" level=info msg="connecting to shim cac910be9b4024a31377df3745aae58d2350ff51c52a0de4424b40c6f60dade0" address="unix:///run/containerd/s/3fd35187f6f3bb21092bab8c83ca3a7b6837209cca640b8934140e8db9b8755f" protocol=ttrpc version=3 Mar 25 01:07:16.819222 systemd[1]: Started cri-containerd-cac910be9b4024a31377df3745aae58d2350ff51c52a0de4424b40c6f60dade0.scope - libcontainer container cac910be9b4024a31377df3745aae58d2350ff51c52a0de4424b40c6f60dade0. Mar 25 01:07:16.820796 containerd[1456]: time="2025-03-25T01:07:16.820745758Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4b212a9c005b741d0aef7bf8f4253aef2997374a667f22f8d3d61bf3c4998e81\" id:\"491f41ad5d2580e905373b0deed266a31ad79491bd401a998a35bd224d2da967\" pid:4759 exited_at:{seconds:1742864836 nanos:820188479}" Mar 25 01:07:16.885853 containerd[1456]: time="2025-03-25T01:07:16.885057464Z" level=info msg="StartContainer for \"cac910be9b4024a31377df3745aae58d2350ff51c52a0de4424b40c6f60dade0\" returns successfully" Mar 25 01:07:16.909861 containerd[1456]: time="2025-03-25T01:07:16.909821588Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4b212a9c005b741d0aef7bf8f4253aef2997374a667f22f8d3d61bf3c4998e81\" id:\"95337fbbc99b1ec98b19d4010406f810394d1535d0762ebcdba31221631e0d08\" pid:4803 exited_at:{seconds:1742864836 nanos:909316749}" Mar 25 01:07:17.065268 systemd-networkd[1390]: cali7f7da876c6f: Gained IPv6LL Mar 25 01:07:17.129305 systemd-networkd[1390]: cali6314993728b: Gained IPv6LL Mar 25 01:07:17.271548 kubelet[2695]: I0325 01:07:17.270718 2695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-7fcd8cd96b-w98xb" podStartSLOduration=25.787925728 podStartE2EDuration="27.270701067s" podCreationTimestamp="2025-03-25 01:06:50 +0000 UTC" firstStartedPulling="2025-03-25 01:07:15.121222336 +0000 UTC m=+47.136280020" lastFinishedPulling="2025-03-25 01:07:16.603997675 +0000 UTC m=+48.619055359" observedRunningTime="2025-03-25 01:07:17.269583868 +0000 UTC m=+49.284641552" watchObservedRunningTime="2025-03-25 01:07:17.270701067 +0000 UTC m=+49.285758751" Mar 25 01:07:17.765282 systemd[1]: Started sshd@13-10.0.0.8:22-10.0.0.1:55356.service - OpenSSH per-connection server daemon (10.0.0.1:55356). Mar 25 01:07:17.849653 sshd[4830]: Accepted publickey for core from 10.0.0.1 port 55356 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:07:17.851490 sshd-session[4830]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:07:17.856172 systemd-logind[1439]: New session 14 of user core. Mar 25 01:07:17.861329 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 25 01:07:18.056208 sshd[4836]: Connection closed by 10.0.0.1 port 55356 Mar 25 01:07:18.056050 sshd-session[4830]: pam_unix(sshd:session): session closed for user core Mar 25 01:07:18.061399 systemd[1]: sshd@13-10.0.0.8:22-10.0.0.1:55356.service: Deactivated successfully. Mar 25 01:07:18.061732 systemd-logind[1439]: Session 14 logged out. Waiting for processes to exit. Mar 25 01:07:18.063636 systemd[1]: session-14.scope: Deactivated successfully. Mar 25 01:07:18.065910 systemd-logind[1439]: Removed session 14. Mar 25 01:07:18.265954 containerd[1456]: time="2025-03-25T01:07:18.265844918Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:07:18.265954 containerd[1456]: time="2025-03-25T01:07:18.265913598Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=40253267" Mar 25 01:07:18.268123 containerd[1456]: time="2025-03-25T01:07:18.268050275Z" level=info msg="ImageCreate event name:\"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:07:18.269278 containerd[1456]: time="2025-03-25T01:07:18.269244953Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"41623040\" in 1.664988399s" Mar 25 01:07:18.269278 containerd[1456]: time="2025-03-25T01:07:18.269278513Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\"" Mar 25 01:07:18.270076 containerd[1456]: time="2025-03-25T01:07:18.269962752Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:07:18.273435 containerd[1456]: time="2025-03-25T01:07:18.273405427Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\"" Mar 25 01:07:18.275555 containerd[1456]: time="2025-03-25T01:07:18.275292025Z" level=info msg="CreateContainer within sandbox \"670c5c9be460f49c9150f6dcd6e3ff293dcaf36b91eab09ec15c281ec6898221\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 25 01:07:18.284282 containerd[1456]: time="2025-03-25T01:07:18.283846172Z" level=info msg="Container ab31e7f9445889952eaf0440a2a23683f38e15611be832fd1aff5db2f09e9910: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:07:18.290990 containerd[1456]: time="2025-03-25T01:07:18.290940122Z" level=info msg="CreateContainer within sandbox \"670c5c9be460f49c9150f6dcd6e3ff293dcaf36b91eab09ec15c281ec6898221\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"ab31e7f9445889952eaf0440a2a23683f38e15611be832fd1aff5db2f09e9910\"" Mar 25 01:07:18.292160 containerd[1456]: time="2025-03-25T01:07:18.291892001Z" level=info msg="StartContainer for \"ab31e7f9445889952eaf0440a2a23683f38e15611be832fd1aff5db2f09e9910\"" Mar 25 01:07:18.293295 containerd[1456]: time="2025-03-25T01:07:18.293250719Z" level=info msg="connecting to shim ab31e7f9445889952eaf0440a2a23683f38e15611be832fd1aff5db2f09e9910" address="unix:///run/containerd/s/46884131b9ea7714d3878aef35f4f434cf446811c5083e0252f6b0eb9a44424a" protocol=ttrpc version=3 Mar 25 01:07:18.303131 containerd[1456]: time="2025-03-25T01:07:18.303070865Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cac910be9b4024a31377df3745aae58d2350ff51c52a0de4424b40c6f60dade0\" id:\"81dd871410a02f7a1f7ee05ba42f878b6676a4618aac1929cac3d0e9c257193d\" pid:4867 exited_at:{seconds:1742864838 nanos:302334466}" Mar 25 01:07:18.319270 systemd[1]: Started cri-containerd-ab31e7f9445889952eaf0440a2a23683f38e15611be832fd1aff5db2f09e9910.scope - libcontainer container ab31e7f9445889952eaf0440a2a23683f38e15611be832fd1aff5db2f09e9910. Mar 25 01:07:18.357426 containerd[1456]: time="2025-03-25T01:07:18.357389828Z" level=info msg="StartContainer for \"ab31e7f9445889952eaf0440a2a23683f38e15611be832fd1aff5db2f09e9910\" returns successfully" Mar 25 01:07:19.360952 containerd[1456]: time="2025-03-25T01:07:19.360893446Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:07:19.361925 containerd[1456]: time="2025-03-25T01:07:19.361853605Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.2: active requests=0, bytes read=7473801" Mar 25 01:07:19.363839 containerd[1456]: time="2025-03-25T01:07:19.363808202Z" level=info msg="ImageCreate event name:\"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:07:19.367550 containerd[1456]: time="2025-03-25T01:07:19.367161997Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:07:19.368018 containerd[1456]: time="2025-03-25T01:07:19.367990196Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.2\" with image id \"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\", size \"8843558\" in 1.094551729s" Mar 25 01:07:19.368084 containerd[1456]: time="2025-03-25T01:07:19.368021636Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\" returns image reference \"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\"" Mar 25 01:07:19.369492 containerd[1456]: time="2025-03-25T01:07:19.369469434Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 25 01:07:19.371974 containerd[1456]: time="2025-03-25T01:07:19.371765591Z" level=info msg="CreateContainer within sandbox \"e0c5c9341d506f2b04707f88794efb8a28315d8fb22cff2a71828ce2dcff0b5d\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 25 01:07:19.381288 containerd[1456]: time="2025-03-25T01:07:19.381227978Z" level=info msg="Container 2b06d5e88ad437a050a07eb22a957f5336e5fcf45273bf97f36aede84fcc5e78: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:07:19.389313 containerd[1456]: time="2025-03-25T01:07:19.389266046Z" level=info msg="CreateContainer within sandbox \"e0c5c9341d506f2b04707f88794efb8a28315d8fb22cff2a71828ce2dcff0b5d\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"2b06d5e88ad437a050a07eb22a957f5336e5fcf45273bf97f36aede84fcc5e78\"" Mar 25 01:07:19.389826 containerd[1456]: time="2025-03-25T01:07:19.389802846Z" level=info msg="StartContainer for \"2b06d5e88ad437a050a07eb22a957f5336e5fcf45273bf97f36aede84fcc5e78\"" Mar 25 01:07:19.391328 containerd[1456]: time="2025-03-25T01:07:19.391302963Z" level=info msg="connecting to shim 2b06d5e88ad437a050a07eb22a957f5336e5fcf45273bf97f36aede84fcc5e78" address="unix:///run/containerd/s/a9181c83f20e4597ab27d9796ffaf61f104ac4d420559cfe29c30bf47aa0ba0a" protocol=ttrpc version=3 Mar 25 01:07:19.418297 systemd[1]: Started cri-containerd-2b06d5e88ad437a050a07eb22a957f5336e5fcf45273bf97f36aede84fcc5e78.scope - libcontainer container 2b06d5e88ad437a050a07eb22a957f5336e5fcf45273bf97f36aede84fcc5e78. Mar 25 01:07:19.472699 containerd[1456]: time="2025-03-25T01:07:19.472662449Z" level=info msg="StartContainer for \"2b06d5e88ad437a050a07eb22a957f5336e5fcf45273bf97f36aede84fcc5e78\" returns successfully" Mar 25 01:07:19.640329 containerd[1456]: time="2025-03-25T01:07:19.639165815Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:07:19.640329 containerd[1456]: time="2025-03-25T01:07:19.639398415Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=77" Mar 25 01:07:19.642248 containerd[1456]: time="2025-03-25T01:07:19.642180131Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"41623040\" in 272.547977ms" Mar 25 01:07:19.642248 containerd[1456]: time="2025-03-25T01:07:19.642248251Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\"" Mar 25 01:07:19.643793 containerd[1456]: time="2025-03-25T01:07:19.643040450Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\"" Mar 25 01:07:19.644790 containerd[1456]: time="2025-03-25T01:07:19.644726207Z" level=info msg="CreateContainer within sandbox \"a399b1c996596e62cbca240337a04eb5b4989d921853bfa7a839c0da3c9ee26e\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 25 01:07:19.654804 containerd[1456]: time="2025-03-25T01:07:19.654753113Z" level=info msg="Container 418dc12a7058525290f79ae97d906204768f02f53922504e8634aae0e414f29b: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:07:19.665920 containerd[1456]: time="2025-03-25T01:07:19.665854137Z" level=info msg="CreateContainer within sandbox \"a399b1c996596e62cbca240337a04eb5b4989d921853bfa7a839c0da3c9ee26e\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"418dc12a7058525290f79ae97d906204768f02f53922504e8634aae0e414f29b\"" Mar 25 01:07:19.667164 containerd[1456]: time="2025-03-25T01:07:19.667092896Z" level=info msg="StartContainer for \"418dc12a7058525290f79ae97d906204768f02f53922504e8634aae0e414f29b\"" Mar 25 01:07:19.668216 containerd[1456]: time="2025-03-25T01:07:19.668185734Z" level=info msg="connecting to shim 418dc12a7058525290f79ae97d906204768f02f53922504e8634aae0e414f29b" address="unix:///run/containerd/s/c7b739d5815a24b795cacf7e4181866468dc9c8ff8bb204b1882e4f28ebd14ba" protocol=ttrpc version=3 Mar 25 01:07:19.692629 systemd[1]: Started cri-containerd-418dc12a7058525290f79ae97d906204768f02f53922504e8634aae0e414f29b.scope - libcontainer container 418dc12a7058525290f79ae97d906204768f02f53922504e8634aae0e414f29b. Mar 25 01:07:19.736577 containerd[1456]: time="2025-03-25T01:07:19.736531998Z" level=info msg="StartContainer for \"418dc12a7058525290f79ae97d906204768f02f53922504e8634aae0e414f29b\" returns successfully" Mar 25 01:07:19.877504 kubelet[2695]: I0325 01:07:19.877440 2695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5558c45bbd-dzxf2" podStartSLOduration=27.782622423 podStartE2EDuration="30.87741524s" podCreationTimestamp="2025-03-25 01:06:49 +0000 UTC" firstStartedPulling="2025-03-25 01:07:15.178444371 +0000 UTC m=+47.193502015" lastFinishedPulling="2025-03-25 01:07:18.273237068 +0000 UTC m=+50.288294832" observedRunningTime="2025-03-25 01:07:19.28005348 +0000 UTC m=+51.295111204" watchObservedRunningTime="2025-03-25 01:07:19.87741524 +0000 UTC m=+51.892472884" Mar 25 01:07:20.980226 containerd[1456]: time="2025-03-25T01:07:20.980182865Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:07:20.981656 containerd[1456]: time="2025-03-25T01:07:20.981575783Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2: active requests=0, bytes read=13121717" Mar 25 01:07:20.982478 containerd[1456]: time="2025-03-25T01:07:20.982449542Z" level=info msg="ImageCreate event name:\"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:07:20.985025 containerd[1456]: time="2025-03-25T01:07:20.984977058Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" with image id \"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\", size \"14491426\" in 1.341901048s" Mar 25 01:07:20.985025 containerd[1456]: time="2025-03-25T01:07:20.985020058Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" returns image reference \"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\"" Mar 25 01:07:20.985305 containerd[1456]: time="2025-03-25T01:07:20.985270818Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:07:20.989279 containerd[1456]: time="2025-03-25T01:07:20.989203772Z" level=info msg="CreateContainer within sandbox \"e0c5c9341d506f2b04707f88794efb8a28315d8fb22cff2a71828ce2dcff0b5d\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 25 01:07:21.001131 containerd[1456]: time="2025-03-25T01:07:20.998226440Z" level=info msg="Container f94560d2a9d751907d4eb7efd21bd9da83aa5e7a8150f6966779e86178cfd50d: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:07:21.010684 containerd[1456]: time="2025-03-25T01:07:21.010630583Z" level=info msg="CreateContainer within sandbox \"e0c5c9341d506f2b04707f88794efb8a28315d8fb22cff2a71828ce2dcff0b5d\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"f94560d2a9d751907d4eb7efd21bd9da83aa5e7a8150f6966779e86178cfd50d\"" Mar 25 01:07:21.011145 containerd[1456]: time="2025-03-25T01:07:21.011102982Z" level=info msg="StartContainer for \"f94560d2a9d751907d4eb7efd21bd9da83aa5e7a8150f6966779e86178cfd50d\"" Mar 25 01:07:21.012703 containerd[1456]: time="2025-03-25T01:07:21.012652820Z" level=info msg="connecting to shim f94560d2a9d751907d4eb7efd21bd9da83aa5e7a8150f6966779e86178cfd50d" address="unix:///run/containerd/s/a9181c83f20e4597ab27d9796ffaf61f104ac4d420559cfe29c30bf47aa0ba0a" protocol=ttrpc version=3 Mar 25 01:07:21.043269 systemd[1]: Started cri-containerd-f94560d2a9d751907d4eb7efd21bd9da83aa5e7a8150f6966779e86178cfd50d.scope - libcontainer container f94560d2a9d751907d4eb7efd21bd9da83aa5e7a8150f6966779e86178cfd50d. Mar 25 01:07:21.089438 containerd[1456]: time="2025-03-25T01:07:21.089394074Z" level=info msg="StartContainer for \"f94560d2a9d751907d4eb7efd21bd9da83aa5e7a8150f6966779e86178cfd50d\" returns successfully" Mar 25 01:07:21.290443 kubelet[2695]: I0325 01:07:21.290382 2695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5558c45bbd-zxfm7" podStartSLOduration=28.133905584 podStartE2EDuration="32.290364758s" podCreationTimestamp="2025-03-25 01:06:49 +0000 UTC" firstStartedPulling="2025-03-25 01:07:15.486476836 +0000 UTC m=+47.501534480" lastFinishedPulling="2025-03-25 01:07:19.64293597 +0000 UTC m=+51.657993654" observedRunningTime="2025-03-25 01:07:20.28619519 +0000 UTC m=+52.301252874" watchObservedRunningTime="2025-03-25 01:07:21.290364758 +0000 UTC m=+53.305422402" Mar 25 01:07:21.290819 kubelet[2695]: I0325 01:07:21.290475 2695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-ggjvq" podStartSLOduration=25.732293104 podStartE2EDuration="31.290470598s" podCreationTimestamp="2025-03-25 01:06:50 +0000 UTC" firstStartedPulling="2025-03-25 01:07:15.428272682 +0000 UTC m=+47.443330366" lastFinishedPulling="2025-03-25 01:07:20.986450176 +0000 UTC m=+53.001507860" observedRunningTime="2025-03-25 01:07:21.290246318 +0000 UTC m=+53.305304002" watchObservedRunningTime="2025-03-25 01:07:21.290470598 +0000 UTC m=+53.305528282" Mar 25 01:07:22.155850 kubelet[2695]: I0325 01:07:22.155808 2695 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 25 01:07:22.162499 kubelet[2695]: I0325 01:07:22.162465 2695 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 25 01:07:23.073328 systemd[1]: Started sshd@14-10.0.0.8:22-10.0.0.1:58182.service - OpenSSH per-connection server daemon (10.0.0.1:58182). Mar 25 01:07:23.180007 sshd[5033]: Accepted publickey for core from 10.0.0.1 port 58182 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:07:23.181768 sshd-session[5033]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:07:23.188114 systemd-logind[1439]: New session 15 of user core. Mar 25 01:07:23.196330 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 25 01:07:23.370865 sshd[5035]: Connection closed by 10.0.0.1 port 58182 Mar 25 01:07:23.371366 sshd-session[5033]: pam_unix(sshd:session): session closed for user core Mar 25 01:07:23.374974 systemd[1]: sshd@14-10.0.0.8:22-10.0.0.1:58182.service: Deactivated successfully. Mar 25 01:07:23.377520 systemd[1]: session-15.scope: Deactivated successfully. Mar 25 01:07:23.378223 systemd-logind[1439]: Session 15 logged out. Waiting for processes to exit. Mar 25 01:07:23.379029 systemd-logind[1439]: Removed session 15. Mar 25 01:07:28.382512 systemd[1]: Started sshd@15-10.0.0.8:22-10.0.0.1:58190.service - OpenSSH per-connection server daemon (10.0.0.1:58190). Mar 25 01:07:28.438708 sshd[5058]: Accepted publickey for core from 10.0.0.1 port 58190 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:07:28.440254 sshd-session[5058]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:07:28.443997 systemd-logind[1439]: New session 16 of user core. Mar 25 01:07:28.450257 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 25 01:07:28.628178 sshd[5060]: Connection closed by 10.0.0.1 port 58190 Mar 25 01:07:28.628668 sshd-session[5058]: pam_unix(sshd:session): session closed for user core Mar 25 01:07:28.639063 systemd[1]: sshd@15-10.0.0.8:22-10.0.0.1:58190.service: Deactivated successfully. Mar 25 01:07:28.640905 systemd[1]: session-16.scope: Deactivated successfully. Mar 25 01:07:28.641729 systemd-logind[1439]: Session 16 logged out. Waiting for processes to exit. Mar 25 01:07:28.644171 systemd[1]: Started sshd@16-10.0.0.8:22-10.0.0.1:58194.service - OpenSSH per-connection server daemon (10.0.0.1:58194). Mar 25 01:07:28.645508 systemd-logind[1439]: Removed session 16. Mar 25 01:07:28.702317 sshd[5072]: Accepted publickey for core from 10.0.0.1 port 58194 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:07:28.704403 sshd-session[5072]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:07:28.709284 systemd-logind[1439]: New session 17 of user core. Mar 25 01:07:28.720256 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 25 01:07:28.922793 sshd[5075]: Connection closed by 10.0.0.1 port 58194 Mar 25 01:07:28.924061 sshd-session[5072]: pam_unix(sshd:session): session closed for user core Mar 25 01:07:28.934347 systemd[1]: sshd@16-10.0.0.8:22-10.0.0.1:58194.service: Deactivated successfully. Mar 25 01:07:28.936583 systemd[1]: session-17.scope: Deactivated successfully. Mar 25 01:07:28.937352 systemd-logind[1439]: Session 17 logged out. Waiting for processes to exit. Mar 25 01:07:28.939366 systemd[1]: Started sshd@17-10.0.0.8:22-10.0.0.1:58206.service - OpenSSH per-connection server daemon (10.0.0.1:58206). Mar 25 01:07:28.940301 systemd-logind[1439]: Removed session 17. Mar 25 01:07:29.009748 sshd[5086]: Accepted publickey for core from 10.0.0.1 port 58206 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:07:29.011461 sshd-session[5086]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:07:29.015758 systemd-logind[1439]: New session 18 of user core. Mar 25 01:07:29.025284 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 25 01:07:30.528928 sshd[5089]: Connection closed by 10.0.0.1 port 58206 Mar 25 01:07:30.529876 sshd-session[5086]: pam_unix(sshd:session): session closed for user core Mar 25 01:07:30.546377 systemd[1]: sshd@17-10.0.0.8:22-10.0.0.1:58206.service: Deactivated successfully. Mar 25 01:07:30.548728 systemd[1]: session-18.scope: Deactivated successfully. Mar 25 01:07:30.550643 systemd[1]: session-18.scope: Consumed 519ms CPU time, 71.5M memory peak. Mar 25 01:07:30.552243 systemd-logind[1439]: Session 18 logged out. Waiting for processes to exit. Mar 25 01:07:30.555734 systemd[1]: Started sshd@18-10.0.0.8:22-10.0.0.1:58214.service - OpenSSH per-connection server daemon (10.0.0.1:58214). Mar 25 01:07:30.560473 systemd-logind[1439]: Removed session 18. Mar 25 01:07:30.617946 sshd[5109]: Accepted publickey for core from 10.0.0.1 port 58214 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:07:30.619296 sshd-session[5109]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:07:30.623991 systemd-logind[1439]: New session 19 of user core. Mar 25 01:07:30.638327 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 25 01:07:30.670024 containerd[1456]: time="2025-03-25T01:07:30.669984563Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cac910be9b4024a31377df3745aae58d2350ff51c52a0de4424b40c6f60dade0\" id:\"8af7a0e25775dafd81036aa9c1a18662e1e66eb59ca3c2048bc867831fdb9593\" pid:5126 exited_at:{seconds:1742864850 nanos:669701684}" Mar 25 01:07:30.937373 sshd[5114]: Connection closed by 10.0.0.1 port 58214 Mar 25 01:07:30.937834 sshd-session[5109]: pam_unix(sshd:session): session closed for user core Mar 25 01:07:30.951197 systemd[1]: sshd@18-10.0.0.8:22-10.0.0.1:58214.service: Deactivated successfully. Mar 25 01:07:30.952962 systemd[1]: session-19.scope: Deactivated successfully. Mar 25 01:07:30.953793 systemd-logind[1439]: Session 19 logged out. Waiting for processes to exit. Mar 25 01:07:30.959773 systemd[1]: Started sshd@19-10.0.0.8:22-10.0.0.1:58222.service - OpenSSH per-connection server daemon (10.0.0.1:58222). Mar 25 01:07:30.969513 systemd-logind[1439]: Removed session 19. Mar 25 01:07:31.015575 sshd[5146]: Accepted publickey for core from 10.0.0.1 port 58222 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:07:31.017091 sshd-session[5146]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:07:31.024666 systemd-logind[1439]: New session 20 of user core. Mar 25 01:07:31.034296 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 25 01:07:31.165780 sshd[5149]: Connection closed by 10.0.0.1 port 58222 Mar 25 01:07:31.167039 sshd-session[5146]: pam_unix(sshd:session): session closed for user core Mar 25 01:07:31.169678 systemd[1]: sshd@19-10.0.0.8:22-10.0.0.1:58222.service: Deactivated successfully. Mar 25 01:07:31.171349 systemd[1]: session-20.scope: Deactivated successfully. Mar 25 01:07:31.172715 systemd-logind[1439]: Session 20 logged out. Waiting for processes to exit. Mar 25 01:07:31.175841 systemd-logind[1439]: Removed session 20. Mar 25 01:07:36.184054 systemd[1]: Started sshd@20-10.0.0.8:22-10.0.0.1:37168.service - OpenSSH per-connection server daemon (10.0.0.1:37168). Mar 25 01:07:36.246094 sshd[5173]: Accepted publickey for core from 10.0.0.1 port 37168 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:07:36.247877 sshd-session[5173]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:07:36.253369 systemd-logind[1439]: New session 21 of user core. Mar 25 01:07:36.257285 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 25 01:07:36.447366 sshd[5175]: Connection closed by 10.0.0.1 port 37168 Mar 25 01:07:36.448278 sshd-session[5173]: pam_unix(sshd:session): session closed for user core Mar 25 01:07:36.452209 systemd[1]: session-21.scope: Deactivated successfully. Mar 25 01:07:36.455005 systemd-logind[1439]: Session 21 logged out. Waiting for processes to exit. Mar 25 01:07:36.457413 systemd[1]: sshd@20-10.0.0.8:22-10.0.0.1:37168.service: Deactivated successfully. Mar 25 01:07:36.461495 systemd-logind[1439]: Removed session 21. Mar 25 01:07:41.460718 systemd[1]: Started sshd@21-10.0.0.8:22-10.0.0.1:37178.service - OpenSSH per-connection server daemon (10.0.0.1:37178). Mar 25 01:07:41.512023 sshd[5188]: Accepted publickey for core from 10.0.0.1 port 37178 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:07:41.513296 sshd-session[5188]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:07:41.518693 systemd-logind[1439]: New session 22 of user core. Mar 25 01:07:41.532286 systemd[1]: Started session-22.scope - Session 22 of User core. Mar 25 01:07:41.666669 sshd[5190]: Connection closed by 10.0.0.1 port 37178 Mar 25 01:07:41.667063 sshd-session[5188]: pam_unix(sshd:session): session closed for user core Mar 25 01:07:41.670482 systemd[1]: sshd@21-10.0.0.8:22-10.0.0.1:37178.service: Deactivated successfully. Mar 25 01:07:41.672354 systemd[1]: session-22.scope: Deactivated successfully. Mar 25 01:07:41.672976 systemd-logind[1439]: Session 22 logged out. Waiting for processes to exit. Mar 25 01:07:41.673778 systemd-logind[1439]: Removed session 22. Mar 25 01:07:46.682450 systemd[1]: Started sshd@22-10.0.0.8:22-10.0.0.1:36356.service - OpenSSH per-connection server daemon (10.0.0.1:36356). Mar 25 01:07:46.744232 sshd[5208]: Accepted publickey for core from 10.0.0.1 port 36356 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:07:46.745651 sshd-session[5208]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:07:46.756798 systemd-logind[1439]: New session 23 of user core. Mar 25 01:07:46.758729 systemd[1]: Started session-23.scope - Session 23 of User core. Mar 25 01:07:46.799223 containerd[1456]: time="2025-03-25T01:07:46.799185527Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4b212a9c005b741d0aef7bf8f4253aef2997374a667f22f8d3d61bf3c4998e81\" id:\"7ba44bd2b593f87332367a5ca964ada5225de8327bf8f124e083a9ab8b64b836\" pid:5222 exited_at:{seconds:1742864866 nanos:798887047}" Mar 25 01:07:46.898654 sshd[5228]: Connection closed by 10.0.0.1 port 36356 Mar 25 01:07:46.898998 sshd-session[5208]: pam_unix(sshd:session): session closed for user core Mar 25 01:07:46.901856 systemd[1]: sshd@22-10.0.0.8:22-10.0.0.1:36356.service: Deactivated successfully. Mar 25 01:07:46.903480 systemd[1]: session-23.scope: Deactivated successfully. Mar 25 01:07:46.904700 systemd-logind[1439]: Session 23 logged out. Waiting for processes to exit. Mar 25 01:07:46.905573 systemd-logind[1439]: Removed session 23.