Mar 25 01:15:27.883406 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Mar 25 01:15:27.883428 kernel: Linux version 6.6.83-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT Mon Mar 24 23:39:14 -00 2025 Mar 25 01:15:27.883438 kernel: KASLR enabled Mar 25 01:15:27.883444 kernel: efi: EFI v2.7 by EDK II Mar 25 01:15:27.883449 kernel: efi: SMBIOS 3.0=0xdced0000 MEMATTR=0xdbbae018 ACPI 2.0=0xd9b43018 RNG=0xd9b43a18 MEMRESERVE=0xd9b40218 Mar 25 01:15:27.883455 kernel: random: crng init done Mar 25 01:15:27.883473 kernel: secureboot: Secure boot disabled Mar 25 01:15:27.883479 kernel: ACPI: Early table checksum verification disabled Mar 25 01:15:27.883484 kernel: ACPI: RSDP 0x00000000D9B43018 000024 (v02 BOCHS ) Mar 25 01:15:27.883493 kernel: ACPI: XSDT 0x00000000D9B43F18 000064 (v01 BOCHS BXPC 00000001 01000013) Mar 25 01:15:27.883499 kernel: ACPI: FACP 0x00000000D9B43B18 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:15:27.883504 kernel: ACPI: DSDT 0x00000000D9B41018 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:15:27.883510 kernel: ACPI: APIC 0x00000000D9B43C98 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:15:27.883516 kernel: ACPI: PPTT 0x00000000D9B43098 00009C (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:15:27.883523 kernel: ACPI: GTDT 0x00000000D9B43818 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:15:27.883531 kernel: ACPI: MCFG 0x00000000D9B43A98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:15:27.883538 kernel: ACPI: SPCR 0x00000000D9B43918 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:15:27.883544 kernel: ACPI: DBG2 0x00000000D9B43998 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:15:27.883551 kernel: ACPI: IORT 0x00000000D9B43198 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:15:27.883556 kernel: ACPI: SPCR: console: pl011,mmio,0x9000000,9600 Mar 25 01:15:27.883562 kernel: NUMA: Failed to initialise from firmware Mar 25 01:15:27.883568 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x00000000dcffffff] Mar 25 01:15:27.883574 kernel: NUMA: NODE_DATA [mem 0xdc958800-0xdc95dfff] Mar 25 01:15:27.883580 kernel: Zone ranges: Mar 25 01:15:27.883586 kernel: DMA [mem 0x0000000040000000-0x00000000dcffffff] Mar 25 01:15:27.883594 kernel: DMA32 empty Mar 25 01:15:27.883599 kernel: Normal empty Mar 25 01:15:27.883605 kernel: Movable zone start for each node Mar 25 01:15:27.883611 kernel: Early memory node ranges Mar 25 01:15:27.883617 kernel: node 0: [mem 0x0000000040000000-0x00000000d967ffff] Mar 25 01:15:27.883623 kernel: node 0: [mem 0x00000000d9680000-0x00000000d968ffff] Mar 25 01:15:27.883629 kernel: node 0: [mem 0x00000000d9690000-0x00000000d976ffff] Mar 25 01:15:27.883635 kernel: node 0: [mem 0x00000000d9770000-0x00000000d9b3ffff] Mar 25 01:15:27.883640 kernel: node 0: [mem 0x00000000d9b40000-0x00000000dce1ffff] Mar 25 01:15:27.883646 kernel: node 0: [mem 0x00000000dce20000-0x00000000dceaffff] Mar 25 01:15:27.883652 kernel: node 0: [mem 0x00000000dceb0000-0x00000000dcebffff] Mar 25 01:15:27.883658 kernel: node 0: [mem 0x00000000dcec0000-0x00000000dcfdffff] Mar 25 01:15:27.883665 kernel: node 0: [mem 0x00000000dcfe0000-0x00000000dcffffff] Mar 25 01:15:27.883671 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x00000000dcffffff] Mar 25 01:15:27.883677 kernel: On node 0, zone DMA: 12288 pages in unavailable ranges Mar 25 01:15:27.883686 kernel: psci: probing for conduit method from ACPI. Mar 25 01:15:27.883692 kernel: psci: PSCIv1.1 detected in firmware. Mar 25 01:15:27.883699 kernel: psci: Using standard PSCI v0.2 function IDs Mar 25 01:15:27.883706 kernel: psci: Trusted OS migration not required Mar 25 01:15:27.883712 kernel: psci: SMC Calling Convention v1.1 Mar 25 01:15:27.883719 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Mar 25 01:15:27.883725 kernel: percpu: Embedded 31 pages/cpu s86696 r8192 d32088 u126976 Mar 25 01:15:27.883732 kernel: pcpu-alloc: s86696 r8192 d32088 u126976 alloc=31*4096 Mar 25 01:15:27.883739 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Mar 25 01:15:27.883745 kernel: Detected PIPT I-cache on CPU0 Mar 25 01:15:27.883751 kernel: CPU features: detected: GIC system register CPU interface Mar 25 01:15:27.883757 kernel: CPU features: detected: Hardware dirty bit management Mar 25 01:15:27.883764 kernel: CPU features: detected: Spectre-v4 Mar 25 01:15:27.883771 kernel: CPU features: detected: Spectre-BHB Mar 25 01:15:27.883778 kernel: CPU features: kernel page table isolation forced ON by KASLR Mar 25 01:15:27.883785 kernel: CPU features: detected: Kernel page table isolation (KPTI) Mar 25 01:15:27.883791 kernel: CPU features: detected: ARM erratum 1418040 Mar 25 01:15:27.883798 kernel: CPU features: detected: SSBS not fully self-synchronizing Mar 25 01:15:27.883804 kernel: alternatives: applying boot alternatives Mar 25 01:15:27.883811 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=b84e5f613acd6cd0a8a878f32f5653a14f2e6fb2820997fecd5b2bd33a4ba3ab Mar 25 01:15:27.883818 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 25 01:15:27.883825 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 25 01:15:27.883831 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 25 01:15:27.883838 kernel: Fallback order for Node 0: 0 Mar 25 01:15:27.883845 kernel: Built 1 zonelists, mobility grouping on. Total pages: 633024 Mar 25 01:15:27.883852 kernel: Policy zone: DMA Mar 25 01:15:27.883868 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 25 01:15:27.883875 kernel: software IO TLB: area num 4. Mar 25 01:15:27.883881 kernel: software IO TLB: mapped [mem 0x00000000d2e00000-0x00000000d6e00000] (64MB) Mar 25 01:15:27.883888 kernel: Memory: 2387412K/2572288K available (10304K kernel code, 2186K rwdata, 8096K rodata, 38464K init, 897K bss, 184876K reserved, 0K cma-reserved) Mar 25 01:15:27.883895 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Mar 25 01:15:27.883901 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 25 01:15:27.883908 kernel: rcu: RCU event tracing is enabled. Mar 25 01:15:27.883915 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Mar 25 01:15:27.883921 kernel: Trampoline variant of Tasks RCU enabled. Mar 25 01:15:27.883928 kernel: Tracing variant of Tasks RCU enabled. Mar 25 01:15:27.883936 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 25 01:15:27.883942 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Mar 25 01:15:27.883948 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Mar 25 01:15:27.883955 kernel: GICv3: 256 SPIs implemented Mar 25 01:15:27.883961 kernel: GICv3: 0 Extended SPIs implemented Mar 25 01:15:27.883967 kernel: Root IRQ handler: gic_handle_irq Mar 25 01:15:27.883973 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Mar 25 01:15:27.883980 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Mar 25 01:15:27.883986 kernel: ITS [mem 0x08080000-0x0809ffff] Mar 25 01:15:27.883993 kernel: ITS@0x0000000008080000: allocated 8192 Devices @400c0000 (indirect, esz 8, psz 64K, shr 1) Mar 25 01:15:27.883999 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @400d0000 (flat, esz 8, psz 64K, shr 1) Mar 25 01:15:27.884007 kernel: GICv3: using LPI property table @0x00000000400f0000 Mar 25 01:15:27.884013 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000040100000 Mar 25 01:15:27.884020 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 25 01:15:27.884026 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 25 01:15:27.884033 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Mar 25 01:15:27.884039 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Mar 25 01:15:27.884046 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Mar 25 01:15:27.884052 kernel: arm-pv: using stolen time PV Mar 25 01:15:27.884059 kernel: Console: colour dummy device 80x25 Mar 25 01:15:27.884065 kernel: ACPI: Core revision 20230628 Mar 25 01:15:27.884072 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Mar 25 01:15:27.884079 kernel: pid_max: default: 32768 minimum: 301 Mar 25 01:15:27.884086 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 25 01:15:27.884092 kernel: landlock: Up and running. Mar 25 01:15:27.884099 kernel: SELinux: Initializing. Mar 25 01:15:27.884105 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 25 01:15:27.884112 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 25 01:15:27.884118 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Mar 25 01:15:27.884125 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Mar 25 01:15:27.884131 kernel: rcu: Hierarchical SRCU implementation. Mar 25 01:15:27.884139 kernel: rcu: Max phase no-delay instances is 400. Mar 25 01:15:27.884146 kernel: Platform MSI: ITS@0x8080000 domain created Mar 25 01:15:27.884153 kernel: PCI/MSI: ITS@0x8080000 domain created Mar 25 01:15:27.884159 kernel: Remapping and enabling EFI services. Mar 25 01:15:27.884166 kernel: smp: Bringing up secondary CPUs ... Mar 25 01:15:27.884179 kernel: Detected PIPT I-cache on CPU1 Mar 25 01:15:27.884186 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Mar 25 01:15:27.884192 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000040110000 Mar 25 01:15:27.884199 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 25 01:15:27.884206 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Mar 25 01:15:27.884213 kernel: Detected PIPT I-cache on CPU2 Mar 25 01:15:27.884225 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Mar 25 01:15:27.884234 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000040120000 Mar 25 01:15:27.884241 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 25 01:15:27.884248 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Mar 25 01:15:27.884255 kernel: Detected PIPT I-cache on CPU3 Mar 25 01:15:27.884261 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Mar 25 01:15:27.884269 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000040130000 Mar 25 01:15:27.884285 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 25 01:15:27.884292 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Mar 25 01:15:27.884299 kernel: smp: Brought up 1 node, 4 CPUs Mar 25 01:15:27.884308 kernel: SMP: Total of 4 processors activated. Mar 25 01:15:27.884318 kernel: CPU features: detected: 32-bit EL0 Support Mar 25 01:15:27.884327 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Mar 25 01:15:27.884334 kernel: CPU features: detected: Common not Private translations Mar 25 01:15:27.884341 kernel: CPU features: detected: CRC32 instructions Mar 25 01:15:27.884350 kernel: CPU features: detected: Enhanced Virtualization Traps Mar 25 01:15:27.884357 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Mar 25 01:15:27.884364 kernel: CPU features: detected: LSE atomic instructions Mar 25 01:15:27.884371 kernel: CPU features: detected: Privileged Access Never Mar 25 01:15:27.884381 kernel: CPU features: detected: RAS Extension Support Mar 25 01:15:27.884388 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Mar 25 01:15:27.884395 kernel: CPU: All CPU(s) started at EL1 Mar 25 01:15:27.884403 kernel: alternatives: applying system-wide alternatives Mar 25 01:15:27.884410 kernel: devtmpfs: initialized Mar 25 01:15:27.884421 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 25 01:15:27.884431 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Mar 25 01:15:27.884441 kernel: pinctrl core: initialized pinctrl subsystem Mar 25 01:15:27.884448 kernel: SMBIOS 3.0.0 present. Mar 25 01:15:27.884455 kernel: DMI: QEMU KVM Virtual Machine, BIOS unknown 02/02/2022 Mar 25 01:15:27.884462 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 25 01:15:27.884469 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Mar 25 01:15:27.884476 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Mar 25 01:15:27.884483 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Mar 25 01:15:27.884491 kernel: audit: initializing netlink subsys (disabled) Mar 25 01:15:27.884498 kernel: audit: type=2000 audit(0.018:1): state=initialized audit_enabled=0 res=1 Mar 25 01:15:27.884505 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 25 01:15:27.884512 kernel: cpuidle: using governor menu Mar 25 01:15:27.884519 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Mar 25 01:15:27.884526 kernel: ASID allocator initialised with 32768 entries Mar 25 01:15:27.884532 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 25 01:15:27.884539 kernel: Serial: AMBA PL011 UART driver Mar 25 01:15:27.884546 kernel: Modules: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Mar 25 01:15:27.884553 kernel: Modules: 0 pages in range for non-PLT usage Mar 25 01:15:27.884562 kernel: Modules: 509248 pages in range for PLT usage Mar 25 01:15:27.884569 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 25 01:15:27.884576 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Mar 25 01:15:27.884582 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Mar 25 01:15:27.884589 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Mar 25 01:15:27.884596 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 25 01:15:27.884603 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Mar 25 01:15:27.884610 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Mar 25 01:15:27.884617 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Mar 25 01:15:27.884625 kernel: ACPI: Added _OSI(Module Device) Mar 25 01:15:27.884632 kernel: ACPI: Added _OSI(Processor Device) Mar 25 01:15:27.884639 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 25 01:15:27.884646 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 25 01:15:27.884653 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 25 01:15:27.884660 kernel: ACPI: Interpreter enabled Mar 25 01:15:27.884666 kernel: ACPI: Using GIC for interrupt routing Mar 25 01:15:27.884673 kernel: ACPI: MCFG table detected, 1 entries Mar 25 01:15:27.884680 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Mar 25 01:15:27.884689 kernel: printk: console [ttyAMA0] enabled Mar 25 01:15:27.884696 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 25 01:15:27.884829 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 25 01:15:27.884926 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Mar 25 01:15:27.885004 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Mar 25 01:15:27.885078 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Mar 25 01:15:27.885148 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Mar 25 01:15:27.885160 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Mar 25 01:15:27.885168 kernel: PCI host bridge to bus 0000:00 Mar 25 01:15:27.885246 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Mar 25 01:15:27.885323 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Mar 25 01:15:27.885390 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Mar 25 01:15:27.885455 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 25 01:15:27.885544 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 Mar 25 01:15:27.885627 kernel: pci 0000:00:01.0: [1af4:1005] type 00 class 0x00ff00 Mar 25 01:15:27.885698 kernel: pci 0000:00:01.0: reg 0x10: [io 0x0000-0x001f] Mar 25 01:15:27.885769 kernel: pci 0000:00:01.0: reg 0x14: [mem 0x10000000-0x10000fff] Mar 25 01:15:27.885839 kernel: pci 0000:00:01.0: reg 0x20: [mem 0x8000000000-0x8000003fff 64bit pref] Mar 25 01:15:27.885923 kernel: pci 0000:00:01.0: BAR 4: assigned [mem 0x8000000000-0x8000003fff 64bit pref] Mar 25 01:15:27.885995 kernel: pci 0000:00:01.0: BAR 1: assigned [mem 0x10000000-0x10000fff] Mar 25 01:15:27.886071 kernel: pci 0000:00:01.0: BAR 0: assigned [io 0x1000-0x101f] Mar 25 01:15:27.886136 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Mar 25 01:15:27.886199 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Mar 25 01:15:27.886261 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Mar 25 01:15:27.886270 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Mar 25 01:15:27.886285 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Mar 25 01:15:27.886293 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Mar 25 01:15:27.886300 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Mar 25 01:15:27.886310 kernel: iommu: Default domain type: Translated Mar 25 01:15:27.886318 kernel: iommu: DMA domain TLB invalidation policy: strict mode Mar 25 01:15:27.886325 kernel: efivars: Registered efivars operations Mar 25 01:15:27.886331 kernel: vgaarb: loaded Mar 25 01:15:27.886338 kernel: clocksource: Switched to clocksource arch_sys_counter Mar 25 01:15:27.886345 kernel: VFS: Disk quotas dquot_6.6.0 Mar 25 01:15:27.886352 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 25 01:15:27.886359 kernel: pnp: PnP ACPI init Mar 25 01:15:27.886444 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Mar 25 01:15:27.886456 kernel: pnp: PnP ACPI: found 1 devices Mar 25 01:15:27.886463 kernel: NET: Registered PF_INET protocol family Mar 25 01:15:27.886470 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 25 01:15:27.886478 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 25 01:15:27.886485 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 25 01:15:27.886492 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 25 01:15:27.886499 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 25 01:15:27.886506 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 25 01:15:27.886515 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 25 01:15:27.886522 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 25 01:15:27.886529 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 25 01:15:27.886536 kernel: PCI: CLS 0 bytes, default 64 Mar 25 01:15:27.886543 kernel: kvm [1]: HYP mode not available Mar 25 01:15:27.886550 kernel: Initialise system trusted keyrings Mar 25 01:15:27.886557 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 25 01:15:27.886564 kernel: Key type asymmetric registered Mar 25 01:15:27.886571 kernel: Asymmetric key parser 'x509' registered Mar 25 01:15:27.886578 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Mar 25 01:15:27.886586 kernel: io scheduler mq-deadline registered Mar 25 01:15:27.886593 kernel: io scheduler kyber registered Mar 25 01:15:27.886600 kernel: io scheduler bfq registered Mar 25 01:15:27.886607 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Mar 25 01:15:27.886614 kernel: ACPI: button: Power Button [PWRB] Mar 25 01:15:27.886621 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Mar 25 01:15:27.886694 kernel: virtio-pci 0000:00:01.0: enabling device (0005 -> 0007) Mar 25 01:15:27.886703 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 25 01:15:27.886710 kernel: thunder_xcv, ver 1.0 Mar 25 01:15:27.886719 kernel: thunder_bgx, ver 1.0 Mar 25 01:15:27.886726 kernel: nicpf, ver 1.0 Mar 25 01:15:27.886733 kernel: nicvf, ver 1.0 Mar 25 01:15:27.886814 kernel: rtc-efi rtc-efi.0: registered as rtc0 Mar 25 01:15:27.886895 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-03-25T01:15:27 UTC (1742865327) Mar 25 01:15:27.886905 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 25 01:15:27.886912 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 counters available Mar 25 01:15:27.886920 kernel: watchdog: Delayed init of the lockup detector failed: -19 Mar 25 01:15:27.886929 kernel: watchdog: Hard watchdog permanently disabled Mar 25 01:15:27.886936 kernel: NET: Registered PF_INET6 protocol family Mar 25 01:15:27.886943 kernel: Segment Routing with IPv6 Mar 25 01:15:27.886949 kernel: In-situ OAM (IOAM) with IPv6 Mar 25 01:15:27.886956 kernel: NET: Registered PF_PACKET protocol family Mar 25 01:15:27.886963 kernel: Key type dns_resolver registered Mar 25 01:15:27.886970 kernel: registered taskstats version 1 Mar 25 01:15:27.886977 kernel: Loading compiled-in X.509 certificates Mar 25 01:15:27.886985 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.83-flatcar: ed4ababe871f0afac8b4236504477de11a6baf07' Mar 25 01:15:27.886993 kernel: Key type .fscrypt registered Mar 25 01:15:27.887000 kernel: Key type fscrypt-provisioning registered Mar 25 01:15:27.887007 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 25 01:15:27.887014 kernel: ima: Allocated hash algorithm: sha1 Mar 25 01:15:27.887021 kernel: ima: No architecture policies found Mar 25 01:15:27.887028 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Mar 25 01:15:27.887035 kernel: clk: Disabling unused clocks Mar 25 01:15:27.887042 kernel: Freeing unused kernel memory: 38464K Mar 25 01:15:27.887051 kernel: Run /init as init process Mar 25 01:15:27.887058 kernel: with arguments: Mar 25 01:15:27.887065 kernel: /init Mar 25 01:15:27.887071 kernel: with environment: Mar 25 01:15:27.887078 kernel: HOME=/ Mar 25 01:15:27.887085 kernel: TERM=linux Mar 25 01:15:27.887092 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 25 01:15:27.887100 systemd[1]: Successfully made /usr/ read-only. Mar 25 01:15:27.887110 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 25 01:15:27.887119 systemd[1]: Detected virtualization kvm. Mar 25 01:15:27.887127 systemd[1]: Detected architecture arm64. Mar 25 01:15:27.887134 systemd[1]: Running in initrd. Mar 25 01:15:27.887141 systemd[1]: No hostname configured, using default hostname. Mar 25 01:15:27.887149 systemd[1]: Hostname set to . Mar 25 01:15:27.887157 systemd[1]: Initializing machine ID from VM UUID. Mar 25 01:15:27.887164 systemd[1]: Queued start job for default target initrd.target. Mar 25 01:15:27.887174 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 01:15:27.887182 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 01:15:27.887190 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 25 01:15:27.887198 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 25 01:15:27.887206 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 25 01:15:27.887214 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 25 01:15:27.887223 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 25 01:15:27.887232 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 25 01:15:27.887240 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 01:15:27.887247 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 25 01:15:27.887255 systemd[1]: Reached target paths.target - Path Units. Mar 25 01:15:27.887263 systemd[1]: Reached target slices.target - Slice Units. Mar 25 01:15:27.887271 systemd[1]: Reached target swap.target - Swaps. Mar 25 01:15:27.887286 systemd[1]: Reached target timers.target - Timer Units. Mar 25 01:15:27.887294 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 25 01:15:27.887301 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 25 01:15:27.887311 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 25 01:15:27.887319 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 25 01:15:27.887326 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 25 01:15:27.887334 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 25 01:15:27.887342 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 01:15:27.887350 systemd[1]: Reached target sockets.target - Socket Units. Mar 25 01:15:27.887358 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 25 01:15:27.887366 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 25 01:15:27.887375 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 25 01:15:27.887382 systemd[1]: Starting systemd-fsck-usr.service... Mar 25 01:15:27.887390 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 25 01:15:27.887401 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 25 01:15:27.887414 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:15:27.887422 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 25 01:15:27.887430 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 01:15:27.887440 systemd[1]: Finished systemd-fsck-usr.service. Mar 25 01:15:27.887448 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 25 01:15:27.887471 systemd-journald[235]: Collecting audit messages is disabled. Mar 25 01:15:27.887492 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:15:27.887500 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 01:15:27.887509 systemd-journald[235]: Journal started Mar 25 01:15:27.887527 systemd-journald[235]: Runtime Journal (/run/log/journal/25cffc67fd9740b7a234a321fcf09510) is 5.9M, max 47.3M, 41.4M free. Mar 25 01:15:27.869429 systemd-modules-load[237]: Inserted module 'overlay' Mar 25 01:15:27.890100 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 25 01:15:27.892883 systemd[1]: Started systemd-journald.service - Journal Service. Mar 25 01:15:27.893374 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 25 01:15:27.896333 kernel: Bridge firewalling registered Mar 25 01:15:27.894974 systemd-modules-load[237]: Inserted module 'br_netfilter' Mar 25 01:15:27.895871 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 25 01:15:27.899497 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 25 01:15:27.901817 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 25 01:15:27.914203 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 25 01:15:27.915478 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:15:27.920553 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 25 01:15:27.924109 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 25 01:15:27.925327 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 01:15:27.927299 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 01:15:27.931749 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 25 01:15:27.934813 dracut-cmdline[273]: dracut-dracut-053 Mar 25 01:15:27.935895 dracut-cmdline[273]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected acpi=force verity.usrhash=b84e5f613acd6cd0a8a878f32f5653a14f2e6fb2820997fecd5b2bd33a4ba3ab Mar 25 01:15:27.973343 systemd-resolved[286]: Positive Trust Anchors: Mar 25 01:15:27.973360 systemd-resolved[286]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 25 01:15:27.973390 systemd-resolved[286]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 25 01:15:27.979260 systemd-resolved[286]: Defaulting to hostname 'linux'. Mar 25 01:15:27.980260 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 25 01:15:27.982754 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 25 01:15:28.002882 kernel: SCSI subsystem initialized Mar 25 01:15:28.007878 kernel: Loading iSCSI transport class v2.0-870. Mar 25 01:15:28.014879 kernel: iscsi: registered transport (tcp) Mar 25 01:15:28.029910 kernel: iscsi: registered transport (qla4xxx) Mar 25 01:15:28.029963 kernel: QLogic iSCSI HBA Driver Mar 25 01:15:28.071113 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 25 01:15:28.073417 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 25 01:15:28.098212 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 25 01:15:28.098251 kernel: device-mapper: uevent: version 1.0.3 Mar 25 01:15:28.100886 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 25 01:15:28.146893 kernel: raid6: neonx8 gen() 15788 MB/s Mar 25 01:15:28.163881 kernel: raid6: neonx4 gen() 15795 MB/s Mar 25 01:15:28.180892 kernel: raid6: neonx2 gen() 13319 MB/s Mar 25 01:15:28.197891 kernel: raid6: neonx1 gen() 10485 MB/s Mar 25 01:15:28.214891 kernel: raid6: int64x8 gen() 6788 MB/s Mar 25 01:15:28.231891 kernel: raid6: int64x4 gen() 7341 MB/s Mar 25 01:15:28.248880 kernel: raid6: int64x2 gen() 6108 MB/s Mar 25 01:15:28.265982 kernel: raid6: int64x1 gen() 5050 MB/s Mar 25 01:15:28.265997 kernel: raid6: using algorithm neonx4 gen() 15795 MB/s Mar 25 01:15:28.283934 kernel: raid6: .... xor() 12452 MB/s, rmw enabled Mar 25 01:15:28.283961 kernel: raid6: using neon recovery algorithm Mar 25 01:15:28.289216 kernel: xor: measuring software checksum speed Mar 25 01:15:28.289230 kernel: 8regs : 21636 MB/sec Mar 25 01:15:28.289905 kernel: 32regs : 20865 MB/sec Mar 25 01:15:28.291123 kernel: arm64_neon : 27766 MB/sec Mar 25 01:15:28.291135 kernel: xor: using function: arm64_neon (27766 MB/sec) Mar 25 01:15:28.341884 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 25 01:15:28.352318 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 25 01:15:28.354748 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 01:15:28.381306 systemd-udevd[462]: Using default interface naming scheme 'v255'. Mar 25 01:15:28.384988 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 01:15:28.387706 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 25 01:15:28.406731 dracut-pre-trigger[469]: rd.md=0: removing MD RAID activation Mar 25 01:15:28.430977 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 25 01:15:28.433051 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 25 01:15:28.481705 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 01:15:28.490007 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 25 01:15:28.507817 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 25 01:15:28.509329 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 25 01:15:28.510938 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 01:15:28.513468 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 25 01:15:28.516212 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 25 01:15:28.537232 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 25 01:15:28.544677 kernel: virtio_blk virtio1: 1/0/0 default/read/poll queues Mar 25 01:15:28.551012 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Mar 25 01:15:28.551118 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 25 01:15:28.551134 kernel: GPT:9289727 != 19775487 Mar 25 01:15:28.551144 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 25 01:15:28.551154 kernel: GPT:9289727 != 19775487 Mar 25 01:15:28.551162 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 25 01:15:28.551171 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 25 01:15:28.545697 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 25 01:15:28.545791 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:15:28.549897 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 01:15:28.550903 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 25 01:15:28.551025 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:15:28.552987 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:15:28.554681 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:15:28.573937 kernel: BTRFS: device fsid bf348154-9cb1-474d-801c-0e035a5758cf devid 1 transid 39 /dev/vda3 scanned by (udev-worker) (509) Mar 25 01:15:28.576938 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 scanned by (udev-worker) (512) Mar 25 01:15:28.579889 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:15:28.591975 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Mar 25 01:15:28.599324 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Mar 25 01:15:28.605508 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Mar 25 01:15:28.606689 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. Mar 25 01:15:28.615205 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 25 01:15:28.617038 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 25 01:15:28.618938 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 01:15:28.634373 disk-uuid[553]: Primary Header is updated. Mar 25 01:15:28.634373 disk-uuid[553]: Secondary Entries is updated. Mar 25 01:15:28.634373 disk-uuid[553]: Secondary Header is updated. Mar 25 01:15:28.641874 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 25 01:15:28.647747 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:15:29.649892 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 25 01:15:29.650179 disk-uuid[556]: The operation has completed successfully. Mar 25 01:15:29.674136 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 25 01:15:29.674249 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 25 01:15:29.698610 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 25 01:15:29.710691 sh[573]: Success Mar 25 01:15:29.726880 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" Mar 25 01:15:29.755113 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 25 01:15:29.757812 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 25 01:15:29.771808 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 25 01:15:29.779161 kernel: BTRFS info (device dm-0): first mount of filesystem bf348154-9cb1-474d-801c-0e035a5758cf Mar 25 01:15:29.779193 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Mar 25 01:15:29.779204 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 25 01:15:29.780948 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 25 01:15:29.780985 kernel: BTRFS info (device dm-0): using free space tree Mar 25 01:15:29.784844 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 25 01:15:29.786177 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 25 01:15:29.786893 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 25 01:15:29.789610 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 25 01:15:29.815905 kernel: BTRFS info (device vda6): first mount of filesystem 09629b08-d05c-4ce3-8bf7-615041c4b2c9 Mar 25 01:15:29.815939 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Mar 25 01:15:29.815949 kernel: BTRFS info (device vda6): using free space tree Mar 25 01:15:29.818890 kernel: BTRFS info (device vda6): auto enabling async discard Mar 25 01:15:29.822878 kernel: BTRFS info (device vda6): last unmount of filesystem 09629b08-d05c-4ce3-8bf7-615041c4b2c9 Mar 25 01:15:29.825882 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 25 01:15:29.827975 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 25 01:15:29.884200 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 25 01:15:29.888625 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 25 01:15:29.933981 systemd-networkd[757]: lo: Link UP Mar 25 01:15:29.933994 systemd-networkd[757]: lo: Gained carrier Mar 25 01:15:29.934876 systemd-networkd[757]: Enumeration completed Mar 25 01:15:29.934981 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 25 01:15:29.935537 systemd-networkd[757]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:15:29.937770 ignition[666]: Ignition 2.20.0 Mar 25 01:15:29.935540 systemd-networkd[757]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 25 01:15:29.937777 ignition[666]: Stage: fetch-offline Mar 25 01:15:29.936219 systemd-networkd[757]: eth0: Link UP Mar 25 01:15:29.937805 ignition[666]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:15:29.936222 systemd-networkd[757]: eth0: Gained carrier Mar 25 01:15:29.937813 ignition[666]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 25 01:15:29.936227 systemd-networkd[757]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:15:29.937972 ignition[666]: parsed url from cmdline: "" Mar 25 01:15:29.936918 systemd[1]: Reached target network.target - Network. Mar 25 01:15:29.937975 ignition[666]: no config URL provided Mar 25 01:15:29.937982 ignition[666]: reading system config file "/usr/lib/ignition/user.ign" Mar 25 01:15:29.937989 ignition[666]: no config at "/usr/lib/ignition/user.ign" Mar 25 01:15:29.938011 ignition[666]: op(1): [started] loading QEMU firmware config module Mar 25 01:15:29.938015 ignition[666]: op(1): executing: "modprobe" "qemu_fw_cfg" Mar 25 01:15:29.945252 ignition[666]: op(1): [finished] loading QEMU firmware config module Mar 25 01:15:29.957918 systemd-networkd[757]: eth0: DHCPv4 address 10.0.0.53/16, gateway 10.0.0.1 acquired from 10.0.0.1 Mar 25 01:15:29.990346 ignition[666]: parsing config with SHA512: b82ad6c2c4882593539949aefa3d6f4a73dedb0ae2fb52761e5d336e057126896b48b777d85184b1da387cd6692d684468a4854b18824f4b153c73895eb44ff1 Mar 25 01:15:29.996155 unknown[666]: fetched base config from "system" Mar 25 01:15:29.996165 unknown[666]: fetched user config from "qemu" Mar 25 01:15:29.996593 ignition[666]: fetch-offline: fetch-offline passed Mar 25 01:15:29.996751 systemd-resolved[286]: Detected conflict on linux IN A 10.0.0.53 Mar 25 01:15:29.996664 ignition[666]: Ignition finished successfully Mar 25 01:15:29.996758 systemd-resolved[286]: Hostname conflict, changing published hostname from 'linux' to 'linux9'. Mar 25 01:15:29.998502 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 25 01:15:29.999889 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Mar 25 01:15:30.000595 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 25 01:15:30.025974 ignition[771]: Ignition 2.20.0 Mar 25 01:15:30.025984 ignition[771]: Stage: kargs Mar 25 01:15:30.026133 ignition[771]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:15:30.026142 ignition[771]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 25 01:15:30.026981 ignition[771]: kargs: kargs passed Mar 25 01:15:30.029579 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 25 01:15:30.027022 ignition[771]: Ignition finished successfully Mar 25 01:15:30.031476 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 25 01:15:30.051690 ignition[780]: Ignition 2.20.0 Mar 25 01:15:30.051699 ignition[780]: Stage: disks Mar 25 01:15:30.051874 ignition[780]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:15:30.054196 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 25 01:15:30.051884 ignition[780]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 25 01:15:30.055749 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 25 01:15:30.052688 ignition[780]: disks: disks passed Mar 25 01:15:30.057428 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 25 01:15:30.052729 ignition[780]: Ignition finished successfully Mar 25 01:15:30.059461 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 25 01:15:30.061265 systemd[1]: Reached target sysinit.target - System Initialization. Mar 25 01:15:30.062698 systemd[1]: Reached target basic.target - Basic System. Mar 25 01:15:30.065321 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 25 01:15:30.093098 systemd-fsck[790]: ROOT: clean, 14/553520 files, 52654/553472 blocks Mar 25 01:15:30.096728 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 25 01:15:30.099356 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 25 01:15:30.146873 kernel: EXT4-fs (vda9): mounted filesystem a7a89271-ee7d-4bda-a834-705261d6cda9 r/w with ordered data mode. Quota mode: none. Mar 25 01:15:30.147251 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 25 01:15:30.148487 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 25 01:15:30.150738 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 25 01:15:30.152325 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 25 01:15:30.153337 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Mar 25 01:15:30.153373 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 25 01:15:30.153395 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 25 01:15:30.166116 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 25 01:15:30.168490 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 25 01:15:30.171610 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/vda6 scanned by mount (798) Mar 25 01:15:30.173972 kernel: BTRFS info (device vda6): first mount of filesystem 09629b08-d05c-4ce3-8bf7-615041c4b2c9 Mar 25 01:15:30.174005 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Mar 25 01:15:30.174022 kernel: BTRFS info (device vda6): using free space tree Mar 25 01:15:30.174032 kernel: BTRFS info (device vda6): auto enabling async discard Mar 25 01:15:30.176468 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 25 01:15:30.212411 initrd-setup-root[822]: cut: /sysroot/etc/passwd: No such file or directory Mar 25 01:15:30.216434 initrd-setup-root[829]: cut: /sysroot/etc/group: No such file or directory Mar 25 01:15:30.220589 initrd-setup-root[836]: cut: /sysroot/etc/shadow: No such file or directory Mar 25 01:15:30.223326 initrd-setup-root[843]: cut: /sysroot/etc/gshadow: No such file or directory Mar 25 01:15:30.287534 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 25 01:15:30.289874 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 25 01:15:30.291403 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 25 01:15:30.305896 kernel: BTRFS info (device vda6): last unmount of filesystem 09629b08-d05c-4ce3-8bf7-615041c4b2c9 Mar 25 01:15:30.317085 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 25 01:15:30.326378 ignition[914]: INFO : Ignition 2.20.0 Mar 25 01:15:30.326378 ignition[914]: INFO : Stage: mount Mar 25 01:15:30.327916 ignition[914]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 01:15:30.327916 ignition[914]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 25 01:15:30.327916 ignition[914]: INFO : mount: mount passed Mar 25 01:15:30.327916 ignition[914]: INFO : Ignition finished successfully Mar 25 01:15:30.328967 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 25 01:15:30.332061 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 25 01:15:30.904583 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 25 01:15:30.906033 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 25 01:15:30.921486 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/vda6 scanned by mount (927) Mar 25 01:15:30.921520 kernel: BTRFS info (device vda6): first mount of filesystem 09629b08-d05c-4ce3-8bf7-615041c4b2c9 Mar 25 01:15:30.921531 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Mar 25 01:15:30.923034 kernel: BTRFS info (device vda6): using free space tree Mar 25 01:15:30.925886 kernel: BTRFS info (device vda6): auto enabling async discard Mar 25 01:15:30.926270 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 25 01:15:30.956602 ignition[944]: INFO : Ignition 2.20.0 Mar 25 01:15:30.956602 ignition[944]: INFO : Stage: files Mar 25 01:15:30.958220 ignition[944]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 01:15:30.958220 ignition[944]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 25 01:15:30.958220 ignition[944]: DEBUG : files: compiled without relabeling support, skipping Mar 25 01:15:30.961406 ignition[944]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 25 01:15:30.961406 ignition[944]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 25 01:15:30.964917 ignition[944]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 25 01:15:30.966205 ignition[944]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 25 01:15:30.966205 ignition[944]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 25 01:15:30.965418 unknown[944]: wrote ssh authorized keys file for user: core Mar 25 01:15:30.969706 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Mar 25 01:15:30.969706 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1 Mar 25 01:15:31.019433 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 25 01:15:31.280975 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Mar 25 01:15:31.280975 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 25 01:15:31.284797 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 25 01:15:31.284797 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 25 01:15:31.284797 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 25 01:15:31.284797 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 25 01:15:31.284797 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 25 01:15:31.284797 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 25 01:15:31.284797 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 25 01:15:31.284797 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 25 01:15:31.284797 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 25 01:15:31.284797 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-arm64.raw" Mar 25 01:15:31.284797 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-arm64.raw" Mar 25 01:15:31.284797 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-arm64.raw" Mar 25 01:15:31.284797 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.31.0-arm64.raw: attempt #1 Mar 25 01:15:31.436982 systemd-networkd[757]: eth0: Gained IPv6LL Mar 25 01:15:31.628599 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 25 01:15:32.349063 ignition[944]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-arm64.raw" Mar 25 01:15:32.349063 ignition[944]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 25 01:15:32.352819 ignition[944]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 25 01:15:32.352819 ignition[944]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 25 01:15:32.352819 ignition[944]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 25 01:15:32.352819 ignition[944]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Mar 25 01:15:32.352819 ignition[944]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Mar 25 01:15:32.352819 ignition[944]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Mar 25 01:15:32.352819 ignition[944]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Mar 25 01:15:32.352819 ignition[944]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" Mar 25 01:15:32.367251 ignition[944]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" Mar 25 01:15:32.370569 ignition[944]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" Mar 25 01:15:32.373564 ignition[944]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" Mar 25 01:15:32.373564 ignition[944]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" Mar 25 01:15:32.373564 ignition[944]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" Mar 25 01:15:32.373564 ignition[944]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 25 01:15:32.373564 ignition[944]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 25 01:15:32.373564 ignition[944]: INFO : files: files passed Mar 25 01:15:32.373564 ignition[944]: INFO : Ignition finished successfully Mar 25 01:15:32.373503 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 25 01:15:32.375369 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 25 01:15:32.377491 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 25 01:15:32.387679 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 25 01:15:32.387757 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 25 01:15:32.392263 initrd-setup-root-after-ignition[974]: grep: /sysroot/oem/oem-release: No such file or directory Mar 25 01:15:32.393632 initrd-setup-root-after-ignition[976]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 25 01:15:32.393632 initrd-setup-root-after-ignition[976]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 25 01:15:32.398324 initrd-setup-root-after-ignition[980]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 25 01:15:32.394444 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 25 01:15:32.396876 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 25 01:15:32.400127 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 25 01:15:32.445809 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 25 01:15:32.445941 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 25 01:15:32.448175 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 25 01:15:32.449828 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 25 01:15:32.451630 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 25 01:15:32.452431 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 25 01:15:32.474174 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 25 01:15:32.476594 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 25 01:15:32.500039 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 25 01:15:32.501371 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 01:15:32.503385 systemd[1]: Stopped target timers.target - Timer Units. Mar 25 01:15:32.505015 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 25 01:15:32.505143 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 25 01:15:32.507536 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 25 01:15:32.509520 systemd[1]: Stopped target basic.target - Basic System. Mar 25 01:15:32.511130 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 25 01:15:32.512741 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 25 01:15:32.514665 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 25 01:15:32.516580 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 25 01:15:32.518391 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 25 01:15:32.520292 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 25 01:15:32.522176 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 25 01:15:32.523870 systemd[1]: Stopped target swap.target - Swaps. Mar 25 01:15:32.525419 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 25 01:15:32.525544 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 25 01:15:32.527754 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 25 01:15:32.529753 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 01:15:32.531674 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 25 01:15:32.535938 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 01:15:32.537220 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 25 01:15:32.537362 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 25 01:15:32.540125 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 25 01:15:32.540252 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 25 01:15:32.542185 systemd[1]: Stopped target paths.target - Path Units. Mar 25 01:15:32.543749 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 25 01:15:32.547921 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 01:15:32.549196 systemd[1]: Stopped target slices.target - Slice Units. Mar 25 01:15:32.551262 systemd[1]: Stopped target sockets.target - Socket Units. Mar 25 01:15:32.552780 systemd[1]: iscsid.socket: Deactivated successfully. Mar 25 01:15:32.552882 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 25 01:15:32.554487 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 25 01:15:32.554567 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 25 01:15:32.556079 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 25 01:15:32.556193 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 25 01:15:32.557982 systemd[1]: ignition-files.service: Deactivated successfully. Mar 25 01:15:32.558085 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 25 01:15:32.560407 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 25 01:15:32.562776 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 25 01:15:32.563947 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 25 01:15:32.564072 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 01:15:32.565850 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 25 01:15:32.565963 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 25 01:15:32.576128 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 25 01:15:32.576234 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 25 01:15:32.586999 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 25 01:15:32.588321 ignition[1000]: INFO : Ignition 2.20.0 Mar 25 01:15:32.588321 ignition[1000]: INFO : Stage: umount Mar 25 01:15:32.590022 ignition[1000]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 01:15:32.590022 ignition[1000]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 25 01:15:32.590022 ignition[1000]: INFO : umount: umount passed Mar 25 01:15:32.590022 ignition[1000]: INFO : Ignition finished successfully Mar 25 01:15:32.591232 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 25 01:15:32.591354 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 25 01:15:32.593095 systemd[1]: Stopped target network.target - Network. Mar 25 01:15:32.594291 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 25 01:15:32.594351 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 25 01:15:32.595842 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 25 01:15:32.595906 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 25 01:15:32.599025 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 25 01:15:32.599075 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 25 01:15:32.600752 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 25 01:15:32.600793 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 25 01:15:32.602517 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 25 01:15:32.604193 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 25 01:15:32.608175 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 25 01:15:32.608297 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 25 01:15:32.611399 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 25 01:15:32.611627 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 25 01:15:32.611662 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 01:15:32.614972 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 25 01:15:32.615190 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 25 01:15:32.615289 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 25 01:15:32.617813 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 25 01:15:32.618319 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 25 01:15:32.618371 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 25 01:15:32.621186 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 25 01:15:32.622026 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 25 01:15:32.622081 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 25 01:15:32.624186 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 25 01:15:32.624231 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 25 01:15:32.626970 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 25 01:15:32.627013 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 25 01:15:32.629002 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 01:15:32.632331 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 25 01:15:32.644273 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 25 01:15:32.644405 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 25 01:15:32.646372 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 25 01:15:32.646496 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 01:15:32.648325 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 25 01:15:32.649883 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 25 01:15:32.651580 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 25 01:15:32.651627 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 25 01:15:32.653510 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 25 01:15:32.653540 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 01:15:32.655214 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 25 01:15:32.655259 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 25 01:15:32.657643 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 25 01:15:32.657686 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 25 01:15:32.659576 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 25 01:15:32.659623 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:15:32.662308 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 25 01:15:32.662358 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 25 01:15:32.664723 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 25 01:15:32.665773 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 25 01:15:32.665830 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 01:15:32.668843 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Mar 25 01:15:32.668900 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 25 01:15:32.670788 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 25 01:15:32.670832 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 01:15:32.672697 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 25 01:15:32.672740 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:15:32.679781 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 25 01:15:32.679904 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 25 01:15:32.681398 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 25 01:15:32.683821 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 25 01:15:32.701358 systemd[1]: Switching root. Mar 25 01:15:32.724946 systemd-journald[235]: Journal stopped Mar 25 01:15:33.440956 systemd-journald[235]: Received SIGTERM from PID 1 (systemd). Mar 25 01:15:33.441011 kernel: SELinux: policy capability network_peer_controls=1 Mar 25 01:15:33.441026 kernel: SELinux: policy capability open_perms=1 Mar 25 01:15:33.441035 kernel: SELinux: policy capability extended_socket_class=1 Mar 25 01:15:33.441044 kernel: SELinux: policy capability always_check_network=0 Mar 25 01:15:33.441053 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 25 01:15:33.441063 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 25 01:15:33.441072 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 25 01:15:33.441081 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 25 01:15:33.441094 kernel: audit: type=1403 audit(1742865332.851:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 25 01:15:33.441104 systemd[1]: Successfully loaded SELinux policy in 30.590ms. Mar 25 01:15:33.441122 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 9.245ms. Mar 25 01:15:33.441133 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 25 01:15:33.441144 systemd[1]: Detected virtualization kvm. Mar 25 01:15:33.441154 systemd[1]: Detected architecture arm64. Mar 25 01:15:33.441164 systemd[1]: Detected first boot. Mar 25 01:15:33.441174 systemd[1]: Initializing machine ID from VM UUID. Mar 25 01:15:33.441186 zram_generator::config[1048]: No configuration found. Mar 25 01:15:33.441201 kernel: NET: Registered PF_VSOCK protocol family Mar 25 01:15:33.441212 systemd[1]: Populated /etc with preset unit settings. Mar 25 01:15:33.441222 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 25 01:15:33.441233 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 25 01:15:33.441243 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 25 01:15:33.441253 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 25 01:15:33.441263 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 25 01:15:33.441273 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 25 01:15:33.441291 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 25 01:15:33.441303 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 25 01:15:33.441316 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 25 01:15:33.441326 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 25 01:15:33.441336 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 25 01:15:33.441347 systemd[1]: Created slice user.slice - User and Session Slice. Mar 25 01:15:33.441361 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 01:15:33.441372 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 01:15:33.441382 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 25 01:15:33.441392 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 25 01:15:33.441402 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 25 01:15:33.441414 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 25 01:15:33.441425 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Mar 25 01:15:33.441436 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 01:15:33.441446 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 25 01:15:33.441456 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 25 01:15:33.441467 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 25 01:15:33.441476 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 25 01:15:33.441488 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 01:15:33.441498 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 25 01:15:33.441508 systemd[1]: Reached target slices.target - Slice Units. Mar 25 01:15:33.441518 systemd[1]: Reached target swap.target - Swaps. Mar 25 01:15:33.441528 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 25 01:15:33.441539 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 25 01:15:33.441549 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 25 01:15:33.441558 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 25 01:15:33.441569 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 25 01:15:33.441579 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 01:15:33.441590 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 25 01:15:33.441600 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 25 01:15:33.441611 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 25 01:15:33.441620 systemd[1]: Mounting media.mount - External Media Directory... Mar 25 01:15:33.441630 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 25 01:15:33.441641 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 25 01:15:33.441651 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 25 01:15:33.441661 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 25 01:15:33.441673 systemd[1]: Reached target machines.target - Containers. Mar 25 01:15:33.441683 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 25 01:15:33.441694 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 01:15:33.441705 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 25 01:15:33.441715 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 25 01:15:33.441725 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 25 01:15:33.441735 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 25 01:15:33.441745 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 25 01:15:33.441755 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 25 01:15:33.441767 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 25 01:15:33.441777 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 25 01:15:33.441788 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 25 01:15:33.441799 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 25 01:15:33.441808 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 25 01:15:33.441818 systemd[1]: Stopped systemd-fsck-usr.service. Mar 25 01:15:33.441829 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 01:15:33.441839 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 25 01:15:33.441851 kernel: loop: module loaded Mar 25 01:15:33.441870 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 25 01:15:33.441880 kernel: fuse: init (API version 7.39) Mar 25 01:15:33.441890 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 25 01:15:33.441900 kernel: ACPI: bus type drm_connector registered Mar 25 01:15:33.441910 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 25 01:15:33.441920 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 25 01:15:33.441930 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 25 01:15:33.441957 systemd-journald[1124]: Collecting audit messages is disabled. Mar 25 01:15:33.441981 systemd[1]: verity-setup.service: Deactivated successfully. Mar 25 01:15:33.441991 systemd[1]: Stopped verity-setup.service. Mar 25 01:15:33.442003 systemd-journald[1124]: Journal started Mar 25 01:15:33.442024 systemd-journald[1124]: Runtime Journal (/run/log/journal/25cffc67fd9740b7a234a321fcf09510) is 5.9M, max 47.3M, 41.4M free. Mar 25 01:15:33.237630 systemd[1]: Queued start job for default target multi-user.target. Mar 25 01:15:33.249740 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Mar 25 01:15:33.250115 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 25 01:15:33.444467 systemd[1]: Started systemd-journald.service - Journal Service. Mar 25 01:15:33.445081 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 25 01:15:33.446194 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 25 01:15:33.447389 systemd[1]: Mounted media.mount - External Media Directory. Mar 25 01:15:33.448499 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 25 01:15:33.449705 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 25 01:15:33.450980 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 25 01:15:33.452213 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 25 01:15:33.454878 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 01:15:33.456345 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 25 01:15:33.456515 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 25 01:15:33.459171 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 25 01:15:33.459359 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 25 01:15:33.460728 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 25 01:15:33.460934 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 25 01:15:33.462264 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 25 01:15:33.462450 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 25 01:15:33.464032 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 25 01:15:33.464231 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 25 01:15:33.465518 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 25 01:15:33.465669 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 25 01:15:33.467212 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 25 01:15:33.468603 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 25 01:15:33.470143 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 25 01:15:33.471609 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 25 01:15:33.483983 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 25 01:15:33.486358 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 25 01:15:33.488355 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 25 01:15:33.489569 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 25 01:15:33.489596 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 25 01:15:33.491549 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 25 01:15:33.496715 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 25 01:15:33.498846 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 25 01:15:33.500061 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 01:15:33.501172 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 25 01:15:33.503212 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 25 01:15:33.504528 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 25 01:15:33.505549 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 25 01:15:33.506666 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 25 01:15:33.507553 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 25 01:15:33.512979 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 25 01:15:33.518004 systemd-journald[1124]: Time spent on flushing to /var/log/journal/25cffc67fd9740b7a234a321fcf09510 is 14.056ms for 869 entries. Mar 25 01:15:33.518004 systemd-journald[1124]: System Journal (/var/log/journal/25cffc67fd9740b7a234a321fcf09510) is 8M, max 195.6M, 187.6M free. Mar 25 01:15:33.537576 systemd-journald[1124]: Received client request to flush runtime journal. Mar 25 01:15:33.537613 kernel: loop0: detected capacity change from 0 to 126448 Mar 25 01:15:33.515113 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 25 01:15:33.517982 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 01:15:33.521709 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 25 01:15:33.523238 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 25 01:15:33.524644 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 25 01:15:33.526270 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 25 01:15:33.530699 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 25 01:15:33.533480 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 25 01:15:33.537091 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 25 01:15:33.541141 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 25 01:15:33.543212 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 25 01:15:33.553805 systemd-tmpfiles[1167]: ACLs are not supported, ignoring. Mar 25 01:15:33.553818 systemd-tmpfiles[1167]: ACLs are not supported, ignoring. Mar 25 01:15:33.554887 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 25 01:15:33.558415 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 25 01:15:33.562098 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 25 01:15:33.568064 udevadm[1177]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. Mar 25 01:15:33.578127 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 25 01:15:33.583879 kernel: loop1: detected capacity change from 0 to 189592 Mar 25 01:15:33.600075 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 25 01:15:33.602520 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 25 01:15:33.609894 kernel: loop2: detected capacity change from 0 to 103832 Mar 25 01:15:33.631790 systemd-tmpfiles[1190]: ACLs are not supported, ignoring. Mar 25 01:15:33.631809 systemd-tmpfiles[1190]: ACLs are not supported, ignoring. Mar 25 01:15:33.636051 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 01:15:33.637022 kernel: loop3: detected capacity change from 0 to 126448 Mar 25 01:15:33.642906 kernel: loop4: detected capacity change from 0 to 189592 Mar 25 01:15:33.649429 kernel: loop5: detected capacity change from 0 to 103832 Mar 25 01:15:33.652265 (sd-merge)[1193]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. Mar 25 01:15:33.652638 (sd-merge)[1193]: Merged extensions into '/usr'. Mar 25 01:15:33.655800 systemd[1]: Reload requested from client PID 1166 ('systemd-sysext') (unit systemd-sysext.service)... Mar 25 01:15:33.655820 systemd[1]: Reloading... Mar 25 01:15:33.709764 zram_generator::config[1221]: No configuration found. Mar 25 01:15:33.764848 ldconfig[1161]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 25 01:15:33.807207 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:15:33.855900 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 25 01:15:33.855987 systemd[1]: Reloading finished in 199 ms. Mar 25 01:15:33.876465 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 25 01:15:33.877972 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 25 01:15:33.891197 systemd[1]: Starting ensure-sysext.service... Mar 25 01:15:33.892983 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 25 01:15:33.907772 systemd[1]: Reload requested from client PID 1257 ('systemctl') (unit ensure-sysext.service)... Mar 25 01:15:33.907787 systemd[1]: Reloading... Mar 25 01:15:33.910478 systemd-tmpfiles[1258]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 25 01:15:33.911006 systemd-tmpfiles[1258]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 25 01:15:33.911829 systemd-tmpfiles[1258]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 25 01:15:33.912203 systemd-tmpfiles[1258]: ACLs are not supported, ignoring. Mar 25 01:15:33.912351 systemd-tmpfiles[1258]: ACLs are not supported, ignoring. Mar 25 01:15:33.915065 systemd-tmpfiles[1258]: Detected autofs mount point /boot during canonicalization of boot. Mar 25 01:15:33.915161 systemd-tmpfiles[1258]: Skipping /boot Mar 25 01:15:33.924012 systemd-tmpfiles[1258]: Detected autofs mount point /boot during canonicalization of boot. Mar 25 01:15:33.924115 systemd-tmpfiles[1258]: Skipping /boot Mar 25 01:15:33.955884 zram_generator::config[1290]: No configuration found. Mar 25 01:15:34.034297 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:15:34.084081 systemd[1]: Reloading finished in 176 ms. Mar 25 01:15:34.094436 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 25 01:15:34.096913 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 01:15:34.114550 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 25 01:15:34.118393 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 25 01:15:34.122512 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 25 01:15:34.135071 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 25 01:15:34.140948 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 01:15:34.146990 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 25 01:15:34.149358 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 25 01:15:34.155562 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 25 01:15:34.161139 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 25 01:15:34.171101 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 01:15:34.172385 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 25 01:15:34.174753 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 25 01:15:34.179385 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 25 01:15:34.180554 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 01:15:34.180783 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 01:15:34.182439 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 25 01:15:34.184629 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 25 01:15:34.184785 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 25 01:15:34.186453 systemd-udevd[1333]: Using default interface naming scheme 'v255'. Mar 25 01:15:34.188161 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 25 01:15:34.190345 augenrules[1356]: No rules Mar 25 01:15:34.191196 systemd[1]: audit-rules.service: Deactivated successfully. Mar 25 01:15:34.191388 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 25 01:15:34.194936 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 25 01:15:34.204038 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 25 01:15:34.204232 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 25 01:15:34.205820 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 25 01:15:34.206013 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 25 01:15:34.213215 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 25 01:15:34.217114 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 01:15:34.219831 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 25 01:15:34.232470 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 25 01:15:34.234631 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 25 01:15:34.236934 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 25 01:15:34.237969 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 01:15:34.238091 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 01:15:34.238202 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 25 01:15:34.239174 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 25 01:15:34.244328 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 01:15:34.246052 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 25 01:15:34.246206 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 25 01:15:34.251140 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 25 01:15:34.251299 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 25 01:15:34.254774 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 25 01:15:34.254977 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 25 01:15:34.260872 systemd[1]: Finished ensure-sysext.service. Mar 25 01:15:34.272850 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 25 01:15:34.273306 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 25 01:15:34.276779 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Mar 25 01:15:34.277426 augenrules[1372]: /sbin/augenrules: No change Mar 25 01:15:34.282151 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 25 01:15:34.283136 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 25 01:15:34.283204 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 25 01:15:34.284782 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 25 01:15:34.291189 augenrules[1424]: No rules Mar 25 01:15:34.295340 systemd-resolved[1332]: Positive Trust Anchors: Mar 25 01:15:34.295351 systemd-resolved[1332]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 25 01:15:34.295382 systemd-resolved[1332]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 25 01:15:34.296421 systemd[1]: audit-rules.service: Deactivated successfully. Mar 25 01:15:34.296632 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 25 01:15:34.307140 systemd-resolved[1332]: Defaulting to hostname 'linux'. Mar 25 01:15:34.310804 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 25 01:15:34.312145 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 25 01:15:34.331930 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (1387) Mar 25 01:15:34.357656 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Mar 25 01:15:34.363293 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 25 01:15:34.375256 systemd-networkd[1422]: lo: Link UP Mar 25 01:15:34.375267 systemd-networkd[1422]: lo: Gained carrier Mar 25 01:15:34.376278 systemd-networkd[1422]: Enumeration completed Mar 25 01:15:34.377457 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 25 01:15:34.378640 systemd[1]: Reached target network.target - Network. Mar 25 01:15:34.380675 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 25 01:15:34.382420 systemd-networkd[1422]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:15:34.382423 systemd-networkd[1422]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 25 01:15:34.382917 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 25 01:15:34.383122 systemd-networkd[1422]: eth0: Link UP Mar 25 01:15:34.383129 systemd-networkd[1422]: eth0: Gained carrier Mar 25 01:15:34.383143 systemd-networkd[1422]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:15:34.384473 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 25 01:15:34.386415 systemd[1]: Reached target time-set.target - System Time Set. Mar 25 01:15:34.393160 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 25 01:15:34.394945 systemd-networkd[1422]: eth0: DHCPv4 address 10.0.0.53/16, gateway 10.0.0.1 acquired from 10.0.0.1 Mar 25 01:15:34.395451 systemd-timesyncd[1423]: Network configuration changed, trying to establish connection. Mar 25 01:15:34.398668 systemd-timesyncd[1423]: Contacted time server 10.0.0.1:123 (10.0.0.1). Mar 25 01:15:34.398781 systemd-timesyncd[1423]: Initial clock synchronization to Tue 2025-03-25 01:15:34.516834 UTC. Mar 25 01:15:34.407033 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 25 01:15:34.418385 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:15:34.442115 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 25 01:15:34.445098 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 25 01:15:34.473190 lvm[1445]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 25 01:15:34.472036 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:15:34.500241 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 25 01:15:34.501700 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 25 01:15:34.504014 systemd[1]: Reached target sysinit.target - System Initialization. Mar 25 01:15:34.505103 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 25 01:15:34.506316 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 25 01:15:34.507672 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 25 01:15:34.508836 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 25 01:15:34.510046 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 25 01:15:34.511228 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 25 01:15:34.511266 systemd[1]: Reached target paths.target - Path Units. Mar 25 01:15:34.512137 systemd[1]: Reached target timers.target - Timer Units. Mar 25 01:15:34.513551 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 25 01:15:34.515812 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 25 01:15:34.518807 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 25 01:15:34.520209 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 25 01:15:34.521421 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 25 01:15:34.526682 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 25 01:15:34.528277 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 25 01:15:34.530402 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 25 01:15:34.532056 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 25 01:15:34.533170 systemd[1]: Reached target sockets.target - Socket Units. Mar 25 01:15:34.534093 systemd[1]: Reached target basic.target - Basic System. Mar 25 01:15:34.535041 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 25 01:15:34.535074 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 25 01:15:34.535917 systemd[1]: Starting containerd.service - containerd container runtime... Mar 25 01:15:34.539894 lvm[1453]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 25 01:15:34.537759 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 25 01:15:34.540242 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 25 01:15:34.543134 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 25 01:15:34.544296 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 25 01:15:34.546161 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 25 01:15:34.550101 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 25 01:15:34.552260 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 25 01:15:34.553766 jq[1456]: false Mar 25 01:15:34.556191 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 25 01:15:34.560324 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 25 01:15:34.562190 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 25 01:15:34.562639 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 25 01:15:34.563843 systemd[1]: Starting update-engine.service - Update Engine... Mar 25 01:15:34.567221 dbus-daemon[1455]: [system] SELinux support is enabled Mar 25 01:15:34.567613 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 25 01:15:34.569575 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 25 01:15:34.574758 extend-filesystems[1457]: Found loop3 Mar 25 01:15:34.579707 extend-filesystems[1457]: Found loop4 Mar 25 01:15:34.579707 extend-filesystems[1457]: Found loop5 Mar 25 01:15:34.579707 extend-filesystems[1457]: Found vda Mar 25 01:15:34.579707 extend-filesystems[1457]: Found vda1 Mar 25 01:15:34.579707 extend-filesystems[1457]: Found vda2 Mar 25 01:15:34.579707 extend-filesystems[1457]: Found vda3 Mar 25 01:15:34.579707 extend-filesystems[1457]: Found usr Mar 25 01:15:34.579707 extend-filesystems[1457]: Found vda4 Mar 25 01:15:34.579707 extend-filesystems[1457]: Found vda6 Mar 25 01:15:34.579707 extend-filesystems[1457]: Found vda7 Mar 25 01:15:34.579707 extend-filesystems[1457]: Found vda9 Mar 25 01:15:34.579707 extend-filesystems[1457]: Checking size of /dev/vda9 Mar 25 01:15:34.577293 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 25 01:15:34.582292 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 25 01:15:34.583015 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 25 01:15:34.593610 jq[1472]: true Mar 25 01:15:34.583373 systemd[1]: motdgen.service: Deactivated successfully. Mar 25 01:15:34.583542 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 25 01:15:34.589273 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 25 01:15:34.589452 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 25 01:15:34.603324 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (1384) Mar 25 01:15:34.603383 extend-filesystems[1457]: Resized partition /dev/vda9 Mar 25 01:15:34.604498 extend-filesystems[1487]: resize2fs 1.47.2 (1-Jan-2025) Mar 25 01:15:34.620933 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Mar 25 01:15:34.613997 (ntainerd)[1481]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 25 01:15:34.625271 update_engine[1470]: I20250325 01:15:34.625030 1470 main.cc:92] Flatcar Update Engine starting Mar 25 01:15:34.628506 jq[1480]: true Mar 25 01:15:34.629911 tar[1477]: linux-arm64/helm Mar 25 01:15:34.634895 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Mar 25 01:15:34.647630 update_engine[1470]: I20250325 01:15:34.639493 1470 update_check_scheduler.cc:74] Next update check in 10m15s Mar 25 01:15:34.647686 extend-filesystems[1487]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Mar 25 01:15:34.647686 extend-filesystems[1487]: old_desc_blocks = 1, new_desc_blocks = 1 Mar 25 01:15:34.647686 extend-filesystems[1487]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Mar 25 01:15:34.645483 systemd[1]: Started update-engine.service - Update Engine. Mar 25 01:15:34.659250 extend-filesystems[1457]: Resized filesystem in /dev/vda9 Mar 25 01:15:34.647079 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 25 01:15:34.647105 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 25 01:15:34.648405 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 25 01:15:34.648421 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 25 01:15:34.652194 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 25 01:15:34.658437 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 25 01:15:34.658627 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 25 01:15:34.687442 bash[1512]: Updated "/home/core/.ssh/authorized_keys" Mar 25 01:15:34.689603 systemd-logind[1468]: Watching system buttons on /dev/input/event0 (Power Button) Mar 25 01:15:34.689912 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 25 01:15:34.691404 systemd-logind[1468]: New seat seat0. Mar 25 01:15:34.692739 systemd[1]: Started systemd-logind.service - User Login Management. Mar 25 01:15:34.695577 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Mar 25 01:15:34.752559 locksmithd[1497]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 25 01:15:34.851429 containerd[1481]: time="2025-03-25T01:15:34Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 25 01:15:34.854243 containerd[1481]: time="2025-03-25T01:15:34.852475840Z" level=info msg="starting containerd" revision=88aa2f531d6c2922003cc7929e51daf1c14caa0a version=v2.0.1 Mar 25 01:15:34.863585 containerd[1481]: time="2025-03-25T01:15:34.863550280Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="6µs" Mar 25 01:15:34.863585 containerd[1481]: time="2025-03-25T01:15:34.863581880Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 25 01:15:34.863672 containerd[1481]: time="2025-03-25T01:15:34.863602000Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 25 01:15:34.863755 containerd[1481]: time="2025-03-25T01:15:34.863735120Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 25 01:15:34.863783 containerd[1481]: time="2025-03-25T01:15:34.863758280Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 25 01:15:34.863801 containerd[1481]: time="2025-03-25T01:15:34.863782640Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 25 01:15:34.863851 containerd[1481]: time="2025-03-25T01:15:34.863834800Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 25 01:15:34.863897 containerd[1481]: time="2025-03-25T01:15:34.863850040Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 25 01:15:34.864274 containerd[1481]: time="2025-03-25T01:15:34.864132840Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 25 01:15:34.864274 containerd[1481]: time="2025-03-25T01:15:34.864156120Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 25 01:15:34.864274 containerd[1481]: time="2025-03-25T01:15:34.864167520Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 25 01:15:34.864274 containerd[1481]: time="2025-03-25T01:15:34.864176280Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 25 01:15:34.864274 containerd[1481]: time="2025-03-25T01:15:34.864247520Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 25 01:15:34.864475 containerd[1481]: time="2025-03-25T01:15:34.864451760Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 25 01:15:34.864503 containerd[1481]: time="2025-03-25T01:15:34.864488000Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 25 01:15:34.864503 containerd[1481]: time="2025-03-25T01:15:34.864498240Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 25 01:15:34.864617 containerd[1481]: time="2025-03-25T01:15:34.864522360Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 25 01:15:34.864758 containerd[1481]: time="2025-03-25T01:15:34.864735720Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 25 01:15:34.864849 containerd[1481]: time="2025-03-25T01:15:34.864795560Z" level=info msg="metadata content store policy set" policy=shared Mar 25 01:15:34.868444 containerd[1481]: time="2025-03-25T01:15:34.868407080Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 25 01:15:34.868501 containerd[1481]: time="2025-03-25T01:15:34.868454040Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 25 01:15:34.868501 containerd[1481]: time="2025-03-25T01:15:34.868468320Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 25 01:15:34.868501 containerd[1481]: time="2025-03-25T01:15:34.868480200Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 25 01:15:34.868501 containerd[1481]: time="2025-03-25T01:15:34.868492480Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 25 01:15:34.868564 containerd[1481]: time="2025-03-25T01:15:34.868504560Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 25 01:15:34.868564 containerd[1481]: time="2025-03-25T01:15:34.868516920Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 25 01:15:34.868564 containerd[1481]: time="2025-03-25T01:15:34.868529600Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 25 01:15:34.868564 containerd[1481]: time="2025-03-25T01:15:34.868547520Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 25 01:15:34.868564 containerd[1481]: time="2025-03-25T01:15:34.868559480Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 25 01:15:34.868756 containerd[1481]: time="2025-03-25T01:15:34.868568520Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 25 01:15:34.868756 containerd[1481]: time="2025-03-25T01:15:34.868581080Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 25 01:15:34.868756 containerd[1481]: time="2025-03-25T01:15:34.868685400Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 25 01:15:34.868756 containerd[1481]: time="2025-03-25T01:15:34.868705960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 25 01:15:34.868756 containerd[1481]: time="2025-03-25T01:15:34.868718600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 25 01:15:34.868756 containerd[1481]: time="2025-03-25T01:15:34.868728840Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 25 01:15:34.868756 containerd[1481]: time="2025-03-25T01:15:34.868739200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 25 01:15:34.868756 containerd[1481]: time="2025-03-25T01:15:34.868750400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 25 01:15:34.868920 containerd[1481]: time="2025-03-25T01:15:34.868761960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 25 01:15:34.868920 containerd[1481]: time="2025-03-25T01:15:34.868773520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 25 01:15:34.868920 containerd[1481]: time="2025-03-25T01:15:34.868785400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 25 01:15:34.868920 containerd[1481]: time="2025-03-25T01:15:34.868795920Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 25 01:15:34.868920 containerd[1481]: time="2025-03-25T01:15:34.868806760Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 25 01:15:34.869099 containerd[1481]: time="2025-03-25T01:15:34.869076840Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 25 01:15:34.869180 containerd[1481]: time="2025-03-25T01:15:34.869109360Z" level=info msg="Start snapshots syncer" Mar 25 01:15:34.869180 containerd[1481]: time="2025-03-25T01:15:34.869136520Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 25 01:15:34.869390 containerd[1481]: time="2025-03-25T01:15:34.869352480Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 25 01:15:34.869482 containerd[1481]: time="2025-03-25T01:15:34.869405720Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 25 01:15:34.869510 containerd[1481]: time="2025-03-25T01:15:34.869485320Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 25 01:15:34.869604 containerd[1481]: time="2025-03-25T01:15:34.869577280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 25 01:15:34.869639 containerd[1481]: time="2025-03-25T01:15:34.869606040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 25 01:15:34.869639 containerd[1481]: time="2025-03-25T01:15:34.869619080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 25 01:15:34.869639 containerd[1481]: time="2025-03-25T01:15:34.869634600Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 25 01:15:34.869688 containerd[1481]: time="2025-03-25T01:15:34.869647160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 25 01:15:34.869688 containerd[1481]: time="2025-03-25T01:15:34.869658240Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 25 01:15:34.869688 containerd[1481]: time="2025-03-25T01:15:34.869671640Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 25 01:15:34.869748 containerd[1481]: time="2025-03-25T01:15:34.869693440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 25 01:15:34.869748 containerd[1481]: time="2025-03-25T01:15:34.869705960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 25 01:15:34.869748 containerd[1481]: time="2025-03-25T01:15:34.869716560Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 25 01:15:34.869797 containerd[1481]: time="2025-03-25T01:15:34.869748840Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 25 01:15:34.869797 containerd[1481]: time="2025-03-25T01:15:34.869762880Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 25 01:15:34.869797 containerd[1481]: time="2025-03-25T01:15:34.869772000Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 25 01:15:34.869797 containerd[1481]: time="2025-03-25T01:15:34.869781280Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 25 01:15:34.869797 containerd[1481]: time="2025-03-25T01:15:34.869789720Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 25 01:15:34.869894 containerd[1481]: time="2025-03-25T01:15:34.869799320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 25 01:15:34.869894 containerd[1481]: time="2025-03-25T01:15:34.869810040Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 25 01:15:34.869927 containerd[1481]: time="2025-03-25T01:15:34.869900880Z" level=info msg="runtime interface created" Mar 25 01:15:34.869927 containerd[1481]: time="2025-03-25T01:15:34.869908600Z" level=info msg="created NRI interface" Mar 25 01:15:34.869927 containerd[1481]: time="2025-03-25T01:15:34.869921160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 25 01:15:34.869982 containerd[1481]: time="2025-03-25T01:15:34.869933440Z" level=info msg="Connect containerd service" Mar 25 01:15:34.869982 containerd[1481]: time="2025-03-25T01:15:34.869960080Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 25 01:15:34.872445 containerd[1481]: time="2025-03-25T01:15:34.872204360Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 25 01:15:34.976938 containerd[1481]: time="2025-03-25T01:15:34.976877200Z" level=info msg="Start subscribing containerd event" Mar 25 01:15:34.977035 containerd[1481]: time="2025-03-25T01:15:34.976949200Z" level=info msg="Start recovering state" Mar 25 01:15:34.977079 containerd[1481]: time="2025-03-25T01:15:34.977035160Z" level=info msg="Start event monitor" Mar 25 01:15:34.977079 containerd[1481]: time="2025-03-25T01:15:34.977050480Z" level=info msg="Start cni network conf syncer for default" Mar 25 01:15:34.977079 containerd[1481]: time="2025-03-25T01:15:34.977058160Z" level=info msg="Start streaming server" Mar 25 01:15:34.977079 containerd[1481]: time="2025-03-25T01:15:34.977067200Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 25 01:15:34.977079 containerd[1481]: time="2025-03-25T01:15:34.977074440Z" level=info msg="runtime interface starting up..." Mar 25 01:15:34.977079 containerd[1481]: time="2025-03-25T01:15:34.977080000Z" level=info msg="starting plugins..." Mar 25 01:15:34.977215 containerd[1481]: time="2025-03-25T01:15:34.977093560Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 25 01:15:34.977811 containerd[1481]: time="2025-03-25T01:15:34.977783560Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 25 01:15:34.977900 containerd[1481]: time="2025-03-25T01:15:34.977880520Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 25 01:15:34.978107 systemd[1]: Started containerd.service - containerd container runtime. Mar 25 01:15:34.980980 containerd[1481]: time="2025-03-25T01:15:34.979350800Z" level=info msg="containerd successfully booted in 0.128265s" Mar 25 01:15:34.997297 tar[1477]: linux-arm64/LICENSE Mar 25 01:15:34.997385 tar[1477]: linux-arm64/README.md Mar 25 01:15:35.017116 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 25 01:15:35.277297 sshd_keygen[1473]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 25 01:15:35.295798 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 25 01:15:35.298580 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 25 01:15:35.324142 systemd[1]: issuegen.service: Deactivated successfully. Mar 25 01:15:35.324350 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 25 01:15:35.326910 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 25 01:15:35.351657 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 25 01:15:35.354369 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 25 01:15:35.356402 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Mar 25 01:15:35.357812 systemd[1]: Reached target getty.target - Login Prompts. Mar 25 01:15:36.173732 systemd-networkd[1422]: eth0: Gained IPv6LL Mar 25 01:15:36.176164 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 25 01:15:36.177926 systemd[1]: Reached target network-online.target - Network is Online. Mar 25 01:15:36.180361 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... Mar 25 01:15:36.182655 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:15:36.192676 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 25 01:15:36.210179 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 25 01:15:36.211832 systemd[1]: coreos-metadata.service: Deactivated successfully. Mar 25 01:15:36.212025 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. Mar 25 01:15:36.214107 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 25 01:15:36.683066 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:15:36.684699 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 25 01:15:36.685893 systemd[1]: Startup finished in 542ms (kernel) + 5.160s (initrd) + 3.866s (userspace) = 9.569s. Mar 25 01:15:36.686408 (kubelet)[1582]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:15:37.104287 kubelet[1582]: E0325 01:15:37.104131 1582 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:15:37.106570 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:15:37.106719 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:15:37.107038 systemd[1]: kubelet.service: Consumed 763ms CPU time, 235M memory peak. Mar 25 01:15:40.642284 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 25 01:15:40.643369 systemd[1]: Started sshd@0-10.0.0.53:22-10.0.0.1:57532.service - OpenSSH per-connection server daemon (10.0.0.1:57532). Mar 25 01:15:40.717033 sshd[1595]: Accepted publickey for core from 10.0.0.1 port 57532 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:15:40.718731 sshd-session[1595]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:15:40.733728 systemd-logind[1468]: New session 1 of user core. Mar 25 01:15:40.734653 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 25 01:15:40.735603 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 25 01:15:40.756211 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 25 01:15:40.758279 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 25 01:15:40.772727 (systemd)[1599]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 25 01:15:40.774655 systemd-logind[1468]: New session c1 of user core. Mar 25 01:15:40.887236 systemd[1599]: Queued start job for default target default.target. Mar 25 01:15:40.898730 systemd[1599]: Created slice app.slice - User Application Slice. Mar 25 01:15:40.898758 systemd[1599]: Reached target paths.target - Paths. Mar 25 01:15:40.898792 systemd[1599]: Reached target timers.target - Timers. Mar 25 01:15:40.899963 systemd[1599]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 25 01:15:40.908409 systemd[1599]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 25 01:15:40.908469 systemd[1599]: Reached target sockets.target - Sockets. Mar 25 01:15:40.908506 systemd[1599]: Reached target basic.target - Basic System. Mar 25 01:15:40.908534 systemd[1599]: Reached target default.target - Main User Target. Mar 25 01:15:40.908557 systemd[1599]: Startup finished in 129ms. Mar 25 01:15:40.908738 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 25 01:15:40.922079 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 25 01:15:40.982508 systemd[1]: Started sshd@1-10.0.0.53:22-10.0.0.1:57540.service - OpenSSH per-connection server daemon (10.0.0.1:57540). Mar 25 01:15:41.046186 sshd[1610]: Accepted publickey for core from 10.0.0.1 port 57540 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:15:41.047297 sshd-session[1610]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:15:41.050879 systemd-logind[1468]: New session 2 of user core. Mar 25 01:15:41.067091 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 25 01:15:41.118518 sshd[1612]: Connection closed by 10.0.0.1 port 57540 Mar 25 01:15:41.118399 sshd-session[1610]: pam_unix(sshd:session): session closed for user core Mar 25 01:15:41.135430 systemd[1]: sshd@1-10.0.0.53:22-10.0.0.1:57540.service: Deactivated successfully. Mar 25 01:15:41.136903 systemd[1]: session-2.scope: Deactivated successfully. Mar 25 01:15:41.137518 systemd-logind[1468]: Session 2 logged out. Waiting for processes to exit. Mar 25 01:15:41.139155 systemd[1]: Started sshd@2-10.0.0.53:22-10.0.0.1:57546.service - OpenSSH per-connection server daemon (10.0.0.1:57546). Mar 25 01:15:41.142195 systemd-logind[1468]: Removed session 2. Mar 25 01:15:41.189884 sshd[1617]: Accepted publickey for core from 10.0.0.1 port 57546 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:15:41.191012 sshd-session[1617]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:15:41.194730 systemd-logind[1468]: New session 3 of user core. Mar 25 01:15:41.204025 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 25 01:15:41.250986 sshd[1620]: Connection closed by 10.0.0.1 port 57546 Mar 25 01:15:41.251454 sshd-session[1617]: pam_unix(sshd:session): session closed for user core Mar 25 01:15:41.260969 systemd[1]: sshd@2-10.0.0.53:22-10.0.0.1:57546.service: Deactivated successfully. Mar 25 01:15:41.262387 systemd[1]: session-3.scope: Deactivated successfully. Mar 25 01:15:41.265027 systemd-logind[1468]: Session 3 logged out. Waiting for processes to exit. Mar 25 01:15:41.265731 systemd[1]: Started sshd@3-10.0.0.53:22-10.0.0.1:57554.service - OpenSSH per-connection server daemon (10.0.0.1:57554). Mar 25 01:15:41.266488 systemd-logind[1468]: Removed session 3. Mar 25 01:15:41.310521 sshd[1625]: Accepted publickey for core from 10.0.0.1 port 57554 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:15:41.311670 sshd-session[1625]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:15:41.315575 systemd-logind[1468]: New session 4 of user core. Mar 25 01:15:41.328004 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 25 01:15:41.379062 sshd[1628]: Connection closed by 10.0.0.1 port 57554 Mar 25 01:15:41.379374 sshd-session[1625]: pam_unix(sshd:session): session closed for user core Mar 25 01:15:41.388905 systemd[1]: sshd@3-10.0.0.53:22-10.0.0.1:57554.service: Deactivated successfully. Mar 25 01:15:41.390395 systemd[1]: session-4.scope: Deactivated successfully. Mar 25 01:15:41.391666 systemd-logind[1468]: Session 4 logged out. Waiting for processes to exit. Mar 25 01:15:41.392729 systemd[1]: Started sshd@4-10.0.0.53:22-10.0.0.1:57570.service - OpenSSH per-connection server daemon (10.0.0.1:57570). Mar 25 01:15:41.393395 systemd-logind[1468]: Removed session 4. Mar 25 01:15:41.445017 sshd[1633]: Accepted publickey for core from 10.0.0.1 port 57570 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:15:41.446325 sshd-session[1633]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:15:41.450628 systemd-logind[1468]: New session 5 of user core. Mar 25 01:15:41.459095 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 25 01:15:41.523918 sudo[1637]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 25 01:15:41.524184 sudo[1637]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:15:41.538679 sudo[1637]: pam_unix(sudo:session): session closed for user root Mar 25 01:15:41.539952 sshd[1636]: Connection closed by 10.0.0.1 port 57570 Mar 25 01:15:41.540447 sshd-session[1633]: pam_unix(sshd:session): session closed for user core Mar 25 01:15:41.559519 systemd[1]: sshd@4-10.0.0.53:22-10.0.0.1:57570.service: Deactivated successfully. Mar 25 01:15:41.560991 systemd[1]: session-5.scope: Deactivated successfully. Mar 25 01:15:41.561656 systemd-logind[1468]: Session 5 logged out. Waiting for processes to exit. Mar 25 01:15:41.563377 systemd[1]: Started sshd@5-10.0.0.53:22-10.0.0.1:57584.service - OpenSSH per-connection server daemon (10.0.0.1:57584). Mar 25 01:15:41.564064 systemd-logind[1468]: Removed session 5. Mar 25 01:15:41.619067 sshd[1642]: Accepted publickey for core from 10.0.0.1 port 57584 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:15:41.620337 sshd-session[1642]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:15:41.624077 systemd-logind[1468]: New session 6 of user core. Mar 25 01:15:41.641015 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 25 01:15:41.690995 sudo[1647]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 25 01:15:41.691265 sudo[1647]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:15:41.694175 sudo[1647]: pam_unix(sudo:session): session closed for user root Mar 25 01:15:41.698619 sudo[1646]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 25 01:15:41.698902 sudo[1646]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:15:41.706734 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 25 01:15:41.747271 augenrules[1669]: No rules Mar 25 01:15:41.748556 systemd[1]: audit-rules.service: Deactivated successfully. Mar 25 01:15:41.749903 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 25 01:15:41.750768 sudo[1646]: pam_unix(sudo:session): session closed for user root Mar 25 01:15:41.751996 sshd[1645]: Connection closed by 10.0.0.1 port 57584 Mar 25 01:15:41.752406 sshd-session[1642]: pam_unix(sshd:session): session closed for user core Mar 25 01:15:41.762833 systemd[1]: sshd@5-10.0.0.53:22-10.0.0.1:57584.service: Deactivated successfully. Mar 25 01:15:41.765355 systemd[1]: session-6.scope: Deactivated successfully. Mar 25 01:15:41.766690 systemd-logind[1468]: Session 6 logged out. Waiting for processes to exit. Mar 25 01:15:41.768222 systemd[1]: Started sshd@6-10.0.0.53:22-10.0.0.1:57592.service - OpenSSH per-connection server daemon (10.0.0.1:57592). Mar 25 01:15:41.769004 systemd-logind[1468]: Removed session 6. Mar 25 01:15:41.830952 sshd[1677]: Accepted publickey for core from 10.0.0.1 port 57592 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:15:41.832090 sshd-session[1677]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:15:41.836212 systemd-logind[1468]: New session 7 of user core. Mar 25 01:15:41.842012 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 25 01:15:41.891615 sudo[1681]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 25 01:15:41.892201 sudo[1681]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:15:42.221970 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 25 01:15:42.235223 (dockerd)[1701]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 25 01:15:42.487086 dockerd[1701]: time="2025-03-25T01:15:42.486959997Z" level=info msg="Starting up" Mar 25 01:15:42.489699 dockerd[1701]: time="2025-03-25T01:15:42.489660097Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 25 01:15:42.586218 dockerd[1701]: time="2025-03-25T01:15:42.586173880Z" level=info msg="Loading containers: start." Mar 25 01:15:42.713904 kernel: Initializing XFRM netlink socket Mar 25 01:15:42.776271 systemd-networkd[1422]: docker0: Link UP Mar 25 01:15:42.838316 dockerd[1701]: time="2025-03-25T01:15:42.838224379Z" level=info msg="Loading containers: done." Mar 25 01:15:42.853943 dockerd[1701]: time="2025-03-25T01:15:42.853837992Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 25 01:15:42.854092 dockerd[1701]: time="2025-03-25T01:15:42.853979811Z" level=info msg="Docker daemon" commit=c710b88579fcb5e0d53f96dcae976d79323b9166 containerd-snapshotter=false storage-driver=overlay2 version=27.4.1 Mar 25 01:15:42.854178 dockerd[1701]: time="2025-03-25T01:15:42.854149027Z" level=info msg="Daemon has completed initialization" Mar 25 01:15:42.882504 dockerd[1701]: time="2025-03-25T01:15:42.882415812Z" level=info msg="API listen on /run/docker.sock" Mar 25 01:15:42.882596 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 25 01:15:43.830846 containerd[1481]: time="2025-03-25T01:15:43.830807652Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.7\"" Mar 25 01:15:44.530729 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2938599483.mount: Deactivated successfully. Mar 25 01:15:46.135592 containerd[1481]: time="2025-03-25T01:15:46.135524808Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:15:46.136253 containerd[1481]: time="2025-03-25T01:15:46.136142262Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.7: active requests=0, bytes read=25552768" Mar 25 01:15:46.137281 containerd[1481]: time="2025-03-25T01:15:46.137218912Z" level=info msg="ImageCreate event name:\"sha256:26ae5fde2308729bfda71fa20aa73cb5a1a4490f107f62dc7e1c4c49823cc084\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:15:46.140273 containerd[1481]: time="2025-03-25T01:15:46.140237386Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:22c19cc70fe5806d0a2cb28a6b6b33fd34e6f9e50616bdf6d53649bcfafbc277\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:15:46.141283 containerd[1481]: time="2025-03-25T01:15:46.141257244Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.7\" with image id \"sha256:26ae5fde2308729bfda71fa20aa73cb5a1a4490f107f62dc7e1c4c49823cc084\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.7\", repo digest \"registry.k8s.io/kube-apiserver@sha256:22c19cc70fe5806d0a2cb28a6b6b33fd34e6f9e50616bdf6d53649bcfafbc277\", size \"25549566\" in 2.310408238s" Mar 25 01:15:46.141335 containerd[1481]: time="2025-03-25T01:15:46.141289995Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.7\" returns image reference \"sha256:26ae5fde2308729bfda71fa20aa73cb5a1a4490f107f62dc7e1c4c49823cc084\"" Mar 25 01:15:46.141914 containerd[1481]: time="2025-03-25T01:15:46.141891354Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.7\"" Mar 25 01:15:47.357457 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 25 01:15:47.358897 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:15:47.469559 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:15:47.472671 (kubelet)[1972]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:15:47.518490 kubelet[1972]: E0325 01:15:47.518398 1972 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:15:47.521487 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:15:47.521647 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:15:47.522000 systemd[1]: kubelet.service: Consumed 131ms CPU time, 96.6M memory peak. Mar 25 01:15:48.065721 containerd[1481]: time="2025-03-25T01:15:48.065674572Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:15:48.066148 containerd[1481]: time="2025-03-25T01:15:48.066077178Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.7: active requests=0, bytes read=22458980" Mar 25 01:15:48.067100 containerd[1481]: time="2025-03-25T01:15:48.067073567Z" level=info msg="ImageCreate event name:\"sha256:3f2886c2c7c101461e78c37591f8beb12ac073f8dcf5e32c95da9e9689d0c1d3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:15:48.069541 containerd[1481]: time="2025-03-25T01:15:48.069510778Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:6abe7a0accecf29db6ebab18a10f844678ffed693d79e2e51a18a6f2b4530cbb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:15:48.071312 containerd[1481]: time="2025-03-25T01:15:48.071276686Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.7\" with image id \"sha256:3f2886c2c7c101461e78c37591f8beb12ac073f8dcf5e32c95da9e9689d0c1d3\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.7\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:6abe7a0accecf29db6ebab18a10f844678ffed693d79e2e51a18a6f2b4530cbb\", size \"23899774\" in 1.929351142s" Mar 25 01:15:48.071362 containerd[1481]: time="2025-03-25T01:15:48.071313381Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.7\" returns image reference \"sha256:3f2886c2c7c101461e78c37591f8beb12ac073f8dcf5e32c95da9e9689d0c1d3\"" Mar 25 01:15:48.071823 containerd[1481]: time="2025-03-25T01:15:48.071713300Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.7\"" Mar 25 01:15:49.496362 containerd[1481]: time="2025-03-25T01:15:49.496311034Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:15:49.496801 containerd[1481]: time="2025-03-25T01:15:49.496748469Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.7: active requests=0, bytes read=17125831" Mar 25 01:15:49.497581 containerd[1481]: time="2025-03-25T01:15:49.497553459Z" level=info msg="ImageCreate event name:\"sha256:3dd474fdc8c0d007008dd47bafecdd344fbdace928731ae8b09f58f633f4a30f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:15:49.500126 containerd[1481]: time="2025-03-25T01:15:49.500074231Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:fb80249bcb77ee72b1c9fa5b70bc28a83ed107c9ca71957841ad91db379963bf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:15:49.501181 containerd[1481]: time="2025-03-25T01:15:49.501147913Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.7\" with image id \"sha256:3dd474fdc8c0d007008dd47bafecdd344fbdace928731ae8b09f58f633f4a30f\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.7\", repo digest \"registry.k8s.io/kube-scheduler@sha256:fb80249bcb77ee72b1c9fa5b70bc28a83ed107c9ca71957841ad91db379963bf\", size \"18566643\" in 1.42940161s" Mar 25 01:15:49.501181 containerd[1481]: time="2025-03-25T01:15:49.501181709Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.7\" returns image reference \"sha256:3dd474fdc8c0d007008dd47bafecdd344fbdace928731ae8b09f58f633f4a30f\"" Mar 25 01:15:49.501616 containerd[1481]: time="2025-03-25T01:15:49.501594488Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.7\"" Mar 25 01:15:50.784681 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4160332870.mount: Deactivated successfully. Mar 25 01:15:51.144852 containerd[1481]: time="2025-03-25T01:15:51.144727633Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:15:51.145376 containerd[1481]: time="2025-03-25T01:15:51.145328159Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.7: active requests=0, bytes read=26871917" Mar 25 01:15:51.146375 containerd[1481]: time="2025-03-25T01:15:51.146344930Z" level=info msg="ImageCreate event name:\"sha256:939054a0dc9c7c1596b061fc2380758139ce62751b44a0b21b3afc7abd7eb3ff\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:15:51.148079 containerd[1481]: time="2025-03-25T01:15:51.148044571Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:e5839270c96c3ad1bea1dce4935126d3281297527f3655408d2970aa4b5cf178\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:15:51.148677 containerd[1481]: time="2025-03-25T01:15:51.148526771Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.7\" with image id \"sha256:939054a0dc9c7c1596b061fc2380758139ce62751b44a0b21b3afc7abd7eb3ff\", repo tag \"registry.k8s.io/kube-proxy:v1.31.7\", repo digest \"registry.k8s.io/kube-proxy@sha256:e5839270c96c3ad1bea1dce4935126d3281297527f3655408d2970aa4b5cf178\", size \"26870934\" in 1.646899814s" Mar 25 01:15:51.148677 containerd[1481]: time="2025-03-25T01:15:51.148555421Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.7\" returns image reference \"sha256:939054a0dc9c7c1596b061fc2380758139ce62751b44a0b21b3afc7abd7eb3ff\"" Mar 25 01:15:51.149051 containerd[1481]: time="2025-03-25T01:15:51.149027243Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Mar 25 01:15:51.756929 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3596482611.mount: Deactivated successfully. Mar 25 01:15:52.763389 containerd[1481]: time="2025-03-25T01:15:52.763344346Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:15:52.764530 containerd[1481]: time="2025-03-25T01:15:52.763777446Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=16485383" Mar 25 01:15:52.764901 containerd[1481]: time="2025-03-25T01:15:52.764870993Z" level=info msg="ImageCreate event name:\"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:15:52.767868 containerd[1481]: time="2025-03-25T01:15:52.767817044Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:15:52.769017 containerd[1481]: time="2025-03-25T01:15:52.768926976Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"16482581\" in 1.619863113s" Mar 25 01:15:52.769017 containerd[1481]: time="2025-03-25T01:15:52.768962591Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\"" Mar 25 01:15:52.769753 containerd[1481]: time="2025-03-25T01:15:52.769706004Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Mar 25 01:15:53.166980 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2733944805.mount: Deactivated successfully. Mar 25 01:15:53.171268 containerd[1481]: time="2025-03-25T01:15:53.171221213Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:15:53.171695 containerd[1481]: time="2025-03-25T01:15:53.171630559Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268705" Mar 25 01:15:53.172459 containerd[1481]: time="2025-03-25T01:15:53.172423537Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:15:53.174706 containerd[1481]: time="2025-03-25T01:15:53.174666810Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:15:53.175520 containerd[1481]: time="2025-03-25T01:15:53.175475049Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 405.721494ms" Mar 25 01:15:53.175559 containerd[1481]: time="2025-03-25T01:15:53.175520950Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Mar 25 01:15:53.178414 containerd[1481]: time="2025-03-25T01:15:53.178369991Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Mar 25 01:15:53.791423 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1342761215.mount: Deactivated successfully. Mar 25 01:15:57.332258 containerd[1481]: time="2025-03-25T01:15:57.332042839Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:15:57.333172 containerd[1481]: time="2025-03-25T01:15:57.332864642Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=66406427" Mar 25 01:15:57.333916 containerd[1481]: time="2025-03-25T01:15:57.333864505Z" level=info msg="ImageCreate event name:\"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:15:57.336549 containerd[1481]: time="2025-03-25T01:15:57.336510975Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:15:57.337774 containerd[1481]: time="2025-03-25T01:15:57.337723805Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"66535646\" in 4.15931729s" Mar 25 01:15:57.337774 containerd[1481]: time="2025-03-25T01:15:57.337757071Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\"" Mar 25 01:15:57.772026 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 25 01:15:57.773553 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:15:57.875756 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:15:57.879150 (kubelet)[2116]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:15:57.915106 kubelet[2116]: E0325 01:15:57.914971 2116 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:15:57.916810 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:15:57.916961 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:15:57.917229 systemd[1]: kubelet.service: Consumed 126ms CPU time, 94.9M memory peak. Mar 25 01:16:03.866569 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:16:03.866719 systemd[1]: kubelet.service: Consumed 126ms CPU time, 94.9M memory peak. Mar 25 01:16:03.868644 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:16:03.889615 systemd[1]: Reload requested from client PID 2145 ('systemctl') (unit session-7.scope)... Mar 25 01:16:03.889632 systemd[1]: Reloading... Mar 25 01:16:03.965883 zram_generator::config[2192]: No configuration found. Mar 25 01:16:04.074904 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:16:04.146139 systemd[1]: Reloading finished in 256 ms. Mar 25 01:16:04.185205 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:16:04.188144 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:16:04.189120 systemd[1]: kubelet.service: Deactivated successfully. Mar 25 01:16:04.189315 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:16:04.189356 systemd[1]: kubelet.service: Consumed 84ms CPU time, 82.5M memory peak. Mar 25 01:16:04.190764 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:16:04.300874 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:16:04.304344 (kubelet)[2236]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 25 01:16:04.339038 kubelet[2236]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:16:04.339038 kubelet[2236]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 25 01:16:04.339038 kubelet[2236]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:16:04.339421 kubelet[2236]: I0325 01:16:04.339185 2236 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 25 01:16:04.822923 kubelet[2236]: I0325 01:16:04.822883 2236 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" Mar 25 01:16:04.822923 kubelet[2236]: I0325 01:16:04.822910 2236 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 25 01:16:04.823174 kubelet[2236]: I0325 01:16:04.823148 2236 server.go:929] "Client rotation is on, will bootstrap in background" Mar 25 01:16:04.860774 kubelet[2236]: E0325 01:16:04.860736 2236 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.0.53:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.53:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:16:04.864539 kubelet[2236]: I0325 01:16:04.864437 2236 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 25 01:16:04.873789 kubelet[2236]: I0325 01:16:04.873693 2236 server.go:1426] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 25 01:16:04.877076 kubelet[2236]: I0325 01:16:04.877050 2236 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 25 01:16:04.877834 kubelet[2236]: I0325 01:16:04.877808 2236 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 25 01:16:04.878004 kubelet[2236]: I0325 01:16:04.877975 2236 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 25 01:16:04.878161 kubelet[2236]: I0325 01:16:04.877998 2236 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 25 01:16:04.878297 kubelet[2236]: I0325 01:16:04.878278 2236 topology_manager.go:138] "Creating topology manager with none policy" Mar 25 01:16:04.878297 kubelet[2236]: I0325 01:16:04.878289 2236 container_manager_linux.go:300] "Creating device plugin manager" Mar 25 01:16:04.878470 kubelet[2236]: I0325 01:16:04.878452 2236 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:16:04.880140 kubelet[2236]: I0325 01:16:04.880085 2236 kubelet.go:408] "Attempting to sync node with API server" Mar 25 01:16:04.880140 kubelet[2236]: I0325 01:16:04.880110 2236 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 25 01:16:04.880218 kubelet[2236]: I0325 01:16:04.880197 2236 kubelet.go:314] "Adding apiserver pod source" Mar 25 01:16:04.880218 kubelet[2236]: I0325 01:16:04.880207 2236 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 25 01:16:04.883210 kubelet[2236]: I0325 01:16:04.883052 2236 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 25 01:16:04.883210 kubelet[2236]: W0325 01:16:04.883188 2236 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.53:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.53:6443: connect: connection refused Mar 25 01:16:04.883317 kubelet[2236]: W0325 01:16:04.883201 2236 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.53:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.53:6443: connect: connection refused Mar 25 01:16:04.883317 kubelet[2236]: E0325 01:16:04.883287 2236 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.53:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.53:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:16:04.883555 kubelet[2236]: E0325 01:16:04.883237 2236 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.0.53:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.53:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:16:04.885587 kubelet[2236]: I0325 01:16:04.885506 2236 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 25 01:16:04.888686 kubelet[2236]: W0325 01:16:04.888657 2236 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 25 01:16:04.891098 kubelet[2236]: I0325 01:16:04.890804 2236 server.go:1269] "Started kubelet" Mar 25 01:16:04.891225 kubelet[2236]: I0325 01:16:04.891172 2236 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 25 01:16:04.891810 kubelet[2236]: I0325 01:16:04.891738 2236 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 25 01:16:04.892339 kubelet[2236]: I0325 01:16:04.892165 2236 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 25 01:16:04.892846 kubelet[2236]: I0325 01:16:04.892820 2236 server.go:460] "Adding debug handlers to kubelet server" Mar 25 01:16:04.893999 kubelet[2236]: I0325 01:16:04.893967 2236 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 25 01:16:04.894657 kubelet[2236]: I0325 01:16:04.894484 2236 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 25 01:16:04.894657 kubelet[2236]: I0325 01:16:04.894537 2236 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 25 01:16:04.895751 kubelet[2236]: E0325 01:16:04.895687 2236 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 25 01:16:04.895751 kubelet[2236]: I0325 01:16:04.895701 2236 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 25 01:16:04.896497 kubelet[2236]: I0325 01:16:04.896044 2236 reconciler.go:26] "Reconciler: start to sync state" Mar 25 01:16:04.896497 kubelet[2236]: E0325 01:16:04.894090 2236 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.53:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.53:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.182fe6d0b5a58bc8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-03-25 01:16:04.890782664 +0000 UTC m=+0.583429645,LastTimestamp:2025-03-25 01:16:04.890782664 +0000 UTC m=+0.583429645,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Mar 25 01:16:04.896497 kubelet[2236]: E0325 01:16:04.896315 2236 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.53:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.53:6443: connect: connection refused" interval="200ms" Mar 25 01:16:04.897335 kubelet[2236]: W0325 01:16:04.896686 2236 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.53:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.53:6443: connect: connection refused Mar 25 01:16:04.897335 kubelet[2236]: E0325 01:16:04.896733 2236 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.53:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.53:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:16:04.897335 kubelet[2236]: I0325 01:16:04.896936 2236 factory.go:221] Registration of the systemd container factory successfully Mar 25 01:16:04.897335 kubelet[2236]: I0325 01:16:04.897022 2236 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 25 01:16:04.898148 kubelet[2236]: E0325 01:16:04.898117 2236 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 25 01:16:04.899102 kubelet[2236]: I0325 01:16:04.899080 2236 factory.go:221] Registration of the containerd container factory successfully Mar 25 01:16:04.908000 kubelet[2236]: I0325 01:16:04.907980 2236 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 25 01:16:04.909790 kubelet[2236]: I0325 01:16:04.909760 2236 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 25 01:16:04.909790 kubelet[2236]: I0325 01:16:04.909786 2236 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:16:04.911136 kubelet[2236]: I0325 01:16:04.911101 2236 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 25 01:16:04.912093 kubelet[2236]: I0325 01:16:04.911701 2236 policy_none.go:49] "None policy: Start" Mar 25 01:16:04.912326 kubelet[2236]: I0325 01:16:04.912289 2236 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 25 01:16:04.912326 kubelet[2236]: I0325 01:16:04.912324 2236 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 25 01:16:04.912529 kubelet[2236]: I0325 01:16:04.912339 2236 kubelet.go:2321] "Starting kubelet main sync loop" Mar 25 01:16:04.912529 kubelet[2236]: E0325 01:16:04.912376 2236 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 25 01:16:04.913364 kubelet[2236]: I0325 01:16:04.913335 2236 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 25 01:16:04.913364 kubelet[2236]: I0325 01:16:04.913361 2236 state_mem.go:35] "Initializing new in-memory state store" Mar 25 01:16:04.913433 kubelet[2236]: W0325 01:16:04.913360 2236 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.53:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.53:6443: connect: connection refused Mar 25 01:16:04.913433 kubelet[2236]: E0325 01:16:04.913405 2236 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.0.53:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.53:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:16:04.919640 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 25 01:16:04.931628 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 25 01:16:04.934543 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 25 01:16:04.943022 kubelet[2236]: I0325 01:16:04.942577 2236 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 25 01:16:04.943022 kubelet[2236]: I0325 01:16:04.942786 2236 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 25 01:16:04.943022 kubelet[2236]: I0325 01:16:04.942798 2236 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 25 01:16:04.943165 kubelet[2236]: I0325 01:16:04.943040 2236 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 25 01:16:04.944344 kubelet[2236]: E0325 01:16:04.944320 2236 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Mar 25 01:16:05.020542 systemd[1]: Created slice kubepods-burstable-pod0a954a86ffd87ccda2e357ea22436588.slice - libcontainer container kubepods-burstable-pod0a954a86ffd87ccda2e357ea22436588.slice. Mar 25 01:16:05.042931 systemd[1]: Created slice kubepods-burstable-pod60762308083b5ef6c837b1be48ec53d6.slice - libcontainer container kubepods-burstable-pod60762308083b5ef6c837b1be48ec53d6.slice. Mar 25 01:16:05.043692 kubelet[2236]: I0325 01:16:05.043655 2236 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Mar 25 01:16:05.044245 kubelet[2236]: E0325 01:16:05.044052 2236 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.53:6443/api/v1/nodes\": dial tcp 10.0.0.53:6443: connect: connection refused" node="localhost" Mar 25 01:16:05.046613 systemd[1]: Created slice kubepods-burstable-pod6f32907a07e55aea05abdc5cd284a8d5.slice - libcontainer container kubepods-burstable-pod6f32907a07e55aea05abdc5cd284a8d5.slice. Mar 25 01:16:05.097411 kubelet[2236]: I0325 01:16:05.097071 2236 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/60762308083b5ef6c837b1be48ec53d6-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"60762308083b5ef6c837b1be48ec53d6\") " pod="kube-system/kube-controller-manager-localhost" Mar 25 01:16:05.097411 kubelet[2236]: I0325 01:16:05.097107 2236 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/60762308083b5ef6c837b1be48ec53d6-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"60762308083b5ef6c837b1be48ec53d6\") " pod="kube-system/kube-controller-manager-localhost" Mar 25 01:16:05.097411 kubelet[2236]: I0325 01:16:05.097122 2236 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/60762308083b5ef6c837b1be48ec53d6-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"60762308083b5ef6c837b1be48ec53d6\") " pod="kube-system/kube-controller-manager-localhost" Mar 25 01:16:05.097411 kubelet[2236]: I0325 01:16:05.097136 2236 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/60762308083b5ef6c837b1be48ec53d6-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"60762308083b5ef6c837b1be48ec53d6\") " pod="kube-system/kube-controller-manager-localhost" Mar 25 01:16:05.097411 kubelet[2236]: I0325 01:16:05.097153 2236 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0a954a86ffd87ccda2e357ea22436588-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"0a954a86ffd87ccda2e357ea22436588\") " pod="kube-system/kube-apiserver-localhost" Mar 25 01:16:05.097601 kubelet[2236]: I0325 01:16:05.097169 2236 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0a954a86ffd87ccda2e357ea22436588-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"0a954a86ffd87ccda2e357ea22436588\") " pod="kube-system/kube-apiserver-localhost" Mar 25 01:16:05.097601 kubelet[2236]: I0325 01:16:05.097186 2236 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/60762308083b5ef6c837b1be48ec53d6-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"60762308083b5ef6c837b1be48ec53d6\") " pod="kube-system/kube-controller-manager-localhost" Mar 25 01:16:05.097601 kubelet[2236]: I0325 01:16:05.097200 2236 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6f32907a07e55aea05abdc5cd284a8d5-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"6f32907a07e55aea05abdc5cd284a8d5\") " pod="kube-system/kube-scheduler-localhost" Mar 25 01:16:05.097601 kubelet[2236]: I0325 01:16:05.097213 2236 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0a954a86ffd87ccda2e357ea22436588-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"0a954a86ffd87ccda2e357ea22436588\") " pod="kube-system/kube-apiserver-localhost" Mar 25 01:16:05.097601 kubelet[2236]: E0325 01:16:05.097288 2236 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.53:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.53:6443: connect: connection refused" interval="400ms" Mar 25 01:16:05.246826 kubelet[2236]: I0325 01:16:05.246344 2236 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Mar 25 01:16:05.246826 kubelet[2236]: E0325 01:16:05.246719 2236 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.53:6443/api/v1/nodes\": dial tcp 10.0.0.53:6443: connect: connection refused" node="localhost" Mar 25 01:16:05.341276 containerd[1481]: time="2025-03-25T01:16:05.341208132Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:0a954a86ffd87ccda2e357ea22436588,Namespace:kube-system,Attempt:0,}" Mar 25 01:16:05.346404 containerd[1481]: time="2025-03-25T01:16:05.346148101Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:60762308083b5ef6c837b1be48ec53d6,Namespace:kube-system,Attempt:0,}" Mar 25 01:16:05.349144 containerd[1481]: time="2025-03-25T01:16:05.348980863Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:6f32907a07e55aea05abdc5cd284a8d5,Namespace:kube-system,Attempt:0,}" Mar 25 01:16:05.367387 containerd[1481]: time="2025-03-25T01:16:05.367313595Z" level=info msg="connecting to shim d9def6639c35ff2e975cbc0c5212229cc260c7f8ef4315e1183d483fe50a432e" address="unix:///run/containerd/s/1f50d0fa7d758bf812fa16df90b0348a1ff3294c8294059e24318f8f0afd25a5" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:16:05.376890 containerd[1481]: time="2025-03-25T01:16:05.376705842Z" level=info msg="connecting to shim bbb696f098f3c97098dde667394c93105809185a0c29f075a348e7263ecbb67a" address="unix:///run/containerd/s/bc66b11543681bcd6e6bb364209fd01e0900a53e8bf20452dd7927887b74d158" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:16:05.400539 containerd[1481]: time="2025-03-25T01:16:05.400493401Z" level=info msg="connecting to shim 5aff4ec7be3a10da16bb0f329d79acae4ca3d29d79f15d8e06b4348f5c9831d2" address="unix:///run/containerd/s/df63da8eb9f687b1d23ff17daf196199713ea1ba0a776ae85b7ca3d75b0b0087" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:16:05.407234 systemd[1]: Started cri-containerd-d9def6639c35ff2e975cbc0c5212229cc260c7f8ef4315e1183d483fe50a432e.scope - libcontainer container d9def6639c35ff2e975cbc0c5212229cc260c7f8ef4315e1183d483fe50a432e. Mar 25 01:16:05.416486 systemd[1]: Started cri-containerd-bbb696f098f3c97098dde667394c93105809185a0c29f075a348e7263ecbb67a.scope - libcontainer container bbb696f098f3c97098dde667394c93105809185a0c29f075a348e7263ecbb67a. Mar 25 01:16:05.425895 systemd[1]: Started cri-containerd-5aff4ec7be3a10da16bb0f329d79acae4ca3d29d79f15d8e06b4348f5c9831d2.scope - libcontainer container 5aff4ec7be3a10da16bb0f329d79acae4ca3d29d79f15d8e06b4348f5c9831d2. Mar 25 01:16:05.444757 containerd[1481]: time="2025-03-25T01:16:05.444704614Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:0a954a86ffd87ccda2e357ea22436588,Namespace:kube-system,Attempt:0,} returns sandbox id \"d9def6639c35ff2e975cbc0c5212229cc260c7f8ef4315e1183d483fe50a432e\"" Mar 25 01:16:05.448795 containerd[1481]: time="2025-03-25T01:16:05.448665120Z" level=info msg="CreateContainer within sandbox \"d9def6639c35ff2e975cbc0c5212229cc260c7f8ef4315e1183d483fe50a432e\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 25 01:16:05.460276 containerd[1481]: time="2025-03-25T01:16:05.460245075Z" level=info msg="Container 0536ebd72587444d20274ffa7b575807e47eb602bd88526df1a9b8422e0dedc4: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:16:05.462342 containerd[1481]: time="2025-03-25T01:16:05.462215605Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:60762308083b5ef6c837b1be48ec53d6,Namespace:kube-system,Attempt:0,} returns sandbox id \"bbb696f098f3c97098dde667394c93105809185a0c29f075a348e7263ecbb67a\"" Mar 25 01:16:05.464582 containerd[1481]: time="2025-03-25T01:16:05.464550673Z" level=info msg="CreateContainer within sandbox \"bbb696f098f3c97098dde667394c93105809185a0c29f075a348e7263ecbb67a\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 25 01:16:05.468496 containerd[1481]: time="2025-03-25T01:16:05.468373581Z" level=info msg="CreateContainer within sandbox \"d9def6639c35ff2e975cbc0c5212229cc260c7f8ef4315e1183d483fe50a432e\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"0536ebd72587444d20274ffa7b575807e47eb602bd88526df1a9b8422e0dedc4\"" Mar 25 01:16:05.468954 containerd[1481]: time="2025-03-25T01:16:05.468919688Z" level=info msg="StartContainer for \"0536ebd72587444d20274ffa7b575807e47eb602bd88526df1a9b8422e0dedc4\"" Mar 25 01:16:05.469771 containerd[1481]: time="2025-03-25T01:16:05.469680133Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:6f32907a07e55aea05abdc5cd284a8d5,Namespace:kube-system,Attempt:0,} returns sandbox id \"5aff4ec7be3a10da16bb0f329d79acae4ca3d29d79f15d8e06b4348f5c9831d2\"" Mar 25 01:16:05.470123 containerd[1481]: time="2025-03-25T01:16:05.470072478Z" level=info msg="connecting to shim 0536ebd72587444d20274ffa7b575807e47eb602bd88526df1a9b8422e0dedc4" address="unix:///run/containerd/s/1f50d0fa7d758bf812fa16df90b0348a1ff3294c8294059e24318f8f0afd25a5" protocol=ttrpc version=3 Mar 25 01:16:05.472972 containerd[1481]: time="2025-03-25T01:16:05.472623085Z" level=info msg="CreateContainer within sandbox \"5aff4ec7be3a10da16bb0f329d79acae4ca3d29d79f15d8e06b4348f5c9831d2\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 25 01:16:05.476838 containerd[1481]: time="2025-03-25T01:16:05.476805450Z" level=info msg="Container 02a6b5c443d5a8ba5178bd6c131175b9cf25d06b300b0da56be433ca063c79d9: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:16:05.480873 containerd[1481]: time="2025-03-25T01:16:05.480833173Z" level=info msg="Container 70f445f86cabdc5db49062e42cc18debda8677f29ba9736a7b045e6ac73d7d4a: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:16:05.487728 containerd[1481]: time="2025-03-25T01:16:05.487691058Z" level=info msg="CreateContainer within sandbox \"bbb696f098f3c97098dde667394c93105809185a0c29f075a348e7263ecbb67a\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"02a6b5c443d5a8ba5178bd6c131175b9cf25d06b300b0da56be433ca063c79d9\"" Mar 25 01:16:05.488485 containerd[1481]: time="2025-03-25T01:16:05.488280977Z" level=info msg="StartContainer for \"02a6b5c443d5a8ba5178bd6c131175b9cf25d06b300b0da56be433ca063c79d9\"" Mar 25 01:16:05.489494 containerd[1481]: time="2025-03-25T01:16:05.489464775Z" level=info msg="connecting to shim 02a6b5c443d5a8ba5178bd6c131175b9cf25d06b300b0da56be433ca063c79d9" address="unix:///run/containerd/s/bc66b11543681bcd6e6bb364209fd01e0900a53e8bf20452dd7927887b74d158" protocol=ttrpc version=3 Mar 25 01:16:05.490890 containerd[1481]: time="2025-03-25T01:16:05.490841226Z" level=info msg="CreateContainer within sandbox \"5aff4ec7be3a10da16bb0f329d79acae4ca3d29d79f15d8e06b4348f5c9831d2\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"70f445f86cabdc5db49062e42cc18debda8677f29ba9736a7b045e6ac73d7d4a\"" Mar 25 01:16:05.491198 containerd[1481]: time="2025-03-25T01:16:05.491174035Z" level=info msg="StartContainer for \"70f445f86cabdc5db49062e42cc18debda8677f29ba9736a7b045e6ac73d7d4a\"" Mar 25 01:16:05.492136 containerd[1481]: time="2025-03-25T01:16:05.492109727Z" level=info msg="connecting to shim 70f445f86cabdc5db49062e42cc18debda8677f29ba9736a7b045e6ac73d7d4a" address="unix:///run/containerd/s/df63da8eb9f687b1d23ff17daf196199713ea1ba0a776ae85b7ca3d75b0b0087" protocol=ttrpc version=3 Mar 25 01:16:05.494046 systemd[1]: Started cri-containerd-0536ebd72587444d20274ffa7b575807e47eb602bd88526df1a9b8422e0dedc4.scope - libcontainer container 0536ebd72587444d20274ffa7b575807e47eb602bd88526df1a9b8422e0dedc4. Mar 25 01:16:05.498479 kubelet[2236]: E0325 01:16:05.498417 2236 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.53:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.53:6443: connect: connection refused" interval="800ms" Mar 25 01:16:05.518018 systemd[1]: Started cri-containerd-02a6b5c443d5a8ba5178bd6c131175b9cf25d06b300b0da56be433ca063c79d9.scope - libcontainer container 02a6b5c443d5a8ba5178bd6c131175b9cf25d06b300b0da56be433ca063c79d9. Mar 25 01:16:05.519448 systemd[1]: Started cri-containerd-70f445f86cabdc5db49062e42cc18debda8677f29ba9736a7b045e6ac73d7d4a.scope - libcontainer container 70f445f86cabdc5db49062e42cc18debda8677f29ba9736a7b045e6ac73d7d4a. Mar 25 01:16:05.553235 containerd[1481]: time="2025-03-25T01:16:05.553144306Z" level=info msg="StartContainer for \"0536ebd72587444d20274ffa7b575807e47eb602bd88526df1a9b8422e0dedc4\" returns successfully" Mar 25 01:16:05.582540 containerd[1481]: time="2025-03-25T01:16:05.582498282Z" level=info msg="StartContainer for \"02a6b5c443d5a8ba5178bd6c131175b9cf25d06b300b0da56be433ca063c79d9\" returns successfully" Mar 25 01:16:05.583340 containerd[1481]: time="2025-03-25T01:16:05.583259927Z" level=info msg="StartContainer for \"70f445f86cabdc5db49062e42cc18debda8677f29ba9736a7b045e6ac73d7d4a\" returns successfully" Mar 25 01:16:05.650951 kubelet[2236]: I0325 01:16:05.650451 2236 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Mar 25 01:16:05.651030 kubelet[2236]: E0325 01:16:05.650992 2236 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.0.0.53:6443/api/v1/nodes\": dial tcp 10.0.0.53:6443: connect: connection refused" node="localhost" Mar 25 01:16:05.707277 kubelet[2236]: W0325 01:16:05.707175 2236 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.53:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.53:6443: connect: connection refused Mar 25 01:16:05.707277 kubelet[2236]: E0325 01:16:05.707248 2236 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.0.53:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.53:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:16:05.722255 kubelet[2236]: W0325 01:16:05.722153 2236 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.53:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.0.53:6443: connect: connection refused Mar 25 01:16:05.722255 kubelet[2236]: E0325 01:16:05.722216 2236 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.0.53:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.53:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:16:06.452277 kubelet[2236]: I0325 01:16:06.452235 2236 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Mar 25 01:16:07.102635 kubelet[2236]: E0325 01:16:07.102522 2236 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Mar 25 01:16:07.196906 kubelet[2236]: I0325 01:16:07.196865 2236 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Mar 25 01:16:07.884870 kubelet[2236]: I0325 01:16:07.884808 2236 apiserver.go:52] "Watching apiserver" Mar 25 01:16:07.896156 kubelet[2236]: I0325 01:16:07.896117 2236 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 25 01:16:09.276610 systemd[1]: Reload requested from client PID 2505 ('systemctl') (unit session-7.scope)... Mar 25 01:16:09.276625 systemd[1]: Reloading... Mar 25 01:16:09.343887 zram_generator::config[2552]: No configuration found. Mar 25 01:16:09.475719 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:16:09.557831 systemd[1]: Reloading finished in 280 ms. Mar 25 01:16:09.579208 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:16:09.588921 systemd[1]: kubelet.service: Deactivated successfully. Mar 25 01:16:09.589160 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:16:09.589215 systemd[1]: kubelet.service: Consumed 942ms CPU time, 117.7M memory peak. Mar 25 01:16:09.590899 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:16:09.701747 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:16:09.706551 (kubelet)[2592]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 25 01:16:09.743339 kubelet[2592]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:16:09.743339 kubelet[2592]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 25 01:16:09.743339 kubelet[2592]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:16:09.743675 kubelet[2592]: I0325 01:16:09.743362 2592 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 25 01:16:09.748287 kubelet[2592]: I0325 01:16:09.748256 2592 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" Mar 25 01:16:09.749842 kubelet[2592]: I0325 01:16:09.748382 2592 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 25 01:16:09.749842 kubelet[2592]: I0325 01:16:09.748597 2592 server.go:929] "Client rotation is on, will bootstrap in background" Mar 25 01:16:09.749944 kubelet[2592]: I0325 01:16:09.749873 2592 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 25 01:16:09.751786 kubelet[2592]: I0325 01:16:09.751749 2592 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 25 01:16:09.756545 kubelet[2592]: I0325 01:16:09.756528 2592 server.go:1426] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 25 01:16:09.759031 kubelet[2592]: I0325 01:16:09.759005 2592 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 25 01:16:09.759140 kubelet[2592]: I0325 01:16:09.759127 2592 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 25 01:16:09.759250 kubelet[2592]: I0325 01:16:09.759227 2592 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 25 01:16:09.759410 kubelet[2592]: I0325 01:16:09.759250 2592 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 25 01:16:09.759410 kubelet[2592]: I0325 01:16:09.759406 2592 topology_manager.go:138] "Creating topology manager with none policy" Mar 25 01:16:09.759502 kubelet[2592]: I0325 01:16:09.759414 2592 container_manager_linux.go:300] "Creating device plugin manager" Mar 25 01:16:09.759502 kubelet[2592]: I0325 01:16:09.759444 2592 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:16:09.759552 kubelet[2592]: I0325 01:16:09.759532 2592 kubelet.go:408] "Attempting to sync node with API server" Mar 25 01:16:09.759552 kubelet[2592]: I0325 01:16:09.759543 2592 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 25 01:16:09.759602 kubelet[2592]: I0325 01:16:09.759562 2592 kubelet.go:314] "Adding apiserver pod source" Mar 25 01:16:09.759602 kubelet[2592]: I0325 01:16:09.759571 2592 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 25 01:16:09.760380 kubelet[2592]: I0325 01:16:09.760094 2592 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 25 01:16:09.760687 kubelet[2592]: I0325 01:16:09.760527 2592 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 25 01:16:09.763865 kubelet[2592]: I0325 01:16:09.760908 2592 server.go:1269] "Started kubelet" Mar 25 01:16:09.763865 kubelet[2592]: I0325 01:16:09.762554 2592 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 25 01:16:09.763865 kubelet[2592]: I0325 01:16:09.762778 2592 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 25 01:16:09.763865 kubelet[2592]: I0325 01:16:09.762822 2592 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 25 01:16:09.763865 kubelet[2592]: I0325 01:16:09.763640 2592 server.go:460] "Adding debug handlers to kubelet server" Mar 25 01:16:09.764907 kubelet[2592]: I0325 01:16:09.764423 2592 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 25 01:16:09.764907 kubelet[2592]: I0325 01:16:09.764567 2592 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 25 01:16:09.765538 kubelet[2592]: I0325 01:16:09.765508 2592 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 25 01:16:09.766642 kubelet[2592]: I0325 01:16:09.765602 2592 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 25 01:16:09.766642 kubelet[2592]: I0325 01:16:09.765718 2592 reconciler.go:26] "Reconciler: start to sync state" Mar 25 01:16:09.770747 kubelet[2592]: E0325 01:16:09.769907 2592 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 25 01:16:09.774895 kubelet[2592]: I0325 01:16:09.774383 2592 factory.go:221] Registration of the systemd container factory successfully Mar 25 01:16:09.774895 kubelet[2592]: I0325 01:16:09.774502 2592 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 25 01:16:09.788034 kubelet[2592]: I0325 01:16:09.788006 2592 factory.go:221] Registration of the containerd container factory successfully Mar 25 01:16:09.788543 kubelet[2592]: E0325 01:16:09.788521 2592 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 25 01:16:09.792644 kubelet[2592]: I0325 01:16:09.792594 2592 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 25 01:16:09.793721 kubelet[2592]: I0325 01:16:09.793438 2592 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 25 01:16:09.793721 kubelet[2592]: I0325 01:16:09.793461 2592 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 25 01:16:09.793721 kubelet[2592]: I0325 01:16:09.793531 2592 kubelet.go:2321] "Starting kubelet main sync loop" Mar 25 01:16:09.793721 kubelet[2592]: E0325 01:16:09.793572 2592 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 25 01:16:09.818071 kubelet[2592]: I0325 01:16:09.817983 2592 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 25 01:16:09.818181 kubelet[2592]: I0325 01:16:09.818165 2592 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 25 01:16:09.818246 kubelet[2592]: I0325 01:16:09.818238 2592 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:16:09.818434 kubelet[2592]: I0325 01:16:09.818416 2592 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 25 01:16:09.818537 kubelet[2592]: I0325 01:16:09.818478 2592 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 25 01:16:09.818537 kubelet[2592]: I0325 01:16:09.818502 2592 policy_none.go:49] "None policy: Start" Mar 25 01:16:09.819529 kubelet[2592]: I0325 01:16:09.819484 2592 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 25 01:16:09.819529 kubelet[2592]: I0325 01:16:09.819506 2592 state_mem.go:35] "Initializing new in-memory state store" Mar 25 01:16:09.819654 kubelet[2592]: I0325 01:16:09.819632 2592 state_mem.go:75] "Updated machine memory state" Mar 25 01:16:09.823658 kubelet[2592]: I0325 01:16:09.823633 2592 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 25 01:16:09.824093 kubelet[2592]: I0325 01:16:09.823785 2592 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 25 01:16:09.824093 kubelet[2592]: I0325 01:16:09.823811 2592 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 25 01:16:09.824093 kubelet[2592]: I0325 01:16:09.824007 2592 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 25 01:16:09.899665 kubelet[2592]: E0325 01:16:09.899620 2592 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Mar 25 01:16:09.926054 kubelet[2592]: I0325 01:16:09.926031 2592 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Mar 25 01:16:09.932903 kubelet[2592]: I0325 01:16:09.932601 2592 kubelet_node_status.go:111] "Node was previously registered" node="localhost" Mar 25 01:16:09.932903 kubelet[2592]: I0325 01:16:09.932695 2592 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Mar 25 01:16:09.966735 kubelet[2592]: I0325 01:16:09.966705 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6f32907a07e55aea05abdc5cd284a8d5-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"6f32907a07e55aea05abdc5cd284a8d5\") " pod="kube-system/kube-scheduler-localhost" Mar 25 01:16:09.966735 kubelet[2592]: I0325 01:16:09.966741 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0a954a86ffd87ccda2e357ea22436588-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"0a954a86ffd87ccda2e357ea22436588\") " pod="kube-system/kube-apiserver-localhost" Mar 25 01:16:09.966735 kubelet[2592]: I0325 01:16:09.966761 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/60762308083b5ef6c837b1be48ec53d6-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"60762308083b5ef6c837b1be48ec53d6\") " pod="kube-system/kube-controller-manager-localhost" Mar 25 01:16:09.967021 kubelet[2592]: I0325 01:16:09.966778 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/60762308083b5ef6c837b1be48ec53d6-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"60762308083b5ef6c837b1be48ec53d6\") " pod="kube-system/kube-controller-manager-localhost" Mar 25 01:16:09.967021 kubelet[2592]: I0325 01:16:09.966794 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/60762308083b5ef6c837b1be48ec53d6-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"60762308083b5ef6c837b1be48ec53d6\") " pod="kube-system/kube-controller-manager-localhost" Mar 25 01:16:09.967021 kubelet[2592]: I0325 01:16:09.966810 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/60762308083b5ef6c837b1be48ec53d6-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"60762308083b5ef6c837b1be48ec53d6\") " pod="kube-system/kube-controller-manager-localhost" Mar 25 01:16:09.967021 kubelet[2592]: I0325 01:16:09.966822 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0a954a86ffd87ccda2e357ea22436588-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"0a954a86ffd87ccda2e357ea22436588\") " pod="kube-system/kube-apiserver-localhost" Mar 25 01:16:09.967021 kubelet[2592]: I0325 01:16:09.966850 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0a954a86ffd87ccda2e357ea22436588-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"0a954a86ffd87ccda2e357ea22436588\") " pod="kube-system/kube-apiserver-localhost" Mar 25 01:16:09.967131 kubelet[2592]: I0325 01:16:09.966887 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/60762308083b5ef6c837b1be48ec53d6-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"60762308083b5ef6c837b1be48ec53d6\") " pod="kube-system/kube-controller-manager-localhost" Mar 25 01:16:10.760332 kubelet[2592]: I0325 01:16:10.760290 2592 apiserver.go:52] "Watching apiserver" Mar 25 01:16:10.766032 kubelet[2592]: I0325 01:16:10.765996 2592 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 25 01:16:10.812317 kubelet[2592]: E0325 01:16:10.812276 2592 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Mar 25 01:16:10.822753 kubelet[2592]: I0325 01:16:10.822693 2592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.822678431 podStartE2EDuration="1.822678431s" podCreationTimestamp="2025-03-25 01:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:16:10.822565775 +0000 UTC m=+1.112508963" watchObservedRunningTime="2025-03-25 01:16:10.822678431 +0000 UTC m=+1.112621619" Mar 25 01:16:10.835791 kubelet[2592]: I0325 01:16:10.835628 2592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=2.835613651 podStartE2EDuration="2.835613651s" podCreationTimestamp="2025-03-25 01:16:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:16:10.829387556 +0000 UTC m=+1.119330744" watchObservedRunningTime="2025-03-25 01:16:10.835613651 +0000 UTC m=+1.125556799" Mar 25 01:16:10.842684 kubelet[2592]: I0325 01:16:10.842535 2592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.842522964 podStartE2EDuration="1.842522964s" podCreationTimestamp="2025-03-25 01:16:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:16:10.835807879 +0000 UTC m=+1.125751067" watchObservedRunningTime="2025-03-25 01:16:10.842522964 +0000 UTC m=+1.132466152" Mar 25 01:16:14.557443 sudo[1681]: pam_unix(sudo:session): session closed for user root Mar 25 01:16:14.558480 sshd[1680]: Connection closed by 10.0.0.1 port 57592 Mar 25 01:16:14.558964 sshd-session[1677]: pam_unix(sshd:session): session closed for user core Mar 25 01:16:14.562551 systemd[1]: sshd@6-10.0.0.53:22-10.0.0.1:57592.service: Deactivated successfully. Mar 25 01:16:14.566583 systemd[1]: session-7.scope: Deactivated successfully. Mar 25 01:16:14.566801 systemd[1]: session-7.scope: Consumed 8.298s CPU time, 226.8M memory peak. Mar 25 01:16:14.567723 systemd-logind[1468]: Session 7 logged out. Waiting for processes to exit. Mar 25 01:16:14.568669 systemd-logind[1468]: Removed session 7. Mar 25 01:16:15.790740 kubelet[2592]: I0325 01:16:15.790704 2592 kuberuntime_manager.go:1633] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 25 01:16:15.792872 containerd[1481]: time="2025-03-25T01:16:15.791234293Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 25 01:16:15.793405 kubelet[2592]: I0325 01:16:15.793371 2592 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 25 01:16:16.807586 systemd[1]: Created slice kubepods-besteffort-podc89cae75_fb23_403c_b013_7914f882b69a.slice - libcontainer container kubepods-besteffort-podc89cae75_fb23_403c_b013_7914f882b69a.slice. Mar 25 01:16:16.827826 kubelet[2592]: I0325 01:16:16.827786 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/c89cae75-fb23-403c-b013-7914f882b69a-xtables-lock\") pod \"kube-proxy-t97jn\" (UID: \"c89cae75-fb23-403c-b013-7914f882b69a\") " pod="kube-system/kube-proxy-t97jn" Mar 25 01:16:16.828112 kubelet[2592]: I0325 01:16:16.827829 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q447c\" (UniqueName: \"kubernetes.io/projected/c89cae75-fb23-403c-b013-7914f882b69a-kube-api-access-q447c\") pod \"kube-proxy-t97jn\" (UID: \"c89cae75-fb23-403c-b013-7914f882b69a\") " pod="kube-system/kube-proxy-t97jn" Mar 25 01:16:16.828112 kubelet[2592]: I0325 01:16:16.827862 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/c89cae75-fb23-403c-b013-7914f882b69a-kube-proxy\") pod \"kube-proxy-t97jn\" (UID: \"c89cae75-fb23-403c-b013-7914f882b69a\") " pod="kube-system/kube-proxy-t97jn" Mar 25 01:16:16.828112 kubelet[2592]: I0325 01:16:16.827880 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c89cae75-fb23-403c-b013-7914f882b69a-lib-modules\") pod \"kube-proxy-t97jn\" (UID: \"c89cae75-fb23-403c-b013-7914f882b69a\") " pod="kube-system/kube-proxy-t97jn" Mar 25 01:16:16.921564 systemd[1]: Created slice kubepods-besteffort-pod2660e537_2f4e_464f_837c_22620914c0a4.slice - libcontainer container kubepods-besteffort-pod2660e537_2f4e_464f_837c_22620914c0a4.slice. Mar 25 01:16:17.029313 kubelet[2592]: I0325 01:16:17.029273 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/2660e537-2f4e-464f-837c-22620914c0a4-var-lib-calico\") pod \"tigera-operator-64ff5465b7-2wfl8\" (UID: \"2660e537-2f4e-464f-837c-22620914c0a4\") " pod="tigera-operator/tigera-operator-64ff5465b7-2wfl8" Mar 25 01:16:17.029313 kubelet[2592]: I0325 01:16:17.029319 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zkzb\" (UniqueName: \"kubernetes.io/projected/2660e537-2f4e-464f-837c-22620914c0a4-kube-api-access-5zkzb\") pod \"tigera-operator-64ff5465b7-2wfl8\" (UID: \"2660e537-2f4e-464f-837c-22620914c0a4\") " pod="tigera-operator/tigera-operator-64ff5465b7-2wfl8" Mar 25 01:16:17.122589 containerd[1481]: time="2025-03-25T01:16:17.122468883Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-t97jn,Uid:c89cae75-fb23-403c-b013-7914f882b69a,Namespace:kube-system,Attempt:0,}" Mar 25 01:16:17.138693 containerd[1481]: time="2025-03-25T01:16:17.138319003Z" level=info msg="connecting to shim d478ad81ba046901471da052047e4dc6952a15c83bb0c156c9cb4e901cfc6c5c" address="unix:///run/containerd/s/3d417734d18d689ef47a7380f0846b6759372d37f70ea62f697d4c2544c59156" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:16:17.169039 systemd[1]: Started cri-containerd-d478ad81ba046901471da052047e4dc6952a15c83bb0c156c9cb4e901cfc6c5c.scope - libcontainer container d478ad81ba046901471da052047e4dc6952a15c83bb0c156c9cb4e901cfc6c5c. Mar 25 01:16:17.191444 containerd[1481]: time="2025-03-25T01:16:17.191406587Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-t97jn,Uid:c89cae75-fb23-403c-b013-7914f882b69a,Namespace:kube-system,Attempt:0,} returns sandbox id \"d478ad81ba046901471da052047e4dc6952a15c83bb0c156c9cb4e901cfc6c5c\"" Mar 25 01:16:17.195329 containerd[1481]: time="2025-03-25T01:16:17.195298730Z" level=info msg="CreateContainer within sandbox \"d478ad81ba046901471da052047e4dc6952a15c83bb0c156c9cb4e901cfc6c5c\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 25 01:16:17.219710 containerd[1481]: time="2025-03-25T01:16:17.219655686Z" level=info msg="Container a2c3d91acfbd6d32adde48ea4604f231343111eec0c6e2819be0e481712ca26d: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:16:17.224674 containerd[1481]: time="2025-03-25T01:16:17.224634976Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-64ff5465b7-2wfl8,Uid:2660e537-2f4e-464f-837c-22620914c0a4,Namespace:tigera-operator,Attempt:0,}" Mar 25 01:16:17.233287 containerd[1481]: time="2025-03-25T01:16:17.233212820Z" level=info msg="CreateContainer within sandbox \"d478ad81ba046901471da052047e4dc6952a15c83bb0c156c9cb4e901cfc6c5c\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"a2c3d91acfbd6d32adde48ea4604f231343111eec0c6e2819be0e481712ca26d\"" Mar 25 01:16:17.233991 containerd[1481]: time="2025-03-25T01:16:17.233910449Z" level=info msg="StartContainer for \"a2c3d91acfbd6d32adde48ea4604f231343111eec0c6e2819be0e481712ca26d\"" Mar 25 01:16:17.235475 containerd[1481]: time="2025-03-25T01:16:17.235448881Z" level=info msg="connecting to shim a2c3d91acfbd6d32adde48ea4604f231343111eec0c6e2819be0e481712ca26d" address="unix:///run/containerd/s/3d417734d18d689ef47a7380f0846b6759372d37f70ea62f697d4c2544c59156" protocol=ttrpc version=3 Mar 25 01:16:17.242115 containerd[1481]: time="2025-03-25T01:16:17.241779703Z" level=info msg="connecting to shim 17f710f3a607d6a79bec3d1a201ea33a1cd81ff0c9d9386cae8a344e59f643b2" address="unix:///run/containerd/s/5b738aaae706044bae196765394e5877e4e1af057ed266ef186e4cb0ebd877a8" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:16:17.255031 systemd[1]: Started cri-containerd-a2c3d91acfbd6d32adde48ea4604f231343111eec0c6e2819be0e481712ca26d.scope - libcontainer container a2c3d91acfbd6d32adde48ea4604f231343111eec0c6e2819be0e481712ca26d. Mar 25 01:16:17.270030 systemd[1]: Started cri-containerd-17f710f3a607d6a79bec3d1a201ea33a1cd81ff0c9d9386cae8a344e59f643b2.scope - libcontainer container 17f710f3a607d6a79bec3d1a201ea33a1cd81ff0c9d9386cae8a344e59f643b2. Mar 25 01:16:17.294850 containerd[1481]: time="2025-03-25T01:16:17.294804441Z" level=info msg="StartContainer for \"a2c3d91acfbd6d32adde48ea4604f231343111eec0c6e2819be0e481712ca26d\" returns successfully" Mar 25 01:16:17.312036 containerd[1481]: time="2025-03-25T01:16:17.310689204Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-64ff5465b7-2wfl8,Uid:2660e537-2f4e-464f-837c-22620914c0a4,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"17f710f3a607d6a79bec3d1a201ea33a1cd81ff0c9d9386cae8a344e59f643b2\"" Mar 25 01:16:17.314112 containerd[1481]: time="2025-03-25T01:16:17.312415094Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\"" Mar 25 01:16:17.946278 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount730004402.mount: Deactivated successfully. Mar 25 01:16:19.070554 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount599833375.mount: Deactivated successfully. Mar 25 01:16:20.117750 update_engine[1470]: I20250325 01:16:20.117666 1470 update_attempter.cc:509] Updating boot flags... Mar 25 01:16:20.148040 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (2952) Mar 25 01:16:20.205939 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (2956) Mar 25 01:16:20.470167 containerd[1481]: time="2025-03-25T01:16:20.470048025Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:20.471254 containerd[1481]: time="2025-03-25T01:16:20.471020948Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.5: active requests=0, bytes read=19271115" Mar 25 01:16:20.472270 containerd[1481]: time="2025-03-25T01:16:20.472239970Z" level=info msg="ImageCreate event name:\"sha256:a709184cc04589116e7266cb3575491ae8f2ac1c959975fea966447025f66eaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:20.475008 containerd[1481]: time="2025-03-25T01:16:20.474977522Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:20.475681 containerd[1481]: time="2025-03-25T01:16:20.475650538Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.5\" with image id \"sha256:a709184cc04589116e7266cb3575491ae8f2ac1c959975fea966447025f66eaa\", repo tag \"quay.io/tigera/operator:v1.36.5\", repo digest \"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\", size \"19267110\" in 3.163204442s" Mar 25 01:16:20.475744 containerd[1481]: time="2025-03-25T01:16:20.475681381Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\" returns image reference \"sha256:a709184cc04589116e7266cb3575491ae8f2ac1c959975fea966447025f66eaa\"" Mar 25 01:16:20.498036 containerd[1481]: time="2025-03-25T01:16:20.497983984Z" level=info msg="CreateContainer within sandbox \"17f710f3a607d6a79bec3d1a201ea33a1cd81ff0c9d9386cae8a344e59f643b2\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 25 01:16:20.506097 containerd[1481]: time="2025-03-25T01:16:20.504240272Z" level=info msg="Container 4e17cdd8716a8327b3d3c38789767cf0b394aca9ff099adaec69fec4c5bc6908: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:16:20.513753 containerd[1481]: time="2025-03-25T01:16:20.513714872Z" level=info msg="CreateContainer within sandbox \"17f710f3a607d6a79bec3d1a201ea33a1cd81ff0c9d9386cae8a344e59f643b2\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"4e17cdd8716a8327b3d3c38789767cf0b394aca9ff099adaec69fec4c5bc6908\"" Mar 25 01:16:20.514345 containerd[1481]: time="2025-03-25T01:16:20.514309602Z" level=info msg="StartContainer for \"4e17cdd8716a8327b3d3c38789767cf0b394aca9ff099adaec69fec4c5bc6908\"" Mar 25 01:16:20.515503 containerd[1481]: time="2025-03-25T01:16:20.515475380Z" level=info msg="connecting to shim 4e17cdd8716a8327b3d3c38789767cf0b394aca9ff099adaec69fec4c5bc6908" address="unix:///run/containerd/s/5b738aaae706044bae196765394e5877e4e1af057ed266ef186e4cb0ebd877a8" protocol=ttrpc version=3 Mar 25 01:16:20.559399 systemd[1]: Started cri-containerd-4e17cdd8716a8327b3d3c38789767cf0b394aca9ff099adaec69fec4c5bc6908.scope - libcontainer container 4e17cdd8716a8327b3d3c38789767cf0b394aca9ff099adaec69fec4c5bc6908. Mar 25 01:16:20.616291 containerd[1481]: time="2025-03-25T01:16:20.616255408Z" level=info msg="StartContainer for \"4e17cdd8716a8327b3d3c38789767cf0b394aca9ff099adaec69fec4c5bc6908\" returns successfully" Mar 25 01:16:20.846073 kubelet[2592]: I0325 01:16:20.845544 2592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-t97jn" podStartSLOduration=4.845525402 podStartE2EDuration="4.845525402s" podCreationTimestamp="2025-03-25 01:16:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:16:17.830069193 +0000 UTC m=+8.120012381" watchObservedRunningTime="2025-03-25 01:16:20.845525402 +0000 UTC m=+11.135468590" Mar 25 01:16:20.846073 kubelet[2592]: I0325 01:16:20.845665 2592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-64ff5465b7-2wfl8" podStartSLOduration=1.6695425529999999 podStartE2EDuration="4.845659733s" podCreationTimestamp="2025-03-25 01:16:16 +0000 UTC" firstStartedPulling="2025-03-25 01:16:17.311827956 +0000 UTC m=+7.601771144" lastFinishedPulling="2025-03-25 01:16:20.487945136 +0000 UTC m=+10.777888324" observedRunningTime="2025-03-25 01:16:20.845238418 +0000 UTC m=+11.135181606" watchObservedRunningTime="2025-03-25 01:16:20.845659733 +0000 UTC m=+11.135602921" Mar 25 01:16:24.230355 systemd[1]: Created slice kubepods-besteffort-pod661aabfc_9fc0_4ee6_b1ed_4238661dd884.slice - libcontainer container kubepods-besteffort-pod661aabfc_9fc0_4ee6_b1ed_4238661dd884.slice. Mar 25 01:16:24.274059 kubelet[2592]: I0325 01:16:24.274002 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/661aabfc-9fc0-4ee6-b1ed-4238661dd884-tigera-ca-bundle\") pod \"calico-typha-5659f4fdf5-rlm7f\" (UID: \"661aabfc-9fc0-4ee6-b1ed-4238661dd884\") " pod="calico-system/calico-typha-5659f4fdf5-rlm7f" Mar 25 01:16:24.275518 kubelet[2592]: I0325 01:16:24.275476 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-286vw\" (UniqueName: \"kubernetes.io/projected/661aabfc-9fc0-4ee6-b1ed-4238661dd884-kube-api-access-286vw\") pod \"calico-typha-5659f4fdf5-rlm7f\" (UID: \"661aabfc-9fc0-4ee6-b1ed-4238661dd884\") " pod="calico-system/calico-typha-5659f4fdf5-rlm7f" Mar 25 01:16:24.275518 kubelet[2592]: I0325 01:16:24.275522 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/661aabfc-9fc0-4ee6-b1ed-4238661dd884-typha-certs\") pod \"calico-typha-5659f4fdf5-rlm7f\" (UID: \"661aabfc-9fc0-4ee6-b1ed-4238661dd884\") " pod="calico-system/calico-typha-5659f4fdf5-rlm7f" Mar 25 01:16:24.313183 systemd[1]: Created slice kubepods-besteffort-podd02f58d6_3aa3_4553_b387_50adcac84b14.slice - libcontainer container kubepods-besteffort-podd02f58d6_3aa3_4553_b387_50adcac84b14.slice. Mar 25 01:16:24.376129 kubelet[2592]: I0325 01:16:24.376082 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d02f58d6-3aa3-4553-b387-50adcac84b14-tigera-ca-bundle\") pod \"calico-node-564vh\" (UID: \"d02f58d6-3aa3-4553-b387-50adcac84b14\") " pod="calico-system/calico-node-564vh" Mar 25 01:16:24.376257 kubelet[2592]: I0325 01:16:24.376152 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/d02f58d6-3aa3-4553-b387-50adcac84b14-node-certs\") pod \"calico-node-564vh\" (UID: \"d02f58d6-3aa3-4553-b387-50adcac84b14\") " pod="calico-system/calico-node-564vh" Mar 25 01:16:24.376257 kubelet[2592]: I0325 01:16:24.376170 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/d02f58d6-3aa3-4553-b387-50adcac84b14-flexvol-driver-host\") pod \"calico-node-564vh\" (UID: \"d02f58d6-3aa3-4553-b387-50adcac84b14\") " pod="calico-system/calico-node-564vh" Mar 25 01:16:24.376257 kubelet[2592]: I0325 01:16:24.376190 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d02f58d6-3aa3-4553-b387-50adcac84b14-lib-modules\") pod \"calico-node-564vh\" (UID: \"d02f58d6-3aa3-4553-b387-50adcac84b14\") " pod="calico-system/calico-node-564vh" Mar 25 01:16:24.376257 kubelet[2592]: I0325 01:16:24.376206 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mspcm\" (UniqueName: \"kubernetes.io/projected/d02f58d6-3aa3-4553-b387-50adcac84b14-kube-api-access-mspcm\") pod \"calico-node-564vh\" (UID: \"d02f58d6-3aa3-4553-b387-50adcac84b14\") " pod="calico-system/calico-node-564vh" Mar 25 01:16:24.376257 kubelet[2592]: I0325 01:16:24.376222 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/d02f58d6-3aa3-4553-b387-50adcac84b14-var-run-calico\") pod \"calico-node-564vh\" (UID: \"d02f58d6-3aa3-4553-b387-50adcac84b14\") " pod="calico-system/calico-node-564vh" Mar 25 01:16:24.376398 kubelet[2592]: I0325 01:16:24.376237 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/d02f58d6-3aa3-4553-b387-50adcac84b14-cni-log-dir\") pod \"calico-node-564vh\" (UID: \"d02f58d6-3aa3-4553-b387-50adcac84b14\") " pod="calico-system/calico-node-564vh" Mar 25 01:16:24.376398 kubelet[2592]: I0325 01:16:24.376252 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d02f58d6-3aa3-4553-b387-50adcac84b14-xtables-lock\") pod \"calico-node-564vh\" (UID: \"d02f58d6-3aa3-4553-b387-50adcac84b14\") " pod="calico-system/calico-node-564vh" Mar 25 01:16:24.376398 kubelet[2592]: I0325 01:16:24.376265 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/d02f58d6-3aa3-4553-b387-50adcac84b14-cni-net-dir\") pod \"calico-node-564vh\" (UID: \"d02f58d6-3aa3-4553-b387-50adcac84b14\") " pod="calico-system/calico-node-564vh" Mar 25 01:16:24.376398 kubelet[2592]: I0325 01:16:24.376296 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/d02f58d6-3aa3-4553-b387-50adcac84b14-policysync\") pod \"calico-node-564vh\" (UID: \"d02f58d6-3aa3-4553-b387-50adcac84b14\") " pod="calico-system/calico-node-564vh" Mar 25 01:16:24.376398 kubelet[2592]: I0325 01:16:24.376313 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/d02f58d6-3aa3-4553-b387-50adcac84b14-cni-bin-dir\") pod \"calico-node-564vh\" (UID: \"d02f58d6-3aa3-4553-b387-50adcac84b14\") " pod="calico-system/calico-node-564vh" Mar 25 01:16:24.376988 kubelet[2592]: I0325 01:16:24.376327 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/d02f58d6-3aa3-4553-b387-50adcac84b14-var-lib-calico\") pod \"calico-node-564vh\" (UID: \"d02f58d6-3aa3-4553-b387-50adcac84b14\") " pod="calico-system/calico-node-564vh" Mar 25 01:16:24.428814 kubelet[2592]: E0325 01:16:24.428750 2592 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dzt9d" podUID="c4b5976c-7a98-4d08-b430-ee7389a6d994" Mar 25 01:16:24.477356 kubelet[2592]: I0325 01:16:24.477305 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/c4b5976c-7a98-4d08-b430-ee7389a6d994-varrun\") pod \"csi-node-driver-dzt9d\" (UID: \"c4b5976c-7a98-4d08-b430-ee7389a6d994\") " pod="calico-system/csi-node-driver-dzt9d" Mar 25 01:16:24.477356 kubelet[2592]: I0325 01:16:24.477354 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c4b5976c-7a98-4d08-b430-ee7389a6d994-kubelet-dir\") pod \"csi-node-driver-dzt9d\" (UID: \"c4b5976c-7a98-4d08-b430-ee7389a6d994\") " pod="calico-system/csi-node-driver-dzt9d" Mar 25 01:16:24.477512 kubelet[2592]: I0325 01:16:24.477391 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c4b5976c-7a98-4d08-b430-ee7389a6d994-registration-dir\") pod \"csi-node-driver-dzt9d\" (UID: \"c4b5976c-7a98-4d08-b430-ee7389a6d994\") " pod="calico-system/csi-node-driver-dzt9d" Mar 25 01:16:24.478245 kubelet[2592]: I0325 01:16:24.477430 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwst2\" (UniqueName: \"kubernetes.io/projected/c4b5976c-7a98-4d08-b430-ee7389a6d994-kube-api-access-pwst2\") pod \"csi-node-driver-dzt9d\" (UID: \"c4b5976c-7a98-4d08-b430-ee7389a6d994\") " pod="calico-system/csi-node-driver-dzt9d" Mar 25 01:16:24.483397 kubelet[2592]: E0325 01:16:24.483322 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.483397 kubelet[2592]: W0325 01:16:24.483349 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.483485 kubelet[2592]: E0325 01:16:24.483445 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.484050 kubelet[2592]: E0325 01:16:24.484016 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.484050 kubelet[2592]: W0325 01:16:24.484037 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.484144 kubelet[2592]: E0325 01:16:24.484081 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.484319 kubelet[2592]: E0325 01:16:24.484301 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.484319 kubelet[2592]: W0325 01:16:24.484315 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.484396 kubelet[2592]: E0325 01:16:24.484374 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.484522 kubelet[2592]: E0325 01:16:24.484511 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.484543 kubelet[2592]: W0325 01:16:24.484522 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.484562 kubelet[2592]: E0325 01:16:24.484543 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.484686 kubelet[2592]: E0325 01:16:24.484675 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.484710 kubelet[2592]: W0325 01:16:24.484686 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.484710 kubelet[2592]: E0325 01:16:24.484700 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.484948 kubelet[2592]: E0325 01:16:24.484935 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.484981 kubelet[2592]: W0325 01:16:24.484949 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.484981 kubelet[2592]: E0325 01:16:24.484963 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.485111 kubelet[2592]: E0325 01:16:24.485101 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.485146 kubelet[2592]: W0325 01:16:24.485110 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.485146 kubelet[2592]: E0325 01:16:24.485131 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.485345 kubelet[2592]: E0325 01:16:24.485328 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.485345 kubelet[2592]: W0325 01:16:24.485337 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.485413 kubelet[2592]: E0325 01:16:24.485396 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.485515 kubelet[2592]: E0325 01:16:24.485503 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.485515 kubelet[2592]: W0325 01:16:24.485513 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.485561 kubelet[2592]: E0325 01:16:24.485534 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.485561 kubelet[2592]: I0325 01:16:24.485557 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c4b5976c-7a98-4d08-b430-ee7389a6d994-socket-dir\") pod \"csi-node-driver-dzt9d\" (UID: \"c4b5976c-7a98-4d08-b430-ee7389a6d994\") " pod="calico-system/csi-node-driver-dzt9d" Mar 25 01:16:24.485657 kubelet[2592]: E0325 01:16:24.485648 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.485684 kubelet[2592]: W0325 01:16:24.485657 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.485684 kubelet[2592]: E0325 01:16:24.485670 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.485832 kubelet[2592]: E0325 01:16:24.485822 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.485863 kubelet[2592]: W0325 01:16:24.485831 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.485863 kubelet[2592]: E0325 01:16:24.485843 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.485992 kubelet[2592]: E0325 01:16:24.485981 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.485992 kubelet[2592]: W0325 01:16:24.485991 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.486039 kubelet[2592]: E0325 01:16:24.486001 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.486176 kubelet[2592]: E0325 01:16:24.486165 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.486176 kubelet[2592]: W0325 01:16:24.486175 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.486234 kubelet[2592]: E0325 01:16:24.486184 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.486912 kubelet[2592]: E0325 01:16:24.486802 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.486912 kubelet[2592]: W0325 01:16:24.486818 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.486912 kubelet[2592]: E0325 01:16:24.486838 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.487108 kubelet[2592]: E0325 01:16:24.487089 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.487149 kubelet[2592]: W0325 01:16:24.487107 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.487149 kubelet[2592]: E0325 01:16:24.487135 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.487312 kubelet[2592]: E0325 01:16:24.487302 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.487312 kubelet[2592]: W0325 01:16:24.487312 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.487373 kubelet[2592]: E0325 01:16:24.487322 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.487499 kubelet[2592]: E0325 01:16:24.487469 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.487499 kubelet[2592]: W0325 01:16:24.487479 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.487499 kubelet[2592]: E0325 01:16:24.487492 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.487659 kubelet[2592]: E0325 01:16:24.487645 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.487659 kubelet[2592]: W0325 01:16:24.487656 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.487716 kubelet[2592]: E0325 01:16:24.487668 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.487895 kubelet[2592]: E0325 01:16:24.487880 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.487895 kubelet[2592]: W0325 01:16:24.487894 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.488021 kubelet[2592]: E0325 01:16:24.487908 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.488334 kubelet[2592]: E0325 01:16:24.488317 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.488334 kubelet[2592]: W0325 01:16:24.488333 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.488417 kubelet[2592]: E0325 01:16:24.488396 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.488554 kubelet[2592]: E0325 01:16:24.488527 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.488554 kubelet[2592]: W0325 01:16:24.488538 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.488707 kubelet[2592]: E0325 01:16:24.488561 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.488749 kubelet[2592]: E0325 01:16:24.488741 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.488780 kubelet[2592]: W0325 01:16:24.488751 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.488780 kubelet[2592]: E0325 01:16:24.488766 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.488970 kubelet[2592]: E0325 01:16:24.488953 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.488970 kubelet[2592]: W0325 01:16:24.488966 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.489028 kubelet[2592]: E0325 01:16:24.489010 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.489128 kubelet[2592]: E0325 01:16:24.489105 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.489128 kubelet[2592]: W0325 01:16:24.489124 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.489181 kubelet[2592]: E0325 01:16:24.489168 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.489287 kubelet[2592]: E0325 01:16:24.489268 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.489287 kubelet[2592]: W0325 01:16:24.489278 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.489336 kubelet[2592]: E0325 01:16:24.489291 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.489438 kubelet[2592]: E0325 01:16:24.489424 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.489438 kubelet[2592]: W0325 01:16:24.489435 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.489527 kubelet[2592]: E0325 01:16:24.489443 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.489610 kubelet[2592]: E0325 01:16:24.489596 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.489610 kubelet[2592]: W0325 01:16:24.489607 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.489665 kubelet[2592]: E0325 01:16:24.489619 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.489763 kubelet[2592]: E0325 01:16:24.489747 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.489763 kubelet[2592]: W0325 01:16:24.489757 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.489812 kubelet[2592]: E0325 01:16:24.489768 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.489930 kubelet[2592]: E0325 01:16:24.489910 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.489930 kubelet[2592]: W0325 01:16:24.489921 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.489989 kubelet[2592]: E0325 01:16:24.489936 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.490166 kubelet[2592]: E0325 01:16:24.490149 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.490166 kubelet[2592]: W0325 01:16:24.490161 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.490224 kubelet[2592]: E0325 01:16:24.490169 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.490317 kubelet[2592]: E0325 01:16:24.490304 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.490317 kubelet[2592]: W0325 01:16:24.490315 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.490368 kubelet[2592]: E0325 01:16:24.490323 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.490490 kubelet[2592]: E0325 01:16:24.490480 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.490523 kubelet[2592]: W0325 01:16:24.490493 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.490552 kubelet[2592]: E0325 01:16:24.490531 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.490651 kubelet[2592]: E0325 01:16:24.490636 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.490651 kubelet[2592]: W0325 01:16:24.490646 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.490698 kubelet[2592]: E0325 01:16:24.490654 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.490799 kubelet[2592]: E0325 01:16:24.490788 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.490799 kubelet[2592]: W0325 01:16:24.490796 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.490799 kubelet[2592]: E0325 01:16:24.490804 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.491000 kubelet[2592]: E0325 01:16:24.490989 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.491000 kubelet[2592]: W0325 01:16:24.490999 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.491049 kubelet[2592]: E0325 01:16:24.491007 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.496891 kubelet[2592]: E0325 01:16:24.495908 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.496891 kubelet[2592]: W0325 01:16:24.495926 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.496891 kubelet[2592]: E0325 01:16:24.495938 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.534414 containerd[1481]: time="2025-03-25T01:16:24.534260947Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5659f4fdf5-rlm7f,Uid:661aabfc-9fc0-4ee6-b1ed-4238661dd884,Namespace:calico-system,Attempt:0,}" Mar 25 01:16:24.577714 containerd[1481]: time="2025-03-25T01:16:24.577672205Z" level=info msg="connecting to shim fbdcf2797530412d93ec9cc45817dd078d343165984b83190a706fced06ec4ca" address="unix:///run/containerd/s/ef174d3317f60eea0606249154b0ae5607968f52acf61210b0cca55095095245" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:16:24.590040 kubelet[2592]: E0325 01:16:24.590014 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.590040 kubelet[2592]: W0325 01:16:24.590037 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.590356 kubelet[2592]: E0325 01:16:24.590057 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.590866 kubelet[2592]: E0325 01:16:24.590697 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.591072 kubelet[2592]: W0325 01:16:24.590997 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.591072 kubelet[2592]: E0325 01:16:24.591029 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.591947 kubelet[2592]: E0325 01:16:24.591923 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.592055 kubelet[2592]: W0325 01:16:24.592038 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.592097 kubelet[2592]: E0325 01:16:24.592061 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.592579 kubelet[2592]: E0325 01:16:24.592538 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.592579 kubelet[2592]: W0325 01:16:24.592555 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.592660 kubelet[2592]: E0325 01:16:24.592603 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.592817 kubelet[2592]: E0325 01:16:24.592797 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.592817 kubelet[2592]: W0325 01:16:24.592811 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.593117 kubelet[2592]: E0325 01:16:24.592826 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.593117 kubelet[2592]: E0325 01:16:24.593073 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.593117 kubelet[2592]: W0325 01:16:24.593086 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.593117 kubelet[2592]: E0325 01:16:24.593105 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.593652 kubelet[2592]: E0325 01:16:24.593351 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.593652 kubelet[2592]: W0325 01:16:24.593461 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.593652 kubelet[2592]: E0325 01:16:24.593481 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.596025 kubelet[2592]: E0325 01:16:24.595956 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.596025 kubelet[2592]: W0325 01:16:24.595973 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.596025 kubelet[2592]: E0325 01:16:24.596020 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.599057 kubelet[2592]: E0325 01:16:24.596232 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.599057 kubelet[2592]: W0325 01:16:24.596250 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.599057 kubelet[2592]: E0325 01:16:24.596302 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.599057 kubelet[2592]: E0325 01:16:24.596950 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.599057 kubelet[2592]: W0325 01:16:24.596965 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.599057 kubelet[2592]: E0325 01:16:24.597061 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.599057 kubelet[2592]: E0325 01:16:24.597191 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.599057 kubelet[2592]: W0325 01:16:24.597204 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.599057 kubelet[2592]: E0325 01:16:24.597429 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.599057 kubelet[2592]: E0325 01:16:24.597634 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.599406 kubelet[2592]: W0325 01:16:24.597646 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.599406 kubelet[2592]: E0325 01:16:24.597666 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.599406 kubelet[2592]: E0325 01:16:24.597852 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.599406 kubelet[2592]: W0325 01:16:24.597874 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.599406 kubelet[2592]: E0325 01:16:24.597967 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.599406 kubelet[2592]: E0325 01:16:24.598392 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.599406 kubelet[2592]: W0325 01:16:24.598424 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.599406 kubelet[2592]: E0325 01:16:24.598498 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.599406 kubelet[2592]: E0325 01:16:24.598716 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.599406 kubelet[2592]: W0325 01:16:24.598729 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.599598 kubelet[2592]: E0325 01:16:24.598767 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.599598 kubelet[2592]: E0325 01:16:24.599096 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.599598 kubelet[2592]: W0325 01:16:24.599109 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.599598 kubelet[2592]: E0325 01:16:24.599164 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.600991 kubelet[2592]: E0325 01:16:24.599849 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.600991 kubelet[2592]: W0325 01:16:24.599895 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.600991 kubelet[2592]: E0325 01:16:24.599930 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.600991 kubelet[2592]: E0325 01:16:24.600299 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.600991 kubelet[2592]: W0325 01:16:24.600312 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.600991 kubelet[2592]: E0325 01:16:24.600346 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.600991 kubelet[2592]: E0325 01:16:24.600820 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.600991 kubelet[2592]: W0325 01:16:24.600834 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.600991 kubelet[2592]: E0325 01:16:24.600900 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.601404 kubelet[2592]: E0325 01:16:24.601371 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.601404 kubelet[2592]: W0325 01:16:24.601394 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.601641 kubelet[2592]: E0325 01:16:24.601613 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.601641 kubelet[2592]: W0325 01:16:24.601627 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.601806 kubelet[2592]: E0325 01:16:24.601781 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.601806 kubelet[2592]: W0325 01:16:24.601793 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.602295 kubelet[2592]: E0325 01:16:24.601944 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.602295 kubelet[2592]: W0325 01:16:24.601957 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.602295 kubelet[2592]: E0325 01:16:24.601969 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.602295 kubelet[2592]: E0325 01:16:24.601993 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.603316 kubelet[2592]: E0325 01:16:24.602801 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.603316 kubelet[2592]: W0325 01:16:24.602823 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.603316 kubelet[2592]: E0325 01:16:24.602821 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.603316 kubelet[2592]: E0325 01:16:24.602936 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.603316 kubelet[2592]: E0325 01:16:24.603042 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.603316 kubelet[2592]: E0325 01:16:24.603244 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.603316 kubelet[2592]: W0325 01:16:24.603257 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.603316 kubelet[2592]: E0325 01:16:24.603269 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.605028 systemd[1]: Started cri-containerd-fbdcf2797530412d93ec9cc45817dd078d343165984b83190a706fced06ec4ca.scope - libcontainer container fbdcf2797530412d93ec9cc45817dd078d343165984b83190a706fced06ec4ca. Mar 25 01:16:24.610471 kubelet[2592]: E0325 01:16:24.610446 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.610471 kubelet[2592]: W0325 01:16:24.610465 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.610570 kubelet[2592]: E0325 01:16:24.610482 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.617487 containerd[1481]: time="2025-03-25T01:16:24.617451251Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-564vh,Uid:d02f58d6-3aa3-4553-b387-50adcac84b14,Namespace:calico-system,Attempt:0,}" Mar 25 01:16:24.643519 containerd[1481]: time="2025-03-25T01:16:24.643475460Z" level=info msg="connecting to shim 485daa6c05fb1ed3ebd539094db7b412e479a0aa346326ecd108a2421a4dcbdc" address="unix:///run/containerd/s/fa0f193fd5ad0b5759ebe17c80c1b62bbdaad6cce4ac550a68c5bb941945f15b" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:16:24.651607 kubelet[2592]: E0325 01:16:24.651563 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.651607 kubelet[2592]: W0325 01:16:24.651600 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.651993 kubelet[2592]: E0325 01:16:24.651621 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.651993 kubelet[2592]: E0325 01:16:24.651825 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.651993 kubelet[2592]: W0325 01:16:24.651835 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.651993 kubelet[2592]: E0325 01:16:24.651845 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.652278 kubelet[2592]: E0325 01:16:24.652074 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.652278 kubelet[2592]: W0325 01:16:24.652084 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.652278 kubelet[2592]: E0325 01:16:24.652094 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.652387 kubelet[2592]: E0325 01:16:24.652354 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.652387 kubelet[2592]: W0325 01:16:24.652364 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.652387 kubelet[2592]: E0325 01:16:24.652374 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.652649 kubelet[2592]: E0325 01:16:24.652591 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:24.652649 kubelet[2592]: W0325 01:16:24.652601 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:24.652649 kubelet[2592]: E0325 01:16:24.652610 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:24.654505 containerd[1481]: time="2025-03-25T01:16:24.654447263Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5659f4fdf5-rlm7f,Uid:661aabfc-9fc0-4ee6-b1ed-4238661dd884,Namespace:calico-system,Attempt:0,} returns sandbox id \"fbdcf2797530412d93ec9cc45817dd078d343165984b83190a706fced06ec4ca\"" Mar 25 01:16:24.656878 containerd[1481]: time="2025-03-25T01:16:24.656043774Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\"" Mar 25 01:16:24.670027 systemd[1]: Started cri-containerd-485daa6c05fb1ed3ebd539094db7b412e479a0aa346326ecd108a2421a4dcbdc.scope - libcontainer container 485daa6c05fb1ed3ebd539094db7b412e479a0aa346326ecd108a2421a4dcbdc. Mar 25 01:16:24.702278 containerd[1481]: time="2025-03-25T01:16:24.702238785Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-564vh,Uid:d02f58d6-3aa3-4553-b387-50adcac84b14,Namespace:calico-system,Attempt:0,} returns sandbox id \"485daa6c05fb1ed3ebd539094db7b412e479a0aa346326ecd108a2421a4dcbdc\"" Mar 25 01:16:25.794355 kubelet[2592]: E0325 01:16:25.794307 2592 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dzt9d" podUID="c4b5976c-7a98-4d08-b430-ee7389a6d994" Mar 25 01:16:25.925876 containerd[1481]: time="2025-03-25T01:16:25.925815764Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:25.926698 containerd[1481]: time="2025-03-25T01:16:25.926558294Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.2: active requests=0, bytes read=28363957" Mar 25 01:16:25.927391 containerd[1481]: time="2025-03-25T01:16:25.927331225Z" level=info msg="ImageCreate event name:\"sha256:38a4e8457549414848315eae0d5ab8ecd6c51f4baaea849fe5edce714d81a999\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:25.929328 containerd[1481]: time="2025-03-25T01:16:25.929261873Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:25.929922 containerd[1481]: time="2025-03-25T01:16:25.929741025Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.2\" with image id \"sha256:38a4e8457549414848315eae0d5ab8ecd6c51f4baaea849fe5edce714d81a999\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\", size \"29733706\" in 1.273665409s" Mar 25 01:16:25.929922 containerd[1481]: time="2025-03-25T01:16:25.929772787Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\" returns image reference \"sha256:38a4e8457549414848315eae0d5ab8ecd6c51f4baaea849fe5edce714d81a999\"" Mar 25 01:16:25.931045 containerd[1481]: time="2025-03-25T01:16:25.930839018Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\"" Mar 25 01:16:25.937641 containerd[1481]: time="2025-03-25T01:16:25.937608787Z" level=info msg="CreateContainer within sandbox \"fbdcf2797530412d93ec9cc45817dd078d343165984b83190a706fced06ec4ca\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 25 01:16:25.944218 containerd[1481]: time="2025-03-25T01:16:25.944170822Z" level=info msg="Container d458750907c2dd68d4395e5a87970ef3f11d1586c625da7db7edaead5edc6d42: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:16:25.955774 containerd[1481]: time="2025-03-25T01:16:25.955676466Z" level=info msg="CreateContainer within sandbox \"fbdcf2797530412d93ec9cc45817dd078d343165984b83190a706fced06ec4ca\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"d458750907c2dd68d4395e5a87970ef3f11d1586c625da7db7edaead5edc6d42\"" Mar 25 01:16:25.956879 containerd[1481]: time="2025-03-25T01:16:25.956365872Z" level=info msg="StartContainer for \"d458750907c2dd68d4395e5a87970ef3f11d1586c625da7db7edaead5edc6d42\"" Mar 25 01:16:25.957612 containerd[1481]: time="2025-03-25T01:16:25.957576792Z" level=info msg="connecting to shim d458750907c2dd68d4395e5a87970ef3f11d1586c625da7db7edaead5edc6d42" address="unix:///run/containerd/s/ef174d3317f60eea0606249154b0ae5607968f52acf61210b0cca55095095245" protocol=ttrpc version=3 Mar 25 01:16:25.984051 systemd[1]: Started cri-containerd-d458750907c2dd68d4395e5a87970ef3f11d1586c625da7db7edaead5edc6d42.scope - libcontainer container d458750907c2dd68d4395e5a87970ef3f11d1586c625da7db7edaead5edc6d42. Mar 25 01:16:26.055709 containerd[1481]: time="2025-03-25T01:16:26.055594337Z" level=info msg="StartContainer for \"d458750907c2dd68d4395e5a87970ef3f11d1586c625da7db7edaead5edc6d42\" returns successfully" Mar 25 01:16:26.859959 kubelet[2592]: I0325 01:16:26.859899 2592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5659f4fdf5-rlm7f" podStartSLOduration=1.5851724059999999 podStartE2EDuration="2.859880805s" podCreationTimestamp="2025-03-25 01:16:24 +0000 UTC" firstStartedPulling="2025-03-25 01:16:24.655735112 +0000 UTC m=+14.945678300" lastFinishedPulling="2025-03-25 01:16:25.930443551 +0000 UTC m=+16.220386699" observedRunningTime="2025-03-25 01:16:26.859240405 +0000 UTC m=+17.149183593" watchObservedRunningTime="2025-03-25 01:16:26.859880805 +0000 UTC m=+17.149823993" Mar 25 01:16:26.868280 kubelet[2592]: E0325 01:16:26.868249 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:26.868280 kubelet[2592]: W0325 01:16:26.868276 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:26.868425 kubelet[2592]: E0325 01:16:26.868297 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:26.868505 kubelet[2592]: E0325 01:16:26.868494 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:26.868505 kubelet[2592]: W0325 01:16:26.868505 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:26.868572 kubelet[2592]: E0325 01:16:26.868515 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:26.868721 kubelet[2592]: E0325 01:16:26.868710 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:26.868721 kubelet[2592]: W0325 01:16:26.868721 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:26.868776 kubelet[2592]: E0325 01:16:26.868730 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:26.868949 kubelet[2592]: E0325 01:16:26.868914 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:26.868949 kubelet[2592]: W0325 01:16:26.868948 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:26.869017 kubelet[2592]: E0325 01:16:26.868959 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:26.869136 kubelet[2592]: E0325 01:16:26.869125 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:26.869136 kubelet[2592]: W0325 01:16:26.869135 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:26.869191 kubelet[2592]: E0325 01:16:26.869144 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:26.869310 kubelet[2592]: E0325 01:16:26.869299 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:26.869350 kubelet[2592]: W0325 01:16:26.869310 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:26.869350 kubelet[2592]: E0325 01:16:26.869319 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:26.869533 kubelet[2592]: E0325 01:16:26.869490 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:26.869533 kubelet[2592]: W0325 01:16:26.869533 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:26.869592 kubelet[2592]: E0325 01:16:26.869542 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:26.869738 kubelet[2592]: E0325 01:16:26.869725 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:26.869770 kubelet[2592]: W0325 01:16:26.869738 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:26.869770 kubelet[2592]: E0325 01:16:26.869748 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:26.869939 kubelet[2592]: E0325 01:16:26.869929 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:26.869939 kubelet[2592]: W0325 01:16:26.869939 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:26.870008 kubelet[2592]: E0325 01:16:26.869948 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:26.870120 kubelet[2592]: E0325 01:16:26.870107 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:26.870120 kubelet[2592]: W0325 01:16:26.870118 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:26.870186 kubelet[2592]: E0325 01:16:26.870127 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:26.870319 kubelet[2592]: E0325 01:16:26.870305 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:26.870319 kubelet[2592]: W0325 01:16:26.870317 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:26.870389 kubelet[2592]: E0325 01:16:26.870325 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:26.870509 kubelet[2592]: E0325 01:16:26.870499 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:26.870509 kubelet[2592]: W0325 01:16:26.870509 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:26.870569 kubelet[2592]: E0325 01:16:26.870517 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:26.870673 kubelet[2592]: E0325 01:16:26.870660 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:26.870673 kubelet[2592]: W0325 01:16:26.870671 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:26.870727 kubelet[2592]: E0325 01:16:26.870678 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:26.870873 kubelet[2592]: E0325 01:16:26.870847 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:26.870873 kubelet[2592]: W0325 01:16:26.870872 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:26.870932 kubelet[2592]: E0325 01:16:26.870880 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:26.871078 kubelet[2592]: E0325 01:16:26.871066 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:26.871078 kubelet[2592]: W0325 01:16:26.871077 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:26.871140 kubelet[2592]: E0325 01:16:26.871086 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:26.914652 kubelet[2592]: E0325 01:16:26.914621 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:26.914652 kubelet[2592]: W0325 01:16:26.914644 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:26.914652 kubelet[2592]: E0325 01:16:26.914664 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:26.914937 kubelet[2592]: E0325 01:16:26.914922 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:26.914937 kubelet[2592]: W0325 01:16:26.914934 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:26.914988 kubelet[2592]: E0325 01:16:26.914946 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:26.915150 kubelet[2592]: E0325 01:16:26.915136 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:26.915150 kubelet[2592]: W0325 01:16:26.915148 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:26.915211 kubelet[2592]: E0325 01:16:26.915160 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:26.915378 kubelet[2592]: E0325 01:16:26.915366 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:26.915378 kubelet[2592]: W0325 01:16:26.915377 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:26.915445 kubelet[2592]: E0325 01:16:26.915389 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:26.915546 kubelet[2592]: E0325 01:16:26.915534 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:26.915546 kubelet[2592]: W0325 01:16:26.915544 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:26.915595 kubelet[2592]: E0325 01:16:26.915553 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:26.915723 kubelet[2592]: E0325 01:16:26.915711 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:26.915744 kubelet[2592]: W0325 01:16:26.915723 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:26.915744 kubelet[2592]: E0325 01:16:26.915735 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:26.915938 kubelet[2592]: E0325 01:16:26.915909 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:26.915969 kubelet[2592]: W0325 01:16:26.915940 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:26.915994 kubelet[2592]: E0325 01:16:26.915968 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:26.916180 kubelet[2592]: E0325 01:16:26.916165 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:26.916180 kubelet[2592]: W0325 01:16:26.916177 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:26.916243 kubelet[2592]: E0325 01:16:26.916208 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:26.916385 kubelet[2592]: E0325 01:16:26.916372 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:26.916385 kubelet[2592]: W0325 01:16:26.916383 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:26.916441 kubelet[2592]: E0325 01:16:26.916410 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:26.916599 kubelet[2592]: E0325 01:16:26.916586 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:26.916599 kubelet[2592]: W0325 01:16:26.916597 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:26.916649 kubelet[2592]: E0325 01:16:26.916611 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:26.916895 kubelet[2592]: E0325 01:16:26.916880 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:26.916895 kubelet[2592]: W0325 01:16:26.916895 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:26.916955 kubelet[2592]: E0325 01:16:26.916909 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:26.917081 kubelet[2592]: E0325 01:16:26.917068 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:26.917081 kubelet[2592]: W0325 01:16:26.917079 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:26.917129 kubelet[2592]: E0325 01:16:26.917096 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:26.917296 kubelet[2592]: E0325 01:16:26.917274 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:26.917296 kubelet[2592]: W0325 01:16:26.917293 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:26.917346 kubelet[2592]: E0325 01:16:26.917306 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:26.917698 kubelet[2592]: E0325 01:16:26.917667 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:26.917698 kubelet[2592]: W0325 01:16:26.917681 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:26.917698 kubelet[2592]: E0325 01:16:26.917695 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:26.917902 kubelet[2592]: E0325 01:16:26.917889 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:26.917925 kubelet[2592]: W0325 01:16:26.917902 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:26.917925 kubelet[2592]: E0325 01:16:26.917914 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:26.918125 kubelet[2592]: E0325 01:16:26.918113 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:26.918148 kubelet[2592]: W0325 01:16:26.918124 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:26.918148 kubelet[2592]: E0325 01:16:26.918136 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:26.918670 kubelet[2592]: E0325 01:16:26.918646 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:26.918670 kubelet[2592]: W0325 01:16:26.918662 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:26.918722 kubelet[2592]: E0325 01:16:26.918673 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:26.925370 kubelet[2592]: E0325 01:16:26.925342 2592 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:16:26.925370 kubelet[2592]: W0325 01:16:26.925362 2592 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:16:26.925370 kubelet[2592]: E0325 01:16:26.925376 2592 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:16:26.997161 containerd[1481]: time="2025-03-25T01:16:26.997109705Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:26.997659 containerd[1481]: time="2025-03-25T01:16:26.997607096Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2: active requests=0, bytes read=5120152" Mar 25 01:16:26.998677 containerd[1481]: time="2025-03-25T01:16:26.998624161Z" level=info msg="ImageCreate event name:\"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:27.000768 containerd[1481]: time="2025-03-25T01:16:27.000606527Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:27.001196 containerd[1481]: time="2025-03-25T01:16:27.001168162Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" with image id \"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\", size \"6489869\" in 1.070280862s" Mar 25 01:16:27.001246 containerd[1481]: time="2025-03-25T01:16:27.001201684Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" returns image reference \"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\"" Mar 25 01:16:27.007540 containerd[1481]: time="2025-03-25T01:16:27.007266694Z" level=info msg="CreateContainer within sandbox \"485daa6c05fb1ed3ebd539094db7b412e479a0aa346326ecd108a2421a4dcbdc\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 25 01:16:27.017622 containerd[1481]: time="2025-03-25T01:16:27.017011084Z" level=info msg="Container 28ca580dcc310934ddd966591a53cdf8bd01f90d38d12f01b1b19fec9f557fe1: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:16:27.040823 containerd[1481]: time="2025-03-25T01:16:27.040762444Z" level=info msg="CreateContainer within sandbox \"485daa6c05fb1ed3ebd539094db7b412e479a0aa346326ecd108a2421a4dcbdc\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"28ca580dcc310934ddd966591a53cdf8bd01f90d38d12f01b1b19fec9f557fe1\"" Mar 25 01:16:27.041898 containerd[1481]: time="2025-03-25T01:16:27.041647218Z" level=info msg="StartContainer for \"28ca580dcc310934ddd966591a53cdf8bd01f90d38d12f01b1b19fec9f557fe1\"" Mar 25 01:16:27.043277 containerd[1481]: time="2025-03-25T01:16:27.043236194Z" level=info msg="connecting to shim 28ca580dcc310934ddd966591a53cdf8bd01f90d38d12f01b1b19fec9f557fe1" address="unix:///run/containerd/s/fa0f193fd5ad0b5759ebe17c80c1b62bbdaad6cce4ac550a68c5bb941945f15b" protocol=ttrpc version=3 Mar 25 01:16:27.068023 systemd[1]: Started cri-containerd-28ca580dcc310934ddd966591a53cdf8bd01f90d38d12f01b1b19fec9f557fe1.scope - libcontainer container 28ca580dcc310934ddd966591a53cdf8bd01f90d38d12f01b1b19fec9f557fe1. Mar 25 01:16:27.101910 containerd[1481]: time="2025-03-25T01:16:27.101847347Z" level=info msg="StartContainer for \"28ca580dcc310934ddd966591a53cdf8bd01f90d38d12f01b1b19fec9f557fe1\" returns successfully" Mar 25 01:16:27.126056 systemd[1]: cri-containerd-28ca580dcc310934ddd966591a53cdf8bd01f90d38d12f01b1b19fec9f557fe1.scope: Deactivated successfully. Mar 25 01:16:27.158555 containerd[1481]: time="2025-03-25T01:16:27.158379014Z" level=info msg="received exit event container_id:\"28ca580dcc310934ddd966591a53cdf8bd01f90d38d12f01b1b19fec9f557fe1\" id:\"28ca580dcc310934ddd966591a53cdf8bd01f90d38d12f01b1b19fec9f557fe1\" pid:3257 exited_at:{seconds:1742865387 nanos:144240837}" Mar 25 01:16:27.158555 containerd[1481]: time="2025-03-25T01:16:27.158492061Z" level=info msg="TaskExit event in podsandbox handler container_id:\"28ca580dcc310934ddd966591a53cdf8bd01f90d38d12f01b1b19fec9f557fe1\" id:\"28ca580dcc310934ddd966591a53cdf8bd01f90d38d12f01b1b19fec9f557fe1\" pid:3257 exited_at:{seconds:1742865387 nanos:144240837}" Mar 25 01:16:27.197352 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-28ca580dcc310934ddd966591a53cdf8bd01f90d38d12f01b1b19fec9f557fe1-rootfs.mount: Deactivated successfully. Mar 25 01:16:27.794690 kubelet[2592]: E0325 01:16:27.794642 2592 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dzt9d" podUID="c4b5976c-7a98-4d08-b430-ee7389a6d994" Mar 25 01:16:27.855193 kubelet[2592]: I0325 01:16:27.855100 2592 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:16:27.856219 containerd[1481]: time="2025-03-25T01:16:27.856017783Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\"" Mar 25 01:16:29.795019 kubelet[2592]: E0325 01:16:29.794950 2592 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dzt9d" podUID="c4b5976c-7a98-4d08-b430-ee7389a6d994" Mar 25 01:16:30.256689 containerd[1481]: time="2025-03-25T01:16:30.256645654Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:30.257561 containerd[1481]: time="2025-03-25T01:16:30.257456377Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.2: active requests=0, bytes read=91227396" Mar 25 01:16:30.258999 containerd[1481]: time="2025-03-25T01:16:30.258404668Z" level=info msg="ImageCreate event name:\"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:30.262917 containerd[1481]: time="2025-03-25T01:16:30.262862946Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:30.263515 containerd[1481]: time="2025-03-25T01:16:30.263485499Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.2\" with image id \"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\", size \"92597153\" in 2.407429634s" Mar 25 01:16:30.263569 containerd[1481]: time="2025-03-25T01:16:30.263514380Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\" returns image reference \"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\"" Mar 25 01:16:30.265328 containerd[1481]: time="2025-03-25T01:16:30.265298915Z" level=info msg="CreateContainer within sandbox \"485daa6c05fb1ed3ebd539094db7b412e479a0aa346326ecd108a2421a4dcbdc\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 25 01:16:30.273213 containerd[1481]: time="2025-03-25T01:16:30.271480045Z" level=info msg="Container db336cb09ccc6653b1e93e57362e0a28b930827514fc135f91a54730bd85520b: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:16:30.280255 containerd[1481]: time="2025-03-25T01:16:30.279721484Z" level=info msg="CreateContainer within sandbox \"485daa6c05fb1ed3ebd539094db7b412e479a0aa346326ecd108a2421a4dcbdc\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"db336cb09ccc6653b1e93e57362e0a28b930827514fc135f91a54730bd85520b\"" Mar 25 01:16:30.280406 containerd[1481]: time="2025-03-25T01:16:30.280352797Z" level=info msg="StartContainer for \"db336cb09ccc6653b1e93e57362e0a28b930827514fc135f91a54730bd85520b\"" Mar 25 01:16:30.282065 containerd[1481]: time="2025-03-25T01:16:30.281946922Z" level=info msg="connecting to shim db336cb09ccc6653b1e93e57362e0a28b930827514fc135f91a54730bd85520b" address="unix:///run/containerd/s/fa0f193fd5ad0b5759ebe17c80c1b62bbdaad6cce4ac550a68c5bb941945f15b" protocol=ttrpc version=3 Mar 25 01:16:30.302008 systemd[1]: Started cri-containerd-db336cb09ccc6653b1e93e57362e0a28b930827514fc135f91a54730bd85520b.scope - libcontainer container db336cb09ccc6653b1e93e57362e0a28b930827514fc135f91a54730bd85520b. Mar 25 01:16:30.359759 containerd[1481]: time="2025-03-25T01:16:30.359713866Z" level=info msg="StartContainer for \"db336cb09ccc6653b1e93e57362e0a28b930827514fc135f91a54730bd85520b\" returns successfully" Mar 25 01:16:30.887058 containerd[1481]: time="2025-03-25T01:16:30.886981679Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 25 01:16:30.889533 systemd[1]: cri-containerd-db336cb09ccc6653b1e93e57362e0a28b930827514fc135f91a54730bd85520b.scope: Deactivated successfully. Mar 25 01:16:30.890501 systemd[1]: cri-containerd-db336cb09ccc6653b1e93e57362e0a28b930827514fc135f91a54730bd85520b.scope: Consumed 416ms CPU time, 160.4M memory peak, 4K read from disk, 150.3M written to disk. Mar 25 01:16:30.896423 containerd[1481]: time="2025-03-25T01:16:30.896333698Z" level=info msg="received exit event container_id:\"db336cb09ccc6653b1e93e57362e0a28b930827514fc135f91a54730bd85520b\" id:\"db336cb09ccc6653b1e93e57362e0a28b930827514fc135f91a54730bd85520b\" pid:3317 exited_at:{seconds:1742865390 nanos:896102485}" Mar 25 01:16:30.896501 containerd[1481]: time="2025-03-25T01:16:30.896442144Z" level=info msg="TaskExit event in podsandbox handler container_id:\"db336cb09ccc6653b1e93e57362e0a28b930827514fc135f91a54730bd85520b\" id:\"db336cb09ccc6653b1e93e57362e0a28b930827514fc135f91a54730bd85520b\" pid:3317 exited_at:{seconds:1742865390 nanos:896102485}" Mar 25 01:16:30.910147 kubelet[2592]: I0325 01:16:30.909845 2592 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Mar 25 01:16:30.939382 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-db336cb09ccc6653b1e93e57362e0a28b930827514fc135f91a54730bd85520b-rootfs.mount: Deactivated successfully. Mar 25 01:16:31.006869 systemd[1]: Created slice kubepods-besteffort-pod557c52a7_2de5_4a05_ba20_15e253ad24e8.slice - libcontainer container kubepods-besteffort-pod557c52a7_2de5_4a05_ba20_15e253ad24e8.slice. Mar 25 01:16:31.014296 systemd[1]: Created slice kubepods-besteffort-pod685bedc2_f03f_4bb4_896f_cbc789cd2327.slice - libcontainer container kubepods-besteffort-pod685bedc2_f03f_4bb4_896f_cbc789cd2327.slice. Mar 25 01:16:31.021429 systemd[1]: Created slice kubepods-burstable-pod2cb4ba0e_3c48_4c29_9cd6_3c852ad5ee46.slice - libcontainer container kubepods-burstable-pod2cb4ba0e_3c48_4c29_9cd6_3c852ad5ee46.slice. Mar 25 01:16:31.027307 systemd[1]: Created slice kubepods-burstable-pod801dfadb_ac93_47ed_94b9_073599ad9abd.slice - libcontainer container kubepods-burstable-pod801dfadb_ac93_47ed_94b9_073599ad9abd.slice. Mar 25 01:16:31.032925 systemd[1]: Created slice kubepods-besteffort-pode10f7734_b525_4202_aa84_b2555758598b.slice - libcontainer container kubepods-besteffort-pode10f7734_b525_4202_aa84_b2555758598b.slice. Mar 25 01:16:31.041551 kubelet[2592]: I0325 01:16:31.041523 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/685bedc2-f03f-4bb4-896f-cbc789cd2327-tigera-ca-bundle\") pod \"calico-kube-controllers-68dbc864b6-dsj69\" (UID: \"685bedc2-f03f-4bb4-896f-cbc789cd2327\") " pod="calico-system/calico-kube-controllers-68dbc864b6-dsj69" Mar 25 01:16:31.041551 kubelet[2592]: I0325 01:16:31.041560 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/801dfadb-ac93-47ed-94b9-073599ad9abd-config-volume\") pod \"coredns-6f6b679f8f-9fkq7\" (UID: \"801dfadb-ac93-47ed-94b9-073599ad9abd\") " pod="kube-system/coredns-6f6b679f8f-9fkq7" Mar 25 01:16:31.041697 kubelet[2592]: I0325 01:16:31.041581 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxckf\" (UniqueName: \"kubernetes.io/projected/557c52a7-2de5-4a05-ba20-15e253ad24e8-kube-api-access-xxckf\") pod \"calico-apiserver-8c9cf7c59-zl2f7\" (UID: \"557c52a7-2de5-4a05-ba20-15e253ad24e8\") " pod="calico-apiserver/calico-apiserver-8c9cf7c59-zl2f7" Mar 25 01:16:31.041697 kubelet[2592]: I0325 01:16:31.041600 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sz2m\" (UniqueName: \"kubernetes.io/projected/685bedc2-f03f-4bb4-896f-cbc789cd2327-kube-api-access-5sz2m\") pod \"calico-kube-controllers-68dbc864b6-dsj69\" (UID: \"685bedc2-f03f-4bb4-896f-cbc789cd2327\") " pod="calico-system/calico-kube-controllers-68dbc864b6-dsj69" Mar 25 01:16:31.041697 kubelet[2592]: I0325 01:16:31.041618 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/557c52a7-2de5-4a05-ba20-15e253ad24e8-calico-apiserver-certs\") pod \"calico-apiserver-8c9cf7c59-zl2f7\" (UID: \"557c52a7-2de5-4a05-ba20-15e253ad24e8\") " pod="calico-apiserver/calico-apiserver-8c9cf7c59-zl2f7" Mar 25 01:16:31.041697 kubelet[2592]: I0325 01:16:31.041634 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27dj5\" (UniqueName: \"kubernetes.io/projected/801dfadb-ac93-47ed-94b9-073599ad9abd-kube-api-access-27dj5\") pod \"coredns-6f6b679f8f-9fkq7\" (UID: \"801dfadb-ac93-47ed-94b9-073599ad9abd\") " pod="kube-system/coredns-6f6b679f8f-9fkq7" Mar 25 01:16:31.041697 kubelet[2592]: I0325 01:16:31.041650 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkcbl\" (UniqueName: \"kubernetes.io/projected/e10f7734-b525-4202-aa84-b2555758598b-kube-api-access-lkcbl\") pod \"calico-apiserver-8c9cf7c59-hf464\" (UID: \"e10f7734-b525-4202-aa84-b2555758598b\") " pod="calico-apiserver/calico-apiserver-8c9cf7c59-hf464" Mar 25 01:16:31.041810 kubelet[2592]: I0325 01:16:31.041668 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2cb4ba0e-3c48-4c29-9cd6-3c852ad5ee46-config-volume\") pod \"coredns-6f6b679f8f-ptvcd\" (UID: \"2cb4ba0e-3c48-4c29-9cd6-3c852ad5ee46\") " pod="kube-system/coredns-6f6b679f8f-ptvcd" Mar 25 01:16:31.041810 kubelet[2592]: I0325 01:16:31.041687 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq5zt\" (UniqueName: \"kubernetes.io/projected/2cb4ba0e-3c48-4c29-9cd6-3c852ad5ee46-kube-api-access-xq5zt\") pod \"coredns-6f6b679f8f-ptvcd\" (UID: \"2cb4ba0e-3c48-4c29-9cd6-3c852ad5ee46\") " pod="kube-system/coredns-6f6b679f8f-ptvcd" Mar 25 01:16:31.041810 kubelet[2592]: I0325 01:16:31.041704 2592 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/e10f7734-b525-4202-aa84-b2555758598b-calico-apiserver-certs\") pod \"calico-apiserver-8c9cf7c59-hf464\" (UID: \"e10f7734-b525-4202-aa84-b2555758598b\") " pod="calico-apiserver/calico-apiserver-8c9cf7c59-hf464" Mar 25 01:16:31.310299 containerd[1481]: time="2025-03-25T01:16:31.310250568Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8c9cf7c59-zl2f7,Uid:557c52a7-2de5-4a05-ba20-15e253ad24e8,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:16:31.318326 containerd[1481]: time="2025-03-25T01:16:31.318014525Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-68dbc864b6-dsj69,Uid:685bedc2-f03f-4bb4-896f-cbc789cd2327,Namespace:calico-system,Attempt:0,}" Mar 25 01:16:31.324815 containerd[1481]: time="2025-03-25T01:16:31.324787191Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-ptvcd,Uid:2cb4ba0e-3c48-4c29-9cd6-3c852ad5ee46,Namespace:kube-system,Attempt:0,}" Mar 25 01:16:31.337844 containerd[1481]: time="2025-03-25T01:16:31.334402163Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-9fkq7,Uid:801dfadb-ac93-47ed-94b9-073599ad9abd,Namespace:kube-system,Attempt:0,}" Mar 25 01:16:31.350887 containerd[1481]: time="2025-03-25T01:16:31.350733278Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8c9cf7c59-hf464,Uid:e10f7734-b525-4202-aa84-b2555758598b,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:16:31.732148 containerd[1481]: time="2025-03-25T01:16:31.732095899Z" level=error msg="Failed to destroy network for sandbox \"928dc9a7bd3e053df0d623941f7dc758916b716bbd9678efdbad486b2da2c488\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:16:31.734711 containerd[1481]: time="2025-03-25T01:16:31.734664910Z" level=error msg="Failed to destroy network for sandbox \"e2591ec162824ab1e25faaf740375f98871c228c39f5293ce67bb0808efd9549\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:16:31.736593 containerd[1481]: time="2025-03-25T01:16:31.736548167Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8c9cf7c59-zl2f7,Uid:557c52a7-2de5-4a05-ba20-15e253ad24e8,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"928dc9a7bd3e053df0d623941f7dc758916b716bbd9678efdbad486b2da2c488\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:16:31.737646 containerd[1481]: time="2025-03-25T01:16:31.737392050Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8c9cf7c59-hf464,Uid:e10f7734-b525-4202-aa84-b2555758598b,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e2591ec162824ab1e25faaf740375f98871c228c39f5293ce67bb0808efd9549\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:16:31.737749 containerd[1481]: time="2025-03-25T01:16:31.737660023Z" level=error msg="Failed to destroy network for sandbox \"701b2f78e2f4389afef61621a7c8becf2f0eb0c934254b0d40ca68a7e5603cad\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:16:31.740172 containerd[1481]: time="2025-03-25T01:16:31.740106749Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-68dbc864b6-dsj69,Uid:685bedc2-f03f-4bb4-896f-cbc789cd2327,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"701b2f78e2f4389afef61621a7c8becf2f0eb0c934254b0d40ca68a7e5603cad\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:16:31.740661 containerd[1481]: time="2025-03-25T01:16:31.740627615Z" level=error msg="Failed to destroy network for sandbox \"b6facec1fb9d5509fb14843dc6eb284ac66e9df724392e9d0aa4336ec27595ce\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:16:31.740740 containerd[1481]: time="2025-03-25T01:16:31.740645656Z" level=error msg="Failed to destroy network for sandbox \"209699384853a197074a4f62ce379626e3320e5107394eda79027e8846010962\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:16:31.741591 containerd[1481]: time="2025-03-25T01:16:31.741550022Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-9fkq7,Uid:801dfadb-ac93-47ed-94b9-073599ad9abd,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b6facec1fb9d5509fb14843dc6eb284ac66e9df724392e9d0aa4336ec27595ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:16:31.741781 kubelet[2592]: E0325 01:16:31.741727 2592 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b6facec1fb9d5509fb14843dc6eb284ac66e9df724392e9d0aa4336ec27595ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:16:31.741967 kubelet[2592]: E0325 01:16:31.741806 2592 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b6facec1fb9d5509fb14843dc6eb284ac66e9df724392e9d0aa4336ec27595ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-9fkq7" Mar 25 01:16:31.741967 kubelet[2592]: E0325 01:16:31.741825 2592 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b6facec1fb9d5509fb14843dc6eb284ac66e9df724392e9d0aa4336ec27595ce\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-9fkq7" Mar 25 01:16:31.741967 kubelet[2592]: E0325 01:16:31.741731 2592 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"928dc9a7bd3e053df0d623941f7dc758916b716bbd9678efdbad486b2da2c488\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:16:31.742162 kubelet[2592]: E0325 01:16:31.741874 2592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-9fkq7_kube-system(801dfadb-ac93-47ed-94b9-073599ad9abd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-9fkq7_kube-system(801dfadb-ac93-47ed-94b9-073599ad9abd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b6facec1fb9d5509fb14843dc6eb284ac66e9df724392e9d0aa4336ec27595ce\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-9fkq7" podUID="801dfadb-ac93-47ed-94b9-073599ad9abd" Mar 25 01:16:31.742162 kubelet[2592]: E0325 01:16:31.741741 2592 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"701b2f78e2f4389afef61621a7c8becf2f0eb0c934254b0d40ca68a7e5603cad\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:16:31.742162 kubelet[2592]: E0325 01:16:31.742046 2592 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"928dc9a7bd3e053df0d623941f7dc758916b716bbd9678efdbad486b2da2c488\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8c9cf7c59-zl2f7" Mar 25 01:16:31.742464 kubelet[2592]: E0325 01:16:31.742066 2592 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"928dc9a7bd3e053df0d623941f7dc758916b716bbd9678efdbad486b2da2c488\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8c9cf7c59-zl2f7" Mar 25 01:16:31.742464 kubelet[2592]: E0325 01:16:31.742095 2592 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e2591ec162824ab1e25faaf740375f98871c228c39f5293ce67bb0808efd9549\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:16:31.742464 kubelet[2592]: E0325 01:16:31.742147 2592 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e2591ec162824ab1e25faaf740375f98871c228c39f5293ce67bb0808efd9549\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8c9cf7c59-hf464" Mar 25 01:16:31.742464 kubelet[2592]: E0325 01:16:31.742163 2592 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e2591ec162824ab1e25faaf740375f98871c228c39f5293ce67bb0808efd9549\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8c9cf7c59-hf464" Mar 25 01:16:31.742576 kubelet[2592]: E0325 01:16:31.741925 2592 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"701b2f78e2f4389afef61621a7c8becf2f0eb0c934254b0d40ca68a7e5603cad\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-68dbc864b6-dsj69" Mar 25 01:16:31.742576 kubelet[2592]: E0325 01:16:31.742205 2592 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"701b2f78e2f4389afef61621a7c8becf2f0eb0c934254b0d40ca68a7e5603cad\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-68dbc864b6-dsj69" Mar 25 01:16:31.742576 kubelet[2592]: E0325 01:16:31.742235 2592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-68dbc864b6-dsj69_calico-system(685bedc2-f03f-4bb4-896f-cbc789cd2327)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-68dbc864b6-dsj69_calico-system(685bedc2-f03f-4bb4-896f-cbc789cd2327)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"701b2f78e2f4389afef61621a7c8becf2f0eb0c934254b0d40ca68a7e5603cad\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68dbc864b6-dsj69" podUID="685bedc2-f03f-4bb4-896f-cbc789cd2327" Mar 25 01:16:31.742660 kubelet[2592]: E0325 01:16:31.742245 2592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8c9cf7c59-hf464_calico-apiserver(e10f7734-b525-4202-aa84-b2555758598b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8c9cf7c59-hf464_calico-apiserver(e10f7734-b525-4202-aa84-b2555758598b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e2591ec162824ab1e25faaf740375f98871c228c39f5293ce67bb0808efd9549\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8c9cf7c59-hf464" podUID="e10f7734-b525-4202-aa84-b2555758598b" Mar 25 01:16:31.742660 kubelet[2592]: E0325 01:16:31.742361 2592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8c9cf7c59-zl2f7_calico-apiserver(557c52a7-2de5-4a05-ba20-15e253ad24e8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8c9cf7c59-zl2f7_calico-apiserver(557c52a7-2de5-4a05-ba20-15e253ad24e8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"928dc9a7bd3e053df0d623941f7dc758916b716bbd9678efdbad486b2da2c488\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8c9cf7c59-zl2f7" podUID="557c52a7-2de5-4a05-ba20-15e253ad24e8" Mar 25 01:16:31.742811 containerd[1481]: time="2025-03-25T01:16:31.742773485Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-ptvcd,Uid:2cb4ba0e-3c48-4c29-9cd6-3c852ad5ee46,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"209699384853a197074a4f62ce379626e3320e5107394eda79027e8846010962\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:16:31.743060 kubelet[2592]: E0325 01:16:31.742972 2592 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"209699384853a197074a4f62ce379626e3320e5107394eda79027e8846010962\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:16:31.743060 kubelet[2592]: E0325 01:16:31.743025 2592 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"209699384853a197074a4f62ce379626e3320e5107394eda79027e8846010962\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-ptvcd" Mar 25 01:16:31.743060 kubelet[2592]: E0325 01:16:31.743041 2592 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"209699384853a197074a4f62ce379626e3320e5107394eda79027e8846010962\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-ptvcd" Mar 25 01:16:31.743262 kubelet[2592]: E0325 01:16:31.743211 2592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-ptvcd_kube-system(2cb4ba0e-3c48-4c29-9cd6-3c852ad5ee46)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-ptvcd_kube-system(2cb4ba0e-3c48-4c29-9cd6-3c852ad5ee46)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"209699384853a197074a4f62ce379626e3320e5107394eda79027e8846010962\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-ptvcd" podUID="2cb4ba0e-3c48-4c29-9cd6-3c852ad5ee46" Mar 25 01:16:31.800262 systemd[1]: Created slice kubepods-besteffort-podc4b5976c_7a98_4d08_b430_ee7389a6d994.slice - libcontainer container kubepods-besteffort-podc4b5976c_7a98_4d08_b430_ee7389a6d994.slice. Mar 25 01:16:31.802974 containerd[1481]: time="2025-03-25T01:16:31.802655107Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dzt9d,Uid:c4b5976c-7a98-4d08-b430-ee7389a6d994,Namespace:calico-system,Attempt:0,}" Mar 25 01:16:31.843271 containerd[1481]: time="2025-03-25T01:16:31.843150258Z" level=error msg="Failed to destroy network for sandbox \"89c89bf0dfd59cde50f784c12115fe1c8625d9718840d60191943f9a3841ec6f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:16:31.844446 containerd[1481]: time="2025-03-25T01:16:31.844385841Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dzt9d,Uid:c4b5976c-7a98-4d08-b430-ee7389a6d994,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"89c89bf0dfd59cde50f784c12115fe1c8625d9718840d60191943f9a3841ec6f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:16:31.844656 kubelet[2592]: E0325 01:16:31.844608 2592 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"89c89bf0dfd59cde50f784c12115fe1c8625d9718840d60191943f9a3841ec6f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:16:31.844746 kubelet[2592]: E0325 01:16:31.844683 2592 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"89c89bf0dfd59cde50f784c12115fe1c8625d9718840d60191943f9a3841ec6f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dzt9d" Mar 25 01:16:31.844746 kubelet[2592]: E0325 01:16:31.844703 2592 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"89c89bf0dfd59cde50f784c12115fe1c8625d9718840d60191943f9a3841ec6f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dzt9d" Mar 25 01:16:31.844799 kubelet[2592]: E0325 01:16:31.844746 2592 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-dzt9d_calico-system(c4b5976c-7a98-4d08-b430-ee7389a6d994)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-dzt9d_calico-system(c4b5976c-7a98-4d08-b430-ee7389a6d994)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"89c89bf0dfd59cde50f784c12115fe1c8625d9718840d60191943f9a3841ec6f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dzt9d" podUID="c4b5976c-7a98-4d08-b430-ee7389a6d994" Mar 25 01:16:31.868387 containerd[1481]: time="2025-03-25T01:16:31.868314424Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\"" Mar 25 01:16:32.273110 systemd[1]: run-netns-cni\x2d91501fbe\x2d5b49\x2d5bd6\x2d3366\x2db3f315351af9.mount: Deactivated successfully. Mar 25 01:16:32.273203 systemd[1]: run-netns-cni\x2d0c49e414\x2db61d\x2df075\x2d5c30\x2dc68fbdb1baba.mount: Deactivated successfully. Mar 25 01:16:32.273249 systemd[1]: run-netns-cni\x2d39191491\x2dfa45\x2d67f8\x2d585e\x2da4b9bf4da73b.mount: Deactivated successfully. Mar 25 01:16:32.273295 systemd[1]: run-netns-cni\x2d60838236\x2d3f55\x2d5e84\x2d94cd\x2da43764341d42.mount: Deactivated successfully. Mar 25 01:16:35.340033 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2541572342.mount: Deactivated successfully. Mar 25 01:16:35.366007 containerd[1481]: time="2025-03-25T01:16:35.365955667Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:35.366629 containerd[1481]: time="2025-03-25T01:16:35.366575134Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.2: active requests=0, bytes read=137086024" Mar 25 01:16:35.367322 containerd[1481]: time="2025-03-25T01:16:35.367288206Z" level=info msg="ImageCreate event name:\"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:35.369146 containerd[1481]: time="2025-03-25T01:16:35.369096245Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:35.369571 containerd[1481]: time="2025-03-25T01:16:35.369538624Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.2\" with image id \"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\", size \"137085886\" in 3.501139076s" Mar 25 01:16:35.369571 containerd[1481]: time="2025-03-25T01:16:35.369569265Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\" returns image reference \"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\"" Mar 25 01:16:35.380986 containerd[1481]: time="2025-03-25T01:16:35.380941164Z" level=info msg="CreateContainer within sandbox \"485daa6c05fb1ed3ebd539094db7b412e479a0aa346326ecd108a2421a4dcbdc\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 25 01:16:35.443779 containerd[1481]: time="2025-03-25T01:16:35.442502221Z" level=info msg="Container 9f6b4e1432693a949302362bf226e1cf53fce4534ec67598a923f4f969dadd17: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:16:35.450327 containerd[1481]: time="2025-03-25T01:16:35.450286282Z" level=info msg="CreateContainer within sandbox \"485daa6c05fb1ed3ebd539094db7b412e479a0aa346326ecd108a2421a4dcbdc\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"9f6b4e1432693a949302362bf226e1cf53fce4534ec67598a923f4f969dadd17\"" Mar 25 01:16:35.450772 containerd[1481]: time="2025-03-25T01:16:35.450741302Z" level=info msg="StartContainer for \"9f6b4e1432693a949302362bf226e1cf53fce4534ec67598a923f4f969dadd17\"" Mar 25 01:16:35.452393 containerd[1481]: time="2025-03-25T01:16:35.452368093Z" level=info msg="connecting to shim 9f6b4e1432693a949302362bf226e1cf53fce4534ec67598a923f4f969dadd17" address="unix:///run/containerd/s/fa0f193fd5ad0b5759ebe17c80c1b62bbdaad6cce4ac550a68c5bb941945f15b" protocol=ttrpc version=3 Mar 25 01:16:35.474026 systemd[1]: Started cri-containerd-9f6b4e1432693a949302362bf226e1cf53fce4534ec67598a923f4f969dadd17.scope - libcontainer container 9f6b4e1432693a949302362bf226e1cf53fce4534ec67598a923f4f969dadd17. Mar 25 01:16:35.517164 containerd[1481]: time="2025-03-25T01:16:35.515830793Z" level=info msg="StartContainer for \"9f6b4e1432693a949302362bf226e1cf53fce4534ec67598a923f4f969dadd17\" returns successfully" Mar 25 01:16:35.677096 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Mar 25 01:16:35.677203 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Mar 25 01:16:35.897422 kubelet[2592]: I0325 01:16:35.897181 2592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-564vh" podStartSLOduration=1.230162023 podStartE2EDuration="11.8971639s" podCreationTimestamp="2025-03-25 01:16:24 +0000 UTC" firstStartedPulling="2025-03-25 01:16:24.703373744 +0000 UTC m=+14.993316932" lastFinishedPulling="2025-03-25 01:16:35.370375621 +0000 UTC m=+25.660318809" observedRunningTime="2025-03-25 01:16:35.897123538 +0000 UTC m=+26.187066726" watchObservedRunningTime="2025-03-25 01:16:35.8971639 +0000 UTC m=+26.187107048" Mar 25 01:16:36.884969 kubelet[2592]: I0325 01:16:36.884937 2592 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:16:38.046086 systemd[1]: Started sshd@7-10.0.0.53:22-10.0.0.1:47678.service - OpenSSH per-connection server daemon (10.0.0.1:47678). Mar 25 01:16:38.105096 sshd[3774]: Accepted publickey for core from 10.0.0.1 port 47678 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:16:38.106444 sshd-session[3774]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:16:38.110724 systemd-logind[1468]: New session 8 of user core. Mar 25 01:16:38.127004 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 25 01:16:38.293885 sshd[3779]: Connection closed by 10.0.0.1 port 47678 Mar 25 01:16:38.293617 sshd-session[3774]: pam_unix(sshd:session): session closed for user core Mar 25 01:16:38.297505 systemd[1]: sshd@7-10.0.0.53:22-10.0.0.1:47678.service: Deactivated successfully. Mar 25 01:16:38.300339 systemd[1]: session-8.scope: Deactivated successfully. Mar 25 01:16:38.301013 systemd-logind[1468]: Session 8 logged out. Waiting for processes to exit. Mar 25 01:16:38.301978 systemd-logind[1468]: Removed session 8. Mar 25 01:16:43.313748 systemd[1]: Started sshd@8-10.0.0.53:22-10.0.0.1:55090.service - OpenSSH per-connection server daemon (10.0.0.1:55090). Mar 25 01:16:43.371729 sshd[3913]: Accepted publickey for core from 10.0.0.1 port 55090 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:16:43.373023 sshd-session[3913]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:16:43.376831 systemd-logind[1468]: New session 9 of user core. Mar 25 01:16:43.395038 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 25 01:16:43.521831 sshd[3915]: Connection closed by 10.0.0.1 port 55090 Mar 25 01:16:43.522178 sshd-session[3913]: pam_unix(sshd:session): session closed for user core Mar 25 01:16:43.524790 systemd[1]: sshd@8-10.0.0.53:22-10.0.0.1:55090.service: Deactivated successfully. Mar 25 01:16:43.527324 systemd[1]: session-9.scope: Deactivated successfully. Mar 25 01:16:43.529350 systemd-logind[1468]: Session 9 logged out. Waiting for processes to exit. Mar 25 01:16:43.530356 systemd-logind[1468]: Removed session 9. Mar 25 01:16:43.795443 containerd[1481]: time="2025-03-25T01:16:43.795168989Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-9fkq7,Uid:801dfadb-ac93-47ed-94b9-073599ad9abd,Namespace:kube-system,Attempt:0,}" Mar 25 01:16:43.795443 containerd[1481]: time="2025-03-25T01:16:43.795214230Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dzt9d,Uid:c4b5976c-7a98-4d08-b430-ee7389a6d994,Namespace:calico-system,Attempt:0,}" Mar 25 01:16:43.796081 containerd[1481]: time="2025-03-25T01:16:43.795476399Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-ptvcd,Uid:2cb4ba0e-3c48-4c29-9cd6-3c852ad5ee46,Namespace:kube-system,Attempt:0,}" Mar 25 01:16:44.137785 systemd-networkd[1422]: cali8596cf488ac: Link UP Mar 25 01:16:44.137988 systemd-networkd[1422]: cali8596cf488ac: Gained carrier Mar 25 01:16:44.150901 containerd[1481]: 2025-03-25 01:16:43.829 [INFO][3942] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 25 01:16:44.150901 containerd[1481]: 2025-03-25 01:16:43.903 [INFO][3942] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--dzt9d-eth0 csi-node-driver- calico-system c4b5976c-7a98-4d08-b430-ee7389a6d994 579 0 2025-03-25 01:16:24 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:568c96974f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-dzt9d eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali8596cf488ac [] []}} ContainerID="84d8e2dc347ab13abed20b2c447977405657a156f89cecef72b0cd7b3e72189d" Namespace="calico-system" Pod="csi-node-driver-dzt9d" WorkloadEndpoint="localhost-k8s-csi--node--driver--dzt9d-" Mar 25 01:16:44.150901 containerd[1481]: 2025-03-25 01:16:43.903 [INFO][3942] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="84d8e2dc347ab13abed20b2c447977405657a156f89cecef72b0cd7b3e72189d" Namespace="calico-system" Pod="csi-node-driver-dzt9d" WorkloadEndpoint="localhost-k8s-csi--node--driver--dzt9d-eth0" Mar 25 01:16:44.150901 containerd[1481]: 2025-03-25 01:16:44.082 [INFO][3976] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="84d8e2dc347ab13abed20b2c447977405657a156f89cecef72b0cd7b3e72189d" HandleID="k8s-pod-network.84d8e2dc347ab13abed20b2c447977405657a156f89cecef72b0cd7b3e72189d" Workload="localhost-k8s-csi--node--driver--dzt9d-eth0" Mar 25 01:16:44.151138 containerd[1481]: 2025-03-25 01:16:44.096 [INFO][3976] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="84d8e2dc347ab13abed20b2c447977405657a156f89cecef72b0cd7b3e72189d" HandleID="k8s-pod-network.84d8e2dc347ab13abed20b2c447977405657a156f89cecef72b0cd7b3e72189d" Workload="localhost-k8s-csi--node--driver--dzt9d-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40004a33f0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-dzt9d", "timestamp":"2025-03-25 01:16:44.082970835 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:16:44.151138 containerd[1481]: 2025-03-25 01:16:44.096 [INFO][3976] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:16:44.151138 containerd[1481]: 2025-03-25 01:16:44.096 [INFO][3976] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:16:44.151138 containerd[1481]: 2025-03-25 01:16:44.096 [INFO][3976] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 25 01:16:44.151138 containerd[1481]: 2025-03-25 01:16:44.099 [INFO][3976] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.84d8e2dc347ab13abed20b2c447977405657a156f89cecef72b0cd7b3e72189d" host="localhost" Mar 25 01:16:44.151138 containerd[1481]: 2025-03-25 01:16:44.104 [INFO][3976] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 25 01:16:44.151138 containerd[1481]: 2025-03-25 01:16:44.108 [INFO][3976] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 25 01:16:44.151138 containerd[1481]: 2025-03-25 01:16:44.111 [INFO][3976] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 25 01:16:44.151138 containerd[1481]: 2025-03-25 01:16:44.113 [INFO][3976] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 25 01:16:44.151138 containerd[1481]: 2025-03-25 01:16:44.114 [INFO][3976] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.84d8e2dc347ab13abed20b2c447977405657a156f89cecef72b0cd7b3e72189d" host="localhost" Mar 25 01:16:44.151333 containerd[1481]: 2025-03-25 01:16:44.115 [INFO][3976] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.84d8e2dc347ab13abed20b2c447977405657a156f89cecef72b0cd7b3e72189d Mar 25 01:16:44.151333 containerd[1481]: 2025-03-25 01:16:44.120 [INFO][3976] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.84d8e2dc347ab13abed20b2c447977405657a156f89cecef72b0cd7b3e72189d" host="localhost" Mar 25 01:16:44.151333 containerd[1481]: 2025-03-25 01:16:44.125 [INFO][3976] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.84d8e2dc347ab13abed20b2c447977405657a156f89cecef72b0cd7b3e72189d" host="localhost" Mar 25 01:16:44.151333 containerd[1481]: 2025-03-25 01:16:44.125 [INFO][3976] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.84d8e2dc347ab13abed20b2c447977405657a156f89cecef72b0cd7b3e72189d" host="localhost" Mar 25 01:16:44.151333 containerd[1481]: 2025-03-25 01:16:44.125 [INFO][3976] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:16:44.151333 containerd[1481]: 2025-03-25 01:16:44.125 [INFO][3976] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="84d8e2dc347ab13abed20b2c447977405657a156f89cecef72b0cd7b3e72189d" HandleID="k8s-pod-network.84d8e2dc347ab13abed20b2c447977405657a156f89cecef72b0cd7b3e72189d" Workload="localhost-k8s-csi--node--driver--dzt9d-eth0" Mar 25 01:16:44.151458 containerd[1481]: 2025-03-25 01:16:44.129 [INFO][3942] cni-plugin/k8s.go 386: Populated endpoint ContainerID="84d8e2dc347ab13abed20b2c447977405657a156f89cecef72b0cd7b3e72189d" Namespace="calico-system" Pod="csi-node-driver-dzt9d" WorkloadEndpoint="localhost-k8s-csi--node--driver--dzt9d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--dzt9d-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"c4b5976c-7a98-4d08-b430-ee7389a6d994", ResourceVersion:"579", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 16, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"568c96974f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-dzt9d", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali8596cf488ac", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:16:44.151458 containerd[1481]: 2025-03-25 01:16:44.130 [INFO][3942] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.129/32] ContainerID="84d8e2dc347ab13abed20b2c447977405657a156f89cecef72b0cd7b3e72189d" Namespace="calico-system" Pod="csi-node-driver-dzt9d" WorkloadEndpoint="localhost-k8s-csi--node--driver--dzt9d-eth0" Mar 25 01:16:44.151525 containerd[1481]: 2025-03-25 01:16:44.130 [INFO][3942] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8596cf488ac ContainerID="84d8e2dc347ab13abed20b2c447977405657a156f89cecef72b0cd7b3e72189d" Namespace="calico-system" Pod="csi-node-driver-dzt9d" WorkloadEndpoint="localhost-k8s-csi--node--driver--dzt9d-eth0" Mar 25 01:16:44.151525 containerd[1481]: 2025-03-25 01:16:44.137 [INFO][3942] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="84d8e2dc347ab13abed20b2c447977405657a156f89cecef72b0cd7b3e72189d" Namespace="calico-system" Pod="csi-node-driver-dzt9d" WorkloadEndpoint="localhost-k8s-csi--node--driver--dzt9d-eth0" Mar 25 01:16:44.151564 containerd[1481]: 2025-03-25 01:16:44.137 [INFO][3942] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="84d8e2dc347ab13abed20b2c447977405657a156f89cecef72b0cd7b3e72189d" Namespace="calico-system" Pod="csi-node-driver-dzt9d" WorkloadEndpoint="localhost-k8s-csi--node--driver--dzt9d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--dzt9d-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"c4b5976c-7a98-4d08-b430-ee7389a6d994", ResourceVersion:"579", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 16, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"568c96974f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"84d8e2dc347ab13abed20b2c447977405657a156f89cecef72b0cd7b3e72189d", Pod:"csi-node-driver-dzt9d", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali8596cf488ac", MAC:"92:97:8a:94:37:0b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:16:44.151608 containerd[1481]: 2025-03-25 01:16:44.148 [INFO][3942] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="84d8e2dc347ab13abed20b2c447977405657a156f89cecef72b0cd7b3e72189d" Namespace="calico-system" Pod="csi-node-driver-dzt9d" WorkloadEndpoint="localhost-k8s-csi--node--driver--dzt9d-eth0" Mar 25 01:16:44.252338 systemd-networkd[1422]: cali50a0a9de549: Link UP Mar 25 01:16:44.253151 systemd-networkd[1422]: cali50a0a9de549: Gained carrier Mar 25 01:16:44.270091 containerd[1481]: 2025-03-25 01:16:43.827 [INFO][3954] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 25 01:16:44.270091 containerd[1481]: 2025-03-25 01:16:43.901 [INFO][3954] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--6f6b679f8f--ptvcd-eth0 coredns-6f6b679f8f- kube-system 2cb4ba0e-3c48-4c29-9cd6-3c852ad5ee46 652 0 2025-03-25 01:16:16 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-6f6b679f8f-ptvcd eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali50a0a9de549 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="e671fe29dbbaf03637b2e44886b25afcfecdf2d2916dc19d6794e8c74d4bc253" Namespace="kube-system" Pod="coredns-6f6b679f8f-ptvcd" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--ptvcd-" Mar 25 01:16:44.270091 containerd[1481]: 2025-03-25 01:16:43.901 [INFO][3954] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="e671fe29dbbaf03637b2e44886b25afcfecdf2d2916dc19d6794e8c74d4bc253" Namespace="kube-system" Pod="coredns-6f6b679f8f-ptvcd" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--ptvcd-eth0" Mar 25 01:16:44.270091 containerd[1481]: 2025-03-25 01:16:44.082 [INFO][3974] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e671fe29dbbaf03637b2e44886b25afcfecdf2d2916dc19d6794e8c74d4bc253" HandleID="k8s-pod-network.e671fe29dbbaf03637b2e44886b25afcfecdf2d2916dc19d6794e8c74d4bc253" Workload="localhost-k8s-coredns--6f6b679f8f--ptvcd-eth0" Mar 25 01:16:44.270432 containerd[1481]: 2025-03-25 01:16:44.096 [INFO][3974] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e671fe29dbbaf03637b2e44886b25afcfecdf2d2916dc19d6794e8c74d4bc253" HandleID="k8s-pod-network.e671fe29dbbaf03637b2e44886b25afcfecdf2d2916dc19d6794e8c74d4bc253" Workload="localhost-k8s-coredns--6f6b679f8f--ptvcd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003dcb10), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-6f6b679f8f-ptvcd", "timestamp":"2025-03-25 01:16:44.082977875 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:16:44.270432 containerd[1481]: 2025-03-25 01:16:44.096 [INFO][3974] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:16:44.270432 containerd[1481]: 2025-03-25 01:16:44.125 [INFO][3974] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:16:44.270432 containerd[1481]: 2025-03-25 01:16:44.125 [INFO][3974] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 25 01:16:44.270432 containerd[1481]: 2025-03-25 01:16:44.200 [INFO][3974] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.e671fe29dbbaf03637b2e44886b25afcfecdf2d2916dc19d6794e8c74d4bc253" host="localhost" Mar 25 01:16:44.270432 containerd[1481]: 2025-03-25 01:16:44.203 [INFO][3974] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 25 01:16:44.270432 containerd[1481]: 2025-03-25 01:16:44.208 [INFO][3974] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 25 01:16:44.270432 containerd[1481]: 2025-03-25 01:16:44.210 [INFO][3974] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 25 01:16:44.270432 containerd[1481]: 2025-03-25 01:16:44.212 [INFO][3974] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 25 01:16:44.270432 containerd[1481]: 2025-03-25 01:16:44.212 [INFO][3974] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.e671fe29dbbaf03637b2e44886b25afcfecdf2d2916dc19d6794e8c74d4bc253" host="localhost" Mar 25 01:16:44.270628 containerd[1481]: 2025-03-25 01:16:44.213 [INFO][3974] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.e671fe29dbbaf03637b2e44886b25afcfecdf2d2916dc19d6794e8c74d4bc253 Mar 25 01:16:44.270628 containerd[1481]: 2025-03-25 01:16:44.223 [INFO][3974] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.e671fe29dbbaf03637b2e44886b25afcfecdf2d2916dc19d6794e8c74d4bc253" host="localhost" Mar 25 01:16:44.270628 containerd[1481]: 2025-03-25 01:16:44.247 [INFO][3974] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.e671fe29dbbaf03637b2e44886b25afcfecdf2d2916dc19d6794e8c74d4bc253" host="localhost" Mar 25 01:16:44.270628 containerd[1481]: 2025-03-25 01:16:44.247 [INFO][3974] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.e671fe29dbbaf03637b2e44886b25afcfecdf2d2916dc19d6794e8c74d4bc253" host="localhost" Mar 25 01:16:44.270628 containerd[1481]: 2025-03-25 01:16:44.247 [INFO][3974] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:16:44.270628 containerd[1481]: 2025-03-25 01:16:44.247 [INFO][3974] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="e671fe29dbbaf03637b2e44886b25afcfecdf2d2916dc19d6794e8c74d4bc253" HandleID="k8s-pod-network.e671fe29dbbaf03637b2e44886b25afcfecdf2d2916dc19d6794e8c74d4bc253" Workload="localhost-k8s-coredns--6f6b679f8f--ptvcd-eth0" Mar 25 01:16:44.270739 containerd[1481]: 2025-03-25 01:16:44.249 [INFO][3954] cni-plugin/k8s.go 386: Populated endpoint ContainerID="e671fe29dbbaf03637b2e44886b25afcfecdf2d2916dc19d6794e8c74d4bc253" Namespace="kube-system" Pod="coredns-6f6b679f8f-ptvcd" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--ptvcd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--ptvcd-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"2cb4ba0e-3c48-4c29-9cd6-3c852ad5ee46", ResourceVersion:"652", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 16, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-6f6b679f8f-ptvcd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali50a0a9de549", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:16:44.270785 containerd[1481]: 2025-03-25 01:16:44.249 [INFO][3954] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.130/32] ContainerID="e671fe29dbbaf03637b2e44886b25afcfecdf2d2916dc19d6794e8c74d4bc253" Namespace="kube-system" Pod="coredns-6f6b679f8f-ptvcd" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--ptvcd-eth0" Mar 25 01:16:44.270785 containerd[1481]: 2025-03-25 01:16:44.249 [INFO][3954] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali50a0a9de549 ContainerID="e671fe29dbbaf03637b2e44886b25afcfecdf2d2916dc19d6794e8c74d4bc253" Namespace="kube-system" Pod="coredns-6f6b679f8f-ptvcd" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--ptvcd-eth0" Mar 25 01:16:44.270785 containerd[1481]: 2025-03-25 01:16:44.253 [INFO][3954] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e671fe29dbbaf03637b2e44886b25afcfecdf2d2916dc19d6794e8c74d4bc253" Namespace="kube-system" Pod="coredns-6f6b679f8f-ptvcd" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--ptvcd-eth0" Mar 25 01:16:44.270953 containerd[1481]: 2025-03-25 01:16:44.254 [INFO][3954] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="e671fe29dbbaf03637b2e44886b25afcfecdf2d2916dc19d6794e8c74d4bc253" Namespace="kube-system" Pod="coredns-6f6b679f8f-ptvcd" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--ptvcd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--ptvcd-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"2cb4ba0e-3c48-4c29-9cd6-3c852ad5ee46", ResourceVersion:"652", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 16, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"e671fe29dbbaf03637b2e44886b25afcfecdf2d2916dc19d6794e8c74d4bc253", Pod:"coredns-6f6b679f8f-ptvcd", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali50a0a9de549", MAC:"82:a0:1b:9f:a6:0d", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:16:44.270953 containerd[1481]: 2025-03-25 01:16:44.266 [INFO][3954] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="e671fe29dbbaf03637b2e44886b25afcfecdf2d2916dc19d6794e8c74d4bc253" Namespace="kube-system" Pod="coredns-6f6b679f8f-ptvcd" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--ptvcd-eth0" Mar 25 01:16:44.344243 systemd-networkd[1422]: calie8136cd6a2d: Link UP Mar 25 01:16:44.344396 systemd-networkd[1422]: calie8136cd6a2d: Gained carrier Mar 25 01:16:44.358568 containerd[1481]: 2025-03-25 01:16:43.822 [INFO][3929] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 25 01:16:44.358568 containerd[1481]: 2025-03-25 01:16:43.901 [INFO][3929] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--6f6b679f8f--9fkq7-eth0 coredns-6f6b679f8f- kube-system 801dfadb-ac93-47ed-94b9-073599ad9abd 655 0 2025-03-25 01:16:16 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-6f6b679f8f-9fkq7 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calie8136cd6a2d [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="92b68dc8e87f4ac81c50fcaa1dc079465df1ba4088f5b9b827e0e7da01409efe" Namespace="kube-system" Pod="coredns-6f6b679f8f-9fkq7" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--9fkq7-" Mar 25 01:16:44.358568 containerd[1481]: 2025-03-25 01:16:43.901 [INFO][3929] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="92b68dc8e87f4ac81c50fcaa1dc079465df1ba4088f5b9b827e0e7da01409efe" Namespace="kube-system" Pod="coredns-6f6b679f8f-9fkq7" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--9fkq7-eth0" Mar 25 01:16:44.358568 containerd[1481]: 2025-03-25 01:16:44.082 [INFO][3972] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="92b68dc8e87f4ac81c50fcaa1dc079465df1ba4088f5b9b827e0e7da01409efe" HandleID="k8s-pod-network.92b68dc8e87f4ac81c50fcaa1dc079465df1ba4088f5b9b827e0e7da01409efe" Workload="localhost-k8s-coredns--6f6b679f8f--9fkq7-eth0" Mar 25 01:16:44.358568 containerd[1481]: 2025-03-25 01:16:44.098 [INFO][3972] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="92b68dc8e87f4ac81c50fcaa1dc079465df1ba4088f5b9b827e0e7da01409efe" HandleID="k8s-pod-network.92b68dc8e87f4ac81c50fcaa1dc079465df1ba4088f5b9b827e0e7da01409efe" Workload="localhost-k8s-coredns--6f6b679f8f--9fkq7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400034b8f0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-6f6b679f8f-9fkq7", "timestamp":"2025-03-25 01:16:44.082967035 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:16:44.358568 containerd[1481]: 2025-03-25 01:16:44.098 [INFO][3972] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:16:44.358568 containerd[1481]: 2025-03-25 01:16:44.247 [INFO][3972] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:16:44.358568 containerd[1481]: 2025-03-25 01:16:44.247 [INFO][3972] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 25 01:16:44.358568 containerd[1481]: 2025-03-25 01:16:44.300 [INFO][3972] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.92b68dc8e87f4ac81c50fcaa1dc079465df1ba4088f5b9b827e0e7da01409efe" host="localhost" Mar 25 01:16:44.358568 containerd[1481]: 2025-03-25 01:16:44.306 [INFO][3972] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 25 01:16:44.358568 containerd[1481]: 2025-03-25 01:16:44.311 [INFO][3972] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 25 01:16:44.358568 containerd[1481]: 2025-03-25 01:16:44.313 [INFO][3972] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 25 01:16:44.358568 containerd[1481]: 2025-03-25 01:16:44.316 [INFO][3972] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 25 01:16:44.358568 containerd[1481]: 2025-03-25 01:16:44.316 [INFO][3972] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.92b68dc8e87f4ac81c50fcaa1dc079465df1ba4088f5b9b827e0e7da01409efe" host="localhost" Mar 25 01:16:44.358568 containerd[1481]: 2025-03-25 01:16:44.318 [INFO][3972] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.92b68dc8e87f4ac81c50fcaa1dc079465df1ba4088f5b9b827e0e7da01409efe Mar 25 01:16:44.358568 containerd[1481]: 2025-03-25 01:16:44.322 [INFO][3972] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.92b68dc8e87f4ac81c50fcaa1dc079465df1ba4088f5b9b827e0e7da01409efe" host="localhost" Mar 25 01:16:44.358568 containerd[1481]: 2025-03-25 01:16:44.333 [INFO][3972] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.92b68dc8e87f4ac81c50fcaa1dc079465df1ba4088f5b9b827e0e7da01409efe" host="localhost" Mar 25 01:16:44.358568 containerd[1481]: 2025-03-25 01:16:44.334 [INFO][3972] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.92b68dc8e87f4ac81c50fcaa1dc079465df1ba4088f5b9b827e0e7da01409efe" host="localhost" Mar 25 01:16:44.358568 containerd[1481]: 2025-03-25 01:16:44.334 [INFO][3972] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:16:44.358568 containerd[1481]: 2025-03-25 01:16:44.334 [INFO][3972] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="92b68dc8e87f4ac81c50fcaa1dc079465df1ba4088f5b9b827e0e7da01409efe" HandleID="k8s-pod-network.92b68dc8e87f4ac81c50fcaa1dc079465df1ba4088f5b9b827e0e7da01409efe" Workload="localhost-k8s-coredns--6f6b679f8f--9fkq7-eth0" Mar 25 01:16:44.359184 containerd[1481]: 2025-03-25 01:16:44.342 [INFO][3929] cni-plugin/k8s.go 386: Populated endpoint ContainerID="92b68dc8e87f4ac81c50fcaa1dc079465df1ba4088f5b9b827e0e7da01409efe" Namespace="kube-system" Pod="coredns-6f6b679f8f-9fkq7" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--9fkq7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--9fkq7-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"801dfadb-ac93-47ed-94b9-073599ad9abd", ResourceVersion:"655", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 16, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-6f6b679f8f-9fkq7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie8136cd6a2d", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:16:44.359184 containerd[1481]: 2025-03-25 01:16:44.342 [INFO][3929] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.131/32] ContainerID="92b68dc8e87f4ac81c50fcaa1dc079465df1ba4088f5b9b827e0e7da01409efe" Namespace="kube-system" Pod="coredns-6f6b679f8f-9fkq7" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--9fkq7-eth0" Mar 25 01:16:44.359184 containerd[1481]: 2025-03-25 01:16:44.342 [INFO][3929] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie8136cd6a2d ContainerID="92b68dc8e87f4ac81c50fcaa1dc079465df1ba4088f5b9b827e0e7da01409efe" Namespace="kube-system" Pod="coredns-6f6b679f8f-9fkq7" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--9fkq7-eth0" Mar 25 01:16:44.359184 containerd[1481]: 2025-03-25 01:16:44.343 [INFO][3929] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="92b68dc8e87f4ac81c50fcaa1dc079465df1ba4088f5b9b827e0e7da01409efe" Namespace="kube-system" Pod="coredns-6f6b679f8f-9fkq7" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--9fkq7-eth0" Mar 25 01:16:44.359184 containerd[1481]: 2025-03-25 01:16:44.344 [INFO][3929] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="92b68dc8e87f4ac81c50fcaa1dc079465df1ba4088f5b9b827e0e7da01409efe" Namespace="kube-system" Pod="coredns-6f6b679f8f-9fkq7" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--9fkq7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--9fkq7-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"801dfadb-ac93-47ed-94b9-073599ad9abd", ResourceVersion:"655", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 16, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"92b68dc8e87f4ac81c50fcaa1dc079465df1ba4088f5b9b827e0e7da01409efe", Pod:"coredns-6f6b679f8f-9fkq7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie8136cd6a2d", MAC:"86:f0:e7:bb:a9:d5", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:16:44.359184 containerd[1481]: 2025-03-25 01:16:44.355 [INFO][3929] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="92b68dc8e87f4ac81c50fcaa1dc079465df1ba4088f5b9b827e0e7da01409efe" Namespace="kube-system" Pod="coredns-6f6b679f8f-9fkq7" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--9fkq7-eth0" Mar 25 01:16:44.382148 containerd[1481]: time="2025-03-25T01:16:44.382025700Z" level=info msg="connecting to shim e671fe29dbbaf03637b2e44886b25afcfecdf2d2916dc19d6794e8c74d4bc253" address="unix:///run/containerd/s/0c5fa25baa358f1770e2140affad20b99e1d80c8a02df6cb0c861a43ce590234" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:16:44.382912 containerd[1481]: time="2025-03-25T01:16:44.382714003Z" level=info msg="connecting to shim 84d8e2dc347ab13abed20b2c447977405657a156f89cecef72b0cd7b3e72189d" address="unix:///run/containerd/s/1615a2c30ce4b0be95c92ed6f0015d5d87874b16ade1a2190e03e88a6693c493" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:16:44.393376 containerd[1481]: time="2025-03-25T01:16:44.393263869Z" level=info msg="connecting to shim 92b68dc8e87f4ac81c50fcaa1dc079465df1ba4088f5b9b827e0e7da01409efe" address="unix:///run/containerd/s/0da74177b9397fa00aba4425e93e11f1e5fb72ac37b9e50b725fd669c662b07f" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:16:44.409031 systemd[1]: Started cri-containerd-84d8e2dc347ab13abed20b2c447977405657a156f89cecef72b0cd7b3e72189d.scope - libcontainer container 84d8e2dc347ab13abed20b2c447977405657a156f89cecef72b0cd7b3e72189d. Mar 25 01:16:44.410444 systemd[1]: Started cri-containerd-e671fe29dbbaf03637b2e44886b25afcfecdf2d2916dc19d6794e8c74d4bc253.scope - libcontainer container e671fe29dbbaf03637b2e44886b25afcfecdf2d2916dc19d6794e8c74d4bc253. Mar 25 01:16:44.413930 systemd[1]: Started cri-containerd-92b68dc8e87f4ac81c50fcaa1dc079465df1ba4088f5b9b827e0e7da01409efe.scope - libcontainer container 92b68dc8e87f4ac81c50fcaa1dc079465df1ba4088f5b9b827e0e7da01409efe. Mar 25 01:16:44.427547 systemd-resolved[1332]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 25 01:16:44.428759 systemd-resolved[1332]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 25 01:16:44.432382 systemd-resolved[1332]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 25 01:16:44.495776 containerd[1481]: time="2025-03-25T01:16:44.495728236Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dzt9d,Uid:c4b5976c-7a98-4d08-b430-ee7389a6d994,Namespace:calico-system,Attempt:0,} returns sandbox id \"84d8e2dc347ab13abed20b2c447977405657a156f89cecef72b0cd7b3e72189d\"" Mar 25 01:16:44.497045 containerd[1481]: time="2025-03-25T01:16:44.497012238Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\"" Mar 25 01:16:44.517299 containerd[1481]: time="2025-03-25T01:16:44.517249903Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-ptvcd,Uid:2cb4ba0e-3c48-4c29-9cd6-3c852ad5ee46,Namespace:kube-system,Attempt:0,} returns sandbox id \"e671fe29dbbaf03637b2e44886b25afcfecdf2d2916dc19d6794e8c74d4bc253\"" Mar 25 01:16:44.518089 containerd[1481]: time="2025-03-25T01:16:44.518053569Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-9fkq7,Uid:801dfadb-ac93-47ed-94b9-073599ad9abd,Namespace:kube-system,Attempt:0,} returns sandbox id \"92b68dc8e87f4ac81c50fcaa1dc079465df1ba4088f5b9b827e0e7da01409efe\"" Mar 25 01:16:44.523388 containerd[1481]: time="2025-03-25T01:16:44.523336343Z" level=info msg="CreateContainer within sandbox \"92b68dc8e87f4ac81c50fcaa1dc079465df1ba4088f5b9b827e0e7da01409efe\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 25 01:16:44.529230 containerd[1481]: time="2025-03-25T01:16:44.529197375Z" level=info msg="CreateContainer within sandbox \"e671fe29dbbaf03637b2e44886b25afcfecdf2d2916dc19d6794e8c74d4bc253\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 25 01:16:44.535774 containerd[1481]: time="2025-03-25T01:16:44.535734950Z" level=info msg="Container f2b3fc1a1a713679cd03bf48ff751ab46b3b547045ec01a4a9147edba7371ce6: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:16:44.539289 containerd[1481]: time="2025-03-25T01:16:44.539210104Z" level=info msg="Container d4e7fd4579d2316673ff81f5f306eb461c6e11f7f9f3797d32c82643e058fba7: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:16:44.543535 containerd[1481]: time="2025-03-25T01:16:44.543499725Z" level=info msg="CreateContainer within sandbox \"92b68dc8e87f4ac81c50fcaa1dc079465df1ba4088f5b9b827e0e7da01409efe\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"f2b3fc1a1a713679cd03bf48ff751ab46b3b547045ec01a4a9147edba7371ce6\"" Mar 25 01:16:44.544124 containerd[1481]: time="2025-03-25T01:16:44.544097825Z" level=info msg="StartContainer for \"f2b3fc1a1a713679cd03bf48ff751ab46b3b547045ec01a4a9147edba7371ce6\"" Mar 25 01:16:44.545395 containerd[1481]: time="2025-03-25T01:16:44.545018895Z" level=info msg="connecting to shim f2b3fc1a1a713679cd03bf48ff751ab46b3b547045ec01a4a9147edba7371ce6" address="unix:///run/containerd/s/0da74177b9397fa00aba4425e93e11f1e5fb72ac37b9e50b725fd669c662b07f" protocol=ttrpc version=3 Mar 25 01:16:44.545995 containerd[1481]: time="2025-03-25T01:16:44.545962646Z" level=info msg="CreateContainer within sandbox \"e671fe29dbbaf03637b2e44886b25afcfecdf2d2916dc19d6794e8c74d4bc253\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"d4e7fd4579d2316673ff81f5f306eb461c6e11f7f9f3797d32c82643e058fba7\"" Mar 25 01:16:44.547926 containerd[1481]: time="2025-03-25T01:16:44.546495103Z" level=info msg="StartContainer for \"d4e7fd4579d2316673ff81f5f306eb461c6e11f7f9f3797d32c82643e058fba7\"" Mar 25 01:16:44.548602 containerd[1481]: time="2025-03-25T01:16:44.548562251Z" level=info msg="connecting to shim d4e7fd4579d2316673ff81f5f306eb461c6e11f7f9f3797d32c82643e058fba7" address="unix:///run/containerd/s/0c5fa25baa358f1770e2140affad20b99e1d80c8a02df6cb0c861a43ce590234" protocol=ttrpc version=3 Mar 25 01:16:44.575039 systemd[1]: Started cri-containerd-f2b3fc1a1a713679cd03bf48ff751ab46b3b547045ec01a4a9147edba7371ce6.scope - libcontainer container f2b3fc1a1a713679cd03bf48ff751ab46b3b547045ec01a4a9147edba7371ce6. Mar 25 01:16:44.578075 systemd[1]: Started cri-containerd-d4e7fd4579d2316673ff81f5f306eb461c6e11f7f9f3797d32c82643e058fba7.scope - libcontainer container d4e7fd4579d2316673ff81f5f306eb461c6e11f7f9f3797d32c82643e058fba7. Mar 25 01:16:44.622458 containerd[1481]: time="2025-03-25T01:16:44.622366476Z" level=info msg="StartContainer for \"f2b3fc1a1a713679cd03bf48ff751ab46b3b547045ec01a4a9147edba7371ce6\" returns successfully" Mar 25 01:16:44.626183 containerd[1481]: time="2025-03-25T01:16:44.626126520Z" level=info msg="StartContainer for \"d4e7fd4579d2316673ff81f5f306eb461c6e11f7f9f3797d32c82643e058fba7\" returns successfully" Mar 25 01:16:44.794850 containerd[1481]: time="2025-03-25T01:16:44.794656736Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8c9cf7c59-zl2f7,Uid:557c52a7-2de5-4a05-ba20-15e253ad24e8,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:16:44.922332 kubelet[2592]: I0325 01:16:44.922165 2592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-9fkq7" podStartSLOduration=28.922149445 podStartE2EDuration="28.922149445s" podCreationTimestamp="2025-03-25 01:16:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:16:44.921405981 +0000 UTC m=+35.211349169" watchObservedRunningTime="2025-03-25 01:16:44.922149445 +0000 UTC m=+35.212092593" Mar 25 01:16:44.928586 systemd-networkd[1422]: calid4a12790e18: Link UP Mar 25 01:16:44.930498 systemd-networkd[1422]: calid4a12790e18: Gained carrier Mar 25 01:16:44.950528 containerd[1481]: 2025-03-25 01:16:44.818 [INFO][4269] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 25 01:16:44.950528 containerd[1481]: 2025-03-25 01:16:44.831 [INFO][4269] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--8c9cf7c59--zl2f7-eth0 calico-apiserver-8c9cf7c59- calico-apiserver 557c52a7-2de5-4a05-ba20-15e253ad24e8 651 0 2025-03-25 01:16:24 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:8c9cf7c59 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-8c9cf7c59-zl2f7 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calid4a12790e18 [] []}} ContainerID="6a0f400c3c452771f5395ee36a0508c24eda6816bf850b3e63a980c64866b5f6" Namespace="calico-apiserver" Pod="calico-apiserver-8c9cf7c59-zl2f7" WorkloadEndpoint="localhost-k8s-calico--apiserver--8c9cf7c59--zl2f7-" Mar 25 01:16:44.950528 containerd[1481]: 2025-03-25 01:16:44.831 [INFO][4269] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="6a0f400c3c452771f5395ee36a0508c24eda6816bf850b3e63a980c64866b5f6" Namespace="calico-apiserver" Pod="calico-apiserver-8c9cf7c59-zl2f7" WorkloadEndpoint="localhost-k8s-calico--apiserver--8c9cf7c59--zl2f7-eth0" Mar 25 01:16:44.950528 containerd[1481]: 2025-03-25 01:16:44.860 [INFO][4283] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6a0f400c3c452771f5395ee36a0508c24eda6816bf850b3e63a980c64866b5f6" HandleID="k8s-pod-network.6a0f400c3c452771f5395ee36a0508c24eda6816bf850b3e63a980c64866b5f6" Workload="localhost-k8s-calico--apiserver--8c9cf7c59--zl2f7-eth0" Mar 25 01:16:44.950528 containerd[1481]: 2025-03-25 01:16:44.871 [INFO][4283] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6a0f400c3c452771f5395ee36a0508c24eda6816bf850b3e63a980c64866b5f6" HandleID="k8s-pod-network.6a0f400c3c452771f5395ee36a0508c24eda6816bf850b3e63a980c64866b5f6" Workload="localhost-k8s-calico--apiserver--8c9cf7c59--zl2f7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002e4a10), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-8c9cf7c59-zl2f7", "timestamp":"2025-03-25 01:16:44.860076046 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:16:44.950528 containerd[1481]: 2025-03-25 01:16:44.871 [INFO][4283] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:16:44.950528 containerd[1481]: 2025-03-25 01:16:44.871 [INFO][4283] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:16:44.950528 containerd[1481]: 2025-03-25 01:16:44.871 [INFO][4283] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 25 01:16:44.950528 containerd[1481]: 2025-03-25 01:16:44.873 [INFO][4283] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.6a0f400c3c452771f5395ee36a0508c24eda6816bf850b3e63a980c64866b5f6" host="localhost" Mar 25 01:16:44.950528 containerd[1481]: 2025-03-25 01:16:44.876 [INFO][4283] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 25 01:16:44.950528 containerd[1481]: 2025-03-25 01:16:44.880 [INFO][4283] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 25 01:16:44.950528 containerd[1481]: 2025-03-25 01:16:44.882 [INFO][4283] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 25 01:16:44.950528 containerd[1481]: 2025-03-25 01:16:44.884 [INFO][4283] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 25 01:16:44.950528 containerd[1481]: 2025-03-25 01:16:44.884 [INFO][4283] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.6a0f400c3c452771f5395ee36a0508c24eda6816bf850b3e63a980c64866b5f6" host="localhost" Mar 25 01:16:44.950528 containerd[1481]: 2025-03-25 01:16:44.886 [INFO][4283] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.6a0f400c3c452771f5395ee36a0508c24eda6816bf850b3e63a980c64866b5f6 Mar 25 01:16:44.950528 containerd[1481]: 2025-03-25 01:16:44.914 [INFO][4283] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.6a0f400c3c452771f5395ee36a0508c24eda6816bf850b3e63a980c64866b5f6" host="localhost" Mar 25 01:16:44.950528 containerd[1481]: 2025-03-25 01:16:44.921 [INFO][4283] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.6a0f400c3c452771f5395ee36a0508c24eda6816bf850b3e63a980c64866b5f6" host="localhost" Mar 25 01:16:44.950528 containerd[1481]: 2025-03-25 01:16:44.921 [INFO][4283] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.6a0f400c3c452771f5395ee36a0508c24eda6816bf850b3e63a980c64866b5f6" host="localhost" Mar 25 01:16:44.950528 containerd[1481]: 2025-03-25 01:16:44.921 [INFO][4283] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:16:44.950528 containerd[1481]: 2025-03-25 01:16:44.921 [INFO][4283] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="6a0f400c3c452771f5395ee36a0508c24eda6816bf850b3e63a980c64866b5f6" HandleID="k8s-pod-network.6a0f400c3c452771f5395ee36a0508c24eda6816bf850b3e63a980c64866b5f6" Workload="localhost-k8s-calico--apiserver--8c9cf7c59--zl2f7-eth0" Mar 25 01:16:44.954101 containerd[1481]: 2025-03-25 01:16:44.926 [INFO][4269] cni-plugin/k8s.go 386: Populated endpoint ContainerID="6a0f400c3c452771f5395ee36a0508c24eda6816bf850b3e63a980c64866b5f6" Namespace="calico-apiserver" Pod="calico-apiserver-8c9cf7c59-zl2f7" WorkloadEndpoint="localhost-k8s-calico--apiserver--8c9cf7c59--zl2f7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--8c9cf7c59--zl2f7-eth0", GenerateName:"calico-apiserver-8c9cf7c59-", Namespace:"calico-apiserver", SelfLink:"", UID:"557c52a7-2de5-4a05-ba20-15e253ad24e8", ResourceVersion:"651", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 16, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8c9cf7c59", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-8c9cf7c59-zl2f7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid4a12790e18", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:16:44.954101 containerd[1481]: 2025-03-25 01:16:44.926 [INFO][4269] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.132/32] ContainerID="6a0f400c3c452771f5395ee36a0508c24eda6816bf850b3e63a980c64866b5f6" Namespace="calico-apiserver" Pod="calico-apiserver-8c9cf7c59-zl2f7" WorkloadEndpoint="localhost-k8s-calico--apiserver--8c9cf7c59--zl2f7-eth0" Mar 25 01:16:44.954101 containerd[1481]: 2025-03-25 01:16:44.926 [INFO][4269] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid4a12790e18 ContainerID="6a0f400c3c452771f5395ee36a0508c24eda6816bf850b3e63a980c64866b5f6" Namespace="calico-apiserver" Pod="calico-apiserver-8c9cf7c59-zl2f7" WorkloadEndpoint="localhost-k8s-calico--apiserver--8c9cf7c59--zl2f7-eth0" Mar 25 01:16:44.954101 containerd[1481]: 2025-03-25 01:16:44.928 [INFO][4269] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6a0f400c3c452771f5395ee36a0508c24eda6816bf850b3e63a980c64866b5f6" Namespace="calico-apiserver" Pod="calico-apiserver-8c9cf7c59-zl2f7" WorkloadEndpoint="localhost-k8s-calico--apiserver--8c9cf7c59--zl2f7-eth0" Mar 25 01:16:44.954101 containerd[1481]: 2025-03-25 01:16:44.930 [INFO][4269] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="6a0f400c3c452771f5395ee36a0508c24eda6816bf850b3e63a980c64866b5f6" Namespace="calico-apiserver" Pod="calico-apiserver-8c9cf7c59-zl2f7" WorkloadEndpoint="localhost-k8s-calico--apiserver--8c9cf7c59--zl2f7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--8c9cf7c59--zl2f7-eth0", GenerateName:"calico-apiserver-8c9cf7c59-", Namespace:"calico-apiserver", SelfLink:"", UID:"557c52a7-2de5-4a05-ba20-15e253ad24e8", ResourceVersion:"651", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 16, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8c9cf7c59", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"6a0f400c3c452771f5395ee36a0508c24eda6816bf850b3e63a980c64866b5f6", Pod:"calico-apiserver-8c9cf7c59-zl2f7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid4a12790e18", MAC:"12:b5:66:28:f5:14", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:16:44.954101 containerd[1481]: 2025-03-25 01:16:44.946 [INFO][4269] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="6a0f400c3c452771f5395ee36a0508c24eda6816bf850b3e63a980c64866b5f6" Namespace="calico-apiserver" Pod="calico-apiserver-8c9cf7c59-zl2f7" WorkloadEndpoint="localhost-k8s-calico--apiserver--8c9cf7c59--zl2f7-eth0" Mar 25 01:16:44.959883 kubelet[2592]: I0325 01:16:44.959747 2592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-ptvcd" podStartSLOduration=28.95972932 podStartE2EDuration="28.95972932s" podCreationTimestamp="2025-03-25 01:16:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:16:44.954693274 +0000 UTC m=+35.244636462" watchObservedRunningTime="2025-03-25 01:16:44.95972932 +0000 UTC m=+35.249672508" Mar 25 01:16:44.984871 containerd[1481]: time="2025-03-25T01:16:44.984787223Z" level=info msg="connecting to shim 6a0f400c3c452771f5395ee36a0508c24eda6816bf850b3e63a980c64866b5f6" address="unix:///run/containerd/s/c43705b0408a93fb75705bbc06de0d4688a2b95a8292271f2d1d23d265560df4" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:16:45.006033 systemd[1]: Started cri-containerd-6a0f400c3c452771f5395ee36a0508c24eda6816bf850b3e63a980c64866b5f6.scope - libcontainer container 6a0f400c3c452771f5395ee36a0508c24eda6816bf850b3e63a980c64866b5f6. Mar 25 01:16:45.020893 systemd-resolved[1332]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 25 01:16:45.044774 containerd[1481]: time="2025-03-25T01:16:45.044697553Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8c9cf7c59-zl2f7,Uid:557c52a7-2de5-4a05-ba20-15e253ad24e8,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"6a0f400c3c452771f5395ee36a0508c24eda6816bf850b3e63a980c64866b5f6\"" Mar 25 01:16:45.484241 containerd[1481]: time="2025-03-25T01:16:45.484193690Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:45.485349 containerd[1481]: time="2025-03-25T01:16:45.485295725Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.2: active requests=0, bytes read=7473801" Mar 25 01:16:45.486681 containerd[1481]: time="2025-03-25T01:16:45.486267996Z" level=info msg="ImageCreate event name:\"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:45.488610 containerd[1481]: time="2025-03-25T01:16:45.488570990Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:45.489000 containerd[1481]: time="2025-03-25T01:16:45.488968803Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.2\" with image id \"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\", size \"8843558\" in 991.919323ms" Mar 25 01:16:45.489000 containerd[1481]: time="2025-03-25T01:16:45.488996084Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\" returns image reference \"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\"" Mar 25 01:16:45.490463 containerd[1481]: time="2025-03-25T01:16:45.490415889Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 25 01:16:45.491833 containerd[1481]: time="2025-03-25T01:16:45.491805013Z" level=info msg="CreateContainer within sandbox \"84d8e2dc347ab13abed20b2c447977405657a156f89cecef72b0cd7b3e72189d\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 25 01:16:45.499591 containerd[1481]: time="2025-03-25T01:16:45.499561222Z" level=info msg="Container 5283ba41d37b5aad791d43b74b11afe1def75c1a8855dc64ce4c93eb8e01ddfe: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:16:45.524926 containerd[1481]: time="2025-03-25T01:16:45.524877031Z" level=info msg="CreateContainer within sandbox \"84d8e2dc347ab13abed20b2c447977405657a156f89cecef72b0cd7b3e72189d\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"5283ba41d37b5aad791d43b74b11afe1def75c1a8855dc64ce4c93eb8e01ddfe\"" Mar 25 01:16:45.525659 containerd[1481]: time="2025-03-25T01:16:45.525488251Z" level=info msg="StartContainer for \"5283ba41d37b5aad791d43b74b11afe1def75c1a8855dc64ce4c93eb8e01ddfe\"" Mar 25 01:16:45.527333 containerd[1481]: time="2025-03-25T01:16:45.527233187Z" level=info msg="connecting to shim 5283ba41d37b5aad791d43b74b11afe1def75c1a8855dc64ce4c93eb8e01ddfe" address="unix:///run/containerd/s/1615a2c30ce4b0be95c92ed6f0015d5d87874b16ade1a2190e03e88a6693c493" protocol=ttrpc version=3 Mar 25 01:16:45.554029 systemd[1]: Started cri-containerd-5283ba41d37b5aad791d43b74b11afe1def75c1a8855dc64ce4c93eb8e01ddfe.scope - libcontainer container 5283ba41d37b5aad791d43b74b11afe1def75c1a8855dc64ce4c93eb8e01ddfe. Mar 25 01:16:45.592490 containerd[1481]: time="2025-03-25T01:16:45.592429912Z" level=info msg="StartContainer for \"5283ba41d37b5aad791d43b74b11afe1def75c1a8855dc64ce4c93eb8e01ddfe\" returns successfully" Mar 25 01:16:45.794824 containerd[1481]: time="2025-03-25T01:16:45.794711062Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-68dbc864b6-dsj69,Uid:685bedc2-f03f-4bb4-896f-cbc789cd2327,Namespace:calico-system,Attempt:0,}" Mar 25 01:16:45.870056 systemd-networkd[1422]: cali8596cf488ac: Gained IPv6LL Mar 25 01:16:45.908931 systemd-networkd[1422]: cali91369175376: Link UP Mar 25 01:16:45.909469 systemd-networkd[1422]: cali91369175376: Gained carrier Mar 25 01:16:45.926561 containerd[1481]: 2025-03-25 01:16:45.829 [INFO][4413] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 25 01:16:45.926561 containerd[1481]: 2025-03-25 01:16:45.842 [INFO][4413] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--68dbc864b6--dsj69-eth0 calico-kube-controllers-68dbc864b6- calico-system 685bedc2-f03f-4bb4-896f-cbc789cd2327 654 0 2025-03-25 01:16:24 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:68dbc864b6 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-68dbc864b6-dsj69 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali91369175376 [] []}} ContainerID="9e45b056531a956c5a54e327dc15f5c2052b419ec853add0cd460036976dbecb" Namespace="calico-system" Pod="calico-kube-controllers-68dbc864b6-dsj69" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--68dbc864b6--dsj69-" Mar 25 01:16:45.926561 containerd[1481]: 2025-03-25 01:16:45.842 [INFO][4413] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="9e45b056531a956c5a54e327dc15f5c2052b419ec853add0cd460036976dbecb" Namespace="calico-system" Pod="calico-kube-controllers-68dbc864b6-dsj69" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--68dbc864b6--dsj69-eth0" Mar 25 01:16:45.926561 containerd[1481]: 2025-03-25 01:16:45.865 [INFO][4428] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9e45b056531a956c5a54e327dc15f5c2052b419ec853add0cd460036976dbecb" HandleID="k8s-pod-network.9e45b056531a956c5a54e327dc15f5c2052b419ec853add0cd460036976dbecb" Workload="localhost-k8s-calico--kube--controllers--68dbc864b6--dsj69-eth0" Mar 25 01:16:45.926561 containerd[1481]: 2025-03-25 01:16:45.878 [INFO][4428] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9e45b056531a956c5a54e327dc15f5c2052b419ec853add0cd460036976dbecb" HandleID="k8s-pod-network.9e45b056531a956c5a54e327dc15f5c2052b419ec853add0cd460036976dbecb" Workload="localhost-k8s-calico--kube--controllers--68dbc864b6--dsj69-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003053d0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-68dbc864b6-dsj69", "timestamp":"2025-03-25 01:16:45.865736933 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:16:45.926561 containerd[1481]: 2025-03-25 01:16:45.878 [INFO][4428] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:16:45.926561 containerd[1481]: 2025-03-25 01:16:45.878 [INFO][4428] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:16:45.926561 containerd[1481]: 2025-03-25 01:16:45.878 [INFO][4428] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 25 01:16:45.926561 containerd[1481]: 2025-03-25 01:16:45.880 [INFO][4428] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.9e45b056531a956c5a54e327dc15f5c2052b419ec853add0cd460036976dbecb" host="localhost" Mar 25 01:16:45.926561 containerd[1481]: 2025-03-25 01:16:45.884 [INFO][4428] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 25 01:16:45.926561 containerd[1481]: 2025-03-25 01:16:45.888 [INFO][4428] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 25 01:16:45.926561 containerd[1481]: 2025-03-25 01:16:45.890 [INFO][4428] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 25 01:16:45.926561 containerd[1481]: 2025-03-25 01:16:45.892 [INFO][4428] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 25 01:16:45.926561 containerd[1481]: 2025-03-25 01:16:45.892 [INFO][4428] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.9e45b056531a956c5a54e327dc15f5c2052b419ec853add0cd460036976dbecb" host="localhost" Mar 25 01:16:45.926561 containerd[1481]: 2025-03-25 01:16:45.893 [INFO][4428] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.9e45b056531a956c5a54e327dc15f5c2052b419ec853add0cd460036976dbecb Mar 25 01:16:45.926561 containerd[1481]: 2025-03-25 01:16:45.897 [INFO][4428] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.9e45b056531a956c5a54e327dc15f5c2052b419ec853add0cd460036976dbecb" host="localhost" Mar 25 01:16:45.926561 containerd[1481]: 2025-03-25 01:16:45.904 [INFO][4428] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.9e45b056531a956c5a54e327dc15f5c2052b419ec853add0cd460036976dbecb" host="localhost" Mar 25 01:16:45.926561 containerd[1481]: 2025-03-25 01:16:45.904 [INFO][4428] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.9e45b056531a956c5a54e327dc15f5c2052b419ec853add0cd460036976dbecb" host="localhost" Mar 25 01:16:45.926561 containerd[1481]: 2025-03-25 01:16:45.904 [INFO][4428] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:16:45.926561 containerd[1481]: 2025-03-25 01:16:45.904 [INFO][4428] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="9e45b056531a956c5a54e327dc15f5c2052b419ec853add0cd460036976dbecb" HandleID="k8s-pod-network.9e45b056531a956c5a54e327dc15f5c2052b419ec853add0cd460036976dbecb" Workload="localhost-k8s-calico--kube--controllers--68dbc864b6--dsj69-eth0" Mar 25 01:16:45.927177 containerd[1481]: 2025-03-25 01:16:45.906 [INFO][4413] cni-plugin/k8s.go 386: Populated endpoint ContainerID="9e45b056531a956c5a54e327dc15f5c2052b419ec853add0cd460036976dbecb" Namespace="calico-system" Pod="calico-kube-controllers-68dbc864b6-dsj69" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--68dbc864b6--dsj69-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--68dbc864b6--dsj69-eth0", GenerateName:"calico-kube-controllers-68dbc864b6-", Namespace:"calico-system", SelfLink:"", UID:"685bedc2-f03f-4bb4-896f-cbc789cd2327", ResourceVersion:"654", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 16, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"68dbc864b6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-68dbc864b6-dsj69", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali91369175376", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:16:45.927177 containerd[1481]: 2025-03-25 01:16:45.907 [INFO][4413] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.133/32] ContainerID="9e45b056531a956c5a54e327dc15f5c2052b419ec853add0cd460036976dbecb" Namespace="calico-system" Pod="calico-kube-controllers-68dbc864b6-dsj69" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--68dbc864b6--dsj69-eth0" Mar 25 01:16:45.927177 containerd[1481]: 2025-03-25 01:16:45.907 [INFO][4413] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali91369175376 ContainerID="9e45b056531a956c5a54e327dc15f5c2052b419ec853add0cd460036976dbecb" Namespace="calico-system" Pod="calico-kube-controllers-68dbc864b6-dsj69" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--68dbc864b6--dsj69-eth0" Mar 25 01:16:45.927177 containerd[1481]: 2025-03-25 01:16:45.909 [INFO][4413] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9e45b056531a956c5a54e327dc15f5c2052b419ec853add0cd460036976dbecb" Namespace="calico-system" Pod="calico-kube-controllers-68dbc864b6-dsj69" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--68dbc864b6--dsj69-eth0" Mar 25 01:16:45.927177 containerd[1481]: 2025-03-25 01:16:45.910 [INFO][4413] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="9e45b056531a956c5a54e327dc15f5c2052b419ec853add0cd460036976dbecb" Namespace="calico-system" Pod="calico-kube-controllers-68dbc864b6-dsj69" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--68dbc864b6--dsj69-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--68dbc864b6--dsj69-eth0", GenerateName:"calico-kube-controllers-68dbc864b6-", Namespace:"calico-system", SelfLink:"", UID:"685bedc2-f03f-4bb4-896f-cbc789cd2327", ResourceVersion:"654", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 16, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"68dbc864b6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"9e45b056531a956c5a54e327dc15f5c2052b419ec853add0cd460036976dbecb", Pod:"calico-kube-controllers-68dbc864b6-dsj69", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali91369175376", MAC:"36:df:62:0e:3d:99", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:16:45.927177 containerd[1481]: 2025-03-25 01:16:45.924 [INFO][4413] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="9e45b056531a956c5a54e327dc15f5c2052b419ec853add0cd460036976dbecb" Namespace="calico-system" Pod="calico-kube-controllers-68dbc864b6-dsj69" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--68dbc864b6--dsj69-eth0" Mar 25 01:16:45.950376 containerd[1481]: time="2025-03-25T01:16:45.949920026Z" level=info msg="connecting to shim 9e45b056531a956c5a54e327dc15f5c2052b419ec853add0cd460036976dbecb" address="unix:///run/containerd/s/832a98967a43c266c4ab30a2115c94c2683dc44698442308871c81a317824ae7" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:16:45.970032 systemd[1]: Started cri-containerd-9e45b056531a956c5a54e327dc15f5c2052b419ec853add0cd460036976dbecb.scope - libcontainer container 9e45b056531a956c5a54e327dc15f5c2052b419ec853add0cd460036976dbecb. Mar 25 01:16:45.981366 systemd-resolved[1332]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 25 01:16:46.001919 containerd[1481]: time="2025-03-25T01:16:46.001882367Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-68dbc864b6-dsj69,Uid:685bedc2-f03f-4bb4-896f-cbc789cd2327,Namespace:calico-system,Attempt:0,} returns sandbox id \"9e45b056531a956c5a54e327dc15f5c2052b419ec853add0cd460036976dbecb\"" Mar 25 01:16:46.125049 systemd-networkd[1422]: cali50a0a9de549: Gained IPv6LL Mar 25 01:16:46.317334 systemd-networkd[1422]: calie8136cd6a2d: Gained IPv6LL Mar 25 01:16:46.701045 systemd-networkd[1422]: calid4a12790e18: Gained IPv6LL Mar 25 01:16:46.795259 containerd[1481]: time="2025-03-25T01:16:46.795208014Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8c9cf7c59-hf464,Uid:e10f7734-b525-4202-aa84-b2555758598b,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:16:46.950264 containerd[1481]: time="2025-03-25T01:16:46.949525384Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:46.951153 containerd[1481]: time="2025-03-25T01:16:46.951065432Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=40253267" Mar 25 01:16:46.953171 containerd[1481]: time="2025-03-25T01:16:46.952143146Z" level=info msg="ImageCreate event name:\"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:46.954748 containerd[1481]: time="2025-03-25T01:16:46.954716346Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:46.956081 containerd[1481]: time="2025-03-25T01:16:46.956055387Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"41623040\" in 1.465606337s" Mar 25 01:16:46.956205 containerd[1481]: time="2025-03-25T01:16:46.956189432Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\"" Mar 25 01:16:46.958069 containerd[1481]: time="2025-03-25T01:16:46.958042889Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\"" Mar 25 01:16:46.959783 containerd[1481]: time="2025-03-25T01:16:46.959751183Z" level=info msg="CreateContainer within sandbox \"6a0f400c3c452771f5395ee36a0508c24eda6816bf850b3e63a980c64866b5f6\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 25 01:16:46.974377 containerd[1481]: time="2025-03-25T01:16:46.974340077Z" level=info msg="Container 42f9884fe921b2c8a0ca873a493b82888429aae53a166a555d95f99f63594883: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:16:46.985325 containerd[1481]: time="2025-03-25T01:16:46.985292539Z" level=info msg="CreateContainer within sandbox \"6a0f400c3c452771f5395ee36a0508c24eda6816bf850b3e63a980c64866b5f6\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"42f9884fe921b2c8a0ca873a493b82888429aae53a166a555d95f99f63594883\"" Mar 25 01:16:46.986209 containerd[1481]: time="2025-03-25T01:16:46.986174326Z" level=info msg="StartContainer for \"42f9884fe921b2c8a0ca873a493b82888429aae53a166a555d95f99f63594883\"" Mar 25 01:16:46.989966 containerd[1481]: time="2025-03-25T01:16:46.989571792Z" level=info msg="connecting to shim 42f9884fe921b2c8a0ca873a493b82888429aae53a166a555d95f99f63594883" address="unix:///run/containerd/s/c43705b0408a93fb75705bbc06de0d4688a2b95a8292271f2d1d23d265560df4" protocol=ttrpc version=3 Mar 25 01:16:47.015128 systemd[1]: Started cri-containerd-42f9884fe921b2c8a0ca873a493b82888429aae53a166a555d95f99f63594883.scope - libcontainer container 42f9884fe921b2c8a0ca873a493b82888429aae53a166a555d95f99f63594883. Mar 25 01:16:47.017731 kubelet[2592]: I0325 01:16:47.017368 2592 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:16:47.075808 containerd[1481]: time="2025-03-25T01:16:47.075766261Z" level=info msg="StartContainer for \"42f9884fe921b2c8a0ca873a493b82888429aae53a166a555d95f99f63594883\" returns successfully" Mar 25 01:16:47.147571 systemd-networkd[1422]: cali5b41a49a908: Link UP Mar 25 01:16:47.147808 systemd-networkd[1422]: cali5b41a49a908: Gained carrier Mar 25 01:16:47.164578 containerd[1481]: 2025-03-25 01:16:46.956 [INFO][4523] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 25 01:16:47.164578 containerd[1481]: 2025-03-25 01:16:46.972 [INFO][4523] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--8c9cf7c59--hf464-eth0 calico-apiserver-8c9cf7c59- calico-apiserver e10f7734-b525-4202-aa84-b2555758598b 653 0 2025-03-25 01:16:24 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:8c9cf7c59 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-8c9cf7c59-hf464 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali5b41a49a908 [] []}} ContainerID="e9c6cb6f349915540c1ff50d4c2c2fdd404e0ba73507934a30db4c99995d12cc" Namespace="calico-apiserver" Pod="calico-apiserver-8c9cf7c59-hf464" WorkloadEndpoint="localhost-k8s-calico--apiserver--8c9cf7c59--hf464-" Mar 25 01:16:47.164578 containerd[1481]: 2025-03-25 01:16:46.972 [INFO][4523] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="e9c6cb6f349915540c1ff50d4c2c2fdd404e0ba73507934a30db4c99995d12cc" Namespace="calico-apiserver" Pod="calico-apiserver-8c9cf7c59-hf464" WorkloadEndpoint="localhost-k8s-calico--apiserver--8c9cf7c59--hf464-eth0" Mar 25 01:16:47.164578 containerd[1481]: 2025-03-25 01:16:47.008 [INFO][4542] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e9c6cb6f349915540c1ff50d4c2c2fdd404e0ba73507934a30db4c99995d12cc" HandleID="k8s-pod-network.e9c6cb6f349915540c1ff50d4c2c2fdd404e0ba73507934a30db4c99995d12cc" Workload="localhost-k8s-calico--apiserver--8c9cf7c59--hf464-eth0" Mar 25 01:16:47.164578 containerd[1481]: 2025-03-25 01:16:47.025 [INFO][4542] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e9c6cb6f349915540c1ff50d4c2c2fdd404e0ba73507934a30db4c99995d12cc" HandleID="k8s-pod-network.e9c6cb6f349915540c1ff50d4c2c2fdd404e0ba73507934a30db4c99995d12cc" Workload="localhost-k8s-calico--apiserver--8c9cf7c59--hf464-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d8e60), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-8c9cf7c59-hf464", "timestamp":"2025-03-25 01:16:47.008725623 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:16:47.164578 containerd[1481]: 2025-03-25 01:16:47.025 [INFO][4542] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:16:47.164578 containerd[1481]: 2025-03-25 01:16:47.026 [INFO][4542] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:16:47.164578 containerd[1481]: 2025-03-25 01:16:47.026 [INFO][4542] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 25 01:16:47.164578 containerd[1481]: 2025-03-25 01:16:47.028 [INFO][4542] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.e9c6cb6f349915540c1ff50d4c2c2fdd404e0ba73507934a30db4c99995d12cc" host="localhost" Mar 25 01:16:47.164578 containerd[1481]: 2025-03-25 01:16:47.118 [INFO][4542] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 25 01:16:47.164578 containerd[1481]: 2025-03-25 01:16:47.124 [INFO][4542] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 25 01:16:47.164578 containerd[1481]: 2025-03-25 01:16:47.126 [INFO][4542] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 25 01:16:47.164578 containerd[1481]: 2025-03-25 01:16:47.129 [INFO][4542] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 25 01:16:47.164578 containerd[1481]: 2025-03-25 01:16:47.129 [INFO][4542] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.e9c6cb6f349915540c1ff50d4c2c2fdd404e0ba73507934a30db4c99995d12cc" host="localhost" Mar 25 01:16:47.164578 containerd[1481]: 2025-03-25 01:16:47.130 [INFO][4542] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.e9c6cb6f349915540c1ff50d4c2c2fdd404e0ba73507934a30db4c99995d12cc Mar 25 01:16:47.164578 containerd[1481]: 2025-03-25 01:16:47.137 [INFO][4542] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.e9c6cb6f349915540c1ff50d4c2c2fdd404e0ba73507934a30db4c99995d12cc" host="localhost" Mar 25 01:16:47.164578 containerd[1481]: 2025-03-25 01:16:47.143 [INFO][4542] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.e9c6cb6f349915540c1ff50d4c2c2fdd404e0ba73507934a30db4c99995d12cc" host="localhost" Mar 25 01:16:47.164578 containerd[1481]: 2025-03-25 01:16:47.143 [INFO][4542] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.e9c6cb6f349915540c1ff50d4c2c2fdd404e0ba73507934a30db4c99995d12cc" host="localhost" Mar 25 01:16:47.164578 containerd[1481]: 2025-03-25 01:16:47.143 [INFO][4542] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:16:47.164578 containerd[1481]: 2025-03-25 01:16:47.143 [INFO][4542] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="e9c6cb6f349915540c1ff50d4c2c2fdd404e0ba73507934a30db4c99995d12cc" HandleID="k8s-pod-network.e9c6cb6f349915540c1ff50d4c2c2fdd404e0ba73507934a30db4c99995d12cc" Workload="localhost-k8s-calico--apiserver--8c9cf7c59--hf464-eth0" Mar 25 01:16:47.165134 containerd[1481]: 2025-03-25 01:16:47.145 [INFO][4523] cni-plugin/k8s.go 386: Populated endpoint ContainerID="e9c6cb6f349915540c1ff50d4c2c2fdd404e0ba73507934a30db4c99995d12cc" Namespace="calico-apiserver" Pod="calico-apiserver-8c9cf7c59-hf464" WorkloadEndpoint="localhost-k8s-calico--apiserver--8c9cf7c59--hf464-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--8c9cf7c59--hf464-eth0", GenerateName:"calico-apiserver-8c9cf7c59-", Namespace:"calico-apiserver", SelfLink:"", UID:"e10f7734-b525-4202-aa84-b2555758598b", ResourceVersion:"653", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 16, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8c9cf7c59", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-8c9cf7c59-hf464", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5b41a49a908", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:16:47.165134 containerd[1481]: 2025-03-25 01:16:47.145 [INFO][4523] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.134/32] ContainerID="e9c6cb6f349915540c1ff50d4c2c2fdd404e0ba73507934a30db4c99995d12cc" Namespace="calico-apiserver" Pod="calico-apiserver-8c9cf7c59-hf464" WorkloadEndpoint="localhost-k8s-calico--apiserver--8c9cf7c59--hf464-eth0" Mar 25 01:16:47.165134 containerd[1481]: 2025-03-25 01:16:47.145 [INFO][4523] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5b41a49a908 ContainerID="e9c6cb6f349915540c1ff50d4c2c2fdd404e0ba73507934a30db4c99995d12cc" Namespace="calico-apiserver" Pod="calico-apiserver-8c9cf7c59-hf464" WorkloadEndpoint="localhost-k8s-calico--apiserver--8c9cf7c59--hf464-eth0" Mar 25 01:16:47.165134 containerd[1481]: 2025-03-25 01:16:47.148 [INFO][4523] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e9c6cb6f349915540c1ff50d4c2c2fdd404e0ba73507934a30db4c99995d12cc" Namespace="calico-apiserver" Pod="calico-apiserver-8c9cf7c59-hf464" WorkloadEndpoint="localhost-k8s-calico--apiserver--8c9cf7c59--hf464-eth0" Mar 25 01:16:47.165134 containerd[1481]: 2025-03-25 01:16:47.148 [INFO][4523] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="e9c6cb6f349915540c1ff50d4c2c2fdd404e0ba73507934a30db4c99995d12cc" Namespace="calico-apiserver" Pod="calico-apiserver-8c9cf7c59-hf464" WorkloadEndpoint="localhost-k8s-calico--apiserver--8c9cf7c59--hf464-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--8c9cf7c59--hf464-eth0", GenerateName:"calico-apiserver-8c9cf7c59-", Namespace:"calico-apiserver", SelfLink:"", UID:"e10f7734-b525-4202-aa84-b2555758598b", ResourceVersion:"653", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 16, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8c9cf7c59", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"e9c6cb6f349915540c1ff50d4c2c2fdd404e0ba73507934a30db4c99995d12cc", Pod:"calico-apiserver-8c9cf7c59-hf464", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5b41a49a908", MAC:"52:e7:44:e1:df:32", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:16:47.165134 containerd[1481]: 2025-03-25 01:16:47.161 [INFO][4523] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="e9c6cb6f349915540c1ff50d4c2c2fdd404e0ba73507934a30db4c99995d12cc" Namespace="calico-apiserver" Pod="calico-apiserver-8c9cf7c59-hf464" WorkloadEndpoint="localhost-k8s-calico--apiserver--8c9cf7c59--hf464-eth0" Mar 25 01:16:47.188692 containerd[1481]: time="2025-03-25T01:16:47.188606572Z" level=info msg="connecting to shim e9c6cb6f349915540c1ff50d4c2c2fdd404e0ba73507934a30db4c99995d12cc" address="unix:///run/containerd/s/6f35839fcfea4375bb59f0e76b96bde21629d124c680053c759396801b6096f1" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:16:47.210047 systemd[1]: Started cri-containerd-e9c6cb6f349915540c1ff50d4c2c2fdd404e0ba73507934a30db4c99995d12cc.scope - libcontainer container e9c6cb6f349915540c1ff50d4c2c2fdd404e0ba73507934a30db4c99995d12cc. Mar 25 01:16:47.213985 systemd-networkd[1422]: cali91369175376: Gained IPv6LL Mar 25 01:16:47.224974 systemd-resolved[1332]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 25 01:16:47.253685 containerd[1481]: time="2025-03-25T01:16:47.253475985Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8c9cf7c59-hf464,Uid:e10f7734-b525-4202-aa84-b2555758598b,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"e9c6cb6f349915540c1ff50d4c2c2fdd404e0ba73507934a30db4c99995d12cc\"" Mar 25 01:16:47.258362 containerd[1481]: time="2025-03-25T01:16:47.258232289Z" level=info msg="CreateContainer within sandbox \"e9c6cb6f349915540c1ff50d4c2c2fdd404e0ba73507934a30db4c99995d12cc\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 25 01:16:47.265595 containerd[1481]: time="2025-03-25T01:16:47.265541152Z" level=info msg="Container 6794ae19eafd2465fbe24353feb523593575cbf9b0df6a02861da48609336634: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:16:47.272364 containerd[1481]: time="2025-03-25T01:16:47.272314757Z" level=info msg="CreateContainer within sandbox \"e9c6cb6f349915540c1ff50d4c2c2fdd404e0ba73507934a30db4c99995d12cc\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"6794ae19eafd2465fbe24353feb523593575cbf9b0df6a02861da48609336634\"" Mar 25 01:16:47.274454 containerd[1481]: time="2025-03-25T01:16:47.273325988Z" level=info msg="StartContainer for \"6794ae19eafd2465fbe24353feb523593575cbf9b0df6a02861da48609336634\"" Mar 25 01:16:47.274454 containerd[1481]: time="2025-03-25T01:16:47.274345659Z" level=info msg="connecting to shim 6794ae19eafd2465fbe24353feb523593575cbf9b0df6a02861da48609336634" address="unix:///run/containerd/s/6f35839fcfea4375bb59f0e76b96bde21629d124c680053c759396801b6096f1" protocol=ttrpc version=3 Mar 25 01:16:47.296038 systemd[1]: Started cri-containerd-6794ae19eafd2465fbe24353feb523593575cbf9b0df6a02861da48609336634.scope - libcontainer container 6794ae19eafd2465fbe24353feb523593575cbf9b0df6a02861da48609336634. Mar 25 01:16:47.349666 containerd[1481]: time="2025-03-25T01:16:47.349617308Z" level=info msg="StartContainer for \"6794ae19eafd2465fbe24353feb523593575cbf9b0df6a02861da48609336634\" returns successfully" Mar 25 01:16:47.383886 kernel: bpftool[4688]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 25 01:16:47.670100 systemd-networkd[1422]: vxlan.calico: Link UP Mar 25 01:16:47.670107 systemd-networkd[1422]: vxlan.calico: Gained carrier Mar 25 01:16:47.959848 kubelet[2592]: I0325 01:16:47.959537 2592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-8c9cf7c59-zl2f7" podStartSLOduration=22.048323304 podStartE2EDuration="23.959520372s" podCreationTimestamp="2025-03-25 01:16:24 +0000 UTC" firstStartedPulling="2025-03-25 01:16:45.046058677 +0000 UTC m=+35.336001865" lastFinishedPulling="2025-03-25 01:16:46.957255785 +0000 UTC m=+37.247198933" observedRunningTime="2025-03-25 01:16:47.943718171 +0000 UTC m=+38.233661359" watchObservedRunningTime="2025-03-25 01:16:47.959520372 +0000 UTC m=+38.249463560" Mar 25 01:16:47.960654 kubelet[2592]: I0325 01:16:47.960541 2592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-8c9cf7c59-hf464" podStartSLOduration=23.960532283 podStartE2EDuration="23.960532283s" podCreationTimestamp="2025-03-25 01:16:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:16:47.957238262 +0000 UTC m=+38.247181450" watchObservedRunningTime="2025-03-25 01:16:47.960532283 +0000 UTC m=+38.250475471" Mar 25 01:16:48.204782 containerd[1481]: time="2025-03-25T01:16:48.204734361Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:48.206049 containerd[1481]: time="2025-03-25T01:16:48.205441662Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2: active requests=0, bytes read=13121717" Mar 25 01:16:48.206428 containerd[1481]: time="2025-03-25T01:16:48.206341009Z" level=info msg="ImageCreate event name:\"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:48.208528 containerd[1481]: time="2025-03-25T01:16:48.208472312Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:48.209394 containerd[1481]: time="2025-03-25T01:16:48.209351418Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" with image id \"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\", size \"14491426\" in 1.251059441s" Mar 25 01:16:48.209394 containerd[1481]: time="2025-03-25T01:16:48.209386259Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" returns image reference \"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\"" Mar 25 01:16:48.212422 containerd[1481]: time="2025-03-25T01:16:48.211796291Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\"" Mar 25 01:16:48.213405 containerd[1481]: time="2025-03-25T01:16:48.213362377Z" level=info msg="CreateContainer within sandbox \"84d8e2dc347ab13abed20b2c447977405657a156f89cecef72b0cd7b3e72189d\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 25 01:16:48.232409 containerd[1481]: time="2025-03-25T01:16:48.232356421Z" level=info msg="Container 3faeafd1c0a61539365ca5aab6b5f4e28da16d61f2cbd8bc2f9cd15a649c9219: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:16:48.245243 containerd[1481]: time="2025-03-25T01:16:48.244615145Z" level=info msg="CreateContainer within sandbox \"84d8e2dc347ab13abed20b2c447977405657a156f89cecef72b0cd7b3e72189d\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"3faeafd1c0a61539365ca5aab6b5f4e28da16d61f2cbd8bc2f9cd15a649c9219\"" Mar 25 01:16:48.245604 containerd[1481]: time="2025-03-25T01:16:48.245584974Z" level=info msg="StartContainer for \"3faeafd1c0a61539365ca5aab6b5f4e28da16d61f2cbd8bc2f9cd15a649c9219\"" Mar 25 01:16:48.247586 containerd[1481]: time="2025-03-25T01:16:48.247426629Z" level=info msg="connecting to shim 3faeafd1c0a61539365ca5aab6b5f4e28da16d61f2cbd8bc2f9cd15a649c9219" address="unix:///run/containerd/s/1615a2c30ce4b0be95c92ed6f0015d5d87874b16ade1a2190e03e88a6693c493" protocol=ttrpc version=3 Mar 25 01:16:48.286080 systemd[1]: Started cri-containerd-3faeafd1c0a61539365ca5aab6b5f4e28da16d61f2cbd8bc2f9cd15a649c9219.scope - libcontainer container 3faeafd1c0a61539365ca5aab6b5f4e28da16d61f2cbd8bc2f9cd15a649c9219. Mar 25 01:16:48.344440 containerd[1481]: time="2025-03-25T01:16:48.344401228Z" level=info msg="StartContainer for \"3faeafd1c0a61539365ca5aab6b5f4e28da16d61f2cbd8bc2f9cd15a649c9219\" returns successfully" Mar 25 01:16:48.429064 systemd-networkd[1422]: cali5b41a49a908: Gained IPv6LL Mar 25 01:16:48.537567 systemd[1]: Started sshd@9-10.0.0.53:22-10.0.0.1:55102.service - OpenSSH per-connection server daemon (10.0.0.1:55102). Mar 25 01:16:48.614611 sshd[4861]: Accepted publickey for core from 10.0.0.1 port 55102 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:16:48.617108 sshd-session[4861]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:16:48.627549 systemd-logind[1468]: New session 10 of user core. Mar 25 01:16:48.633074 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 25 01:16:48.823245 sshd[4863]: Connection closed by 10.0.0.1 port 55102 Mar 25 01:16:48.824176 sshd-session[4861]: pam_unix(sshd:session): session closed for user core Mar 25 01:16:48.832567 systemd[1]: session-10.scope: Deactivated successfully. Mar 25 01:16:48.834980 systemd[1]: sshd@9-10.0.0.53:22-10.0.0.1:55102.service: Deactivated successfully. Mar 25 01:16:48.838698 systemd-logind[1468]: Session 10 logged out. Waiting for processes to exit. Mar 25 01:16:48.843115 systemd[1]: Started sshd@10-10.0.0.53:22-10.0.0.1:55106.service - OpenSSH per-connection server daemon (10.0.0.1:55106). Mar 25 01:16:48.845140 systemd-logind[1468]: Removed session 10. Mar 25 01:16:48.863656 kubelet[2592]: I0325 01:16:48.863573 2592 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 25 01:16:48.868148 kubelet[2592]: I0325 01:16:48.868115 2592 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 25 01:16:48.909838 sshd[4876]: Accepted publickey for core from 10.0.0.1 port 55106 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:16:48.911337 sshd-session[4876]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:16:48.920731 systemd-logind[1468]: New session 11 of user core. Mar 25 01:16:48.927023 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 25 01:16:48.957837 kubelet[2592]: I0325 01:16:48.957195 2592 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:16:48.970483 kubelet[2592]: I0325 01:16:48.970250 2592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-dzt9d" podStartSLOduration=21.25531131 podStartE2EDuration="24.970231368s" podCreationTimestamp="2025-03-25 01:16:24 +0000 UTC" firstStartedPulling="2025-03-25 01:16:44.496734309 +0000 UTC m=+34.786677497" lastFinishedPulling="2025-03-25 01:16:48.211654367 +0000 UTC m=+38.501597555" observedRunningTime="2025-03-25 01:16:48.969679951 +0000 UTC m=+39.259623139" watchObservedRunningTime="2025-03-25 01:16:48.970231368 +0000 UTC m=+39.260174556" Mar 25 01:16:49.195260 sshd[4879]: Connection closed by 10.0.0.1 port 55106 Mar 25 01:16:49.195898 sshd-session[4876]: pam_unix(sshd:session): session closed for user core Mar 25 01:16:49.204763 systemd[1]: sshd@10-10.0.0.53:22-10.0.0.1:55106.service: Deactivated successfully. Mar 25 01:16:49.208235 systemd[1]: session-11.scope: Deactivated successfully. Mar 25 01:16:49.209326 systemd-logind[1468]: Session 11 logged out. Waiting for processes to exit. Mar 25 01:16:49.215465 systemd[1]: Started sshd@11-10.0.0.53:22-10.0.0.1:55122.service - OpenSSH per-connection server daemon (10.0.0.1:55122). Mar 25 01:16:49.216840 systemd-logind[1468]: Removed session 11. Mar 25 01:16:49.278986 sshd[4896]: Accepted publickey for core from 10.0.0.1 port 55122 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:16:49.280302 sshd-session[4896]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:16:49.288550 systemd-logind[1468]: New session 12 of user core. Mar 25 01:16:49.304031 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 25 01:16:49.453054 systemd-networkd[1422]: vxlan.calico: Gained IPv6LL Mar 25 01:16:49.547927 sshd[4899]: Connection closed by 10.0.0.1 port 55122 Mar 25 01:16:49.550272 sshd-session[4896]: pam_unix(sshd:session): session closed for user core Mar 25 01:16:49.552814 systemd[1]: sshd@11-10.0.0.53:22-10.0.0.1:55122.service: Deactivated successfully. Mar 25 01:16:49.554807 systemd[1]: session-12.scope: Deactivated successfully. Mar 25 01:16:49.558461 systemd-logind[1468]: Session 12 logged out. Waiting for processes to exit. Mar 25 01:16:49.559535 systemd-logind[1468]: Removed session 12. Mar 25 01:16:49.833781 containerd[1481]: time="2025-03-25T01:16:49.833401595Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:49.834222 containerd[1481]: time="2025-03-25T01:16:49.834010973Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.2: active requests=0, bytes read=32560257" Mar 25 01:16:49.835955 containerd[1481]: time="2025-03-25T01:16:49.835143565Z" level=info msg="ImageCreate event name:\"sha256:39a6e91a11a792441d34dccf5e11416a0fd297782f169fdb871a5558ad50b229\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:49.837649 containerd[1481]: time="2025-03-25T01:16:49.837620757Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:16:49.840465 containerd[1481]: time="2025-03-25T01:16:49.840358997Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" with image id \"sha256:39a6e91a11a792441d34dccf5e11416a0fd297782f169fdb871a5558ad50b229\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\", size \"33929982\" in 1.628514384s" Mar 25 01:16:49.840523 containerd[1481]: time="2025-03-25T01:16:49.840469680Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" returns image reference \"sha256:39a6e91a11a792441d34dccf5e11416a0fd297782f169fdb871a5558ad50b229\"" Mar 25 01:16:49.851811 containerd[1481]: time="2025-03-25T01:16:49.851766208Z" level=info msg="CreateContainer within sandbox \"9e45b056531a956c5a54e327dc15f5c2052b419ec853add0cd460036976dbecb\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 25 01:16:49.857116 containerd[1481]: time="2025-03-25T01:16:49.857081442Z" level=info msg="Container 69cf366349ba4a64b1203f05b2fbb77bebf6dcdaad48162b8a0cebad7567598a: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:16:49.865237 containerd[1481]: time="2025-03-25T01:16:49.865195957Z" level=info msg="CreateContainer within sandbox \"9e45b056531a956c5a54e327dc15f5c2052b419ec853add0cd460036976dbecb\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"69cf366349ba4a64b1203f05b2fbb77bebf6dcdaad48162b8a0cebad7567598a\"" Mar 25 01:16:49.865655 containerd[1481]: time="2025-03-25T01:16:49.865630770Z" level=info msg="StartContainer for \"69cf366349ba4a64b1203f05b2fbb77bebf6dcdaad48162b8a0cebad7567598a\"" Mar 25 01:16:49.866866 containerd[1481]: time="2025-03-25T01:16:49.866824045Z" level=info msg="connecting to shim 69cf366349ba4a64b1203f05b2fbb77bebf6dcdaad48162b8a0cebad7567598a" address="unix:///run/containerd/s/832a98967a43c266c4ab30a2115c94c2683dc44698442308871c81a317824ae7" protocol=ttrpc version=3 Mar 25 01:16:49.894027 systemd[1]: Started cri-containerd-69cf366349ba4a64b1203f05b2fbb77bebf6dcdaad48162b8a0cebad7567598a.scope - libcontainer container 69cf366349ba4a64b1203f05b2fbb77bebf6dcdaad48162b8a0cebad7567598a. Mar 25 01:16:49.983952 containerd[1481]: time="2025-03-25T01:16:49.983903202Z" level=info msg="StartContainer for \"69cf366349ba4a64b1203f05b2fbb77bebf6dcdaad48162b8a0cebad7567598a\" returns successfully" Mar 25 01:16:50.169722 kubelet[2592]: I0325 01:16:50.169571 2592 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:16:50.298744 containerd[1481]: time="2025-03-25T01:16:50.298703829Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9f6b4e1432693a949302362bf226e1cf53fce4534ec67598a923f4f969dadd17\" id:\"645bd6efda2c67183967d56c4393b9488b9d87c396ccdac997828428922fcc88\" pid:4962 exit_status:1 exited_at:{seconds:1742865410 nanos:298390861}" Mar 25 01:16:50.373317 containerd[1481]: time="2025-03-25T01:16:50.373279106Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9f6b4e1432693a949302362bf226e1cf53fce4534ec67598a923f4f969dadd17\" id:\"e25f7709867f780653318277787839abddef504e2b1631bbe0124762f130f03c\" pid:4986 exit_status:1 exited_at:{seconds:1742865410 nanos:372801773}" Mar 25 01:16:50.996157 kubelet[2592]: I0325 01:16:50.996098 2592 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-68dbc864b6-dsj69" podStartSLOduration=23.158157216 podStartE2EDuration="26.996079346s" podCreationTimestamp="2025-03-25 01:16:24 +0000 UTC" firstStartedPulling="2025-03-25 01:16:46.003312452 +0000 UTC m=+36.293255640" lastFinishedPulling="2025-03-25 01:16:49.841234582 +0000 UTC m=+40.131177770" observedRunningTime="2025-03-25 01:16:50.994952474 +0000 UTC m=+41.284895662" watchObservedRunningTime="2025-03-25 01:16:50.996079346 +0000 UTC m=+41.286022494" Mar 25 01:16:54.560441 systemd[1]: Started sshd@12-10.0.0.53:22-10.0.0.1:55172.service - OpenSSH per-connection server daemon (10.0.0.1:55172). Mar 25 01:16:54.649812 sshd[5007]: Accepted publickey for core from 10.0.0.1 port 55172 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:16:54.651358 sshd-session[5007]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:16:54.656919 systemd-logind[1468]: New session 13 of user core. Mar 25 01:16:54.666072 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 25 01:16:54.840628 sshd[5011]: Connection closed by 10.0.0.1 port 55172 Mar 25 01:16:54.841399 sshd-session[5007]: pam_unix(sshd:session): session closed for user core Mar 25 01:16:54.851239 systemd[1]: sshd@12-10.0.0.53:22-10.0.0.1:55172.service: Deactivated successfully. Mar 25 01:16:54.852972 systemd[1]: session-13.scope: Deactivated successfully. Mar 25 01:16:54.853650 systemd-logind[1468]: Session 13 logged out. Waiting for processes to exit. Mar 25 01:16:54.855431 systemd[1]: Started sshd@13-10.0.0.53:22-10.0.0.1:55182.service - OpenSSH per-connection server daemon (10.0.0.1:55182). Mar 25 01:16:54.856289 systemd-logind[1468]: Removed session 13. Mar 25 01:16:54.916985 sshd[5024]: Accepted publickey for core from 10.0.0.1 port 55182 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:16:54.918149 sshd-session[5024]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:16:54.922663 systemd-logind[1468]: New session 14 of user core. Mar 25 01:16:54.928995 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 25 01:16:55.134712 sshd[5027]: Connection closed by 10.0.0.1 port 55182 Mar 25 01:16:55.134505 sshd-session[5024]: pam_unix(sshd:session): session closed for user core Mar 25 01:16:55.151235 systemd[1]: sshd@13-10.0.0.53:22-10.0.0.1:55182.service: Deactivated successfully. Mar 25 01:16:55.153525 systemd[1]: session-14.scope: Deactivated successfully. Mar 25 01:16:55.154401 systemd-logind[1468]: Session 14 logged out. Waiting for processes to exit. Mar 25 01:16:55.156323 systemd[1]: Started sshd@14-10.0.0.53:22-10.0.0.1:55194.service - OpenSSH per-connection server daemon (10.0.0.1:55194). Mar 25 01:16:55.157129 systemd-logind[1468]: Removed session 14. Mar 25 01:16:55.218596 sshd[5038]: Accepted publickey for core from 10.0.0.1 port 55194 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:16:55.219886 sshd-session[5038]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:16:55.223906 systemd-logind[1468]: New session 15 of user core. Mar 25 01:16:55.231001 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 25 01:16:56.793523 sshd[5041]: Connection closed by 10.0.0.1 port 55194 Mar 25 01:16:56.793908 sshd-session[5038]: pam_unix(sshd:session): session closed for user core Mar 25 01:16:56.804755 systemd[1]: sshd@14-10.0.0.53:22-10.0.0.1:55194.service: Deactivated successfully. Mar 25 01:16:56.807989 systemd[1]: session-15.scope: Deactivated successfully. Mar 25 01:16:56.808256 systemd[1]: session-15.scope: Consumed 497ms CPU time, 71.5M memory peak. Mar 25 01:16:56.809319 systemd-logind[1468]: Session 15 logged out. Waiting for processes to exit. Mar 25 01:16:56.813584 systemd[1]: Started sshd@15-10.0.0.53:22-10.0.0.1:55208.service - OpenSSH per-connection server daemon (10.0.0.1:55208). Mar 25 01:16:56.816666 systemd-logind[1468]: Removed session 15. Mar 25 01:16:56.876036 sshd[5062]: Accepted publickey for core from 10.0.0.1 port 55208 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:16:56.877204 sshd-session[5062]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:16:56.881094 systemd-logind[1468]: New session 16 of user core. Mar 25 01:16:56.889995 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 25 01:16:57.224639 sshd[5065]: Connection closed by 10.0.0.1 port 55208 Mar 25 01:16:57.226066 sshd-session[5062]: pam_unix(sshd:session): session closed for user core Mar 25 01:16:57.233352 systemd[1]: sshd@15-10.0.0.53:22-10.0.0.1:55208.service: Deactivated successfully. Mar 25 01:16:57.236624 systemd[1]: session-16.scope: Deactivated successfully. Mar 25 01:16:57.239985 systemd-logind[1468]: Session 16 logged out. Waiting for processes to exit. Mar 25 01:16:57.242636 systemd[1]: Started sshd@16-10.0.0.53:22-10.0.0.1:55210.service - OpenSSH per-connection server daemon (10.0.0.1:55210). Mar 25 01:16:57.244102 systemd-logind[1468]: Removed session 16. Mar 25 01:16:57.295955 sshd[5076]: Accepted publickey for core from 10.0.0.1 port 55210 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:16:57.297223 sshd-session[5076]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:16:57.301759 systemd-logind[1468]: New session 17 of user core. Mar 25 01:16:57.319037 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 25 01:16:57.456305 sshd[5079]: Connection closed by 10.0.0.1 port 55210 Mar 25 01:16:57.456654 sshd-session[5076]: pam_unix(sshd:session): session closed for user core Mar 25 01:16:57.459964 systemd[1]: sshd@16-10.0.0.53:22-10.0.0.1:55210.service: Deactivated successfully. Mar 25 01:16:57.461755 systemd[1]: session-17.scope: Deactivated successfully. Mar 25 01:16:57.462402 systemd-logind[1468]: Session 17 logged out. Waiting for processes to exit. Mar 25 01:16:57.463518 systemd-logind[1468]: Removed session 17. Mar 25 01:16:57.816641 kubelet[2592]: I0325 01:16:57.816602 2592 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:16:57.868760 containerd[1481]: time="2025-03-25T01:16:57.868719435Z" level=info msg="TaskExit event in podsandbox handler container_id:\"69cf366349ba4a64b1203f05b2fbb77bebf6dcdaad48162b8a0cebad7567598a\" id:\"36106458127997533dc95182dc72fb66c69d56a89eb83adcacedd23b454263f9\" pid:5103 exited_at:{seconds:1742865417 nanos:868355106}" Mar 25 01:16:57.899711 containerd[1481]: time="2025-03-25T01:16:57.899664007Z" level=info msg="TaskExit event in podsandbox handler container_id:\"69cf366349ba4a64b1203f05b2fbb77bebf6dcdaad48162b8a0cebad7567598a\" id:\"7362fbe7402067b194842ae800722bc86ebd96a9acbde5c57306e79dddf2094d\" pid:5124 exited_at:{seconds:1742865417 nanos:899485763}" Mar 25 01:17:02.468276 systemd[1]: Started sshd@17-10.0.0.53:22-10.0.0.1:41128.service - OpenSSH per-connection server daemon (10.0.0.1:41128). Mar 25 01:17:02.525208 sshd[5143]: Accepted publickey for core from 10.0.0.1 port 41128 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:17:02.526504 sshd-session[5143]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:17:02.530940 systemd-logind[1468]: New session 18 of user core. Mar 25 01:17:02.537074 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 25 01:17:02.666202 sshd[5145]: Connection closed by 10.0.0.1 port 41128 Mar 25 01:17:02.666553 sshd-session[5143]: pam_unix(sshd:session): session closed for user core Mar 25 01:17:02.670201 systemd[1]: sshd@17-10.0.0.53:22-10.0.0.1:41128.service: Deactivated successfully. Mar 25 01:17:02.672080 systemd[1]: session-18.scope: Deactivated successfully. Mar 25 01:17:02.672659 systemd-logind[1468]: Session 18 logged out. Waiting for processes to exit. Mar 25 01:17:02.673394 systemd-logind[1468]: Removed session 18. Mar 25 01:17:07.679626 systemd[1]: Started sshd@18-10.0.0.53:22-10.0.0.1:41134.service - OpenSSH per-connection server daemon (10.0.0.1:41134). Mar 25 01:17:07.734721 sshd[5163]: Accepted publickey for core from 10.0.0.1 port 41134 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:17:07.736103 sshd-session[5163]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:17:07.742618 systemd-logind[1468]: New session 19 of user core. Mar 25 01:17:07.752008 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 25 01:17:07.869811 sshd[5165]: Connection closed by 10.0.0.1 port 41134 Mar 25 01:17:07.870531 sshd-session[5163]: pam_unix(sshd:session): session closed for user core Mar 25 01:17:07.873979 systemd[1]: sshd@18-10.0.0.53:22-10.0.0.1:41134.service: Deactivated successfully. Mar 25 01:17:07.876371 systemd[1]: session-19.scope: Deactivated successfully. Mar 25 01:17:07.877733 systemd-logind[1468]: Session 19 logged out. Waiting for processes to exit. Mar 25 01:17:07.878617 systemd-logind[1468]: Removed session 19. Mar 25 01:17:12.883322 systemd[1]: Started sshd@19-10.0.0.53:22-10.0.0.1:44252.service - OpenSSH per-connection server daemon (10.0.0.1:44252). Mar 25 01:17:12.931548 sshd[5189]: Accepted publickey for core from 10.0.0.1 port 44252 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:17:12.932707 sshd-session[5189]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:17:12.936990 systemd-logind[1468]: New session 20 of user core. Mar 25 01:17:12.952092 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 25 01:17:13.076322 sshd[5191]: Connection closed by 10.0.0.1 port 44252 Mar 25 01:17:13.077150 sshd-session[5189]: pam_unix(sshd:session): session closed for user core Mar 25 01:17:13.079775 systemd[1]: sshd@19-10.0.0.53:22-10.0.0.1:44252.service: Deactivated successfully. Mar 25 01:17:13.085813 systemd[1]: session-20.scope: Deactivated successfully. Mar 25 01:17:13.087296 systemd-logind[1468]: Session 20 logged out. Waiting for processes to exit. Mar 25 01:17:13.089403 systemd-logind[1468]: Removed session 20. Mar 25 01:17:17.083468 kubelet[2592]: I0325 01:17:17.083428 2592 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:17:18.089322 systemd[1]: Started sshd@20-10.0.0.53:22-10.0.0.1:44256.service - OpenSSH per-connection server daemon (10.0.0.1:44256). Mar 25 01:17:18.143850 sshd[5214]: Accepted publickey for core from 10.0.0.1 port 44256 ssh2: RSA SHA256:RyyrKoKHvyGTiWIDeMwuNNfmpVLXChNPYxUIZdc99cw Mar 25 01:17:18.145041 sshd-session[5214]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:17:18.148826 systemd-logind[1468]: New session 21 of user core. Mar 25 01:17:18.162029 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 25 01:17:18.293042 sshd[5216]: Connection closed by 10.0.0.1 port 44256 Mar 25 01:17:18.293558 sshd-session[5214]: pam_unix(sshd:session): session closed for user core Mar 25 01:17:18.297103 systemd[1]: sshd@20-10.0.0.53:22-10.0.0.1:44256.service: Deactivated successfully. Mar 25 01:17:18.300420 systemd[1]: session-21.scope: Deactivated successfully. Mar 25 01:17:18.301138 systemd-logind[1468]: Session 21 logged out. Waiting for processes to exit. Mar 25 01:17:18.301903 systemd-logind[1468]: Removed session 21.