Mar 25 01:25:51.901962 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Mar 25 01:25:51.901990 kernel: Linux version 6.6.83-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT Mon Mar 24 23:39:14 -00 2025 Mar 25 01:25:51.902001 kernel: KASLR enabled Mar 25 01:25:51.902006 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Mar 25 01:25:51.902012 kernel: efi: SMBIOS 3.0=0x139ed0000 MEMATTR=0x1390b8118 ACPI 2.0=0x136760018 RNG=0x13676e918 MEMRESERVE=0x136b41218 Mar 25 01:25:51.902018 kernel: random: crng init done Mar 25 01:25:51.902024 kernel: secureboot: Secure boot disabled Mar 25 01:25:51.902030 kernel: ACPI: Early table checksum verification disabled Mar 25 01:25:51.902036 kernel: ACPI: RSDP 0x0000000136760018 000024 (v02 BOCHS ) Mar 25 01:25:51.902044 kernel: ACPI: XSDT 0x000000013676FE98 00006C (v01 BOCHS BXPC 00000001 01000013) Mar 25 01:25:51.902050 kernel: ACPI: FACP 0x000000013676FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:25:51.902056 kernel: ACPI: DSDT 0x0000000136767518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:25:51.902062 kernel: ACPI: APIC 0x000000013676FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:25:51.902068 kernel: ACPI: PPTT 0x000000013676FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:25:51.902075 kernel: ACPI: GTDT 0x000000013676D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:25:51.902082 kernel: ACPI: MCFG 0x000000013676FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:25:51.902088 kernel: ACPI: SPCR 0x000000013676E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:25:51.902094 kernel: ACPI: DBG2 0x000000013676E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:25:51.902100 kernel: ACPI: IORT 0x000000013676E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:25:51.902107 kernel: ACPI: BGRT 0x000000013676E798 000038 (v01 INTEL EDK2 00000002 01000013) Mar 25 01:25:51.902113 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Mar 25 01:25:51.902119 kernel: NUMA: Failed to initialise from firmware Mar 25 01:25:51.902125 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] Mar 25 01:25:51.902857 kernel: NUMA: NODE_DATA [mem 0x13966f800-0x139674fff] Mar 25 01:25:51.902878 kernel: Zone ranges: Mar 25 01:25:51.902891 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Mar 25 01:25:51.902898 kernel: DMA32 empty Mar 25 01:25:51.902904 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] Mar 25 01:25:51.902910 kernel: Movable zone start for each node Mar 25 01:25:51.902916 kernel: Early memory node ranges Mar 25 01:25:51.902923 kernel: node 0: [mem 0x0000000040000000-0x000000013666ffff] Mar 25 01:25:51.902929 kernel: node 0: [mem 0x0000000136670000-0x000000013667ffff] Mar 25 01:25:51.902935 kernel: node 0: [mem 0x0000000136680000-0x000000013676ffff] Mar 25 01:25:51.902941 kernel: node 0: [mem 0x0000000136770000-0x0000000136b3ffff] Mar 25 01:25:51.902947 kernel: node 0: [mem 0x0000000136b40000-0x0000000139e1ffff] Mar 25 01:25:51.902953 kernel: node 0: [mem 0x0000000139e20000-0x0000000139eaffff] Mar 25 01:25:51.902959 kernel: node 0: [mem 0x0000000139eb0000-0x0000000139ebffff] Mar 25 01:25:51.902967 kernel: node 0: [mem 0x0000000139ec0000-0x0000000139fdffff] Mar 25 01:25:51.902973 kernel: node 0: [mem 0x0000000139fe0000-0x0000000139ffffff] Mar 25 01:25:51.902980 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] Mar 25 01:25:51.902989 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Mar 25 01:25:51.902997 kernel: psci: probing for conduit method from ACPI. Mar 25 01:25:51.903003 kernel: psci: PSCIv1.1 detected in firmware. Mar 25 01:25:51.903012 kernel: psci: Using standard PSCI v0.2 function IDs Mar 25 01:25:51.903018 kernel: psci: Trusted OS migration not required Mar 25 01:25:51.903025 kernel: psci: SMC Calling Convention v1.1 Mar 25 01:25:51.903031 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Mar 25 01:25:51.903038 kernel: percpu: Embedded 31 pages/cpu s86696 r8192 d32088 u126976 Mar 25 01:25:51.903044 kernel: pcpu-alloc: s86696 r8192 d32088 u126976 alloc=31*4096 Mar 25 01:25:51.903051 kernel: pcpu-alloc: [0] 0 [0] 1 Mar 25 01:25:51.903058 kernel: Detected PIPT I-cache on CPU0 Mar 25 01:25:51.903064 kernel: CPU features: detected: GIC system register CPU interface Mar 25 01:25:51.903071 kernel: CPU features: detected: Hardware dirty bit management Mar 25 01:25:51.903079 kernel: CPU features: detected: Spectre-v4 Mar 25 01:25:51.903086 kernel: CPU features: detected: Spectre-BHB Mar 25 01:25:51.903092 kernel: CPU features: kernel page table isolation forced ON by KASLR Mar 25 01:25:51.903099 kernel: CPU features: detected: Kernel page table isolation (KPTI) Mar 25 01:25:51.903105 kernel: CPU features: detected: ARM erratum 1418040 Mar 25 01:25:51.903111 kernel: CPU features: detected: SSBS not fully self-synchronizing Mar 25 01:25:51.903118 kernel: alternatives: applying boot alternatives Mar 25 01:25:51.903125 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=b84e5f613acd6cd0a8a878f32f5653a14f2e6fb2820997fecd5b2bd33a4ba3ab Mar 25 01:25:51.903133 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 25 01:25:51.903139 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 25 01:25:51.903146 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 25 01:25:51.903154 kernel: Fallback order for Node 0: 0 Mar 25 01:25:51.903160 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1008000 Mar 25 01:25:51.903167 kernel: Policy zone: Normal Mar 25 01:25:51.903173 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 25 01:25:51.903180 kernel: software IO TLB: area num 2. Mar 25 01:25:51.903186 kernel: software IO TLB: mapped [mem 0x00000000fbfff000-0x00000000fffff000] (64MB) Mar 25 01:25:51.903193 kernel: Memory: 3883768K/4096000K available (10304K kernel code, 2186K rwdata, 8096K rodata, 38464K init, 897K bss, 212232K reserved, 0K cma-reserved) Mar 25 01:25:51.903199 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 25 01:25:51.903206 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 25 01:25:51.903213 kernel: rcu: RCU event tracing is enabled. Mar 25 01:25:51.903219 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 25 01:25:51.903226 kernel: Trampoline variant of Tasks RCU enabled. Mar 25 01:25:51.903234 kernel: Tracing variant of Tasks RCU enabled. Mar 25 01:25:51.903241 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 25 01:25:51.903247 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 25 01:25:51.903254 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Mar 25 01:25:51.903260 kernel: GICv3: 256 SPIs implemented Mar 25 01:25:51.903267 kernel: GICv3: 0 Extended SPIs implemented Mar 25 01:25:51.903273 kernel: Root IRQ handler: gic_handle_irq Mar 25 01:25:51.903279 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Mar 25 01:25:51.903286 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Mar 25 01:25:51.903292 kernel: ITS [mem 0x08080000-0x0809ffff] Mar 25 01:25:51.903299 kernel: ITS@0x0000000008080000: allocated 8192 Devices @1000c0000 (indirect, esz 8, psz 64K, shr 1) Mar 25 01:25:51.903307 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @1000d0000 (flat, esz 8, psz 64K, shr 1) Mar 25 01:25:51.903314 kernel: GICv3: using LPI property table @0x00000001000e0000 Mar 25 01:25:51.903320 kernel: GICv3: CPU0: using allocated LPI pending table @0x00000001000f0000 Mar 25 01:25:51.903327 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 25 01:25:51.903333 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 25 01:25:51.903340 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Mar 25 01:25:51.903346 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Mar 25 01:25:51.903353 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Mar 25 01:25:51.903359 kernel: Console: colour dummy device 80x25 Mar 25 01:25:51.903366 kernel: ACPI: Core revision 20230628 Mar 25 01:25:51.903373 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Mar 25 01:25:51.903382 kernel: pid_max: default: 32768 minimum: 301 Mar 25 01:25:51.903389 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 25 01:25:51.903395 kernel: landlock: Up and running. Mar 25 01:25:51.903402 kernel: SELinux: Initializing. Mar 25 01:25:51.903409 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 25 01:25:51.903415 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 25 01:25:51.903422 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 25 01:25:51.903429 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 25 01:25:51.903436 kernel: rcu: Hierarchical SRCU implementation. Mar 25 01:25:51.903444 kernel: rcu: Max phase no-delay instances is 400. Mar 25 01:25:51.903451 kernel: Platform MSI: ITS@0x8080000 domain created Mar 25 01:25:51.903457 kernel: PCI/MSI: ITS@0x8080000 domain created Mar 25 01:25:51.903464 kernel: Remapping and enabling EFI services. Mar 25 01:25:51.903471 kernel: smp: Bringing up secondary CPUs ... Mar 25 01:25:51.903477 kernel: Detected PIPT I-cache on CPU1 Mar 25 01:25:51.903484 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Mar 25 01:25:51.903490 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100100000 Mar 25 01:25:51.903497 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Mar 25 01:25:51.903505 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Mar 25 01:25:51.903512 kernel: smp: Brought up 1 node, 2 CPUs Mar 25 01:25:51.903524 kernel: SMP: Total of 2 processors activated. Mar 25 01:25:51.903533 kernel: CPU features: detected: 32-bit EL0 Support Mar 25 01:25:51.903540 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Mar 25 01:25:51.903547 kernel: CPU features: detected: Common not Private translations Mar 25 01:25:51.903553 kernel: CPU features: detected: CRC32 instructions Mar 25 01:25:51.903560 kernel: CPU features: detected: Enhanced Virtualization Traps Mar 25 01:25:51.903568 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Mar 25 01:25:51.903576 kernel: CPU features: detected: LSE atomic instructions Mar 25 01:25:51.903583 kernel: CPU features: detected: Privileged Access Never Mar 25 01:25:51.903590 kernel: CPU features: detected: RAS Extension Support Mar 25 01:25:51.903597 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Mar 25 01:25:51.903603 kernel: CPU: All CPU(s) started at EL1 Mar 25 01:25:51.903610 kernel: alternatives: applying system-wide alternatives Mar 25 01:25:51.903617 kernel: devtmpfs: initialized Mar 25 01:25:51.903625 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 25 01:25:51.903633 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 25 01:25:51.903640 kernel: pinctrl core: initialized pinctrl subsystem Mar 25 01:25:51.903648 kernel: SMBIOS 3.0.0 present. Mar 25 01:25:51.903655 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 Mar 25 01:25:51.903672 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 25 01:25:51.903680 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Mar 25 01:25:51.903687 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Mar 25 01:25:51.903694 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Mar 25 01:25:51.903701 kernel: audit: initializing netlink subsys (disabled) Mar 25 01:25:51.903711 kernel: audit: type=2000 audit(0.012:1): state=initialized audit_enabled=0 res=1 Mar 25 01:25:51.903718 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 25 01:25:51.903725 kernel: cpuidle: using governor menu Mar 25 01:25:51.903732 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Mar 25 01:25:51.903739 kernel: ASID allocator initialised with 32768 entries Mar 25 01:25:51.903746 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 25 01:25:51.903753 kernel: Serial: AMBA PL011 UART driver Mar 25 01:25:51.903760 kernel: Modules: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Mar 25 01:25:51.903767 kernel: Modules: 0 pages in range for non-PLT usage Mar 25 01:25:51.903775 kernel: Modules: 509248 pages in range for PLT usage Mar 25 01:25:51.903782 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 25 01:25:51.903789 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Mar 25 01:25:51.903796 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Mar 25 01:25:51.903803 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Mar 25 01:25:51.904836 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 25 01:25:51.904846 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Mar 25 01:25:51.904855 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Mar 25 01:25:51.904863 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Mar 25 01:25:51.904876 kernel: ACPI: Added _OSI(Module Device) Mar 25 01:25:51.904883 kernel: ACPI: Added _OSI(Processor Device) Mar 25 01:25:51.904890 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 25 01:25:51.904897 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 25 01:25:51.904904 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 25 01:25:51.904912 kernel: ACPI: Interpreter enabled Mar 25 01:25:51.904919 kernel: ACPI: Using GIC for interrupt routing Mar 25 01:25:51.904928 kernel: ACPI: MCFG table detected, 1 entries Mar 25 01:25:51.904937 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Mar 25 01:25:51.904947 kernel: printk: console [ttyAMA0] enabled Mar 25 01:25:51.904955 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 25 01:25:51.905139 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 25 01:25:51.905221 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Mar 25 01:25:51.905291 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Mar 25 01:25:51.905358 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Mar 25 01:25:51.905423 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Mar 25 01:25:51.905435 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Mar 25 01:25:51.905442 kernel: PCI host bridge to bus 0000:00 Mar 25 01:25:51.905518 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Mar 25 01:25:51.905583 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Mar 25 01:25:51.905645 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Mar 25 01:25:51.905726 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 25 01:25:51.906891 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 Mar 25 01:25:51.907013 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 Mar 25 01:25:51.907088 kernel: pci 0000:00:01.0: reg 0x14: [mem 0x11289000-0x11289fff] Mar 25 01:25:51.907159 kernel: pci 0000:00:01.0: reg 0x20: [mem 0x8000600000-0x8000603fff 64bit pref] Mar 25 01:25:51.907253 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Mar 25 01:25:51.907322 kernel: pci 0000:00:02.0: reg 0x10: [mem 0x11288000-0x11288fff] Mar 25 01:25:51.907399 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Mar 25 01:25:51.907476 kernel: pci 0000:00:02.1: reg 0x10: [mem 0x11287000-0x11287fff] Mar 25 01:25:51.907554 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Mar 25 01:25:51.907625 kernel: pci 0000:00:02.2: reg 0x10: [mem 0x11286000-0x11286fff] Mar 25 01:25:51.907752 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Mar 25 01:25:51.908899 kernel: pci 0000:00:02.3: reg 0x10: [mem 0x11285000-0x11285fff] Mar 25 01:25:51.909003 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Mar 25 01:25:51.909073 kernel: pci 0000:00:02.4: reg 0x10: [mem 0x11284000-0x11284fff] Mar 25 01:25:51.909155 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Mar 25 01:25:51.909221 kernel: pci 0000:00:02.5: reg 0x10: [mem 0x11283000-0x11283fff] Mar 25 01:25:51.909295 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Mar 25 01:25:51.909362 kernel: pci 0000:00:02.6: reg 0x10: [mem 0x11282000-0x11282fff] Mar 25 01:25:51.909437 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Mar 25 01:25:51.909505 kernel: pci 0000:00:02.7: reg 0x10: [mem 0x11281000-0x11281fff] Mar 25 01:25:51.909587 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 Mar 25 01:25:51.909657 kernel: pci 0000:00:03.0: reg 0x10: [mem 0x11280000-0x11280fff] Mar 25 01:25:51.909752 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 Mar 25 01:25:51.910927 kernel: pci 0000:00:04.0: reg 0x10: [io 0x0000-0x0007] Mar 25 01:25:51.911040 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 Mar 25 01:25:51.911113 kernel: pci 0000:01:00.0: reg 0x14: [mem 0x11000000-0x11000fff] Mar 25 01:25:51.911192 kernel: pci 0000:01:00.0: reg 0x20: [mem 0x8000000000-0x8000003fff 64bit pref] Mar 25 01:25:51.911261 kernel: pci 0000:01:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] Mar 25 01:25:51.911343 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 Mar 25 01:25:51.911415 kernel: pci 0000:02:00.0: reg 0x10: [mem 0x10e00000-0x10e03fff 64bit] Mar 25 01:25:51.911493 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 Mar 25 01:25:51.911564 kernel: pci 0000:03:00.0: reg 0x14: [mem 0x10c00000-0x10c00fff] Mar 25 01:25:51.911634 kernel: pci 0000:03:00.0: reg 0x20: [mem 0x8000100000-0x8000103fff 64bit pref] Mar 25 01:25:51.911743 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 Mar 25 01:25:51.912892 kernel: pci 0000:04:00.0: reg 0x20: [mem 0x8000200000-0x8000203fff 64bit pref] Mar 25 01:25:51.913019 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 Mar 25 01:25:51.913095 kernel: pci 0000:05:00.0: reg 0x14: [mem 0x10800000-0x10800fff] Mar 25 01:25:51.913168 kernel: pci 0000:05:00.0: reg 0x20: [mem 0x8000300000-0x8000303fff 64bit pref] Mar 25 01:25:51.913248 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 Mar 25 01:25:51.913327 kernel: pci 0000:06:00.0: reg 0x14: [mem 0x10600000-0x10600fff] Mar 25 01:25:51.913397 kernel: pci 0000:06:00.0: reg 0x20: [mem 0x8000400000-0x8000403fff 64bit pref] Mar 25 01:25:51.913481 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 Mar 25 01:25:51.913554 kernel: pci 0000:07:00.0: reg 0x14: [mem 0x10400000-0x10400fff] Mar 25 01:25:51.913635 kernel: pci 0000:07:00.0: reg 0x20: [mem 0x8000500000-0x8000503fff 64bit pref] Mar 25 01:25:51.913720 kernel: pci 0000:07:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] Mar 25 01:25:51.913798 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Mar 25 01:25:51.915112 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Mar 25 01:25:51.915194 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Mar 25 01:25:51.915271 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Mar 25 01:25:51.915343 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Mar 25 01:25:51.915410 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Mar 25 01:25:51.915482 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Mar 25 01:25:51.915565 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Mar 25 01:25:51.915643 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Mar 25 01:25:51.915762 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Mar 25 01:25:51.917470 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Mar 25 01:25:51.917560 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Mar 25 01:25:51.917637 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Mar 25 01:25:51.917762 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Mar 25 01:25:51.917856 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Mar 25 01:25:51.917943 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Mar 25 01:25:51.918035 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Mar 25 01:25:51.918104 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Mar 25 01:25:51.918176 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Mar 25 01:25:51.918244 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 Mar 25 01:25:51.918320 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 Mar 25 01:25:51.918404 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Mar 25 01:25:51.918487 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Mar 25 01:25:51.918559 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Mar 25 01:25:51.918654 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Mar 25 01:25:51.918739 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Mar 25 01:25:51.919870 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Mar 25 01:25:51.919989 kernel: pci 0000:00:02.0: BAR 14: assigned [mem 0x10000000-0x101fffff] Mar 25 01:25:51.920061 kernel: pci 0000:00:02.0: BAR 15: assigned [mem 0x8000000000-0x80001fffff 64bit pref] Mar 25 01:25:51.920137 kernel: pci 0000:00:02.1: BAR 14: assigned [mem 0x10200000-0x103fffff] Mar 25 01:25:51.920210 kernel: pci 0000:00:02.1: BAR 15: assigned [mem 0x8000200000-0x80003fffff 64bit pref] Mar 25 01:25:51.920284 kernel: pci 0000:00:02.2: BAR 14: assigned [mem 0x10400000-0x105fffff] Mar 25 01:25:51.920351 kernel: pci 0000:00:02.2: BAR 15: assigned [mem 0x8000400000-0x80005fffff 64bit pref] Mar 25 01:25:51.920421 kernel: pci 0000:00:02.3: BAR 14: assigned [mem 0x10600000-0x107fffff] Mar 25 01:25:51.920488 kernel: pci 0000:00:02.3: BAR 15: assigned [mem 0x8000600000-0x80007fffff 64bit pref] Mar 25 01:25:51.920557 kernel: pci 0000:00:02.4: BAR 14: assigned [mem 0x10800000-0x109fffff] Mar 25 01:25:51.920627 kernel: pci 0000:00:02.4: BAR 15: assigned [mem 0x8000800000-0x80009fffff 64bit pref] Mar 25 01:25:51.920741 kernel: pci 0000:00:02.5: BAR 14: assigned [mem 0x10a00000-0x10bfffff] Mar 25 01:25:51.922062 kernel: pci 0000:00:02.5: BAR 15: assigned [mem 0x8000a00000-0x8000bfffff 64bit pref] Mar 25 01:25:51.922169 kernel: pci 0000:00:02.6: BAR 14: assigned [mem 0x10c00000-0x10dfffff] Mar 25 01:25:51.922238 kernel: pci 0000:00:02.6: BAR 15: assigned [mem 0x8000c00000-0x8000dfffff 64bit pref] Mar 25 01:25:51.922309 kernel: pci 0000:00:02.7: BAR 14: assigned [mem 0x10e00000-0x10ffffff] Mar 25 01:25:51.922375 kernel: pci 0000:00:02.7: BAR 15: assigned [mem 0x8000e00000-0x8000ffffff 64bit pref] Mar 25 01:25:51.922454 kernel: pci 0000:00:03.0: BAR 14: assigned [mem 0x11000000-0x111fffff] Mar 25 01:25:51.922523 kernel: pci 0000:00:03.0: BAR 15: assigned [mem 0x8001000000-0x80011fffff 64bit pref] Mar 25 01:25:51.922613 kernel: pci 0000:00:01.0: BAR 4: assigned [mem 0x8001200000-0x8001203fff 64bit pref] Mar 25 01:25:51.922699 kernel: pci 0000:00:01.0: BAR 1: assigned [mem 0x11200000-0x11200fff] Mar 25 01:25:51.922780 kernel: pci 0000:00:02.0: BAR 0: assigned [mem 0x11201000-0x11201fff] Mar 25 01:25:51.923545 kernel: pci 0000:00:02.0: BAR 13: assigned [io 0x1000-0x1fff] Mar 25 01:25:51.923644 kernel: pci 0000:00:02.1: BAR 0: assigned [mem 0x11202000-0x11202fff] Mar 25 01:25:51.923775 kernel: pci 0000:00:02.1: BAR 13: assigned [io 0x2000-0x2fff] Mar 25 01:25:51.923938 kernel: pci 0000:00:02.2: BAR 0: assigned [mem 0x11203000-0x11203fff] Mar 25 01:25:51.924012 kernel: pci 0000:00:02.2: BAR 13: assigned [io 0x3000-0x3fff] Mar 25 01:25:51.924085 kernel: pci 0000:00:02.3: BAR 0: assigned [mem 0x11204000-0x11204fff] Mar 25 01:25:51.924156 kernel: pci 0000:00:02.3: BAR 13: assigned [io 0x4000-0x4fff] Mar 25 01:25:51.924229 kernel: pci 0000:00:02.4: BAR 0: assigned [mem 0x11205000-0x11205fff] Mar 25 01:25:51.924296 kernel: pci 0000:00:02.4: BAR 13: assigned [io 0x5000-0x5fff] Mar 25 01:25:51.924378 kernel: pci 0000:00:02.5: BAR 0: assigned [mem 0x11206000-0x11206fff] Mar 25 01:25:51.924447 kernel: pci 0000:00:02.5: BAR 13: assigned [io 0x6000-0x6fff] Mar 25 01:25:51.924524 kernel: pci 0000:00:02.6: BAR 0: assigned [mem 0x11207000-0x11207fff] Mar 25 01:25:51.924592 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x7000-0x7fff] Mar 25 01:25:51.924677 kernel: pci 0000:00:02.7: BAR 0: assigned [mem 0x11208000-0x11208fff] Mar 25 01:25:51.924746 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x8000-0x8fff] Mar 25 01:25:51.924855 kernel: pci 0000:00:03.0: BAR 0: assigned [mem 0x11209000-0x11209fff] Mar 25 01:25:51.924929 kernel: pci 0000:00:03.0: BAR 13: assigned [io 0x9000-0x9fff] Mar 25 01:25:51.925006 kernel: pci 0000:00:04.0: BAR 0: assigned [io 0xa000-0xa007] Mar 25 01:25:51.925089 kernel: pci 0000:01:00.0: BAR 6: assigned [mem 0x10000000-0x1007ffff pref] Mar 25 01:25:51.925166 kernel: pci 0000:01:00.0: BAR 4: assigned [mem 0x8000000000-0x8000003fff 64bit pref] Mar 25 01:25:51.925255 kernel: pci 0000:01:00.0: BAR 1: assigned [mem 0x10080000-0x10080fff] Mar 25 01:25:51.925335 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Mar 25 01:25:51.925400 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Mar 25 01:25:51.925469 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] Mar 25 01:25:51.925535 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Mar 25 01:25:51.925612 kernel: pci 0000:02:00.0: BAR 0: assigned [mem 0x10200000-0x10203fff 64bit] Mar 25 01:25:51.925699 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Mar 25 01:25:51.925767 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Mar 25 01:25:51.925891 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] Mar 25 01:25:51.925964 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Mar 25 01:25:51.926037 kernel: pci 0000:03:00.0: BAR 4: assigned [mem 0x8000400000-0x8000403fff 64bit pref] Mar 25 01:25:51.926112 kernel: pci 0000:03:00.0: BAR 1: assigned [mem 0x10400000-0x10400fff] Mar 25 01:25:51.926179 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Mar 25 01:25:51.926245 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Mar 25 01:25:51.926309 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] Mar 25 01:25:51.926376 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Mar 25 01:25:51.926449 kernel: pci 0000:04:00.0: BAR 4: assigned [mem 0x8000600000-0x8000603fff 64bit pref] Mar 25 01:25:51.926519 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Mar 25 01:25:51.926584 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Mar 25 01:25:51.926651 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] Mar 25 01:25:51.926781 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Mar 25 01:25:51.926986 kernel: pci 0000:05:00.0: BAR 4: assigned [mem 0x8000800000-0x8000803fff 64bit pref] Mar 25 01:25:51.927060 kernel: pci 0000:05:00.0: BAR 1: assigned [mem 0x10800000-0x10800fff] Mar 25 01:25:51.927127 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Mar 25 01:25:51.927194 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Mar 25 01:25:51.927264 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Mar 25 01:25:51.927335 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Mar 25 01:25:51.927415 kernel: pci 0000:06:00.0: BAR 4: assigned [mem 0x8000a00000-0x8000a03fff 64bit pref] Mar 25 01:25:51.927486 kernel: pci 0000:06:00.0: BAR 1: assigned [mem 0x10a00000-0x10a00fff] Mar 25 01:25:51.927556 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Mar 25 01:25:51.927622 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Mar 25 01:25:51.927708 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] Mar 25 01:25:51.927779 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Mar 25 01:25:51.927893 kernel: pci 0000:07:00.0: BAR 6: assigned [mem 0x10c00000-0x10c7ffff pref] Mar 25 01:25:51.927967 kernel: pci 0000:07:00.0: BAR 4: assigned [mem 0x8000c00000-0x8000c03fff 64bit pref] Mar 25 01:25:51.928041 kernel: pci 0000:07:00.0: BAR 1: assigned [mem 0x10c80000-0x10c80fff] Mar 25 01:25:51.928109 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Mar 25 01:25:51.928175 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Mar 25 01:25:51.928240 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] Mar 25 01:25:51.928306 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Mar 25 01:25:51.928377 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Mar 25 01:25:51.928443 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Mar 25 01:25:51.928510 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] Mar 25 01:25:51.928575 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Mar 25 01:25:51.928644 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Mar 25 01:25:51.928727 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] Mar 25 01:25:51.928795 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] Mar 25 01:25:51.928920 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Mar 25 01:25:51.928993 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Mar 25 01:25:51.929052 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Mar 25 01:25:51.929116 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Mar 25 01:25:51.929189 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Mar 25 01:25:51.929252 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Mar 25 01:25:51.929311 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Mar 25 01:25:51.929381 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] Mar 25 01:25:51.929441 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Mar 25 01:25:51.929504 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Mar 25 01:25:51.929587 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] Mar 25 01:25:51.929652 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Mar 25 01:25:51.929765 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Mar 25 01:25:51.929961 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Mar 25 01:25:51.930092 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Mar 25 01:25:51.930157 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Mar 25 01:25:51.930249 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] Mar 25 01:25:51.930311 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Mar 25 01:25:51.930509 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Mar 25 01:25:51.930620 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] Mar 25 01:25:51.930711 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Mar 25 01:25:51.930779 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Mar 25 01:25:51.931964 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] Mar 25 01:25:51.932052 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Mar 25 01:25:51.932116 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Mar 25 01:25:51.932192 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] Mar 25 01:25:51.932264 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Mar 25 01:25:51.932335 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Mar 25 01:25:51.932413 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] Mar 25 01:25:51.932478 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Mar 25 01:25:51.932541 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Mar 25 01:25:51.932550 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Mar 25 01:25:51.932558 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Mar 25 01:25:51.932565 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Mar 25 01:25:51.932573 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Mar 25 01:25:51.932583 kernel: iommu: Default domain type: Translated Mar 25 01:25:51.932590 kernel: iommu: DMA domain TLB invalidation policy: strict mode Mar 25 01:25:51.932598 kernel: efivars: Registered efivars operations Mar 25 01:25:51.932605 kernel: vgaarb: loaded Mar 25 01:25:51.932612 kernel: clocksource: Switched to clocksource arch_sys_counter Mar 25 01:25:51.932622 kernel: VFS: Disk quotas dquot_6.6.0 Mar 25 01:25:51.932630 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 25 01:25:51.932637 kernel: pnp: PnP ACPI init Mar 25 01:25:51.932735 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Mar 25 01:25:51.932750 kernel: pnp: PnP ACPI: found 1 devices Mar 25 01:25:51.932757 kernel: NET: Registered PF_INET protocol family Mar 25 01:25:51.932765 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 25 01:25:51.932773 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 25 01:25:51.932781 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 25 01:25:51.932788 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 25 01:25:51.932796 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Mar 25 01:25:51.932804 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 25 01:25:51.933866 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 25 01:25:51.933876 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 25 01:25:51.933884 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 25 01:25:51.934027 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Mar 25 01:25:51.934040 kernel: PCI: CLS 0 bytes, default 64 Mar 25 01:25:51.934048 kernel: kvm [1]: HYP mode not available Mar 25 01:25:51.934055 kernel: Initialise system trusted keyrings Mar 25 01:25:51.934064 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 25 01:25:51.934072 kernel: Key type asymmetric registered Mar 25 01:25:51.934083 kernel: Asymmetric key parser 'x509' registered Mar 25 01:25:51.934090 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Mar 25 01:25:51.934098 kernel: io scheduler mq-deadline registered Mar 25 01:25:51.934105 kernel: io scheduler kyber registered Mar 25 01:25:51.934112 kernel: io scheduler bfq registered Mar 25 01:25:51.934120 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Mar 25 01:25:51.934237 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 Mar 25 01:25:51.934323 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 Mar 25 01:25:51.934400 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 25 01:25:51.934482 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 Mar 25 01:25:51.934553 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 Mar 25 01:25:51.934641 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 25 01:25:51.934738 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 Mar 25 01:25:51.935452 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 Mar 25 01:25:51.935572 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 25 01:25:51.935647 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 Mar 25 01:25:51.935735 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 Mar 25 01:25:51.935840 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 25 01:25:51.935915 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 Mar 25 01:25:51.935984 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 Mar 25 01:25:51.936056 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 25 01:25:51.936131 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 Mar 25 01:25:51.936198 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 Mar 25 01:25:51.936265 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 25 01:25:51.936335 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 Mar 25 01:25:51.936400 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 Mar 25 01:25:51.936469 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 25 01:25:51.936547 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 Mar 25 01:25:51.936619 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 Mar 25 01:25:51.936733 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 25 01:25:51.936747 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Mar 25 01:25:51.938274 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 Mar 25 01:25:51.938392 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 Mar 25 01:25:51.938462 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Mar 25 01:25:51.938473 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Mar 25 01:25:51.938480 kernel: ACPI: button: Power Button [PWRB] Mar 25 01:25:51.938489 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Mar 25 01:25:51.938564 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Mar 25 01:25:51.938641 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) Mar 25 01:25:51.938652 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 25 01:25:51.938706 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Mar 25 01:25:51.938797 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) Mar 25 01:25:51.938833 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A Mar 25 01:25:51.938841 kernel: thunder_xcv, ver 1.0 Mar 25 01:25:51.938848 kernel: thunder_bgx, ver 1.0 Mar 25 01:25:51.938855 kernel: nicpf, ver 1.0 Mar 25 01:25:51.938863 kernel: nicvf, ver 1.0 Mar 25 01:25:51.938952 kernel: rtc-efi rtc-efi.0: registered as rtc0 Mar 25 01:25:51.939019 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-03-25T01:25:51 UTC (1742865951) Mar 25 01:25:51.939032 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 25 01:25:51.939040 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 counters available Mar 25 01:25:51.939047 kernel: watchdog: Delayed init of the lockup detector failed: -19 Mar 25 01:25:51.939055 kernel: watchdog: Hard watchdog permanently disabled Mar 25 01:25:51.939062 kernel: NET: Registered PF_INET6 protocol family Mar 25 01:25:51.939069 kernel: Segment Routing with IPv6 Mar 25 01:25:51.939077 kernel: In-situ OAM (IOAM) with IPv6 Mar 25 01:25:51.939084 kernel: NET: Registered PF_PACKET protocol family Mar 25 01:25:51.939093 kernel: Key type dns_resolver registered Mar 25 01:25:51.939101 kernel: registered taskstats version 1 Mar 25 01:25:51.939108 kernel: Loading compiled-in X.509 certificates Mar 25 01:25:51.939115 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.83-flatcar: ed4ababe871f0afac8b4236504477de11a6baf07' Mar 25 01:25:51.939123 kernel: Key type .fscrypt registered Mar 25 01:25:51.939130 kernel: Key type fscrypt-provisioning registered Mar 25 01:25:51.939137 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 25 01:25:51.939145 kernel: ima: Allocated hash algorithm: sha1 Mar 25 01:25:51.939152 kernel: ima: No architecture policies found Mar 25 01:25:51.939161 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Mar 25 01:25:51.939169 kernel: clk: Disabling unused clocks Mar 25 01:25:51.939176 kernel: Freeing unused kernel memory: 38464K Mar 25 01:25:51.939183 kernel: Run /init as init process Mar 25 01:25:51.939191 kernel: with arguments: Mar 25 01:25:51.939199 kernel: /init Mar 25 01:25:51.939206 kernel: with environment: Mar 25 01:25:51.939213 kernel: HOME=/ Mar 25 01:25:51.939220 kernel: TERM=linux Mar 25 01:25:51.939228 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 25 01:25:51.939237 systemd[1]: Successfully made /usr/ read-only. Mar 25 01:25:51.939254 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 25 01:25:51.939263 systemd[1]: Detected virtualization kvm. Mar 25 01:25:51.939271 systemd[1]: Detected architecture arm64. Mar 25 01:25:51.939278 systemd[1]: Running in initrd. Mar 25 01:25:51.939286 systemd[1]: No hostname configured, using default hostname. Mar 25 01:25:51.939296 systemd[1]: Hostname set to . Mar 25 01:25:51.939304 systemd[1]: Initializing machine ID from VM UUID. Mar 25 01:25:51.939313 systemd[1]: Queued start job for default target initrd.target. Mar 25 01:25:51.939321 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 01:25:51.939337 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 01:25:51.939346 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 25 01:25:51.939358 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 25 01:25:51.939367 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 25 01:25:51.939377 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 25 01:25:51.939386 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 25 01:25:51.939394 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 25 01:25:51.939402 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 01:25:51.939410 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 25 01:25:51.939418 systemd[1]: Reached target paths.target - Path Units. Mar 25 01:25:51.939426 systemd[1]: Reached target slices.target - Slice Units. Mar 25 01:25:51.939436 systemd[1]: Reached target swap.target - Swaps. Mar 25 01:25:51.939443 systemd[1]: Reached target timers.target - Timer Units. Mar 25 01:25:51.939452 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 25 01:25:51.939460 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 25 01:25:51.939467 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 25 01:25:51.939475 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 25 01:25:51.939483 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 25 01:25:51.939491 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 25 01:25:51.939499 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 01:25:51.939509 systemd[1]: Reached target sockets.target - Socket Units. Mar 25 01:25:51.939517 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 25 01:25:51.939525 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 25 01:25:51.939533 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 25 01:25:51.939541 systemd[1]: Starting systemd-fsck-usr.service... Mar 25 01:25:51.939549 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 25 01:25:51.939557 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 25 01:25:51.939565 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:25:51.939573 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 25 01:25:51.939583 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 01:25:51.939591 systemd[1]: Finished systemd-fsck-usr.service. Mar 25 01:25:51.939599 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 25 01:25:51.939636 systemd-journald[236]: Collecting audit messages is disabled. Mar 25 01:25:51.939669 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:25:51.939678 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 25 01:25:51.939686 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 01:25:51.939694 kernel: Bridge firewalling registered Mar 25 01:25:51.939704 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 25 01:25:51.939713 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 25 01:25:51.939720 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 25 01:25:51.939728 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 25 01:25:51.939738 systemd-journald[236]: Journal started Mar 25 01:25:51.939757 systemd-journald[236]: Runtime Journal (/run/log/journal/fe52c5d8db4246368d8c198ae6d8f097) is 8M, max 76.6M, 68.6M free. Mar 25 01:25:51.893647 systemd-modules-load[238]: Inserted module 'overlay' Mar 25 01:25:51.922926 systemd-modules-load[238]: Inserted module 'br_netfilter' Mar 25 01:25:51.949039 systemd[1]: Started systemd-journald.service - Journal Service. Mar 25 01:25:51.955213 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:25:51.957974 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 25 01:25:51.960414 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 01:25:51.962618 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 25 01:25:51.965096 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 25 01:25:51.987096 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 01:25:51.990971 dracut-cmdline[272]: dracut-dracut-053 Mar 25 01:25:51.992470 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 25 01:25:51.998881 dracut-cmdline[272]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=b84e5f613acd6cd0a8a878f32f5653a14f2e6fb2820997fecd5b2bd33a4ba3ab Mar 25 01:25:52.042109 systemd-resolved[283]: Positive Trust Anchors: Mar 25 01:25:52.042909 systemd-resolved[283]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 25 01:25:52.042944 systemd-resolved[283]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 25 01:25:52.055287 systemd-resolved[283]: Defaulting to hostname 'linux'. Mar 25 01:25:52.056488 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 25 01:25:52.057870 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 25 01:25:52.113841 kernel: SCSI subsystem initialized Mar 25 01:25:52.118874 kernel: Loading iSCSI transport class v2.0-870. Mar 25 01:25:52.127864 kernel: iscsi: registered transport (tcp) Mar 25 01:25:52.141906 kernel: iscsi: registered transport (qla4xxx) Mar 25 01:25:52.142037 kernel: QLogic iSCSI HBA Driver Mar 25 01:25:52.201111 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 25 01:25:52.203988 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 25 01:25:52.246052 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 25 01:25:52.246129 kernel: device-mapper: uevent: version 1.0.3 Mar 25 01:25:52.246141 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 25 01:25:52.298194 kernel: raid6: neonx8 gen() 15333 MB/s Mar 25 01:25:52.314939 kernel: raid6: neonx4 gen() 12941 MB/s Mar 25 01:25:52.331877 kernel: raid6: neonx2 gen() 12864 MB/s Mar 25 01:25:52.348892 kernel: raid6: neonx1 gen() 10105 MB/s Mar 25 01:25:52.365864 kernel: raid6: int64x8 gen() 6707 MB/s Mar 25 01:25:52.382874 kernel: raid6: int64x4 gen() 7271 MB/s Mar 25 01:25:52.399882 kernel: raid6: int64x2 gen() 6002 MB/s Mar 25 01:25:52.416897 kernel: raid6: int64x1 gen() 4971 MB/s Mar 25 01:25:52.417004 kernel: raid6: using algorithm neonx8 gen() 15333 MB/s Mar 25 01:25:52.433887 kernel: raid6: .... xor() 11765 MB/s, rmw enabled Mar 25 01:25:52.433985 kernel: raid6: using neon recovery algorithm Mar 25 01:25:52.439092 kernel: xor: measuring software checksum speed Mar 25 01:25:52.439164 kernel: 8regs : 20613 MB/sec Mar 25 01:25:52.439898 kernel: 32regs : 21687 MB/sec Mar 25 01:25:52.439944 kernel: arm64_neon : 27043 MB/sec Mar 25 01:25:52.439963 kernel: xor: using function: arm64_neon (27043 MB/sec) Mar 25 01:25:52.492854 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 25 01:25:52.507928 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 25 01:25:52.510887 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 01:25:52.543993 systemd-udevd[458]: Using default interface naming scheme 'v255'. Mar 25 01:25:52.548257 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 01:25:52.553555 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 25 01:25:52.588242 dracut-pre-trigger[465]: rd.md=0: removing MD RAID activation Mar 25 01:25:52.631513 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 25 01:25:52.635939 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 25 01:25:52.705679 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 01:25:52.713719 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 25 01:25:52.748981 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 25 01:25:52.752544 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 25 01:25:52.754489 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 01:25:52.755516 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 25 01:25:52.759440 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 25 01:25:52.780801 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 25 01:25:52.845898 kernel: ACPI: bus type USB registered Mar 25 01:25:52.845957 kernel: usbcore: registered new interface driver usbfs Mar 25 01:25:52.849329 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 25 01:25:52.849460 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:25:52.850393 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 01:25:52.856913 kernel: usbcore: registered new interface driver hub Mar 25 01:25:52.856950 kernel: usbcore: registered new device driver usb Mar 25 01:25:52.851923 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 25 01:25:52.852096 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:25:52.856566 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:25:52.858713 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:25:52.872831 kernel: scsi host0: Virtio SCSI HBA Mar 25 01:25:52.877275 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 Mar 25 01:25:52.877359 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Mar 25 01:25:52.878124 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 25 01:25:52.888126 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:25:52.890619 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 01:25:52.903058 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Mar 25 01:25:52.916024 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Mar 25 01:25:52.916142 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Mar 25 01:25:52.916231 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Mar 25 01:25:52.916333 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Mar 25 01:25:52.916421 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Mar 25 01:25:52.916504 kernel: sr 0:0:0:0: Power-on or device reset occurred Mar 25 01:25:52.917920 kernel: hub 1-0:1.0: USB hub found Mar 25 01:25:52.918104 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray Mar 25 01:25:52.918209 kernel: hub 1-0:1.0: 4 ports detected Mar 25 01:25:52.918391 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 25 01:25:52.918405 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Mar 25 01:25:52.918530 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Mar 25 01:25:52.918640 kernel: hub 2-0:1.0: USB hub found Mar 25 01:25:52.918770 kernel: hub 2-0:1.0: 4 ports detected Mar 25 01:25:52.929578 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:25:52.934973 kernel: sd 0:0:0:1: Power-on or device reset occurred Mar 25 01:25:52.942716 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Mar 25 01:25:52.943248 kernel: sd 0:0:0:1: [sda] Write Protect is off Mar 25 01:25:52.943368 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 Mar 25 01:25:52.943458 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Mar 25 01:25:52.943537 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 25 01:25:52.943547 kernel: GPT:17805311 != 80003071 Mar 25 01:25:52.943564 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 25 01:25:52.943574 kernel: GPT:17805311 != 80003071 Mar 25 01:25:52.943582 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 25 01:25:52.943591 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 25 01:25:52.943600 kernel: sd 0:0:0:1: [sda] Attached SCSI disk Mar 25 01:25:52.994343 kernel: BTRFS: device fsid bf348154-9cb1-474d-801c-0e035a5758cf devid 1 transid 39 /dev/sda3 scanned by (udev-worker) (522) Mar 25 01:25:52.999835 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by (udev-worker) (504) Mar 25 01:25:53.006488 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Mar 25 01:25:53.030902 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Mar 25 01:25:53.041440 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Mar 25 01:25:53.048939 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Mar 25 01:25:53.049564 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Mar 25 01:25:53.052385 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 25 01:25:53.074492 disk-uuid[575]: Primary Header is updated. Mar 25 01:25:53.074492 disk-uuid[575]: Secondary Entries is updated. Mar 25 01:25:53.074492 disk-uuid[575]: Secondary Header is updated. Mar 25 01:25:53.082858 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 25 01:25:53.158999 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Mar 25 01:25:53.404870 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Mar 25 01:25:53.542919 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Mar 25 01:25:53.542988 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Mar 25 01:25:53.544453 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Mar 25 01:25:53.600552 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Mar 25 01:25:53.601161 kernel: usbcore: registered new interface driver usbhid Mar 25 01:25:53.601181 kernel: usbhid: USB HID core driver Mar 25 01:25:54.093915 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 25 01:25:54.094511 disk-uuid[576]: The operation has completed successfully. Mar 25 01:25:54.153323 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 25 01:25:54.153433 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 25 01:25:54.181382 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 25 01:25:54.199078 sh[591]: Success Mar 25 01:25:54.213862 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" Mar 25 01:25:54.273739 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 25 01:25:54.277915 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 25 01:25:54.295912 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 25 01:25:54.310922 kernel: BTRFS info (device dm-0): first mount of filesystem bf348154-9cb1-474d-801c-0e035a5758cf Mar 25 01:25:54.311006 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Mar 25 01:25:54.311031 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 25 01:25:54.311055 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 25 01:25:54.311858 kernel: BTRFS info (device dm-0): using free space tree Mar 25 01:25:54.318879 kernel: BTRFS info (device dm-0): enabling ssd optimizations Mar 25 01:25:54.321256 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 25 01:25:54.323071 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 25 01:25:54.325944 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 25 01:25:54.329961 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 25 01:25:54.357886 kernel: BTRFS info (device sda6): first mount of filesystem 09629b08-d05c-4ce3-8bf7-615041c4b2c9 Mar 25 01:25:54.357956 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 25 01:25:54.358932 kernel: BTRFS info (device sda6): using free space tree Mar 25 01:25:54.362847 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 25 01:25:54.362923 kernel: BTRFS info (device sda6): auto enabling async discard Mar 25 01:25:54.368858 kernel: BTRFS info (device sda6): last unmount of filesystem 09629b08-d05c-4ce3-8bf7-615041c4b2c9 Mar 25 01:25:54.371607 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 25 01:25:54.377051 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 25 01:25:54.451361 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 25 01:25:54.455300 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 25 01:25:54.487995 ignition[688]: Ignition 2.20.0 Mar 25 01:25:54.488006 ignition[688]: Stage: fetch-offline Mar 25 01:25:54.488045 ignition[688]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:25:54.488053 ignition[688]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 25 01:25:54.488213 ignition[688]: parsed url from cmdline: "" Mar 25 01:25:54.488217 ignition[688]: no config URL provided Mar 25 01:25:54.488221 ignition[688]: reading system config file "/usr/lib/ignition/user.ign" Mar 25 01:25:54.488229 ignition[688]: no config at "/usr/lib/ignition/user.ign" Mar 25 01:25:54.488234 ignition[688]: failed to fetch config: resource requires networking Mar 25 01:25:54.489162 ignition[688]: Ignition finished successfully Mar 25 01:25:54.496145 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 25 01:25:54.497257 systemd-networkd[774]: lo: Link UP Mar 25 01:25:54.497261 systemd-networkd[774]: lo: Gained carrier Mar 25 01:25:54.499414 systemd-networkd[774]: Enumeration completed Mar 25 01:25:54.499804 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 25 01:25:54.500476 systemd-networkd[774]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:25:54.500480 systemd-networkd[774]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 25 01:25:54.501279 systemd-networkd[774]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:25:54.501283 systemd-networkd[774]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 25 01:25:54.501972 systemd-networkd[774]: eth0: Link UP Mar 25 01:25:54.501975 systemd-networkd[774]: eth0: Gained carrier Mar 25 01:25:54.501983 systemd-networkd[774]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:25:54.504127 systemd[1]: Reached target network.target - Network. Mar 25 01:25:54.507015 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 25 01:25:54.510209 systemd-networkd[774]: eth1: Link UP Mar 25 01:25:54.510213 systemd-networkd[774]: eth1: Gained carrier Mar 25 01:25:54.510224 systemd-networkd[774]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:25:54.529995 ignition[779]: Ignition 2.20.0 Mar 25 01:25:54.530006 ignition[779]: Stage: fetch Mar 25 01:25:54.530237 ignition[779]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:25:54.530248 ignition[779]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 25 01:25:54.530365 ignition[779]: parsed url from cmdline: "" Mar 25 01:25:54.530369 ignition[779]: no config URL provided Mar 25 01:25:54.530375 ignition[779]: reading system config file "/usr/lib/ignition/user.ign" Mar 25 01:25:54.530384 ignition[779]: no config at "/usr/lib/ignition/user.ign" Mar 25 01:25:54.530578 ignition[779]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Mar 25 01:25:54.531559 ignition[779]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Mar 25 01:25:54.539905 systemd-networkd[774]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 Mar 25 01:25:54.567942 systemd-networkd[774]: eth0: DHCPv4 address 78.46.211.139/32, gateway 172.31.1.1 acquired from 172.31.1.1 Mar 25 01:25:54.731750 ignition[779]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Mar 25 01:25:54.738747 ignition[779]: GET result: OK Mar 25 01:25:54.738964 ignition[779]: parsing config with SHA512: fecc0030357f7bfb5e1d705b43a34d40708fe024faa50eee4f8ac4c9bcf9a1fa6ea447e60055ee1c48ad74da868cf693fc83cbe4d23ee0c98e3b5a53c7a65d65 Mar 25 01:25:54.746706 unknown[779]: fetched base config from "system" Mar 25 01:25:54.746721 unknown[779]: fetched base config from "system" Mar 25 01:25:54.746727 unknown[779]: fetched user config from "hetzner" Mar 25 01:25:54.748272 ignition[779]: fetch: fetch complete Mar 25 01:25:54.748279 ignition[779]: fetch: fetch passed Mar 25 01:25:54.748351 ignition[779]: Ignition finished successfully Mar 25 01:25:54.750361 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 25 01:25:54.752584 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 25 01:25:54.778599 ignition[787]: Ignition 2.20.0 Mar 25 01:25:54.778610 ignition[787]: Stage: kargs Mar 25 01:25:54.778796 ignition[787]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:25:54.778824 ignition[787]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 25 01:25:54.782001 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 25 01:25:54.779751 ignition[787]: kargs: kargs passed Mar 25 01:25:54.779838 ignition[787]: Ignition finished successfully Mar 25 01:25:54.783968 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 25 01:25:54.808931 ignition[794]: Ignition 2.20.0 Mar 25 01:25:54.808952 ignition[794]: Stage: disks Mar 25 01:25:54.809401 ignition[794]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:25:54.809426 ignition[794]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 25 01:25:54.811938 ignition[794]: disks: disks passed Mar 25 01:25:54.812056 ignition[794]: Ignition finished successfully Mar 25 01:25:54.813260 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 25 01:25:54.815132 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 25 01:25:54.816556 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 25 01:25:54.817310 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 25 01:25:54.818400 systemd[1]: Reached target sysinit.target - System Initialization. Mar 25 01:25:54.819425 systemd[1]: Reached target basic.target - Basic System. Mar 25 01:25:54.821519 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 25 01:25:54.855380 systemd-fsck[803]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Mar 25 01:25:54.861504 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 25 01:25:54.864067 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 25 01:25:54.921838 kernel: EXT4-fs (sda9): mounted filesystem a7a89271-ee7d-4bda-a834-705261d6cda9 r/w with ordered data mode. Quota mode: none. Mar 25 01:25:54.923154 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 25 01:25:54.925032 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 25 01:25:54.928564 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 25 01:25:54.932957 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 25 01:25:54.941287 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Mar 25 01:25:54.945460 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 25 01:25:54.946927 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 25 01:25:54.951151 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 25 01:25:54.953951 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 25 01:25:54.961170 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (811) Mar 25 01:25:54.963865 kernel: BTRFS info (device sda6): first mount of filesystem 09629b08-d05c-4ce3-8bf7-615041c4b2c9 Mar 25 01:25:54.963926 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 25 01:25:54.963937 kernel: BTRFS info (device sda6): using free space tree Mar 25 01:25:54.967153 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 25 01:25:54.967215 kernel: BTRFS info (device sda6): auto enabling async discard Mar 25 01:25:54.971218 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 25 01:25:55.025019 initrd-setup-root[838]: cut: /sysroot/etc/passwd: No such file or directory Mar 25 01:25:55.026843 coreos-metadata[813]: Mar 25 01:25:55.024 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Mar 25 01:25:55.028749 coreos-metadata[813]: Mar 25 01:25:55.028 INFO Fetch successful Mar 25 01:25:55.028749 coreos-metadata[813]: Mar 25 01:25:55.028 INFO wrote hostname ci-4284-0-0-6-22e9b0bb97 to /sysroot/etc/hostname Mar 25 01:25:55.030533 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 25 01:25:55.035785 initrd-setup-root[846]: cut: /sysroot/etc/group: No such file or directory Mar 25 01:25:55.041945 initrd-setup-root[853]: cut: /sysroot/etc/shadow: No such file or directory Mar 25 01:25:55.047938 initrd-setup-root[860]: cut: /sysroot/etc/gshadow: No such file or directory Mar 25 01:25:55.155990 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 25 01:25:55.159046 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 25 01:25:55.161032 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 25 01:25:55.178837 kernel: BTRFS info (device sda6): last unmount of filesystem 09629b08-d05c-4ce3-8bf7-615041c4b2c9 Mar 25 01:25:55.194073 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 25 01:25:55.207868 ignition[929]: INFO : Ignition 2.20.0 Mar 25 01:25:55.207868 ignition[929]: INFO : Stage: mount Mar 25 01:25:55.207868 ignition[929]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 01:25:55.207868 ignition[929]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 25 01:25:55.211150 ignition[929]: INFO : mount: mount passed Mar 25 01:25:55.211150 ignition[929]: INFO : Ignition finished successfully Mar 25 01:25:55.211077 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 25 01:25:55.213957 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 25 01:25:55.309081 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 25 01:25:55.311583 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 25 01:25:55.338211 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/sda6 scanned by mount (940) Mar 25 01:25:55.338288 kernel: BTRFS info (device sda6): first mount of filesystem 09629b08-d05c-4ce3-8bf7-615041c4b2c9 Mar 25 01:25:55.338314 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Mar 25 01:25:55.338984 kernel: BTRFS info (device sda6): using free space tree Mar 25 01:25:55.341853 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 25 01:25:55.341916 kernel: BTRFS info (device sda6): auto enabling async discard Mar 25 01:25:55.345491 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 25 01:25:55.381725 ignition[958]: INFO : Ignition 2.20.0 Mar 25 01:25:55.381725 ignition[958]: INFO : Stage: files Mar 25 01:25:55.383658 ignition[958]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 01:25:55.383658 ignition[958]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 25 01:25:55.383658 ignition[958]: DEBUG : files: compiled without relabeling support, skipping Mar 25 01:25:55.386567 ignition[958]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 25 01:25:55.386567 ignition[958]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 25 01:25:55.389031 ignition[958]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 25 01:25:55.389975 ignition[958]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 25 01:25:55.389975 ignition[958]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 25 01:25:55.389467 unknown[958]: wrote ssh authorized keys file for user: core Mar 25 01:25:55.392292 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Mar 25 01:25:55.392292 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1 Mar 25 01:25:55.521342 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 25 01:25:55.769748 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Mar 25 01:25:55.769748 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 25 01:25:55.773507 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 25 01:25:55.773507 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 25 01:25:55.773507 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 25 01:25:55.773507 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 25 01:25:55.773507 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 25 01:25:55.773507 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 25 01:25:55.773507 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 25 01:25:55.773507 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 25 01:25:55.773507 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 25 01:25:55.773507 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-arm64.raw" Mar 25 01:25:55.773507 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-arm64.raw" Mar 25 01:25:55.773507 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-arm64.raw" Mar 25 01:25:55.785781 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.31.0-arm64.raw: attempt #1 Mar 25 01:25:56.130056 systemd-networkd[774]: eth1: Gained IPv6LL Mar 25 01:25:56.322344 systemd-networkd[774]: eth0: Gained IPv6LL Mar 25 01:25:56.337779 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 25 01:25:56.552569 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-arm64.raw" Mar 25 01:25:56.552569 ignition[958]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 25 01:25:56.555957 ignition[958]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 25 01:25:56.557890 ignition[958]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 25 01:25:56.557890 ignition[958]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 25 01:25:56.557890 ignition[958]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Mar 25 01:25:56.557890 ignition[958]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Mar 25 01:25:56.557890 ignition[958]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Mar 25 01:25:56.557890 ignition[958]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Mar 25 01:25:56.557890 ignition[958]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Mar 25 01:25:56.557890 ignition[958]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Mar 25 01:25:56.557890 ignition[958]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 25 01:25:56.573781 ignition[958]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 25 01:25:56.573781 ignition[958]: INFO : files: files passed Mar 25 01:25:56.573781 ignition[958]: INFO : Ignition finished successfully Mar 25 01:25:56.560179 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 25 01:25:56.564020 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 25 01:25:56.567049 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 25 01:25:56.584433 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 25 01:25:56.584893 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 25 01:25:56.591947 initrd-setup-root-after-ignition[986]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 25 01:25:56.591947 initrd-setup-root-after-ignition[986]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 25 01:25:56.597474 initrd-setup-root-after-ignition[990]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 25 01:25:56.598113 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 25 01:25:56.600478 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 25 01:25:56.603900 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 25 01:25:56.652081 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 25 01:25:56.652226 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 25 01:25:56.653852 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 25 01:25:56.654783 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 25 01:25:56.655917 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 25 01:25:56.657057 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 25 01:25:56.685175 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 25 01:25:56.689866 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 25 01:25:56.714504 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 25 01:25:56.715385 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 01:25:56.716455 systemd[1]: Stopped target timers.target - Timer Units. Mar 25 01:25:56.717453 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 25 01:25:56.717617 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 25 01:25:56.718965 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 25 01:25:56.719539 systemd[1]: Stopped target basic.target - Basic System. Mar 25 01:25:56.720656 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 25 01:25:56.721681 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 25 01:25:56.722662 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 25 01:25:56.723696 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 25 01:25:56.724732 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 25 01:25:56.725901 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 25 01:25:56.726864 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 25 01:25:56.727912 systemd[1]: Stopped target swap.target - Swaps. Mar 25 01:25:56.728764 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 25 01:25:56.728921 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 25 01:25:56.730121 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 25 01:25:56.730731 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 01:25:56.732620 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 25 01:25:56.733151 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 01:25:56.734561 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 25 01:25:56.734703 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 25 01:25:56.736108 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 25 01:25:56.736230 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 25 01:25:56.737521 systemd[1]: ignition-files.service: Deactivated successfully. Mar 25 01:25:56.737641 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 25 01:25:56.738455 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Mar 25 01:25:56.738556 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 25 01:25:56.742016 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 25 01:25:56.742653 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 25 01:25:56.742797 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 01:25:56.747131 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 25 01:25:56.747912 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 25 01:25:56.748062 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 01:25:56.749259 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 25 01:25:56.749358 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 25 01:25:56.758191 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 25 01:25:56.758877 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 25 01:25:56.772331 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 25 01:25:56.774784 ignition[1010]: INFO : Ignition 2.20.0 Mar 25 01:25:56.776684 ignition[1010]: INFO : Stage: umount Mar 25 01:25:56.776684 ignition[1010]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 01:25:56.776684 ignition[1010]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 25 01:25:56.776684 ignition[1010]: INFO : umount: umount passed Mar 25 01:25:56.776684 ignition[1010]: INFO : Ignition finished successfully Mar 25 01:25:56.776014 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 25 01:25:56.776134 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 25 01:25:56.777924 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 25 01:25:56.778025 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 25 01:25:56.779784 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 25 01:25:56.780237 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 25 01:25:56.781258 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 25 01:25:56.781309 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 25 01:25:56.782079 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 25 01:25:56.782116 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 25 01:25:56.782976 systemd[1]: Stopped target network.target - Network. Mar 25 01:25:56.783744 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 25 01:25:56.783802 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 25 01:25:56.784749 systemd[1]: Stopped target paths.target - Path Units. Mar 25 01:25:56.785518 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 25 01:25:56.790932 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 01:25:56.792998 systemd[1]: Stopped target slices.target - Slice Units. Mar 25 01:25:56.793994 systemd[1]: Stopped target sockets.target - Socket Units. Mar 25 01:25:56.794945 systemd[1]: iscsid.socket: Deactivated successfully. Mar 25 01:25:56.794996 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 25 01:25:56.795836 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 25 01:25:56.795870 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 25 01:25:56.796727 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 25 01:25:56.796783 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 25 01:25:56.798320 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 25 01:25:56.798398 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 25 01:25:56.799686 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 25 01:25:56.799757 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 25 01:25:56.801283 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 25 01:25:56.802382 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 25 01:25:56.808051 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 25 01:25:56.808169 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 25 01:25:56.812123 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 25 01:25:56.813145 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 25 01:25:56.813289 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 01:25:56.816422 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 25 01:25:56.816744 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 25 01:25:56.816898 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 25 01:25:56.819182 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 25 01:25:56.820146 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 25 01:25:56.820211 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 25 01:25:56.822423 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 25 01:25:56.823097 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 25 01:25:56.823159 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 25 01:25:56.823901 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 25 01:25:56.823949 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 25 01:25:56.825374 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 25 01:25:56.825419 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 25 01:25:56.826531 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 01:25:56.832486 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 25 01:25:56.843148 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 25 01:25:56.843359 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 01:25:56.844827 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 25 01:25:56.844885 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 25 01:25:56.845772 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 25 01:25:56.845849 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 01:25:56.848236 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 25 01:25:56.848313 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 25 01:25:56.850007 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 25 01:25:56.850074 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 25 01:25:56.851526 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 25 01:25:56.851616 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:25:56.854375 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 25 01:25:56.855404 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 25 01:25:56.855468 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 01:25:56.856302 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 25 01:25:56.856351 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:25:56.875663 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 25 01:25:56.875793 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 25 01:25:56.881116 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 25 01:25:56.881236 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 25 01:25:56.882483 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 25 01:25:56.885259 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 25 01:25:56.909176 systemd[1]: Switching root. Mar 25 01:25:56.951714 systemd-journald[236]: Journal stopped Mar 25 01:25:58.082306 systemd-journald[236]: Received SIGTERM from PID 1 (systemd). Mar 25 01:25:58.082396 kernel: SELinux: policy capability network_peer_controls=1 Mar 25 01:25:58.082409 kernel: SELinux: policy capability open_perms=1 Mar 25 01:25:58.082418 kernel: SELinux: policy capability extended_socket_class=1 Mar 25 01:25:58.082428 kernel: SELinux: policy capability always_check_network=0 Mar 25 01:25:58.082437 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 25 01:25:58.082447 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 25 01:25:58.082460 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 25 01:25:58.082470 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 25 01:25:58.082479 kernel: audit: type=1403 audit(1742865957.097:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 25 01:25:58.082492 systemd[1]: Successfully loaded SELinux policy in 41.085ms. Mar 25 01:25:58.082512 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 11.883ms. Mar 25 01:25:58.082523 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 25 01:25:58.082534 systemd[1]: Detected virtualization kvm. Mar 25 01:25:58.082544 systemd[1]: Detected architecture arm64. Mar 25 01:25:58.082554 systemd[1]: Detected first boot. Mar 25 01:25:58.082580 systemd[1]: Hostname set to . Mar 25 01:25:58.082592 systemd[1]: Initializing machine ID from VM UUID. Mar 25 01:25:58.082603 zram_generator::config[1055]: No configuration found. Mar 25 01:25:58.082614 kernel: NET: Registered PF_VSOCK protocol family Mar 25 01:25:58.082678 systemd[1]: Populated /etc with preset unit settings. Mar 25 01:25:58.082693 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 25 01:25:58.082704 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 25 01:25:58.082714 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 25 01:25:58.082725 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 25 01:25:58.082741 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 25 01:25:58.082753 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 25 01:25:58.082764 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 25 01:25:58.082774 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 25 01:25:58.082784 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 25 01:25:58.082795 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 25 01:25:58.084295 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 25 01:25:58.084338 systemd[1]: Created slice user.slice - User and Session Slice. Mar 25 01:25:58.084349 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 01:25:58.084360 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 01:25:58.084376 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 25 01:25:58.084386 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 25 01:25:58.084397 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 25 01:25:58.084407 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 25 01:25:58.084417 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Mar 25 01:25:58.084428 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 01:25:58.084440 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 25 01:25:58.084451 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 25 01:25:58.084461 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 25 01:25:58.084476 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 25 01:25:58.084486 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 01:25:58.084496 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 25 01:25:58.084507 systemd[1]: Reached target slices.target - Slice Units. Mar 25 01:25:58.084517 systemd[1]: Reached target swap.target - Swaps. Mar 25 01:25:58.084527 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 25 01:25:58.084538 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 25 01:25:58.084548 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 25 01:25:58.084573 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 25 01:25:58.084588 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 25 01:25:58.084598 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 01:25:58.084608 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 25 01:25:58.084618 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 25 01:25:58.084629 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 25 01:25:58.084647 systemd[1]: Mounting media.mount - External Media Directory... Mar 25 01:25:58.084659 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 25 01:25:58.084670 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 25 01:25:58.084680 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 25 01:25:58.084690 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 25 01:25:58.084700 systemd[1]: Reached target machines.target - Containers. Mar 25 01:25:58.084712 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 25 01:25:58.084722 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 01:25:58.084732 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 25 01:25:58.084743 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 25 01:25:58.084753 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 25 01:25:58.084764 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 25 01:25:58.084775 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 25 01:25:58.084785 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 25 01:25:58.084795 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 25 01:25:58.086854 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 25 01:25:58.086894 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 25 01:25:58.086906 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 25 01:25:58.086917 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 25 01:25:58.086927 systemd[1]: Stopped systemd-fsck-usr.service. Mar 25 01:25:58.086938 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 01:25:58.086949 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 25 01:25:58.086959 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 25 01:25:58.086977 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 25 01:25:58.086987 kernel: fuse: init (API version 7.39) Mar 25 01:25:58.086998 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 25 01:25:58.087008 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 25 01:25:58.087018 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 25 01:25:58.087030 systemd[1]: verity-setup.service: Deactivated successfully. Mar 25 01:25:58.087041 systemd[1]: Stopped verity-setup.service. Mar 25 01:25:58.087051 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 25 01:25:58.087061 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 25 01:25:58.087071 systemd[1]: Mounted media.mount - External Media Directory. Mar 25 01:25:58.087081 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 25 01:25:58.087091 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 25 01:25:58.087102 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 25 01:25:58.087114 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 01:25:58.087125 kernel: ACPI: bus type drm_connector registered Mar 25 01:25:58.087135 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 25 01:25:58.087145 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 25 01:25:58.087156 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 25 01:25:58.087166 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 25 01:25:58.087178 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 25 01:25:58.087188 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 25 01:25:58.087198 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 25 01:25:58.087209 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 25 01:25:58.087219 kernel: loop: module loaded Mar 25 01:25:58.087228 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 25 01:25:58.087238 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 25 01:25:58.087248 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 25 01:25:58.087259 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 25 01:25:58.087271 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 25 01:25:58.087281 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 25 01:25:58.087291 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 25 01:25:58.087304 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 25 01:25:58.087314 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 25 01:25:58.087325 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 25 01:25:58.087369 systemd-journald[1126]: Collecting audit messages is disabled. Mar 25 01:25:58.087400 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 25 01:25:58.087412 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 25 01:25:58.087423 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 25 01:25:58.087436 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 25 01:25:58.087446 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 25 01:25:58.087458 systemd-journald[1126]: Journal started Mar 25 01:25:58.087480 systemd-journald[1126]: Runtime Journal (/run/log/journal/fe52c5d8db4246368d8c198ae6d8f097) is 8M, max 76.6M, 68.6M free. Mar 25 01:25:57.737981 systemd[1]: Queued start job for default target multi-user.target. Mar 25 01:25:57.753080 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Mar 25 01:25:57.754209 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 25 01:25:58.094349 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 25 01:25:58.094420 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 01:25:58.099231 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 25 01:25:58.099299 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 25 01:25:58.105275 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 25 01:25:58.106948 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 25 01:25:58.127458 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 25 01:25:58.132832 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 25 01:25:58.136028 systemd[1]: Started systemd-journald.service - Journal Service. Mar 25 01:25:58.140946 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 25 01:25:58.143441 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 25 01:25:58.145269 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 25 01:25:58.146641 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 25 01:25:58.149312 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 25 01:25:58.165525 kernel: loop0: detected capacity change from 0 to 8 Mar 25 01:25:58.185073 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 25 01:25:58.187584 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 25 01:25:58.195521 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 25 01:25:58.202079 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 25 01:25:58.209115 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 25 01:25:58.226301 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 01:25:58.228934 kernel: loop1: detected capacity change from 0 to 189592 Mar 25 01:25:58.235160 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 25 01:25:58.250994 systemd-journald[1126]: Time spent on flushing to /var/log/journal/fe52c5d8db4246368d8c198ae6d8f097 is 47.722ms for 1145 entries. Mar 25 01:25:58.250994 systemd-journald[1126]: System Journal (/var/log/journal/fe52c5d8db4246368d8c198ae6d8f097) is 8M, max 584.8M, 576.8M free. Mar 25 01:25:58.309391 systemd-journald[1126]: Received client request to flush runtime journal. Mar 25 01:25:58.309439 kernel: loop2: detected capacity change from 0 to 126448 Mar 25 01:25:58.267306 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 25 01:25:58.279624 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 25 01:25:58.315161 udevadm[1186]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Mar 25 01:25:58.321666 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 25 01:25:58.328730 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 25 01:25:58.330129 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 25 01:25:58.362419 systemd-tmpfiles[1190]: ACLs are not supported, ignoring. Mar 25 01:25:58.362441 systemd-tmpfiles[1190]: ACLs are not supported, ignoring. Mar 25 01:25:58.376154 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 01:25:58.385037 kernel: loop3: detected capacity change from 0 to 103832 Mar 25 01:25:58.436193 kernel: loop4: detected capacity change from 0 to 8 Mar 25 01:25:58.441090 kernel: loop5: detected capacity change from 0 to 189592 Mar 25 01:25:58.473868 kernel: loop6: detected capacity change from 0 to 126448 Mar 25 01:25:58.494135 kernel: loop7: detected capacity change from 0 to 103832 Mar 25 01:25:58.515147 (sd-merge)[1201]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Mar 25 01:25:58.517637 (sd-merge)[1201]: Merged extensions into '/usr'. Mar 25 01:25:58.525643 systemd[1]: Reload requested from client PID 1155 ('systemd-sysext') (unit systemd-sysext.service)... Mar 25 01:25:58.525665 systemd[1]: Reloading... Mar 25 01:25:58.640832 zram_generator::config[1225]: No configuration found. Mar 25 01:25:58.662675 ldconfig[1152]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 25 01:25:58.797826 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:25:58.861253 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 25 01:25:58.861740 systemd[1]: Reloading finished in 335 ms. Mar 25 01:25:58.879322 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 25 01:25:58.884559 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 25 01:25:58.898088 systemd[1]: Starting ensure-sysext.service... Mar 25 01:25:58.900688 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 25 01:25:58.930364 systemd-tmpfiles[1267]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 25 01:25:58.930607 systemd-tmpfiles[1267]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 25 01:25:58.931393 systemd-tmpfiles[1267]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 25 01:25:58.931712 systemd-tmpfiles[1267]: ACLs are not supported, ignoring. Mar 25 01:25:58.931770 systemd-tmpfiles[1267]: ACLs are not supported, ignoring. Mar 25 01:25:58.935245 systemd-tmpfiles[1267]: Detected autofs mount point /boot during canonicalization of boot. Mar 25 01:25:58.935258 systemd-tmpfiles[1267]: Skipping /boot Mar 25 01:25:58.944043 systemd[1]: Reload requested from client PID 1266 ('systemctl') (unit ensure-sysext.service)... Mar 25 01:25:58.944067 systemd[1]: Reloading... Mar 25 01:25:58.945375 systemd-tmpfiles[1267]: Detected autofs mount point /boot during canonicalization of boot. Mar 25 01:25:58.945391 systemd-tmpfiles[1267]: Skipping /boot Mar 25 01:25:59.031838 zram_generator::config[1296]: No configuration found. Mar 25 01:25:59.146063 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:25:59.208454 systemd[1]: Reloading finished in 263 ms. Mar 25 01:25:59.223979 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 25 01:25:59.235850 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 01:25:59.247486 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 25 01:25:59.254191 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 25 01:25:59.264950 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 25 01:25:59.270947 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 25 01:25:59.275948 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 01:25:59.279684 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 25 01:25:59.283638 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 01:25:59.288903 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 25 01:25:59.297407 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 25 01:25:59.301753 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 25 01:25:59.302778 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 01:25:59.302937 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 01:25:59.313186 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 25 01:25:59.319499 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 25 01:25:59.319771 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 25 01:25:59.323103 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 25 01:25:59.324705 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 25 01:25:59.329237 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 01:25:59.329627 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 01:25:59.332255 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 01:25:59.332494 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 25 01:25:59.342310 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 25 01:25:59.348561 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 01:25:59.350890 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 25 01:25:59.356120 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 25 01:25:59.367522 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 25 01:25:59.368350 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 01:25:59.368475 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 01:25:59.369945 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 25 01:25:59.371368 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 25 01:25:59.371898 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 25 01:25:59.384340 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 25 01:25:59.392133 systemd-udevd[1340]: Using default interface naming scheme 'v255'. Mar 25 01:25:59.393908 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 25 01:25:59.395203 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 25 01:25:59.396647 systemd[1]: Finished ensure-sysext.service. Mar 25 01:25:59.398463 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 25 01:25:59.398666 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 25 01:25:59.407467 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 25 01:25:59.412963 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 25 01:25:59.417969 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 25 01:25:59.419900 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 25 01:25:59.420296 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 25 01:25:59.420488 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 25 01:25:59.421936 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 25 01:25:59.424584 augenrules[1379]: No rules Mar 25 01:25:59.427484 systemd[1]: audit-rules.service: Deactivated successfully. Mar 25 01:25:59.427754 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 25 01:25:59.449794 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 25 01:25:59.451510 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 25 01:25:59.454513 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 01:25:59.460606 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 25 01:25:59.580945 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Mar 25 01:25:59.611557 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 25 01:25:59.612321 systemd[1]: Reached target time-set.target - System Time Set. Mar 25 01:25:59.616049 systemd-resolved[1339]: Positive Trust Anchors: Mar 25 01:25:59.616075 systemd-resolved[1339]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 25 01:25:59.616106 systemd-resolved[1339]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 25 01:25:59.617317 systemd-networkd[1395]: lo: Link UP Mar 25 01:25:59.617321 systemd-networkd[1395]: lo: Gained carrier Mar 25 01:25:59.621087 systemd-resolved[1339]: Using system hostname 'ci-4284-0-0-6-22e9b0bb97'. Mar 25 01:25:59.625376 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 25 01:25:59.627033 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 25 01:25:59.632802 systemd-networkd[1395]: Enumeration completed Mar 25 01:25:59.633039 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 25 01:25:59.634671 systemd[1]: Reached target network.target - Network. Mar 25 01:25:59.639356 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 25 01:25:59.643262 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 25 01:25:59.682504 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 25 01:25:59.712849 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (1394) Mar 25 01:25:59.744316 systemd-networkd[1395]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:25:59.744328 systemd-networkd[1395]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 25 01:25:59.749270 systemd-networkd[1395]: eth0: Link UP Mar 25 01:25:59.749406 systemd-networkd[1395]: eth0: Gained carrier Mar 25 01:25:59.749614 systemd-networkd[1395]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:25:59.767887 kernel: mousedev: PS/2 mouse device common for all mice Mar 25 01:25:59.772777 systemd-networkd[1395]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:25:59.773281 systemd-networkd[1395]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 25 01:25:59.774673 systemd-networkd[1395]: eth1: Link UP Mar 25 01:25:59.775273 systemd-networkd[1395]: eth1: Gained carrier Mar 25 01:25:59.775301 systemd-networkd[1395]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:25:59.793959 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Mar 25 01:25:59.798037 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 25 01:25:59.798966 systemd-networkd[1395]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 Mar 25 01:25:59.800617 systemd-timesyncd[1376]: Network configuration changed, trying to establish connection. Mar 25 01:25:59.807941 systemd-networkd[1395]: eth0: DHCPv4 address 78.46.211.139/32, gateway 172.31.1.1 acquired from 172.31.1.1 Mar 25 01:25:59.808425 systemd-timesyncd[1376]: Network configuration changed, trying to establish connection. Mar 25 01:25:59.809121 systemd-timesyncd[1376]: Network configuration changed, trying to establish connection. Mar 25 01:25:59.830947 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 25 01:25:59.859234 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 Mar 25 01:25:59.859301 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Mar 25 01:25:59.859339 kernel: [drm] features: -context_init Mar 25 01:25:59.858777 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Mar 25 01:25:59.860129 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 01:25:59.862170 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 25 01:25:59.866123 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 25 01:25:59.869612 kernel: [drm] number of scanouts: 1 Mar 25 01:25:59.869723 kernel: [drm] number of cap sets: 0 Mar 25 01:25:59.869079 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 25 01:25:59.869978 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 01:25:59.870021 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 01:25:59.870045 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 25 01:25:59.886260 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:01.0 on minor 0 Mar 25 01:25:59.887084 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 25 01:25:59.887931 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 25 01:25:59.889409 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 25 01:25:59.890150 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 25 01:25:59.896851 kernel: Console: switching to colour frame buffer device 160x50 Mar 25 01:25:59.903255 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 25 01:25:59.903493 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 25 01:25:59.917362 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Mar 25 01:25:59.921874 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 25 01:25:59.921975 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 25 01:25:59.941800 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:26:00.029745 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:26:00.084325 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 25 01:26:00.089093 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 25 01:26:00.110939 lvm[1458]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 25 01:26:00.137764 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 25 01:26:00.139965 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 25 01:26:00.140981 systemd[1]: Reached target sysinit.target - System Initialization. Mar 25 01:26:00.141952 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 25 01:26:00.143079 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 25 01:26:00.143958 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 25 01:26:00.144644 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 25 01:26:00.145365 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 25 01:26:00.146049 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 25 01:26:00.146087 systemd[1]: Reached target paths.target - Path Units. Mar 25 01:26:00.146572 systemd[1]: Reached target timers.target - Timer Units. Mar 25 01:26:00.148961 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 25 01:26:00.151555 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 25 01:26:00.155663 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 25 01:26:00.156734 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 25 01:26:00.157477 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 25 01:26:00.165322 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 25 01:26:00.168304 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 25 01:26:00.170931 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 25 01:26:00.172369 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 25 01:26:00.173134 systemd[1]: Reached target sockets.target - Socket Units. Mar 25 01:26:00.173660 systemd[1]: Reached target basic.target - Basic System. Mar 25 01:26:00.174294 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 25 01:26:00.174331 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 25 01:26:00.178010 systemd[1]: Starting containerd.service - containerd container runtime... Mar 25 01:26:00.182657 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 25 01:26:00.186046 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 25 01:26:00.191006 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 25 01:26:00.193455 lvm[1462]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 25 01:26:00.196128 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 25 01:26:00.197919 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 25 01:26:00.200780 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 25 01:26:00.208409 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 25 01:26:00.212179 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Mar 25 01:26:00.215844 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 25 01:26:00.218921 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 25 01:26:00.220683 jq[1466]: false Mar 25 01:26:00.229087 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 25 01:26:00.231896 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 25 01:26:00.232470 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 25 01:26:00.238097 systemd[1]: Starting update-engine.service - Update Engine... Mar 25 01:26:00.249065 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 25 01:26:00.265264 extend-filesystems[1467]: Found loop4 Mar 25 01:26:00.265264 extend-filesystems[1467]: Found loop5 Mar 25 01:26:00.265264 extend-filesystems[1467]: Found loop6 Mar 25 01:26:00.265264 extend-filesystems[1467]: Found loop7 Mar 25 01:26:00.265264 extend-filesystems[1467]: Found sda Mar 25 01:26:00.265264 extend-filesystems[1467]: Found sda1 Mar 25 01:26:00.265264 extend-filesystems[1467]: Found sda2 Mar 25 01:26:00.265264 extend-filesystems[1467]: Found sda3 Mar 25 01:26:00.265264 extend-filesystems[1467]: Found usr Mar 25 01:26:00.265264 extend-filesystems[1467]: Found sda4 Mar 25 01:26:00.265264 extend-filesystems[1467]: Found sda6 Mar 25 01:26:00.265264 extend-filesystems[1467]: Found sda7 Mar 25 01:26:00.265264 extend-filesystems[1467]: Found sda9 Mar 25 01:26:00.265264 extend-filesystems[1467]: Checking size of /dev/sda9 Mar 25 01:26:00.270398 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 25 01:26:00.297111 coreos-metadata[1464]: Mar 25 01:26:00.280 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Mar 25 01:26:00.297111 coreos-metadata[1464]: Mar 25 01:26:00.284 INFO Fetch successful Mar 25 01:26:00.297111 coreos-metadata[1464]: Mar 25 01:26:00.286 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Mar 25 01:26:00.297111 coreos-metadata[1464]: Mar 25 01:26:00.288 INFO Fetch successful Mar 25 01:26:00.275206 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 25 01:26:00.299415 dbus-daemon[1465]: [system] SELinux support is enabled Mar 25 01:26:00.275408 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 25 01:26:00.312862 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 25 01:26:00.317342 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 25 01:26:00.319046 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 25 01:26:00.332442 jq[1478]: true Mar 25 01:26:00.337421 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 25 01:26:00.339098 tar[1482]: linux-arm64/helm Mar 25 01:26:00.337474 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 25 01:26:00.338379 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 25 01:26:00.338399 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 25 01:26:00.349680 extend-filesystems[1467]: Resized partition /dev/sda9 Mar 25 01:26:00.359827 extend-filesystems[1508]: resize2fs 1.47.2 (1-Jan-2025) Mar 25 01:26:00.366147 (ntainerd)[1497]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 25 01:26:00.367257 update_engine[1476]: I20250325 01:26:00.359690 1476 main.cc:92] Flatcar Update Engine starting Mar 25 01:26:00.371114 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Mar 25 01:26:00.371233 update_engine[1476]: I20250325 01:26:00.371014 1476 update_check_scheduler.cc:74] Next update check in 8m19s Mar 25 01:26:00.373109 systemd[1]: motdgen.service: Deactivated successfully. Mar 25 01:26:00.373340 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 25 01:26:00.376761 systemd[1]: Started update-engine.service - Update Engine. Mar 25 01:26:00.388024 jq[1506]: true Mar 25 01:26:00.391330 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 25 01:26:00.502874 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (1408) Mar 25 01:26:00.556244 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Mar 25 01:26:00.576352 systemd-logind[1475]: New seat seat0. Mar 25 01:26:00.584994 bash[1531]: Updated "/home/core/.ssh/authorized_keys" Mar 25 01:26:00.586986 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 25 01:26:00.588320 systemd-logind[1475]: Watching system buttons on /dev/input/event0 (Power Button) Mar 25 01:26:00.588349 systemd-logind[1475]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Mar 25 01:26:00.588824 systemd[1]: Started systemd-logind.service - User Login Management. Mar 25 01:26:00.592403 systemd[1]: Starting sshkeys.service... Mar 25 01:26:00.598854 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 25 01:26:00.600280 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 25 01:26:00.603196 extend-filesystems[1508]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Mar 25 01:26:00.603196 extend-filesystems[1508]: old_desc_blocks = 1, new_desc_blocks = 5 Mar 25 01:26:00.603196 extend-filesystems[1508]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Mar 25 01:26:00.610098 extend-filesystems[1467]: Resized filesystem in /dev/sda9 Mar 25 01:26:00.610098 extend-filesystems[1467]: Found sr0 Mar 25 01:26:00.605700 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 25 01:26:00.606891 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 25 01:26:00.647078 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 25 01:26:00.652864 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 25 01:26:00.700510 coreos-metadata[1544]: Mar 25 01:26:00.700 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Mar 25 01:26:00.701908 coreos-metadata[1544]: Mar 25 01:26:00.701 INFO Fetch successful Mar 25 01:26:00.709887 unknown[1544]: wrote ssh authorized keys file for user: core Mar 25 01:26:00.760621 update-ssh-keys[1551]: Updated "/home/core/.ssh/authorized_keys" Mar 25 01:26:00.761629 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 25 01:26:00.768851 systemd[1]: Finished sshkeys.service. Mar 25 01:26:00.800078 locksmithd[1511]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 25 01:26:00.863970 containerd[1497]: time="2025-03-25T01:26:00Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 25 01:26:00.868473 containerd[1497]: time="2025-03-25T01:26:00.868411440Z" level=info msg="starting containerd" revision=88aa2f531d6c2922003cc7929e51daf1c14caa0a version=v2.0.1 Mar 25 01:26:00.897818 containerd[1497]: time="2025-03-25T01:26:00.896401080Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="8.72µs" Mar 25 01:26:00.897818 containerd[1497]: time="2025-03-25T01:26:00.896456120Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 25 01:26:00.897818 containerd[1497]: time="2025-03-25T01:26:00.896479360Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 25 01:26:00.897818 containerd[1497]: time="2025-03-25T01:26:00.896678640Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 25 01:26:00.897818 containerd[1497]: time="2025-03-25T01:26:00.896700760Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 25 01:26:00.897818 containerd[1497]: time="2025-03-25T01:26:00.896731960Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 25 01:26:00.897818 containerd[1497]: time="2025-03-25T01:26:00.896799240Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 25 01:26:00.897818 containerd[1497]: time="2025-03-25T01:26:00.896832280Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 25 01:26:00.897818 containerd[1497]: time="2025-03-25T01:26:00.897232560Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 25 01:26:00.897818 containerd[1497]: time="2025-03-25T01:26:00.897249600Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 25 01:26:00.897818 containerd[1497]: time="2025-03-25T01:26:00.897260560Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 25 01:26:00.897818 containerd[1497]: time="2025-03-25T01:26:00.897269440Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 25 01:26:00.898151 containerd[1497]: time="2025-03-25T01:26:00.897339840Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 25 01:26:00.898151 containerd[1497]: time="2025-03-25T01:26:00.897575840Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 25 01:26:00.898151 containerd[1497]: time="2025-03-25T01:26:00.897608080Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 25 01:26:00.898151 containerd[1497]: time="2025-03-25T01:26:00.897619360Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 25 01:26:00.901683 containerd[1497]: time="2025-03-25T01:26:00.901642120Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 25 01:26:00.902360 containerd[1497]: time="2025-03-25T01:26:00.902078880Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 25 01:26:00.902360 containerd[1497]: time="2025-03-25T01:26:00.902211160Z" level=info msg="metadata content store policy set" policy=shared Mar 25 01:26:00.909840 containerd[1497]: time="2025-03-25T01:26:00.909385920Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 25 01:26:00.909840 containerd[1497]: time="2025-03-25T01:26:00.909453320Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 25 01:26:00.909840 containerd[1497]: time="2025-03-25T01:26:00.909487480Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 25 01:26:00.909840 containerd[1497]: time="2025-03-25T01:26:00.909502880Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 25 01:26:00.909840 containerd[1497]: time="2025-03-25T01:26:00.909557280Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 25 01:26:00.909840 containerd[1497]: time="2025-03-25T01:26:00.909575560Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 25 01:26:00.909840 containerd[1497]: time="2025-03-25T01:26:00.909588440Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 25 01:26:00.909840 containerd[1497]: time="2025-03-25T01:26:00.909601200Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 25 01:26:00.909840 containerd[1497]: time="2025-03-25T01:26:00.909613160Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 25 01:26:00.909840 containerd[1497]: time="2025-03-25T01:26:00.909625080Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 25 01:26:00.909840 containerd[1497]: time="2025-03-25T01:26:00.909636000Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 25 01:26:00.909840 containerd[1497]: time="2025-03-25T01:26:00.909648240Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 25 01:26:00.912842 containerd[1497]: time="2025-03-25T01:26:00.911871640Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 25 01:26:00.912842 containerd[1497]: time="2025-03-25T01:26:00.911934640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 25 01:26:00.912842 containerd[1497]: time="2025-03-25T01:26:00.911956680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 25 01:26:00.912842 containerd[1497]: time="2025-03-25T01:26:00.911968680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 25 01:26:00.912842 containerd[1497]: time="2025-03-25T01:26:00.911981440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 25 01:26:00.912842 containerd[1497]: time="2025-03-25T01:26:00.911992040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 25 01:26:00.912842 containerd[1497]: time="2025-03-25T01:26:00.912004000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 25 01:26:00.912842 containerd[1497]: time="2025-03-25T01:26:00.912014280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 25 01:26:00.912842 containerd[1497]: time="2025-03-25T01:26:00.912028200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 25 01:26:00.912842 containerd[1497]: time="2025-03-25T01:26:00.912042240Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 25 01:26:00.912842 containerd[1497]: time="2025-03-25T01:26:00.912054880Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 25 01:26:00.912842 containerd[1497]: time="2025-03-25T01:26:00.912325240Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 25 01:26:00.912842 containerd[1497]: time="2025-03-25T01:26:00.912342400Z" level=info msg="Start snapshots syncer" Mar 25 01:26:00.912842 containerd[1497]: time="2025-03-25T01:26:00.912378760Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 25 01:26:00.913158 containerd[1497]: time="2025-03-25T01:26:00.912662680Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 25 01:26:00.913158 containerd[1497]: time="2025-03-25T01:26:00.912719480Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 25 01:26:00.913647 containerd[1497]: time="2025-03-25T01:26:00.913615000Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 25 01:26:00.913893 containerd[1497]: time="2025-03-25T01:26:00.913869320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 25 01:26:00.916845 containerd[1497]: time="2025-03-25T01:26:00.914879160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 25 01:26:00.916845 containerd[1497]: time="2025-03-25T01:26:00.914905680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 25 01:26:00.916845 containerd[1497]: time="2025-03-25T01:26:00.914917640Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 25 01:26:00.916845 containerd[1497]: time="2025-03-25T01:26:00.914933080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 25 01:26:00.916845 containerd[1497]: time="2025-03-25T01:26:00.914944160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 25 01:26:00.916845 containerd[1497]: time="2025-03-25T01:26:00.914965520Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 25 01:26:00.916845 containerd[1497]: time="2025-03-25T01:26:00.915001080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 25 01:26:00.916845 containerd[1497]: time="2025-03-25T01:26:00.915014000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 25 01:26:00.916845 containerd[1497]: time="2025-03-25T01:26:00.915024760Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 25 01:26:00.916845 containerd[1497]: time="2025-03-25T01:26:00.915067280Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 25 01:26:00.916845 containerd[1497]: time="2025-03-25T01:26:00.915087280Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 25 01:26:00.916845 containerd[1497]: time="2025-03-25T01:26:00.915096800Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 25 01:26:00.916845 containerd[1497]: time="2025-03-25T01:26:00.915107160Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 25 01:26:00.916845 containerd[1497]: time="2025-03-25T01:26:00.915114920Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 25 01:26:00.917218 containerd[1497]: time="2025-03-25T01:26:00.915124560Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 25 01:26:00.917218 containerd[1497]: time="2025-03-25T01:26:00.915135600Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 25 01:26:00.917218 containerd[1497]: time="2025-03-25T01:26:00.915219440Z" level=info msg="runtime interface created" Mar 25 01:26:00.917218 containerd[1497]: time="2025-03-25T01:26:00.915224720Z" level=info msg="created NRI interface" Mar 25 01:26:00.917218 containerd[1497]: time="2025-03-25T01:26:00.915234560Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 25 01:26:00.917218 containerd[1497]: time="2025-03-25T01:26:00.915253240Z" level=info msg="Connect containerd service" Mar 25 01:26:00.917218 containerd[1497]: time="2025-03-25T01:26:00.915292480Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 25 01:26:00.917218 containerd[1497]: time="2025-03-25T01:26:00.916060280Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 25 01:26:01.062294 tar[1482]: linux-arm64/LICENSE Mar 25 01:26:01.062401 tar[1482]: linux-arm64/README.md Mar 25 01:26:01.080076 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 25 01:26:01.102599 containerd[1497]: time="2025-03-25T01:26:01.102457400Z" level=info msg="Start subscribing containerd event" Mar 25 01:26:01.102599 containerd[1497]: time="2025-03-25T01:26:01.102556480Z" level=info msg="Start recovering state" Mar 25 01:26:01.102757 containerd[1497]: time="2025-03-25T01:26:01.102667040Z" level=info msg="Start event monitor" Mar 25 01:26:01.102757 containerd[1497]: time="2025-03-25T01:26:01.102684960Z" level=info msg="Start cni network conf syncer for default" Mar 25 01:26:01.102757 containerd[1497]: time="2025-03-25T01:26:01.102694880Z" level=info msg="Start streaming server" Mar 25 01:26:01.102757 containerd[1497]: time="2025-03-25T01:26:01.102703280Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 25 01:26:01.102757 containerd[1497]: time="2025-03-25T01:26:01.102710400Z" level=info msg="runtime interface starting up..." Mar 25 01:26:01.102757 containerd[1497]: time="2025-03-25T01:26:01.102716480Z" level=info msg="starting plugins..." Mar 25 01:26:01.102757 containerd[1497]: time="2025-03-25T01:26:01.102730400Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 25 01:26:01.103437 containerd[1497]: time="2025-03-25T01:26:01.103400240Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 25 01:26:01.103563 containerd[1497]: time="2025-03-25T01:26:01.103528400Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 25 01:26:01.105111 systemd[1]: Started containerd.service - containerd container runtime. Mar 25 01:26:01.105938 containerd[1497]: time="2025-03-25T01:26:01.105896200Z" level=info msg="containerd successfully booted in 0.242359s" Mar 25 01:26:01.199381 sshd_keygen[1505]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 25 01:26:01.227658 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 25 01:26:01.233638 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 25 01:26:01.260370 systemd[1]: issuegen.service: Deactivated successfully. Mar 25 01:26:01.260696 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 25 01:26:01.265336 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 25 01:26:01.298963 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 25 01:26:01.304581 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 25 01:26:01.307202 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Mar 25 01:26:01.309224 systemd[1]: Reached target getty.target - Login Prompts. Mar 25 01:26:01.570083 systemd-networkd[1395]: eth0: Gained IPv6LL Mar 25 01:26:01.572251 systemd-timesyncd[1376]: Network configuration changed, trying to establish connection. Mar 25 01:26:01.577736 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 25 01:26:01.580203 systemd[1]: Reached target network-online.target - Network is Online. Mar 25 01:26:01.588257 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:26:01.593148 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 25 01:26:01.652249 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 25 01:26:01.698076 systemd-networkd[1395]: eth1: Gained IPv6LL Mar 25 01:26:01.698572 systemd-timesyncd[1376]: Network configuration changed, trying to establish connection. Mar 25 01:26:02.412801 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:26:02.415136 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 25 01:26:02.417912 systemd[1]: Startup finished in 798ms (kernel) + 5.408s (initrd) + 5.359s (userspace) = 11.567s. Mar 25 01:26:02.427134 (kubelet)[1606]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:26:03.102978 kubelet[1606]: E0325 01:26:03.102830 1606 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:26:03.105170 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:26:03.105518 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:26:03.106381 systemd[1]: kubelet.service: Consumed 932ms CPU time, 234.1M memory peak. Mar 25 01:26:13.357300 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 25 01:26:13.360363 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:26:13.519901 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:26:13.531914 (kubelet)[1626]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:26:13.581677 kubelet[1626]: E0325 01:26:13.581563 1626 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:26:13.584561 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:26:13.584892 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:26:13.585715 systemd[1]: kubelet.service: Consumed 175ms CPU time, 94.3M memory peak. Mar 25 01:26:23.594702 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 25 01:26:23.597320 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:26:23.744590 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:26:23.764602 (kubelet)[1641]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:26:23.822002 kubelet[1641]: E0325 01:26:23.821912 1641 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:26:23.826262 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:26:23.826476 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:26:23.827062 systemd[1]: kubelet.service: Consumed 169ms CPU time, 95M memory peak. Mar 25 01:26:31.861305 systemd-timesyncd[1376]: Contacted time server 85.215.166.214:123 (2.flatcar.pool.ntp.org). Mar 25 01:26:31.861377 systemd-timesyncd[1376]: Initial clock synchronization to Tue 2025-03-25 01:26:32.162341 UTC. Mar 25 01:26:33.845914 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 25 01:26:33.849759 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:26:33.999347 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:26:34.011495 (kubelet)[1656]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:26:34.060835 kubelet[1656]: E0325 01:26:34.060709 1656 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:26:34.064812 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:26:34.065204 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:26:34.065783 systemd[1]: kubelet.service: Consumed 161ms CPU time, 94.5M memory peak. Mar 25 01:26:44.094794 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Mar 25 01:26:44.100671 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:26:44.255475 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:26:44.264446 (kubelet)[1672]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:26:44.310433 kubelet[1672]: E0325 01:26:44.310371 1672 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:26:44.313429 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:26:44.313581 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:26:44.314204 systemd[1]: kubelet.service: Consumed 159ms CPU time, 94M memory peak. Mar 25 01:26:45.896382 update_engine[1476]: I20250325 01:26:45.896229 1476 update_attempter.cc:509] Updating boot flags... Mar 25 01:26:45.951837 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (1688) Mar 25 01:26:46.026847 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (1684) Mar 25 01:26:46.104916 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (1684) Mar 25 01:26:54.345052 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Mar 25 01:26:54.348014 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:26:54.501161 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:26:54.521376 (kubelet)[1708]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:26:54.566325 kubelet[1708]: E0325 01:26:54.566256 1708 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:26:54.570123 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:26:54.570440 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:26:54.571069 systemd[1]: kubelet.service: Consumed 167ms CPU time, 96.2M memory peak. Mar 25 01:27:04.594551 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. Mar 25 01:27:04.597630 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:27:04.765032 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:27:04.776390 (kubelet)[1723]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:27:04.824851 kubelet[1723]: E0325 01:27:04.824552 1723 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:27:04.827996 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:27:04.828217 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:27:04.829262 systemd[1]: kubelet.service: Consumed 173ms CPU time, 96.5M memory peak. Mar 25 01:27:14.844468 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 7. Mar 25 01:27:14.847142 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:27:14.990373 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:27:15.001137 (kubelet)[1738]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:27:15.048453 kubelet[1738]: E0325 01:27:15.048377 1738 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:27:15.050779 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:27:15.051002 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:27:15.051644 systemd[1]: kubelet.service: Consumed 160ms CPU time, 92.6M memory peak. Mar 25 01:27:25.094115 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 8. Mar 25 01:27:25.097550 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:27:25.260567 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:27:25.271421 (kubelet)[1753]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:27:25.319524 kubelet[1753]: E0325 01:27:25.319424 1753 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:27:25.324251 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:27:25.324400 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:27:25.324728 systemd[1]: kubelet.service: Consumed 168ms CPU time, 92.5M memory peak. Mar 25 01:27:35.344838 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 9. Mar 25 01:27:35.347973 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:27:35.485419 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:27:35.501770 (kubelet)[1768]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:27:35.546383 kubelet[1768]: E0325 01:27:35.546309 1768 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:27:35.549237 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:27:35.549420 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:27:35.549968 systemd[1]: kubelet.service: Consumed 156ms CPU time, 91.3M memory peak. Mar 25 01:27:41.487405 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 25 01:27:41.489615 systemd[1]: Started sshd@0-78.46.211.139:22-92.255.57.132:48698.service - OpenSSH per-connection server daemon (92.255.57.132:48698). Mar 25 01:27:41.717501 sshd[1776]: Invalid user a from 92.255.57.132 port 48698 Mar 25 01:27:41.762078 sshd[1776]: Connection closed by invalid user a 92.255.57.132 port 48698 [preauth] Mar 25 01:27:41.765671 systemd[1]: sshd@0-78.46.211.139:22-92.255.57.132:48698.service: Deactivated successfully. Mar 25 01:27:45.569755 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 10. Mar 25 01:27:45.573577 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:27:45.576075 systemd[1]: Started sshd@1-78.46.211.139:22-139.178.89.65:54000.service - OpenSSH per-connection server daemon (139.178.89.65:54000). Mar 25 01:27:45.729181 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:27:45.740467 (kubelet)[1791]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:27:45.786790 kubelet[1791]: E0325 01:27:45.786730 1791 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:27:45.789656 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:27:45.790008 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:27:45.790667 systemd[1]: kubelet.service: Consumed 166ms CPU time, 96.3M memory peak. Mar 25 01:27:46.591938 sshd[1782]: Accepted publickey for core from 139.178.89.65 port 54000 ssh2: RSA SHA256:Xy1qy6Im1XRHylsMcGES+WKq7CDbUddw+Bozhds0vS4 Mar 25 01:27:46.596588 sshd-session[1782]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:27:46.610006 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 25 01:27:46.611787 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 25 01:27:46.615456 systemd-logind[1475]: New session 1 of user core. Mar 25 01:27:46.649701 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 25 01:27:46.652610 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 25 01:27:46.668613 (systemd)[1800]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 25 01:27:46.671557 systemd-logind[1475]: New session c1 of user core. Mar 25 01:27:46.805039 systemd[1800]: Queued start job for default target default.target. Mar 25 01:27:46.814622 systemd[1800]: Created slice app.slice - User Application Slice. Mar 25 01:27:46.814677 systemd[1800]: Reached target paths.target - Paths. Mar 25 01:27:46.814744 systemd[1800]: Reached target timers.target - Timers. Mar 25 01:27:46.816878 systemd[1800]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 25 01:27:46.830987 systemd[1800]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 25 01:27:46.831664 systemd[1800]: Reached target sockets.target - Sockets. Mar 25 01:27:46.832137 systemd[1800]: Reached target basic.target - Basic System. Mar 25 01:27:46.832422 systemd[1800]: Reached target default.target - Main User Target. Mar 25 01:27:46.832445 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 25 01:27:46.832950 systemd[1800]: Startup finished in 153ms. Mar 25 01:27:46.844077 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 25 01:27:47.544909 systemd[1]: Started sshd@2-78.46.211.139:22-139.178.89.65:54006.service - OpenSSH per-connection server daemon (139.178.89.65:54006). Mar 25 01:27:48.548866 sshd[1811]: Accepted publickey for core from 139.178.89.65 port 54006 ssh2: RSA SHA256:Xy1qy6Im1XRHylsMcGES+WKq7CDbUddw+Bozhds0vS4 Mar 25 01:27:48.551609 sshd-session[1811]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:27:48.557149 systemd-logind[1475]: New session 2 of user core. Mar 25 01:27:48.570133 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 25 01:27:49.238132 sshd[1813]: Connection closed by 139.178.89.65 port 54006 Mar 25 01:27:49.236171 sshd-session[1811]: pam_unix(sshd:session): session closed for user core Mar 25 01:27:49.246283 systemd[1]: sshd@2-78.46.211.139:22-139.178.89.65:54006.service: Deactivated successfully. Mar 25 01:27:49.249286 systemd[1]: session-2.scope: Deactivated successfully. Mar 25 01:27:49.253635 systemd-logind[1475]: Session 2 logged out. Waiting for processes to exit. Mar 25 01:27:49.255697 systemd-logind[1475]: Removed session 2. Mar 25 01:27:49.413112 systemd[1]: Started sshd@3-78.46.211.139:22-139.178.89.65:41132.service - OpenSSH per-connection server daemon (139.178.89.65:41132). Mar 25 01:27:50.425297 sshd[1819]: Accepted publickey for core from 139.178.89.65 port 41132 ssh2: RSA SHA256:Xy1qy6Im1XRHylsMcGES+WKq7CDbUddw+Bozhds0vS4 Mar 25 01:27:50.427253 sshd-session[1819]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:27:50.433376 systemd-logind[1475]: New session 3 of user core. Mar 25 01:27:50.440106 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 25 01:27:51.109866 sshd[1821]: Connection closed by 139.178.89.65 port 41132 Mar 25 01:27:51.110980 sshd-session[1819]: pam_unix(sshd:session): session closed for user core Mar 25 01:27:51.116452 systemd[1]: sshd@3-78.46.211.139:22-139.178.89.65:41132.service: Deactivated successfully. Mar 25 01:27:51.118725 systemd[1]: session-3.scope: Deactivated successfully. Mar 25 01:27:51.120206 systemd-logind[1475]: Session 3 logged out. Waiting for processes to exit. Mar 25 01:27:51.122744 systemd-logind[1475]: Removed session 3. Mar 25 01:27:51.284211 systemd[1]: Started sshd@4-78.46.211.139:22-139.178.89.65:41134.service - OpenSSH per-connection server daemon (139.178.89.65:41134). Mar 25 01:27:52.284776 sshd[1827]: Accepted publickey for core from 139.178.89.65 port 41134 ssh2: RSA SHA256:Xy1qy6Im1XRHylsMcGES+WKq7CDbUddw+Bozhds0vS4 Mar 25 01:27:52.287311 sshd-session[1827]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:27:52.295237 systemd-logind[1475]: New session 4 of user core. Mar 25 01:27:52.302491 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 25 01:27:52.967137 sshd[1829]: Connection closed by 139.178.89.65 port 41134 Mar 25 01:27:52.968011 sshd-session[1827]: pam_unix(sshd:session): session closed for user core Mar 25 01:27:52.971857 systemd[1]: sshd@4-78.46.211.139:22-139.178.89.65:41134.service: Deactivated successfully. Mar 25 01:27:52.973892 systemd[1]: session-4.scope: Deactivated successfully. Mar 25 01:27:52.976557 systemd-logind[1475]: Session 4 logged out. Waiting for processes to exit. Mar 25 01:27:52.978340 systemd-logind[1475]: Removed session 4. Mar 25 01:27:53.139647 systemd[1]: Started sshd@5-78.46.211.139:22-139.178.89.65:41138.service - OpenSSH per-connection server daemon (139.178.89.65:41138). Mar 25 01:27:54.135837 sshd[1835]: Accepted publickey for core from 139.178.89.65 port 41138 ssh2: RSA SHA256:Xy1qy6Im1XRHylsMcGES+WKq7CDbUddw+Bozhds0vS4 Mar 25 01:27:54.137893 sshd-session[1835]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:27:54.144099 systemd-logind[1475]: New session 5 of user core. Mar 25 01:27:54.159538 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 25 01:27:54.670314 sudo[1838]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 25 01:27:54.670680 sudo[1838]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:27:54.694038 sudo[1838]: pam_unix(sudo:session): session closed for user root Mar 25 01:27:54.853100 sshd[1837]: Connection closed by 139.178.89.65 port 41138 Mar 25 01:27:54.854377 sshd-session[1835]: pam_unix(sshd:session): session closed for user core Mar 25 01:27:54.859559 systemd[1]: sshd@5-78.46.211.139:22-139.178.89.65:41138.service: Deactivated successfully. Mar 25 01:27:54.861777 systemd[1]: session-5.scope: Deactivated successfully. Mar 25 01:27:54.864516 systemd-logind[1475]: Session 5 logged out. Waiting for processes to exit. Mar 25 01:27:54.866535 systemd-logind[1475]: Removed session 5. Mar 25 01:27:55.028238 systemd[1]: Started sshd@6-78.46.211.139:22-139.178.89.65:41154.service - OpenSSH per-connection server daemon (139.178.89.65:41154). Mar 25 01:27:55.844057 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 11. Mar 25 01:27:55.848175 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:27:55.999399 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:27:56.013442 (kubelet)[1854]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:27:56.035243 sshd[1844]: Accepted publickey for core from 139.178.89.65 port 41154 ssh2: RSA SHA256:Xy1qy6Im1XRHylsMcGES+WKq7CDbUddw+Bozhds0vS4 Mar 25 01:27:56.038646 sshd-session[1844]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:27:56.048531 systemd-logind[1475]: New session 6 of user core. Mar 25 01:27:56.050256 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 25 01:27:56.074755 kubelet[1854]: E0325 01:27:56.074669 1854 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:27:56.077972 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:27:56.078163 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:27:56.078593 systemd[1]: kubelet.service: Consumed 170ms CPU time, 96.2M memory peak. Mar 25 01:27:56.565004 sudo[1863]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 25 01:27:56.565306 sudo[1863]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:27:56.573611 sudo[1863]: pam_unix(sudo:session): session closed for user root Mar 25 01:27:56.582150 sudo[1862]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 25 01:27:56.583255 sudo[1862]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:27:56.596344 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 25 01:27:56.641619 augenrules[1885]: No rules Mar 25 01:27:56.642660 systemd[1]: audit-rules.service: Deactivated successfully. Mar 25 01:27:56.643136 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 25 01:27:56.646924 sudo[1862]: pam_unix(sudo:session): session closed for user root Mar 25 01:27:56.806925 sshd[1860]: Connection closed by 139.178.89.65 port 41154 Mar 25 01:27:56.807482 sshd-session[1844]: pam_unix(sshd:session): session closed for user core Mar 25 01:27:56.812110 systemd[1]: sshd@6-78.46.211.139:22-139.178.89.65:41154.service: Deactivated successfully. Mar 25 01:27:56.815207 systemd[1]: session-6.scope: Deactivated successfully. Mar 25 01:27:56.816286 systemd-logind[1475]: Session 6 logged out. Waiting for processes to exit. Mar 25 01:27:56.818163 systemd-logind[1475]: Removed session 6. Mar 25 01:27:56.977627 systemd[1]: Started sshd@7-78.46.211.139:22-139.178.89.65:41158.service - OpenSSH per-connection server daemon (139.178.89.65:41158). Mar 25 01:27:57.979306 sshd[1894]: Accepted publickey for core from 139.178.89.65 port 41158 ssh2: RSA SHA256:Xy1qy6Im1XRHylsMcGES+WKq7CDbUddw+Bozhds0vS4 Mar 25 01:27:57.981740 sshd-session[1894]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:27:57.988883 systemd-logind[1475]: New session 7 of user core. Mar 25 01:27:58.005213 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 25 01:27:58.503428 sudo[1897]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 25 01:27:58.503914 sudo[1897]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:27:58.844405 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 25 01:27:58.858401 (dockerd)[1915]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 25 01:27:59.119031 dockerd[1915]: time="2025-03-25T01:27:59.118612437Z" level=info msg="Starting up" Mar 25 01:27:59.126732 dockerd[1915]: time="2025-03-25T01:27:59.126662844Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 25 01:27:59.197629 dockerd[1915]: time="2025-03-25T01:27:59.197378931Z" level=info msg="Loading containers: start." Mar 25 01:27:59.393977 kernel: Initializing XFRM netlink socket Mar 25 01:27:59.491293 systemd-networkd[1395]: docker0: Link UP Mar 25 01:27:59.578792 dockerd[1915]: time="2025-03-25T01:27:59.578748124Z" level=info msg="Loading containers: done." Mar 25 01:27:59.602674 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck455429915-merged.mount: Deactivated successfully. Mar 25 01:27:59.604343 dockerd[1915]: time="2025-03-25T01:27:59.603576630Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 25 01:27:59.604343 dockerd[1915]: time="2025-03-25T01:27:59.603673682Z" level=info msg="Docker daemon" commit=c710b88579fcb5e0d53f96dcae976d79323b9166 containerd-snapshotter=false storage-driver=overlay2 version=27.4.1 Mar 25 01:27:59.604343 dockerd[1915]: time="2025-03-25T01:27:59.603898310Z" level=info msg="Daemon has completed initialization" Mar 25 01:27:59.654177 dockerd[1915]: time="2025-03-25T01:27:59.654021421Z" level=info msg="API listen on /run/docker.sock" Mar 25 01:27:59.654339 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 25 01:28:00.746065 containerd[1497]: time="2025-03-25T01:28:00.745997209Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.7\"" Mar 25 01:28:01.394240 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3040960730.mount: Deactivated successfully. Mar 25 01:28:02.268139 containerd[1497]: time="2025-03-25T01:28:02.268068434Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:28:02.270188 containerd[1497]: time="2025-03-25T01:28:02.270109801Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.7: active requests=0, bytes read=25552858" Mar 25 01:28:02.271994 containerd[1497]: time="2025-03-25T01:28:02.271922780Z" level=info msg="ImageCreate event name:\"sha256:26ae5fde2308729bfda71fa20aa73cb5a1a4490f107f62dc7e1c4c49823cc084\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:28:02.276433 containerd[1497]: time="2025-03-25T01:28:02.276329352Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:22c19cc70fe5806d0a2cb28a6b6b33fd34e6f9e50616bdf6d53649bcfafbc277\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:28:02.278209 containerd[1497]: time="2025-03-25T01:28:02.277884220Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.7\" with image id \"sha256:26ae5fde2308729bfda71fa20aa73cb5a1a4490f107f62dc7e1c4c49823cc084\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.7\", repo digest \"registry.k8s.io/kube-apiserver@sha256:22c19cc70fe5806d0a2cb28a6b6b33fd34e6f9e50616bdf6d53649bcfafbc277\", size \"25549566\" in 1.531833564s" Mar 25 01:28:02.278209 containerd[1497]: time="2025-03-25T01:28:02.277966550Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.7\" returns image reference \"sha256:26ae5fde2308729bfda71fa20aa73cb5a1a4490f107f62dc7e1c4c49823cc084\"" Mar 25 01:28:02.278907 containerd[1497]: time="2025-03-25T01:28:02.278860498Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.7\"" Mar 25 01:28:03.412750 containerd[1497]: time="2025-03-25T01:28:03.412278907Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:28:03.414394 containerd[1497]: time="2025-03-25T01:28:03.414276985Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.7: active requests=0, bytes read=22458998" Mar 25 01:28:03.415649 containerd[1497]: time="2025-03-25T01:28:03.415518534Z" level=info msg="ImageCreate event name:\"sha256:3f2886c2c7c101461e78c37591f8beb12ac073f8dcf5e32c95da9e9689d0c1d3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:28:03.420911 containerd[1497]: time="2025-03-25T01:28:03.420770641Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:6abe7a0accecf29db6ebab18a10f844678ffed693d79e2e51a18a6f2b4530cbb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:28:03.422833 containerd[1497]: time="2025-03-25T01:28:03.421668509Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.7\" with image id \"sha256:3f2886c2c7c101461e78c37591f8beb12ac073f8dcf5e32c95da9e9689d0c1d3\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.7\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:6abe7a0accecf29db6ebab18a10f844678ffed693d79e2e51a18a6f2b4530cbb\", size \"23899774\" in 1.142761165s" Mar 25 01:28:03.422833 containerd[1497]: time="2025-03-25T01:28:03.421727436Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.7\" returns image reference \"sha256:3f2886c2c7c101461e78c37591f8beb12ac073f8dcf5e32c95da9e9689d0c1d3\"" Mar 25 01:28:03.423711 containerd[1497]: time="2025-03-25T01:28:03.423420318Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.7\"" Mar 25 01:28:05.413867 containerd[1497]: time="2025-03-25T01:28:05.413782486Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:28:05.415662 containerd[1497]: time="2025-03-25T01:28:05.415519209Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.7: active requests=0, bytes read=17125849" Mar 25 01:28:05.417769 containerd[1497]: time="2025-03-25T01:28:05.416276858Z" level=info msg="ImageCreate event name:\"sha256:3dd474fdc8c0d007008dd47bafecdd344fbdace928731ae8b09f58f633f4a30f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:28:05.419680 containerd[1497]: time="2025-03-25T01:28:05.419372701Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:fb80249bcb77ee72b1c9fa5b70bc28a83ed107c9ca71957841ad91db379963bf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:28:05.421954 containerd[1497]: time="2025-03-25T01:28:05.421902677Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.7\" with image id \"sha256:3dd474fdc8c0d007008dd47bafecdd344fbdace928731ae8b09f58f633f4a30f\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.7\", repo digest \"registry.k8s.io/kube-scheduler@sha256:fb80249bcb77ee72b1c9fa5b70bc28a83ed107c9ca71957841ad91db379963bf\", size \"18566643\" in 1.998438834s" Mar 25 01:28:05.422100 containerd[1497]: time="2025-03-25T01:28:05.422084459Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.7\" returns image reference \"sha256:3dd474fdc8c0d007008dd47bafecdd344fbdace928731ae8b09f58f633f4a30f\"" Mar 25 01:28:05.424065 containerd[1497]: time="2025-03-25T01:28:05.424034327Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.7\"" Mar 25 01:28:06.094003 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 12. Mar 25 01:28:06.097665 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:28:06.267720 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:28:06.280462 (kubelet)[2188]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:28:06.351616 kubelet[2188]: E0325 01:28:06.351163 2188 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:28:06.356203 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:28:06.356354 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:28:06.356720 systemd[1]: kubelet.service: Consumed 170ms CPU time, 94.4M memory peak. Mar 25 01:28:06.500965 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2475173297.mount: Deactivated successfully. Mar 25 01:28:06.821202 containerd[1497]: time="2025-03-25T01:28:06.820185659Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:28:06.821660 containerd[1497]: time="2025-03-25T01:28:06.821593902Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.7: active requests=0, bytes read=26871941" Mar 25 01:28:06.822022 containerd[1497]: time="2025-03-25T01:28:06.821986668Z" level=info msg="ImageCreate event name:\"sha256:939054a0dc9c7c1596b061fc2380758139ce62751b44a0b21b3afc7abd7eb3ff\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:28:06.824110 containerd[1497]: time="2025-03-25T01:28:06.824057388Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:e5839270c96c3ad1bea1dce4935126d3281297527f3655408d2970aa4b5cf178\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:28:06.824875 containerd[1497]: time="2025-03-25T01:28:06.824839959Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.7\" with image id \"sha256:939054a0dc9c7c1596b061fc2380758139ce62751b44a0b21b3afc7abd7eb3ff\", repo tag \"registry.k8s.io/kube-proxy:v1.31.7\", repo digest \"registry.k8s.io/kube-proxy@sha256:e5839270c96c3ad1bea1dce4935126d3281297527f3655408d2970aa4b5cf178\", size \"26870934\" in 1.400641813s" Mar 25 01:28:06.825060 containerd[1497]: time="2025-03-25T01:28:06.825038942Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.7\" returns image reference \"sha256:939054a0dc9c7c1596b061fc2380758139ce62751b44a0b21b3afc7abd7eb3ff\"" Mar 25 01:28:06.825548 containerd[1497]: time="2025-03-25T01:28:06.825528879Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Mar 25 01:28:07.368346 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount919085950.mount: Deactivated successfully. Mar 25 01:28:08.049696 containerd[1497]: time="2025-03-25T01:28:08.048608996Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:28:08.049696 containerd[1497]: time="2025-03-25T01:28:08.049625193Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=16485461" Mar 25 01:28:08.051670 containerd[1497]: time="2025-03-25T01:28:08.051597538Z" level=info msg="ImageCreate event name:\"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:28:08.055553 containerd[1497]: time="2025-03-25T01:28:08.055500384Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:28:08.057839 containerd[1497]: time="2025-03-25T01:28:08.057150972Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"16482581\" in 1.231518281s" Mar 25 01:28:08.057839 containerd[1497]: time="2025-03-25T01:28:08.057207099Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\"" Mar 25 01:28:08.058558 containerd[1497]: time="2025-03-25T01:28:08.058443880Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Mar 25 01:28:08.603776 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount84072080.mount: Deactivated successfully. Mar 25 01:28:08.612671 containerd[1497]: time="2025-03-25T01:28:08.611895403Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:28:08.613985 containerd[1497]: time="2025-03-25T01:28:08.613888065Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" Mar 25 01:28:08.614974 containerd[1497]: time="2025-03-25T01:28:08.614868795Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:28:08.617320 containerd[1497]: time="2025-03-25T01:28:08.617227011Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:28:08.618702 containerd[1497]: time="2025-03-25T01:28:08.618191220Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 559.703935ms" Mar 25 01:28:08.618702 containerd[1497]: time="2025-03-25T01:28:08.618259866Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Mar 25 01:28:08.618881 containerd[1497]: time="2025-03-25T01:28:08.618844480Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Mar 25 01:28:09.250760 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount74467200.mount: Deactivated successfully. Mar 25 01:28:13.024493 containerd[1497]: time="2025-03-25T01:28:13.024421469Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:28:13.026886 containerd[1497]: time="2025-03-25T01:28:13.026803235Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=66406487" Mar 25 01:28:13.028780 containerd[1497]: time="2025-03-25T01:28:13.028707801Z" level=info msg="ImageCreate event name:\"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:28:13.033846 containerd[1497]: time="2025-03-25T01:28:13.032937778Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:28:13.034966 containerd[1497]: time="2025-03-25T01:28:13.034120962Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"66535646\" in 4.41524s" Mar 25 01:28:13.034966 containerd[1497]: time="2025-03-25T01:28:13.034163078Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\"" Mar 25 01:28:16.593946 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 13. Mar 25 01:28:16.597046 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:28:16.747041 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:28:16.754217 (kubelet)[2329]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:28:16.801822 kubelet[2329]: E0325 01:28:16.798845 2329 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:28:16.801521 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:28:16.801652 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:28:16.802946 systemd[1]: kubelet.service: Consumed 155ms CPU time, 96.5M memory peak. Mar 25 01:28:18.316063 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:28:18.316247 systemd[1]: kubelet.service: Consumed 155ms CPU time, 96.5M memory peak. Mar 25 01:28:18.319381 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:28:18.355159 systemd[1]: Reload requested from client PID 2344 ('systemctl') (unit session-7.scope)... Mar 25 01:28:18.355200 systemd[1]: Reloading... Mar 25 01:28:18.512851 zram_generator::config[2395]: No configuration found. Mar 25 01:28:18.617730 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:28:18.713941 systemd[1]: Reloading finished in 358 ms. Mar 25 01:28:18.777457 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Mar 25 01:28:18.777533 systemd[1]: kubelet.service: Failed with result 'signal'. Mar 25 01:28:18.778128 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:28:18.779532 systemd[1]: kubelet.service: Consumed 110ms CPU time, 82.3M memory peak. Mar 25 01:28:18.783256 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:28:18.945395 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:28:18.955420 (kubelet)[2438]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 25 01:28:19.002064 kubelet[2438]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:28:19.002064 kubelet[2438]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 25 01:28:19.002064 kubelet[2438]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:28:19.002557 kubelet[2438]: I0325 01:28:19.002355 2438 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 25 01:28:20.337055 kubelet[2438]: I0325 01:28:20.336985 2438 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" Mar 25 01:28:20.337055 kubelet[2438]: I0325 01:28:20.337056 2438 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 25 01:28:20.337620 kubelet[2438]: I0325 01:28:20.337567 2438 server.go:929] "Client rotation is on, will bootstrap in background" Mar 25 01:28:20.371570 kubelet[2438]: E0325 01:28:20.371523 2438 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://78.46.211.139:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 78.46.211.139:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:28:20.372673 kubelet[2438]: I0325 01:28:20.372361 2438 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 25 01:28:20.384933 kubelet[2438]: I0325 01:28:20.384902 2438 server.go:1426] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 25 01:28:20.389056 kubelet[2438]: I0325 01:28:20.388918 2438 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 25 01:28:20.390079 kubelet[2438]: I0325 01:28:20.389999 2438 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 25 01:28:20.390218 kubelet[2438]: I0325 01:28:20.390177 2438 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 25 01:28:20.390444 kubelet[2438]: I0325 01:28:20.390213 2438 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4284-0-0-6-22e9b0bb97","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 25 01:28:20.390634 kubelet[2438]: I0325 01:28:20.390536 2438 topology_manager.go:138] "Creating topology manager with none policy" Mar 25 01:28:20.390634 kubelet[2438]: I0325 01:28:20.390547 2438 container_manager_linux.go:300] "Creating device plugin manager" Mar 25 01:28:20.390990 kubelet[2438]: I0325 01:28:20.390741 2438 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:28:20.393407 kubelet[2438]: I0325 01:28:20.393329 2438 kubelet.go:408] "Attempting to sync node with API server" Mar 25 01:28:20.393407 kubelet[2438]: I0325 01:28:20.393378 2438 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 25 01:28:20.393407 kubelet[2438]: I0325 01:28:20.393409 2438 kubelet.go:314] "Adding apiserver pod source" Mar 25 01:28:20.393407 kubelet[2438]: I0325 01:28:20.393419 2438 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 25 01:28:20.397846 kubelet[2438]: W0325 01:28:20.397763 2438 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://78.46.211.139:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-6-22e9b0bb97&limit=500&resourceVersion=0": dial tcp 78.46.211.139:6443: connect: connection refused Mar 25 01:28:20.398646 kubelet[2438]: E0325 01:28:20.398329 2438 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://78.46.211.139:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-6-22e9b0bb97&limit=500&resourceVersion=0\": dial tcp 78.46.211.139:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:28:20.398646 kubelet[2438]: I0325 01:28:20.398446 2438 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 25 01:28:20.401133 kubelet[2438]: I0325 01:28:20.401105 2438 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 25 01:28:20.402708 kubelet[2438]: W0325 01:28:20.402677 2438 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 25 01:28:20.404170 kubelet[2438]: W0325 01:28:20.403695 2438 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://78.46.211.139:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 78.46.211.139:6443: connect: connection refused Mar 25 01:28:20.404170 kubelet[2438]: E0325 01:28:20.403751 2438 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://78.46.211.139:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 78.46.211.139:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:28:20.404170 kubelet[2438]: I0325 01:28:20.404065 2438 server.go:1269] "Started kubelet" Mar 25 01:28:20.406845 kubelet[2438]: I0325 01:28:20.405373 2438 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 25 01:28:20.406845 kubelet[2438]: I0325 01:28:20.406552 2438 server.go:460] "Adding debug handlers to kubelet server" Mar 25 01:28:20.407076 kubelet[2438]: I0325 01:28:20.407007 2438 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 25 01:28:20.407427 kubelet[2438]: I0325 01:28:20.407396 2438 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 25 01:28:20.409005 kubelet[2438]: E0325 01:28:20.407675 2438 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://78.46.211.139:6443/api/v1/namespaces/default/events\": dial tcp 78.46.211.139:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4284-0-0-6-22e9b0bb97.182fe77bf5a6ebb1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4284-0-0-6-22e9b0bb97,UID:ci-4284-0-0-6-22e9b0bb97,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4284-0-0-6-22e9b0bb97,},FirstTimestamp:2025-03-25 01:28:20.404022193 +0000 UTC m=+1.444614093,LastTimestamp:2025-03-25 01:28:20.404022193 +0000 UTC m=+1.444614093,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4284-0-0-6-22e9b0bb97,}" Mar 25 01:28:20.411462 kubelet[2438]: E0325 01:28:20.411424 2438 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 25 01:28:20.412420 kubelet[2438]: I0325 01:28:20.412397 2438 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 25 01:28:20.417184 kubelet[2438]: I0325 01:28:20.417158 2438 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 25 01:28:20.417388 kubelet[2438]: I0325 01:28:20.413623 2438 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 25 01:28:20.417624 kubelet[2438]: I0325 01:28:20.417604 2438 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 25 01:28:20.417746 kubelet[2438]: I0325 01:28:20.417735 2438 reconciler.go:26] "Reconciler: start to sync state" Mar 25 01:28:20.418734 kubelet[2438]: W0325 01:28:20.418681 2438 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://78.46.211.139:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 78.46.211.139:6443: connect: connection refused Mar 25 01:28:20.418948 kubelet[2438]: E0325 01:28:20.418928 2438 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://78.46.211.139:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 78.46.211.139:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:28:20.419516 kubelet[2438]: E0325 01:28:20.419474 2438 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284-0-0-6-22e9b0bb97\" not found" Mar 25 01:28:20.420086 kubelet[2438]: E0325 01:28:20.420006 2438 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://78.46.211.139:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-6-22e9b0bb97?timeout=10s\": dial tcp 78.46.211.139:6443: connect: connection refused" interval="200ms" Mar 25 01:28:20.424256 kubelet[2438]: I0325 01:28:20.424221 2438 factory.go:221] Registration of the containerd container factory successfully Mar 25 01:28:20.424256 kubelet[2438]: I0325 01:28:20.424247 2438 factory.go:221] Registration of the systemd container factory successfully Mar 25 01:28:20.424394 kubelet[2438]: I0325 01:28:20.424335 2438 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 25 01:28:20.446136 kubelet[2438]: I0325 01:28:20.445745 2438 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 25 01:28:20.446572 kubelet[2438]: I0325 01:28:20.446479 2438 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 25 01:28:20.446572 kubelet[2438]: I0325 01:28:20.446506 2438 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 25 01:28:20.446572 kubelet[2438]: I0325 01:28:20.446529 2438 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:28:20.449006 kubelet[2438]: I0325 01:28:20.447986 2438 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 25 01:28:20.449006 kubelet[2438]: I0325 01:28:20.448020 2438 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 25 01:28:20.449006 kubelet[2438]: I0325 01:28:20.448055 2438 kubelet.go:2321] "Starting kubelet main sync loop" Mar 25 01:28:20.449006 kubelet[2438]: E0325 01:28:20.448126 2438 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 25 01:28:20.450921 kubelet[2438]: I0325 01:28:20.450888 2438 policy_none.go:49] "None policy: Start" Mar 25 01:28:20.453471 kubelet[2438]: W0325 01:28:20.453408 2438 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://78.46.211.139:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 78.46.211.139:6443: connect: connection refused Mar 25 01:28:20.453642 kubelet[2438]: E0325 01:28:20.453618 2438 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://78.46.211.139:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 78.46.211.139:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:28:20.454388 kubelet[2438]: I0325 01:28:20.454363 2438 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 25 01:28:20.454632 kubelet[2438]: I0325 01:28:20.454617 2438 state_mem.go:35] "Initializing new in-memory state store" Mar 25 01:28:20.462355 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 25 01:28:20.478242 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 25 01:28:20.484146 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 25 01:28:20.496315 kubelet[2438]: I0325 01:28:20.496265 2438 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 25 01:28:20.497140 kubelet[2438]: I0325 01:28:20.497106 2438 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 25 01:28:20.498707 kubelet[2438]: I0325 01:28:20.497137 2438 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 25 01:28:20.498707 kubelet[2438]: I0325 01:28:20.497481 2438 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 25 01:28:20.500508 kubelet[2438]: E0325 01:28:20.500482 2438 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4284-0-0-6-22e9b0bb97\" not found" Mar 25 01:28:20.563294 systemd[1]: Created slice kubepods-burstable-pode3fafa487bd60eaa4b00ee05814ddac8.slice - libcontainer container kubepods-burstable-pode3fafa487bd60eaa4b00ee05814ddac8.slice. Mar 25 01:28:20.575688 systemd[1]: Created slice kubepods-burstable-pod55700de9767ba4988f2925b242cf77e0.slice - libcontainer container kubepods-burstable-pod55700de9767ba4988f2925b242cf77e0.slice. Mar 25 01:28:20.582957 systemd[1]: Created slice kubepods-burstable-podd47669bb28818cbe71f3f62ea49c10d2.slice - libcontainer container kubepods-burstable-podd47669bb28818cbe71f3f62ea49c10d2.slice. Mar 25 01:28:20.600107 kubelet[2438]: I0325 01:28:20.599993 2438 kubelet_node_status.go:72] "Attempting to register node" node="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:28:20.600878 kubelet[2438]: E0325 01:28:20.600829 2438 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://78.46.211.139:6443/api/v1/nodes\": dial tcp 78.46.211.139:6443: connect: connection refused" node="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:28:20.620717 kubelet[2438]: E0325 01:28:20.620589 2438 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://78.46.211.139:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-6-22e9b0bb97?timeout=10s\": dial tcp 78.46.211.139:6443: connect: connection refused" interval="400ms" Mar 25 01:28:20.719355 kubelet[2438]: I0325 01:28:20.719236 2438 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e3fafa487bd60eaa4b00ee05814ddac8-ca-certs\") pod \"kube-controller-manager-ci-4284-0-0-6-22e9b0bb97\" (UID: \"e3fafa487bd60eaa4b00ee05814ddac8\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-6-22e9b0bb97" Mar 25 01:28:20.719355 kubelet[2438]: I0325 01:28:20.719304 2438 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e3fafa487bd60eaa4b00ee05814ddac8-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4284-0-0-6-22e9b0bb97\" (UID: \"e3fafa487bd60eaa4b00ee05814ddac8\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-6-22e9b0bb97" Mar 25 01:28:20.719355 kubelet[2438]: I0325 01:28:20.719326 2438 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/55700de9767ba4988f2925b242cf77e0-kubeconfig\") pod \"kube-scheduler-ci-4284-0-0-6-22e9b0bb97\" (UID: \"55700de9767ba4988f2925b242cf77e0\") " pod="kube-system/kube-scheduler-ci-4284-0-0-6-22e9b0bb97" Mar 25 01:28:20.719355 kubelet[2438]: I0325 01:28:20.719358 2438 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d47669bb28818cbe71f3f62ea49c10d2-ca-certs\") pod \"kube-apiserver-ci-4284-0-0-6-22e9b0bb97\" (UID: \"d47669bb28818cbe71f3f62ea49c10d2\") " pod="kube-system/kube-apiserver-ci-4284-0-0-6-22e9b0bb97" Mar 25 01:28:20.719355 kubelet[2438]: I0325 01:28:20.719378 2438 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d47669bb28818cbe71f3f62ea49c10d2-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4284-0-0-6-22e9b0bb97\" (UID: \"d47669bb28818cbe71f3f62ea49c10d2\") " pod="kube-system/kube-apiserver-ci-4284-0-0-6-22e9b0bb97" Mar 25 01:28:20.719792 kubelet[2438]: I0325 01:28:20.719395 2438 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/e3fafa487bd60eaa4b00ee05814ddac8-flexvolume-dir\") pod \"kube-controller-manager-ci-4284-0-0-6-22e9b0bb97\" (UID: \"e3fafa487bd60eaa4b00ee05814ddac8\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-6-22e9b0bb97" Mar 25 01:28:20.719792 kubelet[2438]: I0325 01:28:20.719412 2438 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e3fafa487bd60eaa4b00ee05814ddac8-k8s-certs\") pod \"kube-controller-manager-ci-4284-0-0-6-22e9b0bb97\" (UID: \"e3fafa487bd60eaa4b00ee05814ddac8\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-6-22e9b0bb97" Mar 25 01:28:20.719792 kubelet[2438]: I0325 01:28:20.719444 2438 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e3fafa487bd60eaa4b00ee05814ddac8-kubeconfig\") pod \"kube-controller-manager-ci-4284-0-0-6-22e9b0bb97\" (UID: \"e3fafa487bd60eaa4b00ee05814ddac8\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-6-22e9b0bb97" Mar 25 01:28:20.719792 kubelet[2438]: I0325 01:28:20.719466 2438 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d47669bb28818cbe71f3f62ea49c10d2-k8s-certs\") pod \"kube-apiserver-ci-4284-0-0-6-22e9b0bb97\" (UID: \"d47669bb28818cbe71f3f62ea49c10d2\") " pod="kube-system/kube-apiserver-ci-4284-0-0-6-22e9b0bb97" Mar 25 01:28:20.805364 kubelet[2438]: I0325 01:28:20.804846 2438 kubelet_node_status.go:72] "Attempting to register node" node="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:28:20.805364 kubelet[2438]: E0325 01:28:20.805246 2438 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://78.46.211.139:6443/api/v1/nodes\": dial tcp 78.46.211.139:6443: connect: connection refused" node="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:28:20.873896 containerd[1497]: time="2025-03-25T01:28:20.873691083Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4284-0-0-6-22e9b0bb97,Uid:e3fafa487bd60eaa4b00ee05814ddac8,Namespace:kube-system,Attempt:0,}" Mar 25 01:28:20.880250 containerd[1497]: time="2025-03-25T01:28:20.880193242Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4284-0-0-6-22e9b0bb97,Uid:55700de9767ba4988f2925b242cf77e0,Namespace:kube-system,Attempt:0,}" Mar 25 01:28:20.888124 containerd[1497]: time="2025-03-25T01:28:20.888030614Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4284-0-0-6-22e9b0bb97,Uid:d47669bb28818cbe71f3f62ea49c10d2,Namespace:kube-system,Attempt:0,}" Mar 25 01:28:20.914956 containerd[1497]: time="2025-03-25T01:28:20.913563311Z" level=info msg="connecting to shim e8ac010563523da5fb45689be3d9910c2d613c18d836b8a03afd62bb034b3024" address="unix:///run/containerd/s/d3586e5775577857858f36607ab0842535a9c69bf2822a3ce89409b2757d354f" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:28:20.934125 containerd[1497]: time="2025-03-25T01:28:20.933946343Z" level=info msg="connecting to shim 7b30b9f3e9896b96ac5743838a8fc19bd2a47ff77c67e20175a44b58c3c5d4e8" address="unix:///run/containerd/s/21b88439c3ef7269e1364b6af10071e15ad94baab75d2f4d345c00c492d208b0" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:28:20.949865 containerd[1497]: time="2025-03-25T01:28:20.949330502Z" level=info msg="connecting to shim 664e31135accd76096b1e83f82181f116de9ac169c6ef41c37a1693c4386d592" address="unix:///run/containerd/s/31706c4e00910f4264babfeedd99cd0e8ec8675108066916a69d10280f7aaf9f" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:28:20.977059 systemd[1]: Started cri-containerd-7b30b9f3e9896b96ac5743838a8fc19bd2a47ff77c67e20175a44b58c3c5d4e8.scope - libcontainer container 7b30b9f3e9896b96ac5743838a8fc19bd2a47ff77c67e20175a44b58c3c5d4e8. Mar 25 01:28:20.978307 systemd[1]: Started cri-containerd-e8ac010563523da5fb45689be3d9910c2d613c18d836b8a03afd62bb034b3024.scope - libcontainer container e8ac010563523da5fb45689be3d9910c2d613c18d836b8a03afd62bb034b3024. Mar 25 01:28:20.999910 systemd[1]: Started cri-containerd-664e31135accd76096b1e83f82181f116de9ac169c6ef41c37a1693c4386d592.scope - libcontainer container 664e31135accd76096b1e83f82181f116de9ac169c6ef41c37a1693c4386d592. Mar 25 01:28:21.022149 kubelet[2438]: E0325 01:28:21.021319 2438 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://78.46.211.139:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-6-22e9b0bb97?timeout=10s\": dial tcp 78.46.211.139:6443: connect: connection refused" interval="800ms" Mar 25 01:28:21.056514 containerd[1497]: time="2025-03-25T01:28:21.056418867Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4284-0-0-6-22e9b0bb97,Uid:e3fafa487bd60eaa4b00ee05814ddac8,Namespace:kube-system,Attempt:0,} returns sandbox id \"e8ac010563523da5fb45689be3d9910c2d613c18d836b8a03afd62bb034b3024\"" Mar 25 01:28:21.064150 containerd[1497]: time="2025-03-25T01:28:21.063662098Z" level=info msg="CreateContainer within sandbox \"e8ac010563523da5fb45689be3d9910c2d613c18d836b8a03afd62bb034b3024\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 25 01:28:21.075662 containerd[1497]: time="2025-03-25T01:28:21.075616954Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4284-0-0-6-22e9b0bb97,Uid:55700de9767ba4988f2925b242cf77e0,Namespace:kube-system,Attempt:0,} returns sandbox id \"7b30b9f3e9896b96ac5743838a8fc19bd2a47ff77c67e20175a44b58c3c5d4e8\"" Mar 25 01:28:21.081335 containerd[1497]: time="2025-03-25T01:28:21.081280017Z" level=info msg="CreateContainer within sandbox \"7b30b9f3e9896b96ac5743838a8fc19bd2a47ff77c67e20175a44b58c3c5d4e8\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 25 01:28:21.082772 containerd[1497]: time="2025-03-25T01:28:21.082738550Z" level=info msg="Container 110a62049e4d4db95ee2ef19440dab5f292ebf3b319ad9e9dbe7e11d30306006: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:28:21.087404 containerd[1497]: time="2025-03-25T01:28:21.087348101Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4284-0-0-6-22e9b0bb97,Uid:d47669bb28818cbe71f3f62ea49c10d2,Namespace:kube-system,Attempt:0,} returns sandbox id \"664e31135accd76096b1e83f82181f116de9ac169c6ef41c37a1693c4386d592\"" Mar 25 01:28:21.091397 containerd[1497]: time="2025-03-25T01:28:21.091351519Z" level=info msg="CreateContainer within sandbox \"664e31135accd76096b1e83f82181f116de9ac169c6ef41c37a1693c4386d592\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 25 01:28:21.097510 containerd[1497]: time="2025-03-25T01:28:21.097459041Z" level=info msg="CreateContainer within sandbox \"e8ac010563523da5fb45689be3d9910c2d613c18d836b8a03afd62bb034b3024\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"110a62049e4d4db95ee2ef19440dab5f292ebf3b319ad9e9dbe7e11d30306006\"" Mar 25 01:28:21.099858 containerd[1497]: time="2025-03-25T01:28:21.098318722Z" level=info msg="StartContainer for \"110a62049e4d4db95ee2ef19440dab5f292ebf3b319ad9e9dbe7e11d30306006\"" Mar 25 01:28:21.100220 containerd[1497]: time="2025-03-25T01:28:21.100172477Z" level=info msg="connecting to shim 110a62049e4d4db95ee2ef19440dab5f292ebf3b319ad9e9dbe7e11d30306006" address="unix:///run/containerd/s/d3586e5775577857858f36607ab0842535a9c69bf2822a3ce89409b2757d354f" protocol=ttrpc version=3 Mar 25 01:28:21.101385 containerd[1497]: time="2025-03-25T01:28:21.101346464Z" level=info msg="Container f43d9153ea7d829f5362a6d6df47a4a9daae7e9ee2e3be5ac92185a84b3948f5: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:28:21.112847 containerd[1497]: time="2025-03-25T01:28:21.112610832Z" level=info msg="Container bc38fa05279ef37ba0c197cbc6fd4ce4e6370046a1fcfda9198fd8d7bb597ee8: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:28:21.116933 containerd[1497]: time="2025-03-25T01:28:21.116882117Z" level=info msg="CreateContainer within sandbox \"7b30b9f3e9896b96ac5743838a8fc19bd2a47ff77c67e20175a44b58c3c5d4e8\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"f43d9153ea7d829f5362a6d6df47a4a9daae7e9ee2e3be5ac92185a84b3948f5\"" Mar 25 01:28:21.117536 containerd[1497]: time="2025-03-25T01:28:21.117415373Z" level=info msg="StartContainer for \"f43d9153ea7d829f5362a6d6df47a4a9daae7e9ee2e3be5ac92185a84b3948f5\"" Mar 25 01:28:21.119892 containerd[1497]: time="2025-03-25T01:28:21.119850342Z" level=info msg="connecting to shim f43d9153ea7d829f5362a6d6df47a4a9daae7e9ee2e3be5ac92185a84b3948f5" address="unix:///run/containerd/s/21b88439c3ef7269e1364b6af10071e15ad94baab75d2f4d345c00c492d208b0" protocol=ttrpc version=3 Mar 25 01:28:21.123153 systemd[1]: Started cri-containerd-110a62049e4d4db95ee2ef19440dab5f292ebf3b319ad9e9dbe7e11d30306006.scope - libcontainer container 110a62049e4d4db95ee2ef19440dab5f292ebf3b319ad9e9dbe7e11d30306006. Mar 25 01:28:21.134103 containerd[1497]: time="2025-03-25T01:28:21.132863671Z" level=info msg="CreateContainer within sandbox \"664e31135accd76096b1e83f82181f116de9ac169c6ef41c37a1693c4386d592\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"bc38fa05279ef37ba0c197cbc6fd4ce4e6370046a1fcfda9198fd8d7bb597ee8\"" Mar 25 01:28:21.134103 containerd[1497]: time="2025-03-25T01:28:21.133501762Z" level=info msg="StartContainer for \"bc38fa05279ef37ba0c197cbc6fd4ce4e6370046a1fcfda9198fd8d7bb597ee8\"" Mar 25 01:28:21.137764 containerd[1497]: time="2025-03-25T01:28:21.137709250Z" level=info msg="connecting to shim bc38fa05279ef37ba0c197cbc6fd4ce4e6370046a1fcfda9198fd8d7bb597ee8" address="unix:///run/containerd/s/31706c4e00910f4264babfeedd99cd0e8ec8675108066916a69d10280f7aaf9f" protocol=ttrpc version=3 Mar 25 01:28:21.153743 systemd[1]: Started cri-containerd-f43d9153ea7d829f5362a6d6df47a4a9daae7e9ee2e3be5ac92185a84b3948f5.scope - libcontainer container f43d9153ea7d829f5362a6d6df47a4a9daae7e9ee2e3be5ac92185a84b3948f5. Mar 25 01:28:21.164087 systemd[1]: Started cri-containerd-bc38fa05279ef37ba0c197cbc6fd4ce4e6370046a1fcfda9198fd8d7bb597ee8.scope - libcontainer container bc38fa05279ef37ba0c197cbc6fd4ce4e6370046a1fcfda9198fd8d7bb597ee8. Mar 25 01:28:21.194567 containerd[1497]: time="2025-03-25T01:28:21.194514227Z" level=info msg="StartContainer for \"110a62049e4d4db95ee2ef19440dab5f292ebf3b319ad9e9dbe7e11d30306006\" returns successfully" Mar 25 01:28:21.207845 kubelet[2438]: I0325 01:28:21.207165 2438 kubelet_node_status.go:72] "Attempting to register node" node="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:28:21.207845 kubelet[2438]: E0325 01:28:21.207524 2438 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://78.46.211.139:6443/api/v1/nodes\": dial tcp 78.46.211.139:6443: connect: connection refused" node="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:28:21.255054 containerd[1497]: time="2025-03-25T01:28:21.254491099Z" level=info msg="StartContainer for \"f43d9153ea7d829f5362a6d6df47a4a9daae7e9ee2e3be5ac92185a84b3948f5\" returns successfully" Mar 25 01:28:21.255201 containerd[1497]: time="2025-03-25T01:28:21.255003676Z" level=info msg="StartContainer for \"bc38fa05279ef37ba0c197cbc6fd4ce4e6370046a1fcfda9198fd8d7bb597ee8\" returns successfully" Mar 25 01:28:22.010191 kubelet[2438]: I0325 01:28:22.010116 2438 kubelet_node_status.go:72] "Attempting to register node" node="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:28:23.405135 kubelet[2438]: I0325 01:28:23.404894 2438 apiserver.go:52] "Watching apiserver" Mar 25 01:28:23.429703 kubelet[2438]: E0325 01:28:23.429652 2438 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4284-0-0-6-22e9b0bb97\" not found" node="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:28:23.502504 kubelet[2438]: I0325 01:28:23.502462 2438 kubelet_node_status.go:75] "Successfully registered node" node="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:28:23.517928 kubelet[2438]: I0325 01:28:23.517794 2438 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 25 01:28:25.905841 systemd[1]: Reload requested from client PID 2706 ('systemctl') (unit session-7.scope)... Mar 25 01:28:25.906207 systemd[1]: Reloading... Mar 25 01:28:26.017887 zram_generator::config[2754]: No configuration found. Mar 25 01:28:26.125045 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:28:26.230284 systemd[1]: Reloading finished in 323 ms. Mar 25 01:28:26.256728 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:28:26.273030 systemd[1]: kubelet.service: Deactivated successfully. Mar 25 01:28:26.273342 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:28:26.273411 systemd[1]: kubelet.service: Consumed 1.925s CPU time, 116.6M memory peak. Mar 25 01:28:26.276318 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:28:26.444186 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:28:26.457610 (kubelet)[2795]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 25 01:28:26.512515 kubelet[2795]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:28:26.512515 kubelet[2795]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 25 01:28:26.512515 kubelet[2795]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:28:26.512515 kubelet[2795]: I0325 01:28:26.511041 2795 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 25 01:28:26.524920 kubelet[2795]: I0325 01:28:26.523647 2795 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" Mar 25 01:28:26.525206 kubelet[2795]: I0325 01:28:26.525185 2795 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 25 01:28:26.525831 kubelet[2795]: I0325 01:28:26.525778 2795 server.go:929] "Client rotation is on, will bootstrap in background" Mar 25 01:28:26.528863 kubelet[2795]: I0325 01:28:26.528138 2795 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 25 01:28:26.532180 kubelet[2795]: I0325 01:28:26.532139 2795 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 25 01:28:26.539088 kubelet[2795]: I0325 01:28:26.539057 2795 server.go:1426] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 25 01:28:26.543554 kubelet[2795]: I0325 01:28:26.543525 2795 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 25 01:28:26.543748 kubelet[2795]: I0325 01:28:26.543677 2795 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 25 01:28:26.543843 kubelet[2795]: I0325 01:28:26.543785 2795 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 25 01:28:26.544015 kubelet[2795]: I0325 01:28:26.543844 2795 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4284-0-0-6-22e9b0bb97","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 25 01:28:26.544107 kubelet[2795]: I0325 01:28:26.544053 2795 topology_manager.go:138] "Creating topology manager with none policy" Mar 25 01:28:26.544107 kubelet[2795]: I0325 01:28:26.544063 2795 container_manager_linux.go:300] "Creating device plugin manager" Mar 25 01:28:26.544107 kubelet[2795]: I0325 01:28:26.544099 2795 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:28:26.544275 kubelet[2795]: I0325 01:28:26.544259 2795 kubelet.go:408] "Attempting to sync node with API server" Mar 25 01:28:26.544314 kubelet[2795]: I0325 01:28:26.544284 2795 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 25 01:28:26.545063 kubelet[2795]: I0325 01:28:26.544892 2795 kubelet.go:314] "Adding apiserver pod source" Mar 25 01:28:26.545063 kubelet[2795]: I0325 01:28:26.545021 2795 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 25 01:28:26.549663 kubelet[2795]: I0325 01:28:26.549539 2795 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 25 01:28:26.551021 kubelet[2795]: I0325 01:28:26.550992 2795 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 25 01:28:26.562790 kubelet[2795]: I0325 01:28:26.562759 2795 server.go:1269] "Started kubelet" Mar 25 01:28:26.565519 kubelet[2795]: I0325 01:28:26.565345 2795 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 25 01:28:26.579892 kubelet[2795]: I0325 01:28:26.577115 2795 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 25 01:28:26.580480 kubelet[2795]: I0325 01:28:26.580451 2795 server.go:460] "Adding debug handlers to kubelet server" Mar 25 01:28:26.581529 kubelet[2795]: I0325 01:28:26.581471 2795 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 25 01:28:26.582061 kubelet[2795]: I0325 01:28:26.582044 2795 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 25 01:28:26.582654 kubelet[2795]: I0325 01:28:26.582472 2795 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 25 01:28:26.586611 kubelet[2795]: I0325 01:28:26.586513 2795 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 25 01:28:26.586611 kubelet[2795]: I0325 01:28:26.586618 2795 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 25 01:28:26.586790 kubelet[2795]: I0325 01:28:26.586771 2795 reconciler.go:26] "Reconciler: start to sync state" Mar 25 01:28:26.588902 kubelet[2795]: I0325 01:28:26.588748 2795 factory.go:221] Registration of the systemd container factory successfully Mar 25 01:28:26.588902 kubelet[2795]: I0325 01:28:26.588885 2795 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 25 01:28:26.592143 kubelet[2795]: E0325 01:28:26.592116 2795 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 25 01:28:26.595840 kubelet[2795]: I0325 01:28:26.595573 2795 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 25 01:28:26.596874 kubelet[2795]: I0325 01:28:26.596853 2795 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 25 01:28:26.597128 kubelet[2795]: I0325 01:28:26.597118 2795 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 25 01:28:26.597525 kubelet[2795]: I0325 01:28:26.597194 2795 kubelet.go:2321] "Starting kubelet main sync loop" Mar 25 01:28:26.597525 kubelet[2795]: E0325 01:28:26.597248 2795 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 25 01:28:26.601946 kubelet[2795]: I0325 01:28:26.601923 2795 factory.go:221] Registration of the containerd container factory successfully Mar 25 01:28:26.663967 kubelet[2795]: I0325 01:28:26.663911 2795 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 25 01:28:26.663967 kubelet[2795]: I0325 01:28:26.663949 2795 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 25 01:28:26.664214 kubelet[2795]: I0325 01:28:26.663987 2795 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:28:26.664507 kubelet[2795]: I0325 01:28:26.664457 2795 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 25 01:28:26.664555 kubelet[2795]: I0325 01:28:26.664497 2795 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 25 01:28:26.664555 kubelet[2795]: I0325 01:28:26.664532 2795 policy_none.go:49] "None policy: Start" Mar 25 01:28:26.665913 kubelet[2795]: I0325 01:28:26.665889 2795 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 25 01:28:26.666830 kubelet[2795]: I0325 01:28:26.666056 2795 state_mem.go:35] "Initializing new in-memory state store" Mar 25 01:28:26.666830 kubelet[2795]: I0325 01:28:26.666286 2795 state_mem.go:75] "Updated machine memory state" Mar 25 01:28:26.674147 kubelet[2795]: I0325 01:28:26.673907 2795 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 25 01:28:26.674147 kubelet[2795]: I0325 01:28:26.674133 2795 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 25 01:28:26.674361 kubelet[2795]: I0325 01:28:26.674146 2795 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 25 01:28:26.675495 kubelet[2795]: I0325 01:28:26.675198 2795 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 25 01:28:26.711387 kubelet[2795]: E0325 01:28:26.711291 2795 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4284-0-0-6-22e9b0bb97\" already exists" pod="kube-system/kube-apiserver-ci-4284-0-0-6-22e9b0bb97" Mar 25 01:28:26.711744 kubelet[2795]: E0325 01:28:26.711566 2795 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-ci-4284-0-0-6-22e9b0bb97\" already exists" pod="kube-system/kube-controller-manager-ci-4284-0-0-6-22e9b0bb97" Mar 25 01:28:26.785066 kubelet[2795]: I0325 01:28:26.784914 2795 kubelet_node_status.go:72] "Attempting to register node" node="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:28:26.797818 kubelet[2795]: I0325 01:28:26.797778 2795 kubelet_node_status.go:111] "Node was previously registered" node="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:28:26.798026 kubelet[2795]: I0325 01:28:26.797892 2795 kubelet_node_status.go:75] "Successfully registered node" node="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:28:26.888369 kubelet[2795]: I0325 01:28:26.888082 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d47669bb28818cbe71f3f62ea49c10d2-ca-certs\") pod \"kube-apiserver-ci-4284-0-0-6-22e9b0bb97\" (UID: \"d47669bb28818cbe71f3f62ea49c10d2\") " pod="kube-system/kube-apiserver-ci-4284-0-0-6-22e9b0bb97" Mar 25 01:28:26.888369 kubelet[2795]: I0325 01:28:26.888127 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d47669bb28818cbe71f3f62ea49c10d2-k8s-certs\") pod \"kube-apiserver-ci-4284-0-0-6-22e9b0bb97\" (UID: \"d47669bb28818cbe71f3f62ea49c10d2\") " pod="kube-system/kube-apiserver-ci-4284-0-0-6-22e9b0bb97" Mar 25 01:28:26.888369 kubelet[2795]: I0325 01:28:26.888152 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d47669bb28818cbe71f3f62ea49c10d2-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4284-0-0-6-22e9b0bb97\" (UID: \"d47669bb28818cbe71f3f62ea49c10d2\") " pod="kube-system/kube-apiserver-ci-4284-0-0-6-22e9b0bb97" Mar 25 01:28:26.888369 kubelet[2795]: I0325 01:28:26.888171 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e3fafa487bd60eaa4b00ee05814ddac8-ca-certs\") pod \"kube-controller-manager-ci-4284-0-0-6-22e9b0bb97\" (UID: \"e3fafa487bd60eaa4b00ee05814ddac8\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-6-22e9b0bb97" Mar 25 01:28:26.888369 kubelet[2795]: I0325 01:28:26.888199 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/e3fafa487bd60eaa4b00ee05814ddac8-flexvolume-dir\") pod \"kube-controller-manager-ci-4284-0-0-6-22e9b0bb97\" (UID: \"e3fafa487bd60eaa4b00ee05814ddac8\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-6-22e9b0bb97" Mar 25 01:28:26.888623 kubelet[2795]: I0325 01:28:26.888217 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e3fafa487bd60eaa4b00ee05814ddac8-k8s-certs\") pod \"kube-controller-manager-ci-4284-0-0-6-22e9b0bb97\" (UID: \"e3fafa487bd60eaa4b00ee05814ddac8\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-6-22e9b0bb97" Mar 25 01:28:26.888623 kubelet[2795]: I0325 01:28:26.888239 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/55700de9767ba4988f2925b242cf77e0-kubeconfig\") pod \"kube-scheduler-ci-4284-0-0-6-22e9b0bb97\" (UID: \"55700de9767ba4988f2925b242cf77e0\") " pod="kube-system/kube-scheduler-ci-4284-0-0-6-22e9b0bb97" Mar 25 01:28:26.888623 kubelet[2795]: I0325 01:28:26.888257 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e3fafa487bd60eaa4b00ee05814ddac8-kubeconfig\") pod \"kube-controller-manager-ci-4284-0-0-6-22e9b0bb97\" (UID: \"e3fafa487bd60eaa4b00ee05814ddac8\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-6-22e9b0bb97" Mar 25 01:28:26.888623 kubelet[2795]: I0325 01:28:26.888277 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e3fafa487bd60eaa4b00ee05814ddac8-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4284-0-0-6-22e9b0bb97\" (UID: \"e3fafa487bd60eaa4b00ee05814ddac8\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-6-22e9b0bb97" Mar 25 01:28:27.547931 kubelet[2795]: I0325 01:28:27.547891 2795 apiserver.go:52] "Watching apiserver" Mar 25 01:28:27.587346 kubelet[2795]: I0325 01:28:27.587293 2795 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 25 01:28:27.714849 kubelet[2795]: E0325 01:28:27.713345 2795 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4284-0-0-6-22e9b0bb97\" already exists" pod="kube-system/kube-apiserver-ci-4284-0-0-6-22e9b0bb97" Mar 25 01:28:27.771080 kubelet[2795]: I0325 01:28:27.770972 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4284-0-0-6-22e9b0bb97" podStartSLOduration=2.77094655 podStartE2EDuration="2.77094655s" podCreationTimestamp="2025-03-25 01:28:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:28:27.749479867 +0000 UTC m=+1.285593603" watchObservedRunningTime="2025-03-25 01:28:27.77094655 +0000 UTC m=+1.307060326" Mar 25 01:28:27.771783 kubelet[2795]: I0325 01:28:27.771717 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4284-0-0-6-22e9b0bb97" podStartSLOduration=1.771700172 podStartE2EDuration="1.771700172s" podCreationTimestamp="2025-03-25 01:28:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:28:27.771439138 +0000 UTC m=+1.307552954" watchObservedRunningTime="2025-03-25 01:28:27.771700172 +0000 UTC m=+1.307813948" Mar 25 01:28:27.822423 kubelet[2795]: I0325 01:28:27.822198 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4284-0-0-6-22e9b0bb97" podStartSLOduration=3.822178957 podStartE2EDuration="3.822178957s" podCreationTimestamp="2025-03-25 01:28:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:28:27.80001597 +0000 UTC m=+1.336129786" watchObservedRunningTime="2025-03-25 01:28:27.822178957 +0000 UTC m=+1.358292693" Mar 25 01:28:30.900785 kubelet[2795]: I0325 01:28:30.900717 2795 kuberuntime_manager.go:1633] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 25 01:28:30.903122 containerd[1497]: time="2025-03-25T01:28:30.901581785Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 25 01:28:30.903476 kubelet[2795]: I0325 01:28:30.901945 2795 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 25 01:28:31.183198 systemd[1]: Created slice kubepods-besteffort-pod9449375f_ee97_496b_8cfd_67b096cc8324.slice - libcontainer container kubepods-besteffort-pod9449375f_ee97_496b_8cfd_67b096cc8324.slice. Mar 25 01:28:31.213472 kubelet[2795]: I0325 01:28:31.213209 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/9449375f-ee97-496b-8cfd-67b096cc8324-kube-proxy\") pod \"kube-proxy-c8856\" (UID: \"9449375f-ee97-496b-8cfd-67b096cc8324\") " pod="kube-system/kube-proxy-c8856" Mar 25 01:28:31.213472 kubelet[2795]: I0325 01:28:31.213286 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmsgb\" (UniqueName: \"kubernetes.io/projected/9449375f-ee97-496b-8cfd-67b096cc8324-kube-api-access-bmsgb\") pod \"kube-proxy-c8856\" (UID: \"9449375f-ee97-496b-8cfd-67b096cc8324\") " pod="kube-system/kube-proxy-c8856" Mar 25 01:28:31.213472 kubelet[2795]: I0325 01:28:31.213324 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/9449375f-ee97-496b-8cfd-67b096cc8324-xtables-lock\") pod \"kube-proxy-c8856\" (UID: \"9449375f-ee97-496b-8cfd-67b096cc8324\") " pod="kube-system/kube-proxy-c8856" Mar 25 01:28:31.213472 kubelet[2795]: I0325 01:28:31.213368 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9449375f-ee97-496b-8cfd-67b096cc8324-lib-modules\") pod \"kube-proxy-c8856\" (UID: \"9449375f-ee97-496b-8cfd-67b096cc8324\") " pod="kube-system/kube-proxy-c8856" Mar 25 01:28:31.326655 kubelet[2795]: E0325 01:28:31.326546 2795 projected.go:288] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Mar 25 01:28:31.326655 kubelet[2795]: E0325 01:28:31.326594 2795 projected.go:194] Error preparing data for projected volume kube-api-access-bmsgb for pod kube-system/kube-proxy-c8856: configmap "kube-root-ca.crt" not found Mar 25 01:28:31.326874 kubelet[2795]: E0325 01:28:31.326675 2795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9449375f-ee97-496b-8cfd-67b096cc8324-kube-api-access-bmsgb podName:9449375f-ee97-496b-8cfd-67b096cc8324 nodeName:}" failed. No retries permitted until 2025-03-25 01:28:31.826647384 +0000 UTC m=+5.362761160 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-bmsgb" (UniqueName: "kubernetes.io/projected/9449375f-ee97-496b-8cfd-67b096cc8324-kube-api-access-bmsgb") pod "kube-proxy-c8856" (UID: "9449375f-ee97-496b-8cfd-67b096cc8324") : configmap "kube-root-ca.crt" not found Mar 25 01:28:31.831698 sudo[1897]: pam_unix(sudo:session): session closed for user root Mar 25 01:28:31.991149 sshd[1896]: Connection closed by 139.178.89.65 port 41158 Mar 25 01:28:31.992255 sshd-session[1894]: pam_unix(sshd:session): session closed for user core Mar 25 01:28:31.998566 systemd[1]: sshd@7-78.46.211.139:22-139.178.89.65:41158.service: Deactivated successfully. Mar 25 01:28:32.001771 systemd[1]: session-7.scope: Deactivated successfully. Mar 25 01:28:32.002332 systemd[1]: session-7.scope: Consumed 6.979s CPU time, 225.9M memory peak. Mar 25 01:28:32.004084 systemd-logind[1475]: Session 7 logged out. Waiting for processes to exit. Mar 25 01:28:32.006953 systemd-logind[1475]: Removed session 7. Mar 25 01:28:32.095383 containerd[1497]: time="2025-03-25T01:28:32.095088374Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-c8856,Uid:9449375f-ee97-496b-8cfd-67b096cc8324,Namespace:kube-system,Attempt:0,}" Mar 25 01:28:32.103019 systemd[1]: Created slice kubepods-besteffort-pod4e6d9e05_8b68_401d_9ba8_4bbb88b6c244.slice - libcontainer container kubepods-besteffort-pod4e6d9e05_8b68_401d_9ba8_4bbb88b6c244.slice. Mar 25 01:28:32.122305 kubelet[2795]: I0325 01:28:32.122258 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/4e6d9e05-8b68-401d-9ba8-4bbb88b6c244-var-lib-calico\") pod \"tigera-operator-64ff5465b7-tzft2\" (UID: \"4e6d9e05-8b68-401d-9ba8-4bbb88b6c244\") " pod="tigera-operator/tigera-operator-64ff5465b7-tzft2" Mar 25 01:28:32.122305 kubelet[2795]: I0325 01:28:32.122306 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2nqk\" (UniqueName: \"kubernetes.io/projected/4e6d9e05-8b68-401d-9ba8-4bbb88b6c244-kube-api-access-h2nqk\") pod \"tigera-operator-64ff5465b7-tzft2\" (UID: \"4e6d9e05-8b68-401d-9ba8-4bbb88b6c244\") " pod="tigera-operator/tigera-operator-64ff5465b7-tzft2" Mar 25 01:28:32.139303 containerd[1497]: time="2025-03-25T01:28:32.139225532Z" level=info msg="connecting to shim dfc11a2936847fb67fca376cd64cbcaf7e3026d775f141b0a086a6a9cac67648" address="unix:///run/containerd/s/263c2ee5765bcb3e10add1903a1a3175097b023a600565457d2611ee276ed0e3" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:28:32.182053 systemd[1]: Started cri-containerd-dfc11a2936847fb67fca376cd64cbcaf7e3026d775f141b0a086a6a9cac67648.scope - libcontainer container dfc11a2936847fb67fca376cd64cbcaf7e3026d775f141b0a086a6a9cac67648. Mar 25 01:28:32.221159 containerd[1497]: time="2025-03-25T01:28:32.221098907Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-c8856,Uid:9449375f-ee97-496b-8cfd-67b096cc8324,Namespace:kube-system,Attempt:0,} returns sandbox id \"dfc11a2936847fb67fca376cd64cbcaf7e3026d775f141b0a086a6a9cac67648\"" Mar 25 01:28:32.227289 containerd[1497]: time="2025-03-25T01:28:32.227200691Z" level=info msg="CreateContainer within sandbox \"dfc11a2936847fb67fca376cd64cbcaf7e3026d775f141b0a086a6a9cac67648\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 25 01:28:32.251450 containerd[1497]: time="2025-03-25T01:28:32.251385631Z" level=info msg="Container 9b63e7835edc03a839fcf663fb03587ec8b1704e7d0dd2391f213fa90dc56f6a: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:28:32.265441 containerd[1497]: time="2025-03-25T01:28:32.265389944Z" level=info msg="CreateContainer within sandbox \"dfc11a2936847fb67fca376cd64cbcaf7e3026d775f141b0a086a6a9cac67648\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"9b63e7835edc03a839fcf663fb03587ec8b1704e7d0dd2391f213fa90dc56f6a\"" Mar 25 01:28:32.267348 containerd[1497]: time="2025-03-25T01:28:32.267230847Z" level=info msg="StartContainer for \"9b63e7835edc03a839fcf663fb03587ec8b1704e7d0dd2391f213fa90dc56f6a\"" Mar 25 01:28:32.269567 containerd[1497]: time="2025-03-25T01:28:32.269444667Z" level=info msg="connecting to shim 9b63e7835edc03a839fcf663fb03587ec8b1704e7d0dd2391f213fa90dc56f6a" address="unix:///run/containerd/s/263c2ee5765bcb3e10add1903a1a3175097b023a600565457d2611ee276ed0e3" protocol=ttrpc version=3 Mar 25 01:28:32.290158 systemd[1]: Started cri-containerd-9b63e7835edc03a839fcf663fb03587ec8b1704e7d0dd2391f213fa90dc56f6a.scope - libcontainer container 9b63e7835edc03a839fcf663fb03587ec8b1704e7d0dd2391f213fa90dc56f6a. Mar 25 01:28:32.335964 containerd[1497]: time="2025-03-25T01:28:32.335926101Z" level=info msg="StartContainer for \"9b63e7835edc03a839fcf663fb03587ec8b1704e7d0dd2391f213fa90dc56f6a\" returns successfully" Mar 25 01:28:32.411701 containerd[1497]: time="2025-03-25T01:28:32.411286415Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-64ff5465b7-tzft2,Uid:4e6d9e05-8b68-401d-9ba8-4bbb88b6c244,Namespace:tigera-operator,Attempt:0,}" Mar 25 01:28:32.438102 containerd[1497]: time="2025-03-25T01:28:32.438052851Z" level=info msg="connecting to shim 33821242d6c0ed7bf73316718acd331853bb05d9f203fabee8de7989f05a4001" address="unix:///run/containerd/s/4bde231bb79f3650548149cec450d6fc9a59105a3ee6a6fd0793bc9447ca4665" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:28:32.467797 systemd[1]: Started cri-containerd-33821242d6c0ed7bf73316718acd331853bb05d9f203fabee8de7989f05a4001.scope - libcontainer container 33821242d6c0ed7bf73316718acd331853bb05d9f203fabee8de7989f05a4001. Mar 25 01:28:32.519375 containerd[1497]: time="2025-03-25T01:28:32.518671837Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-64ff5465b7-tzft2,Uid:4e6d9e05-8b68-401d-9ba8-4bbb88b6c244,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"33821242d6c0ed7bf73316718acd331853bb05d9f203fabee8de7989f05a4001\"" Mar 25 01:28:32.523165 containerd[1497]: time="2025-03-25T01:28:32.523124557Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\"" Mar 25 01:28:32.673509 kubelet[2795]: I0325 01:28:32.673325 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-c8856" podStartSLOduration=1.673283469 podStartE2EDuration="1.673283469s" podCreationTimestamp="2025-03-25 01:28:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:28:32.672957232 +0000 UTC m=+6.209071008" watchObservedRunningTime="2025-03-25 01:28:32.673283469 +0000 UTC m=+6.209397245" Mar 25 01:28:41.499490 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1071646691.mount: Deactivated successfully. Mar 25 01:28:41.820204 containerd[1497]: time="2025-03-25T01:28:41.820003473Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:28:41.822220 containerd[1497]: time="2025-03-25T01:28:41.821351930Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.5: active requests=0, bytes read=19271115" Mar 25 01:28:41.823175 containerd[1497]: time="2025-03-25T01:28:41.823087352Z" level=info msg="ImageCreate event name:\"sha256:a709184cc04589116e7266cb3575491ae8f2ac1c959975fea966447025f66eaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:28:41.831862 containerd[1497]: time="2025-03-25T01:28:41.830960250Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:28:41.831862 containerd[1497]: time="2025-03-25T01:28:41.831644219Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.5\" with image id \"sha256:a709184cc04589116e7266cb3575491ae8f2ac1c959975fea966447025f66eaa\", repo tag \"quay.io/tigera/operator:v1.36.5\", repo digest \"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\", size \"19267110\" in 9.308294024s" Mar 25 01:28:41.831862 containerd[1497]: time="2025-03-25T01:28:41.831680419Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\" returns image reference \"sha256:a709184cc04589116e7266cb3575491ae8f2ac1c959975fea966447025f66eaa\"" Mar 25 01:28:41.837607 containerd[1497]: time="2025-03-25T01:28:41.837555093Z" level=info msg="CreateContainer within sandbox \"33821242d6c0ed7bf73316718acd331853bb05d9f203fabee8de7989f05a4001\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 25 01:28:41.847511 containerd[1497]: time="2025-03-25T01:28:41.847449657Z" level=info msg="Container bb4266f414617d23abda7a0741560d58b9b4a155d5b9027173e68d09e4144bae: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:28:41.860757 containerd[1497]: time="2025-03-25T01:28:41.860704423Z" level=info msg="CreateContainer within sandbox \"33821242d6c0ed7bf73316718acd331853bb05d9f203fabee8de7989f05a4001\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"bb4266f414617d23abda7a0741560d58b9b4a155d5b9027173e68d09e4144bae\"" Mar 25 01:28:41.861908 containerd[1497]: time="2025-03-25T01:28:41.861863837Z" level=info msg="StartContainer for \"bb4266f414617d23abda7a0741560d58b9b4a155d5b9027173e68d09e4144bae\"" Mar 25 01:28:41.863398 containerd[1497]: time="2025-03-25T01:28:41.863342216Z" level=info msg="connecting to shim bb4266f414617d23abda7a0741560d58b9b4a155d5b9027173e68d09e4144bae" address="unix:///run/containerd/s/4bde231bb79f3650548149cec450d6fc9a59105a3ee6a6fd0793bc9447ca4665" protocol=ttrpc version=3 Mar 25 01:28:41.892064 systemd[1]: Started cri-containerd-bb4266f414617d23abda7a0741560d58b9b4a155d5b9027173e68d09e4144bae.scope - libcontainer container bb4266f414617d23abda7a0741560d58b9b4a155d5b9027173e68d09e4144bae. Mar 25 01:28:41.933655 containerd[1497]: time="2025-03-25T01:28:41.932772365Z" level=info msg="StartContainer for \"bb4266f414617d23abda7a0741560d58b9b4a155d5b9027173e68d09e4144bae\" returns successfully" Mar 25 01:28:47.007074 kubelet[2795]: I0325 01:28:47.006998 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-64ff5465b7-tzft2" podStartSLOduration=5.693916424 podStartE2EDuration="15.00697433s" podCreationTimestamp="2025-03-25 01:28:32 +0000 UTC" firstStartedPulling="2025-03-25 01:28:32.520645059 +0000 UTC m=+6.056758835" lastFinishedPulling="2025-03-25 01:28:41.833702925 +0000 UTC m=+15.369816741" observedRunningTime="2025-03-25 01:28:42.700713531 +0000 UTC m=+16.236827347" watchObservedRunningTime="2025-03-25 01:28:47.00697433 +0000 UTC m=+20.543088106" Mar 25 01:28:47.017298 systemd[1]: Created slice kubepods-besteffort-podb19b44c0_0c50_44ac_9f5d_70c0346115db.slice - libcontainer container kubepods-besteffort-podb19b44c0_0c50_44ac_9f5d_70c0346115db.slice. Mar 25 01:28:47.027158 kubelet[2795]: I0325 01:28:47.026985 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/b19b44c0-0c50-44ac-9f5d-70c0346115db-typha-certs\") pod \"calico-typha-7b79c6dc96-nfbwc\" (UID: \"b19b44c0-0c50-44ac-9f5d-70c0346115db\") " pod="calico-system/calico-typha-7b79c6dc96-nfbwc" Mar 25 01:28:47.027158 kubelet[2795]: I0325 01:28:47.027037 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b19b44c0-0c50-44ac-9f5d-70c0346115db-tigera-ca-bundle\") pod \"calico-typha-7b79c6dc96-nfbwc\" (UID: \"b19b44c0-0c50-44ac-9f5d-70c0346115db\") " pod="calico-system/calico-typha-7b79c6dc96-nfbwc" Mar 25 01:28:47.027158 kubelet[2795]: I0325 01:28:47.027060 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbjzm\" (UniqueName: \"kubernetes.io/projected/b19b44c0-0c50-44ac-9f5d-70c0346115db-kube-api-access-gbjzm\") pod \"calico-typha-7b79c6dc96-nfbwc\" (UID: \"b19b44c0-0c50-44ac-9f5d-70c0346115db\") " pod="calico-system/calico-typha-7b79c6dc96-nfbwc" Mar 25 01:28:47.179189 systemd[1]: Created slice kubepods-besteffort-podbde7fb56_43cc_4fb9_9b15_61aa4c6b5651.slice - libcontainer container kubepods-besteffort-podbde7fb56_43cc_4fb9_9b15_61aa4c6b5651.slice. Mar 25 01:28:47.229116 kubelet[2795]: I0325 01:28:47.229056 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/bde7fb56-43cc-4fb9-9b15-61aa4c6b5651-cni-log-dir\") pod \"calico-node-klltq\" (UID: \"bde7fb56-43cc-4fb9-9b15-61aa4c6b5651\") " pod="calico-system/calico-node-klltq" Mar 25 01:28:47.229116 kubelet[2795]: I0325 01:28:47.229109 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bde7fb56-43cc-4fb9-9b15-61aa4c6b5651-lib-modules\") pod \"calico-node-klltq\" (UID: \"bde7fb56-43cc-4fb9-9b15-61aa4c6b5651\") " pod="calico-system/calico-node-klltq" Mar 25 01:28:47.229549 kubelet[2795]: I0325 01:28:47.229133 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/bde7fb56-43cc-4fb9-9b15-61aa4c6b5651-cni-net-dir\") pod \"calico-node-klltq\" (UID: \"bde7fb56-43cc-4fb9-9b15-61aa4c6b5651\") " pod="calico-system/calico-node-klltq" Mar 25 01:28:47.229549 kubelet[2795]: I0325 01:28:47.229153 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/bde7fb56-43cc-4fb9-9b15-61aa4c6b5651-xtables-lock\") pod \"calico-node-klltq\" (UID: \"bde7fb56-43cc-4fb9-9b15-61aa4c6b5651\") " pod="calico-system/calico-node-klltq" Mar 25 01:28:47.229549 kubelet[2795]: I0325 01:28:47.229173 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/bde7fb56-43cc-4fb9-9b15-61aa4c6b5651-node-certs\") pod \"calico-node-klltq\" (UID: \"bde7fb56-43cc-4fb9-9b15-61aa4c6b5651\") " pod="calico-system/calico-node-klltq" Mar 25 01:28:47.229549 kubelet[2795]: I0325 01:28:47.229191 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/bde7fb56-43cc-4fb9-9b15-61aa4c6b5651-flexvol-driver-host\") pod \"calico-node-klltq\" (UID: \"bde7fb56-43cc-4fb9-9b15-61aa4c6b5651\") " pod="calico-system/calico-node-klltq" Mar 25 01:28:47.229549 kubelet[2795]: I0325 01:28:47.229210 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt2tg\" (UniqueName: \"kubernetes.io/projected/bde7fb56-43cc-4fb9-9b15-61aa4c6b5651-kube-api-access-bt2tg\") pod \"calico-node-klltq\" (UID: \"bde7fb56-43cc-4fb9-9b15-61aa4c6b5651\") " pod="calico-system/calico-node-klltq" Mar 25 01:28:47.229905 kubelet[2795]: I0325 01:28:47.229230 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/bde7fb56-43cc-4fb9-9b15-61aa4c6b5651-var-run-calico\") pod \"calico-node-klltq\" (UID: \"bde7fb56-43cc-4fb9-9b15-61aa4c6b5651\") " pod="calico-system/calico-node-klltq" Mar 25 01:28:47.229905 kubelet[2795]: I0325 01:28:47.229247 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/bde7fb56-43cc-4fb9-9b15-61aa4c6b5651-var-lib-calico\") pod \"calico-node-klltq\" (UID: \"bde7fb56-43cc-4fb9-9b15-61aa4c6b5651\") " pod="calico-system/calico-node-klltq" Mar 25 01:28:47.229905 kubelet[2795]: I0325 01:28:47.229263 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bde7fb56-43cc-4fb9-9b15-61aa4c6b5651-tigera-ca-bundle\") pod \"calico-node-klltq\" (UID: \"bde7fb56-43cc-4fb9-9b15-61aa4c6b5651\") " pod="calico-system/calico-node-klltq" Mar 25 01:28:47.229905 kubelet[2795]: I0325 01:28:47.229281 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/bde7fb56-43cc-4fb9-9b15-61aa4c6b5651-policysync\") pod \"calico-node-klltq\" (UID: \"bde7fb56-43cc-4fb9-9b15-61aa4c6b5651\") " pod="calico-system/calico-node-klltq" Mar 25 01:28:47.229905 kubelet[2795]: I0325 01:28:47.229301 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/bde7fb56-43cc-4fb9-9b15-61aa4c6b5651-cni-bin-dir\") pod \"calico-node-klltq\" (UID: \"bde7fb56-43cc-4fb9-9b15-61aa4c6b5651\") " pod="calico-system/calico-node-klltq" Mar 25 01:28:47.317843 kubelet[2795]: E0325 01:28:47.317074 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-t6zpq" podUID="455e7431-91bf-4680-a141-9fa18af89c18" Mar 25 01:28:47.322839 containerd[1497]: time="2025-03-25T01:28:47.322053168Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7b79c6dc96-nfbwc,Uid:b19b44c0-0c50-44ac-9f5d-70c0346115db,Namespace:calico-system,Attempt:0,}" Mar 25 01:28:47.330013 kubelet[2795]: I0325 01:28:47.329954 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/455e7431-91bf-4680-a141-9fa18af89c18-varrun\") pod \"csi-node-driver-t6zpq\" (UID: \"455e7431-91bf-4680-a141-9fa18af89c18\") " pod="calico-system/csi-node-driver-t6zpq" Mar 25 01:28:47.330159 kubelet[2795]: I0325 01:28:47.330032 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlqr9\" (UniqueName: \"kubernetes.io/projected/455e7431-91bf-4680-a141-9fa18af89c18-kube-api-access-xlqr9\") pod \"csi-node-driver-t6zpq\" (UID: \"455e7431-91bf-4680-a141-9fa18af89c18\") " pod="calico-system/csi-node-driver-t6zpq" Mar 25 01:28:47.330159 kubelet[2795]: I0325 01:28:47.330054 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/455e7431-91bf-4680-a141-9fa18af89c18-socket-dir\") pod \"csi-node-driver-t6zpq\" (UID: \"455e7431-91bf-4680-a141-9fa18af89c18\") " pod="calico-system/csi-node-driver-t6zpq" Mar 25 01:28:47.330159 kubelet[2795]: I0325 01:28:47.330069 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/455e7431-91bf-4680-a141-9fa18af89c18-registration-dir\") pod \"csi-node-driver-t6zpq\" (UID: \"455e7431-91bf-4680-a141-9fa18af89c18\") " pod="calico-system/csi-node-driver-t6zpq" Mar 25 01:28:47.330159 kubelet[2795]: I0325 01:28:47.330096 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/455e7431-91bf-4680-a141-9fa18af89c18-kubelet-dir\") pod \"csi-node-driver-t6zpq\" (UID: \"455e7431-91bf-4680-a141-9fa18af89c18\") " pod="calico-system/csi-node-driver-t6zpq" Mar 25 01:28:47.340864 kubelet[2795]: E0325 01:28:47.337188 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:47.340864 kubelet[2795]: W0325 01:28:47.337216 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:47.340864 kubelet[2795]: E0325 01:28:47.337481 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:47.340864 kubelet[2795]: E0325 01:28:47.338846 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:47.340864 kubelet[2795]: W0325 01:28:47.338868 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:47.340864 kubelet[2795]: E0325 01:28:47.338982 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:47.340864 kubelet[2795]: E0325 01:28:47.339350 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:47.340864 kubelet[2795]: W0325 01:28:47.339360 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:47.340864 kubelet[2795]: E0325 01:28:47.339531 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:47.340864 kubelet[2795]: E0325 01:28:47.339586 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:47.341681 kubelet[2795]: W0325 01:28:47.339594 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:47.341681 kubelet[2795]: E0325 01:28:47.339605 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:47.341681 kubelet[2795]: E0325 01:28:47.340123 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:47.341681 kubelet[2795]: W0325 01:28:47.340137 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:47.341681 kubelet[2795]: E0325 01:28:47.340154 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:47.341681 kubelet[2795]: E0325 01:28:47.340973 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:47.341681 kubelet[2795]: W0325 01:28:47.340989 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:47.341681 kubelet[2795]: E0325 01:28:47.341012 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:47.370931 containerd[1497]: time="2025-03-25T01:28:47.370682809Z" level=info msg="connecting to shim ac0794f373e92eb195b28e158d34495584efcf77aaca8094ff5125835e3fa4a0" address="unix:///run/containerd/s/ed411b5943090a836cd051d55600416b1a0977bd3bb63d7334a8e8951b62d312" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:28:47.377859 kubelet[2795]: E0325 01:28:47.372671 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:47.377859 kubelet[2795]: W0325 01:28:47.372713 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:47.377859 kubelet[2795]: E0325 01:28:47.372736 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:47.410081 systemd[1]: Started cri-containerd-ac0794f373e92eb195b28e158d34495584efcf77aaca8094ff5125835e3fa4a0.scope - libcontainer container ac0794f373e92eb195b28e158d34495584efcf77aaca8094ff5125835e3fa4a0. Mar 25 01:28:47.431237 kubelet[2795]: E0325 01:28:47.431186 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:47.431237 kubelet[2795]: W0325 01:28:47.431226 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:47.431460 kubelet[2795]: E0325 01:28:47.431250 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:47.431698 kubelet[2795]: E0325 01:28:47.431661 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:47.431698 kubelet[2795]: W0325 01:28:47.431683 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:47.431793 kubelet[2795]: E0325 01:28:47.431722 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:47.432017 kubelet[2795]: E0325 01:28:47.431987 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:47.432017 kubelet[2795]: W0325 01:28:47.432008 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:47.432093 kubelet[2795]: E0325 01:28:47.432035 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:47.432861 kubelet[2795]: E0325 01:28:47.432320 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:47.432861 kubelet[2795]: W0325 01:28:47.432339 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:47.432861 kubelet[2795]: E0325 01:28:47.432522 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:47.432861 kubelet[2795]: E0325 01:28:47.432607 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:47.432861 kubelet[2795]: W0325 01:28:47.432615 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:47.432861 kubelet[2795]: E0325 01:28:47.432634 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:47.432861 kubelet[2795]: E0325 01:28:47.432803 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:47.432861 kubelet[2795]: W0325 01:28:47.432836 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:47.432861 kubelet[2795]: E0325 01:28:47.432851 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:47.433118 kubelet[2795]: E0325 01:28:47.433016 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:47.433118 kubelet[2795]: W0325 01:28:47.433026 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:47.433118 kubelet[2795]: E0325 01:28:47.433073 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:47.433835 kubelet[2795]: E0325 01:28:47.433237 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:47.433835 kubelet[2795]: W0325 01:28:47.433252 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:47.433835 kubelet[2795]: E0325 01:28:47.433331 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:47.433835 kubelet[2795]: E0325 01:28:47.433456 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:47.433835 kubelet[2795]: W0325 01:28:47.433463 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:47.433835 kubelet[2795]: E0325 01:28:47.433541 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:47.433835 kubelet[2795]: E0325 01:28:47.433642 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:47.433835 kubelet[2795]: W0325 01:28:47.433650 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:47.433835 kubelet[2795]: E0325 01:28:47.433664 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:47.434087 kubelet[2795]: E0325 01:28:47.433976 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:47.434087 kubelet[2795]: W0325 01:28:47.433987 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:47.434087 kubelet[2795]: E0325 01:28:47.434032 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:47.434844 kubelet[2795]: E0325 01:28:47.434167 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:47.434844 kubelet[2795]: W0325 01:28:47.434189 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:47.434844 kubelet[2795]: E0325 01:28:47.434264 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:47.434844 kubelet[2795]: E0325 01:28:47.434494 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:47.434844 kubelet[2795]: W0325 01:28:47.434506 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:47.434844 kubelet[2795]: E0325 01:28:47.434635 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:47.434844 kubelet[2795]: E0325 01:28:47.434830 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:47.434844 kubelet[2795]: W0325 01:28:47.434841 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:47.435059 kubelet[2795]: E0325 01:28:47.434898 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:47.435131 kubelet[2795]: E0325 01:28:47.435069 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:47.435131 kubelet[2795]: W0325 01:28:47.435080 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:47.435171 kubelet[2795]: E0325 01:28:47.435143 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:47.436075 kubelet[2795]: E0325 01:28:47.436046 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:47.436075 kubelet[2795]: W0325 01:28:47.436066 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:47.436869 kubelet[2795]: E0325 01:28:47.436173 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:47.436869 kubelet[2795]: E0325 01:28:47.436283 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:47.436869 kubelet[2795]: W0325 01:28:47.436291 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:47.436869 kubelet[2795]: E0325 01:28:47.436372 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:47.436869 kubelet[2795]: E0325 01:28:47.436500 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:47.436869 kubelet[2795]: W0325 01:28:47.436507 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:47.436869 kubelet[2795]: E0325 01:28:47.436575 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:47.436869 kubelet[2795]: E0325 01:28:47.436794 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:47.436869 kubelet[2795]: W0325 01:28:47.436802 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:47.436869 kubelet[2795]: E0325 01:28:47.436880 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:47.437122 kubelet[2795]: E0325 01:28:47.437003 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:47.437122 kubelet[2795]: W0325 01:28:47.437010 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:47.437122 kubelet[2795]: E0325 01:28:47.437021 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:47.437240 kubelet[2795]: E0325 01:28:47.437206 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:47.437240 kubelet[2795]: W0325 01:28:47.437217 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:47.437240 kubelet[2795]: E0325 01:28:47.437232 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:47.438834 kubelet[2795]: E0325 01:28:47.437713 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:47.438834 kubelet[2795]: W0325 01:28:47.437786 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:47.438834 kubelet[2795]: E0325 01:28:47.437863 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:47.438834 kubelet[2795]: E0325 01:28:47.438090 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:47.438834 kubelet[2795]: W0325 01:28:47.438101 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:47.438834 kubelet[2795]: E0325 01:28:47.438176 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:47.438834 kubelet[2795]: E0325 01:28:47.438443 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:47.438834 kubelet[2795]: W0325 01:28:47.438455 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:47.438834 kubelet[2795]: E0325 01:28:47.438551 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:47.438834 kubelet[2795]: E0325 01:28:47.438684 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:47.439175 kubelet[2795]: W0325 01:28:47.438691 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:47.439175 kubelet[2795]: E0325 01:28:47.438700 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:47.472333 kubelet[2795]: E0325 01:28:47.472292 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:47.472333 kubelet[2795]: W0325 01:28:47.472324 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:47.472788 kubelet[2795]: E0325 01:28:47.472363 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:47.493600 containerd[1497]: time="2025-03-25T01:28:47.493462018Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-klltq,Uid:bde7fb56-43cc-4fb9-9b15-61aa4c6b5651,Namespace:calico-system,Attempt:0,}" Mar 25 01:28:47.535782 containerd[1497]: time="2025-03-25T01:28:47.534982649Z" level=info msg="connecting to shim 0981bc70b8474e6e76c8796196ff1c1242759024bd7c9e0fcc729e041c8869e7" address="unix:///run/containerd/s/e6ea8904e1571de8a31af665a33ec794fa32e8548fd78a48e28aa4df0ecade64" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:28:47.537769 containerd[1497]: time="2025-03-25T01:28:47.536967937Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7b79c6dc96-nfbwc,Uid:b19b44c0-0c50-44ac-9f5d-70c0346115db,Namespace:calico-system,Attempt:0,} returns sandbox id \"ac0794f373e92eb195b28e158d34495584efcf77aaca8094ff5125835e3fa4a0\"" Mar 25 01:28:47.541834 containerd[1497]: time="2025-03-25T01:28:47.541668329Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\"" Mar 25 01:28:47.580139 systemd[1]: Started cri-containerd-0981bc70b8474e6e76c8796196ff1c1242759024bd7c9e0fcc729e041c8869e7.scope - libcontainer container 0981bc70b8474e6e76c8796196ff1c1242759024bd7c9e0fcc729e041c8869e7. Mar 25 01:28:47.631086 containerd[1497]: time="2025-03-25T01:28:47.631025621Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-klltq,Uid:bde7fb56-43cc-4fb9-9b15-61aa4c6b5651,Namespace:calico-system,Attempt:0,} returns sandbox id \"0981bc70b8474e6e76c8796196ff1c1242759024bd7c9e0fcc729e041c8869e7\"" Mar 25 01:28:49.313859 containerd[1497]: time="2025-03-25T01:28:49.313602859Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:28:49.314709 containerd[1497]: time="2025-03-25T01:28:49.314395001Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.2: active requests=0, bytes read=28363957" Mar 25 01:28:49.315659 containerd[1497]: time="2025-03-25T01:28:49.315617874Z" level=info msg="ImageCreate event name:\"sha256:38a4e8457549414848315eae0d5ab8ecd6c51f4baaea849fe5edce714d81a999\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:28:49.318518 containerd[1497]: time="2025-03-25T01:28:49.318471071Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:28:49.319469 containerd[1497]: time="2025-03-25T01:28:49.319315534Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.2\" with image id \"sha256:38a4e8457549414848315eae0d5ab8ecd6c51f4baaea849fe5edce714d81a999\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\", size \"29733706\" in 1.777592204s" Mar 25 01:28:49.319469 containerd[1497]: time="2025-03-25T01:28:49.319352175Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\" returns image reference \"sha256:38a4e8457549414848315eae0d5ab8ecd6c51f4baaea849fe5edce714d81a999\"" Mar 25 01:28:49.322840 containerd[1497]: time="2025-03-25T01:28:49.322690626Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\"" Mar 25 01:28:49.339611 containerd[1497]: time="2025-03-25T01:28:49.339507563Z" level=info msg="CreateContainer within sandbox \"ac0794f373e92eb195b28e158d34495584efcf77aaca8094ff5125835e3fa4a0\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 25 01:28:49.355329 containerd[1497]: time="2025-03-25T01:28:49.355183710Z" level=info msg="Container e97d8c1dd69e4cc99f5b30f297631915063bee21f44268a779b8289f9790b96f: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:28:49.364133 containerd[1497]: time="2025-03-25T01:28:49.364021670Z" level=info msg="CreateContainer within sandbox \"ac0794f373e92eb195b28e158d34495584efcf77aaca8094ff5125835e3fa4a0\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"e97d8c1dd69e4cc99f5b30f297631915063bee21f44268a779b8289f9790b96f\"" Mar 25 01:28:49.364698 containerd[1497]: time="2025-03-25T01:28:49.364650447Z" level=info msg="StartContainer for \"e97d8c1dd69e4cc99f5b30f297631915063bee21f44268a779b8289f9790b96f\"" Mar 25 01:28:49.366608 containerd[1497]: time="2025-03-25T01:28:49.366513298Z" level=info msg="connecting to shim e97d8c1dd69e4cc99f5b30f297631915063bee21f44268a779b8289f9790b96f" address="unix:///run/containerd/s/ed411b5943090a836cd051d55600416b1a0977bd3bb63d7334a8e8951b62d312" protocol=ttrpc version=3 Mar 25 01:28:49.399378 systemd[1]: Started cri-containerd-e97d8c1dd69e4cc99f5b30f297631915063bee21f44268a779b8289f9790b96f.scope - libcontainer container e97d8c1dd69e4cc99f5b30f297631915063bee21f44268a779b8289f9790b96f. Mar 25 01:28:49.460157 containerd[1497]: time="2025-03-25T01:28:49.460024400Z" level=info msg="StartContainer for \"e97d8c1dd69e4cc99f5b30f297631915063bee21f44268a779b8289f9790b96f\" returns successfully" Mar 25 01:28:49.597727 kubelet[2795]: E0325 01:28:49.597655 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-t6zpq" podUID="455e7431-91bf-4680-a141-9fa18af89c18" Mar 25 01:28:49.737259 kubelet[2795]: E0325 01:28:49.737209 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:49.737259 kubelet[2795]: W0325 01:28:49.737253 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:49.737518 kubelet[2795]: E0325 01:28:49.737285 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:49.737778 kubelet[2795]: E0325 01:28:49.737730 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:49.737903 kubelet[2795]: W0325 01:28:49.737779 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:49.737903 kubelet[2795]: E0325 01:28:49.737804 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:49.738281 kubelet[2795]: E0325 01:28:49.738255 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:49.738339 kubelet[2795]: W0325 01:28:49.738282 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:49.738339 kubelet[2795]: E0325 01:28:49.738305 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:49.739260 kubelet[2795]: E0325 01:28:49.739217 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:49.739260 kubelet[2795]: W0325 01:28:49.739242 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:49.739260 kubelet[2795]: E0325 01:28:49.739259 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:49.739738 kubelet[2795]: E0325 01:28:49.739627 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:49.739738 kubelet[2795]: W0325 01:28:49.739640 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:49.739738 kubelet[2795]: E0325 01:28:49.739651 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:49.740207 kubelet[2795]: E0325 01:28:49.740184 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:49.740207 kubelet[2795]: W0325 01:28:49.740200 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:49.740350 kubelet[2795]: E0325 01:28:49.740215 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:49.740460 kubelet[2795]: E0325 01:28:49.740424 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:49.740460 kubelet[2795]: W0325 01:28:49.740438 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:49.740460 kubelet[2795]: E0325 01:28:49.740447 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:49.741132 kubelet[2795]: E0325 01:28:49.740700 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:49.741132 kubelet[2795]: W0325 01:28:49.740710 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:49.741132 kubelet[2795]: E0325 01:28:49.740718 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:49.741392 kubelet[2795]: E0325 01:28:49.741330 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:49.741392 kubelet[2795]: W0325 01:28:49.741341 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:49.741392 kubelet[2795]: E0325 01:28:49.741350 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:49.741739 kubelet[2795]: E0325 01:28:49.741637 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:49.741739 kubelet[2795]: W0325 01:28:49.741647 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:49.741739 kubelet[2795]: E0325 01:28:49.741656 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:49.741891 kubelet[2795]: E0325 01:28:49.741872 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:49.741891 kubelet[2795]: W0325 01:28:49.741882 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:49.741891 kubelet[2795]: E0325 01:28:49.741890 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:49.742085 kubelet[2795]: E0325 01:28:49.742057 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:49.742085 kubelet[2795]: W0325 01:28:49.742072 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:49.742085 kubelet[2795]: E0325 01:28:49.742081 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:49.742844 kubelet[2795]: E0325 01:28:49.742795 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:49.742898 kubelet[2795]: W0325 01:28:49.742850 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:49.742898 kubelet[2795]: E0325 01:28:49.742862 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:49.743073 kubelet[2795]: E0325 01:28:49.743039 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:49.743073 kubelet[2795]: W0325 01:28:49.743052 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:49.743073 kubelet[2795]: E0325 01:28:49.743069 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:49.744593 kubelet[2795]: E0325 01:28:49.744569 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:49.744593 kubelet[2795]: W0325 01:28:49.744589 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:49.744722 kubelet[2795]: E0325 01:28:49.744604 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:49.752185 kubelet[2795]: E0325 01:28:49.752087 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:49.753063 kubelet[2795]: W0325 01:28:49.752280 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:49.753063 kubelet[2795]: E0325 01:28:49.752635 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:49.753641 kubelet[2795]: E0325 01:28:49.753523 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:49.753970 kubelet[2795]: W0325 01:28:49.753918 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:49.754386 kubelet[2795]: E0325 01:28:49.754213 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:49.755037 kubelet[2795]: E0325 01:28:49.754992 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:49.755037 kubelet[2795]: W0325 01:28:49.755014 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:49.755037 kubelet[2795]: E0325 01:28:49.755036 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:49.755436 kubelet[2795]: E0325 01:28:49.755236 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:49.755436 kubelet[2795]: W0325 01:28:49.755244 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:49.755436 kubelet[2795]: E0325 01:28:49.755253 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:49.755436 kubelet[2795]: E0325 01:28:49.755397 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:49.755436 kubelet[2795]: W0325 01:28:49.755440 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:49.755957 kubelet[2795]: E0325 01:28:49.755460 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:49.755957 kubelet[2795]: E0325 01:28:49.755742 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:49.755957 kubelet[2795]: W0325 01:28:49.755754 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:49.756418 kubelet[2795]: E0325 01:28:49.756198 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:49.756953 kubelet[2795]: E0325 01:28:49.756755 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:49.756953 kubelet[2795]: W0325 01:28:49.756780 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:49.757131 kubelet[2795]: E0325 01:28:49.757098 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:49.758310 kubelet[2795]: E0325 01:28:49.758257 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:49.758310 kubelet[2795]: W0325 01:28:49.758279 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:49.758310 kubelet[2795]: E0325 01:28:49.758308 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:49.758884 kubelet[2795]: E0325 01:28:49.758796 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:49.758884 kubelet[2795]: W0325 01:28:49.758880 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:49.759102 kubelet[2795]: E0325 01:28:49.759074 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:49.759102 kubelet[2795]: W0325 01:28:49.759084 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:49.759672 kubelet[2795]: E0325 01:28:49.759240 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:49.759672 kubelet[2795]: E0325 01:28:49.759277 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:49.759672 kubelet[2795]: E0325 01:28:49.759569 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:49.759672 kubelet[2795]: W0325 01:28:49.759583 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:49.759672 kubelet[2795]: E0325 01:28:49.759608 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:49.759843 kubelet[2795]: E0325 01:28:49.759795 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:49.759843 kubelet[2795]: W0325 01:28:49.759804 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:49.759843 kubelet[2795]: E0325 01:28:49.759842 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:49.760546 kubelet[2795]: E0325 01:28:49.760525 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:49.760546 kubelet[2795]: W0325 01:28:49.760544 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:49.760630 kubelet[2795]: E0325 01:28:49.760565 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:49.760835 kubelet[2795]: E0325 01:28:49.760802 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:49.760835 kubelet[2795]: W0325 01:28:49.760833 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:49.761312 kubelet[2795]: E0325 01:28:49.760932 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:49.761312 kubelet[2795]: E0325 01:28:49.761164 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:49.761312 kubelet[2795]: W0325 01:28:49.761176 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:49.761312 kubelet[2795]: E0325 01:28:49.761189 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:49.762160 kubelet[2795]: E0325 01:28:49.762009 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:49.762160 kubelet[2795]: W0325 01:28:49.762027 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:49.762160 kubelet[2795]: E0325 01:28:49.762062 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:49.762615 kubelet[2795]: E0325 01:28:49.762495 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:49.762615 kubelet[2795]: W0325 01:28:49.762513 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:49.762615 kubelet[2795]: E0325 01:28:49.762531 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:49.762920 kubelet[2795]: E0325 01:28:49.762732 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:49.762920 kubelet[2795]: W0325 01:28:49.762747 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:49.762920 kubelet[2795]: E0325 01:28:49.762757 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:50.610049 containerd[1497]: time="2025-03-25T01:28:50.608711592Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:28:50.611829 containerd[1497]: time="2025-03-25T01:28:50.611747240Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2: active requests=0, bytes read=5120152" Mar 25 01:28:50.613630 containerd[1497]: time="2025-03-25T01:28:50.613584132Z" level=info msg="ImageCreate event name:\"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:28:50.617739 containerd[1497]: time="2025-03-25T01:28:50.617692691Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:28:50.619247 containerd[1497]: time="2025-03-25T01:28:50.619093891Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" with image id \"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\", size \"6489869\" in 1.296361224s" Mar 25 01:28:50.619247 containerd[1497]: time="2025-03-25T01:28:50.619142852Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" returns image reference \"sha256:bf0e51f0111c4e6f7bc448c15934e73123805f3c5e66e455c7eb7392854e0921\"" Mar 25 01:28:50.625128 containerd[1497]: time="2025-03-25T01:28:50.625059263Z" level=info msg="CreateContainer within sandbox \"0981bc70b8474e6e76c8796196ff1c1242759024bd7c9e0fcc729e041c8869e7\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 25 01:28:50.645209 containerd[1497]: time="2025-03-25T01:28:50.642982378Z" level=info msg="Container 51c4ab11e09c6fc53f65967737c38df1cd2adfae24ff83f0071512e09f0a7f2e: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:28:50.656841 containerd[1497]: time="2025-03-25T01:28:50.656773855Z" level=info msg="CreateContainer within sandbox \"0981bc70b8474e6e76c8796196ff1c1242759024bd7c9e0fcc729e041c8869e7\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"51c4ab11e09c6fc53f65967737c38df1cd2adfae24ff83f0071512e09f0a7f2e\"" Mar 25 01:28:50.658225 containerd[1497]: time="2025-03-25T01:28:50.658162815Z" level=info msg="StartContainer for \"51c4ab11e09c6fc53f65967737c38df1cd2adfae24ff83f0071512e09f0a7f2e\"" Mar 25 01:28:50.662431 containerd[1497]: time="2025-03-25T01:28:50.662371736Z" level=info msg="connecting to shim 51c4ab11e09c6fc53f65967737c38df1cd2adfae24ff83f0071512e09f0a7f2e" address="unix:///run/containerd/s/e6ea8904e1571de8a31af665a33ec794fa32e8548fd78a48e28aa4df0ecade64" protocol=ttrpc version=3 Mar 25 01:28:50.692065 systemd[1]: Started cri-containerd-51c4ab11e09c6fc53f65967737c38df1cd2adfae24ff83f0071512e09f0a7f2e.scope - libcontainer container 51c4ab11e09c6fc53f65967737c38df1cd2adfae24ff83f0071512e09f0a7f2e. Mar 25 01:28:50.716577 kubelet[2795]: I0325 01:28:50.716533 2795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:28:50.747234 containerd[1497]: time="2025-03-25T01:28:50.747145775Z" level=info msg="StartContainer for \"51c4ab11e09c6fc53f65967737c38df1cd2adfae24ff83f0071512e09f0a7f2e\" returns successfully" Mar 25 01:28:50.751842 kubelet[2795]: E0325 01:28:50.751769 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:50.751842 kubelet[2795]: W0325 01:28:50.751796 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:50.752161 kubelet[2795]: E0325 01:28:50.751860 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:50.752823 kubelet[2795]: E0325 01:28:50.752779 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:50.752897 kubelet[2795]: W0325 01:28:50.752814 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:50.752897 kubelet[2795]: E0325 01:28:50.752848 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:50.753881 kubelet[2795]: E0325 01:28:50.753838 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:50.753881 kubelet[2795]: W0325 01:28:50.753865 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:50.753881 kubelet[2795]: E0325 01:28:50.753885 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:50.755004 kubelet[2795]: E0325 01:28:50.754967 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:50.755004 kubelet[2795]: W0325 01:28:50.754995 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:50.755138 kubelet[2795]: E0325 01:28:50.755016 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:50.756377 kubelet[2795]: E0325 01:28:50.756339 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:50.756498 kubelet[2795]: W0325 01:28:50.756408 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:50.756528 kubelet[2795]: E0325 01:28:50.756499 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:50.756949 kubelet[2795]: E0325 01:28:50.756901 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:50.756949 kubelet[2795]: W0325 01:28:50.756944 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:50.757185 kubelet[2795]: E0325 01:28:50.756961 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:50.757252 kubelet[2795]: E0325 01:28:50.757232 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:50.757252 kubelet[2795]: W0325 01:28:50.757247 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:50.757305 kubelet[2795]: E0325 01:28:50.757259 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:50.757577 kubelet[2795]: E0325 01:28:50.757554 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:50.757577 kubelet[2795]: W0325 01:28:50.757571 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:50.757668 kubelet[2795]: E0325 01:28:50.757584 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:50.757925 kubelet[2795]: E0325 01:28:50.757904 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:50.757925 kubelet[2795]: W0325 01:28:50.757920 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:50.757994 kubelet[2795]: E0325 01:28:50.757933 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:50.758166 kubelet[2795]: E0325 01:28:50.758139 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:50.758166 kubelet[2795]: W0325 01:28:50.758151 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:50.758166 kubelet[2795]: E0325 01:28:50.758160 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:50.758398 kubelet[2795]: E0325 01:28:50.758368 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:50.758398 kubelet[2795]: W0325 01:28:50.758393 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:50.758476 kubelet[2795]: E0325 01:28:50.758403 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:50.758662 kubelet[2795]: E0325 01:28:50.758641 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:50.758662 kubelet[2795]: W0325 01:28:50.758656 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:50.758662 kubelet[2795]: E0325 01:28:50.758666 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:50.759087 kubelet[2795]: E0325 01:28:50.759060 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:50.759087 kubelet[2795]: W0325 01:28:50.759077 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:50.759179 kubelet[2795]: E0325 01:28:50.759112 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:50.759487 kubelet[2795]: E0325 01:28:50.759465 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:50.759487 kubelet[2795]: W0325 01:28:50.759484 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:50.759591 kubelet[2795]: E0325 01:28:50.759495 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:50.759692 kubelet[2795]: E0325 01:28:50.759677 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:50.759925 kubelet[2795]: W0325 01:28:50.759901 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:50.759925 kubelet[2795]: E0325 01:28:50.759926 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:50.770175 kubelet[2795]: E0325 01:28:50.770081 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:50.770175 kubelet[2795]: W0325 01:28:50.770112 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:50.770175 kubelet[2795]: E0325 01:28:50.770138 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:50.770542 kubelet[2795]: E0325 01:28:50.770466 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:50.770542 kubelet[2795]: W0325 01:28:50.770480 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:50.770952 kubelet[2795]: E0325 01:28:50.770919 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:50.771613 kubelet[2795]: E0325 01:28:50.771583 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:50.771613 kubelet[2795]: W0325 01:28:50.771607 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:50.771764 kubelet[2795]: E0325 01:28:50.771624 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:50.772396 kubelet[2795]: E0325 01:28:50.771989 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:50.772544 kubelet[2795]: W0325 01:28:50.772399 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:50.772544 kubelet[2795]: E0325 01:28:50.772455 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:50.774303 kubelet[2795]: E0325 01:28:50.774039 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:50.774303 kubelet[2795]: W0325 01:28:50.774059 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:50.774868 kubelet[2795]: E0325 01:28:50.774840 2795 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:28:50.774868 kubelet[2795]: W0325 01:28:50.774863 2795 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:28:50.777365 systemd[1]: cri-containerd-51c4ab11e09c6fc53f65967737c38df1cd2adfae24ff83f0071512e09f0a7f2e.scope: Deactivated successfully. Mar 25 01:28:50.778962 kubelet[2795]: E0325 01:28:50.778486 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:50.778962 kubelet[2795]: E0325 01:28:50.778515 2795 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:28:50.786824 containerd[1497]: time="2025-03-25T01:28:50.786646992Z" level=info msg="TaskExit event in podsandbox handler container_id:\"51c4ab11e09c6fc53f65967737c38df1cd2adfae24ff83f0071512e09f0a7f2e\" id:\"51c4ab11e09c6fc53f65967737c38df1cd2adfae24ff83f0071512e09f0a7f2e\" pid:3403 exited_at:{seconds:1742866130 nanos:780472894}" Mar 25 01:28:50.787239 containerd[1497]: time="2025-03-25T01:28:50.787074164Z" level=info msg="received exit event container_id:\"51c4ab11e09c6fc53f65967737c38df1cd2adfae24ff83f0071512e09f0a7f2e\" id:\"51c4ab11e09c6fc53f65967737c38df1cd2adfae24ff83f0071512e09f0a7f2e\" pid:3403 exited_at:{seconds:1742866130 nanos:780472894}" Mar 25 01:28:50.815005 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-51c4ab11e09c6fc53f65967737c38df1cd2adfae24ff83f0071512e09f0a7f2e-rootfs.mount: Deactivated successfully. Mar 25 01:28:51.598119 kubelet[2795]: E0325 01:28:51.598053 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-t6zpq" podUID="455e7431-91bf-4680-a141-9fa18af89c18" Mar 25 01:28:51.725780 containerd[1497]: time="2025-03-25T01:28:51.725595840Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\"" Mar 25 01:28:51.752995 kubelet[2795]: I0325 01:28:51.752649 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-7b79c6dc96-nfbwc" podStartSLOduration=3.971563488 podStartE2EDuration="5.752627339s" podCreationTimestamp="2025-03-25 01:28:46 +0000 UTC" firstStartedPulling="2025-03-25 01:28:47.539439956 +0000 UTC m=+21.075553732" lastFinishedPulling="2025-03-25 01:28:49.320503807 +0000 UTC m=+22.856617583" observedRunningTime="2025-03-25 01:28:49.726173676 +0000 UTC m=+23.262287452" watchObservedRunningTime="2025-03-25 01:28:51.752627339 +0000 UTC m=+25.288741115" Mar 25 01:28:53.597876 kubelet[2795]: E0325 01:28:53.597595 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-t6zpq" podUID="455e7431-91bf-4680-a141-9fa18af89c18" Mar 25 01:28:54.452801 containerd[1497]: time="2025-03-25T01:28:54.452727101Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:28:54.456165 containerd[1497]: time="2025-03-25T01:28:54.455940732Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.2: active requests=0, bytes read=91227396" Mar 25 01:28:54.459205 containerd[1497]: time="2025-03-25T01:28:54.457892600Z" level=info msg="ImageCreate event name:\"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:28:54.461829 containerd[1497]: time="2025-03-25T01:28:54.461718773Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:28:54.462593 containerd[1497]: time="2025-03-25T01:28:54.462558762Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.2\" with image id \"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\", size \"92597153\" in 2.736907s" Mar 25 01:28:54.462720 containerd[1497]: time="2025-03-25T01:28:54.462704327Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\" returns image reference \"sha256:57c2b1dcdc0045be5220c7237f900bce5f47c006714073859cf102b0eaa65290\"" Mar 25 01:28:54.466221 containerd[1497]: time="2025-03-25T01:28:54.466173927Z" level=info msg="CreateContainer within sandbox \"0981bc70b8474e6e76c8796196ff1c1242759024bd7c9e0fcc729e041c8869e7\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 25 01:28:54.478098 containerd[1497]: time="2025-03-25T01:28:54.478037338Z" level=info msg="Container 8b11635d2d9737186dca7a7e09e1fa5dd349aa7f842803f3cc6416e1b8814cd4: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:28:54.494845 containerd[1497]: time="2025-03-25T01:28:54.493755442Z" level=info msg="CreateContainer within sandbox \"0981bc70b8474e6e76c8796196ff1c1242759024bd7c9e0fcc729e041c8869e7\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"8b11635d2d9737186dca7a7e09e1fa5dd349aa7f842803f3cc6416e1b8814cd4\"" Mar 25 01:28:54.496186 containerd[1497]: time="2025-03-25T01:28:54.496138285Z" level=info msg="StartContainer for \"8b11635d2d9737186dca7a7e09e1fa5dd349aa7f842803f3cc6416e1b8814cd4\"" Mar 25 01:28:54.497804 containerd[1497]: time="2025-03-25T01:28:54.497758501Z" level=info msg="connecting to shim 8b11635d2d9737186dca7a7e09e1fa5dd349aa7f842803f3cc6416e1b8814cd4" address="unix:///run/containerd/s/e6ea8904e1571de8a31af665a33ec794fa32e8548fd78a48e28aa4df0ecade64" protocol=ttrpc version=3 Mar 25 01:28:54.531105 systemd[1]: Started cri-containerd-8b11635d2d9737186dca7a7e09e1fa5dd349aa7f842803f3cc6416e1b8814cd4.scope - libcontainer container 8b11635d2d9737186dca7a7e09e1fa5dd349aa7f842803f3cc6416e1b8814cd4. Mar 25 01:28:54.617372 containerd[1497]: time="2025-03-25T01:28:54.616891426Z" level=info msg="StartContainer for \"8b11635d2d9737186dca7a7e09e1fa5dd349aa7f842803f3cc6416e1b8814cd4\" returns successfully" Mar 25 01:28:55.229979 systemd[1]: cri-containerd-8b11635d2d9737186dca7a7e09e1fa5dd349aa7f842803f3cc6416e1b8814cd4.scope: Deactivated successfully. Mar 25 01:28:55.231880 systemd[1]: cri-containerd-8b11635d2d9737186dca7a7e09e1fa5dd349aa7f842803f3cc6416e1b8814cd4.scope: Consumed 506ms CPU time, 170.3M memory peak, 150.3M written to disk. Mar 25 01:28:55.232468 containerd[1497]: time="2025-03-25T01:28:55.232386175Z" level=info msg="received exit event container_id:\"8b11635d2d9737186dca7a7e09e1fa5dd349aa7f842803f3cc6416e1b8814cd4\" id:\"8b11635d2d9737186dca7a7e09e1fa5dd349aa7f842803f3cc6416e1b8814cd4\" pid:3484 exited_at:{seconds:1742866135 nanos:231793714}" Mar 25 01:28:55.233232 containerd[1497]: time="2025-03-25T01:28:55.232766429Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8b11635d2d9737186dca7a7e09e1fa5dd349aa7f842803f3cc6416e1b8814cd4\" id:\"8b11635d2d9737186dca7a7e09e1fa5dd349aa7f842803f3cc6416e1b8814cd4\" pid:3484 exited_at:{seconds:1742866135 nanos:231793714}" Mar 25 01:28:55.256689 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8b11635d2d9737186dca7a7e09e1fa5dd349aa7f842803f3cc6416e1b8814cd4-rootfs.mount: Deactivated successfully. Mar 25 01:28:55.280646 kubelet[2795]: I0325 01:28:55.280601 2795 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Mar 25 01:28:55.338262 systemd[1]: Created slice kubepods-burstable-pod2c508311_23b5_48ed_a7d6_4e74589ff007.slice - libcontainer container kubepods-burstable-pod2c508311_23b5_48ed_a7d6_4e74589ff007.slice. Mar 25 01:28:55.348829 kubelet[2795]: W0325 01:28:55.348706 2795 reflector.go:561] object-"calico-apiserver"/"calico-apiserver-certs": failed to list *v1.Secret: secrets "calico-apiserver-certs" is forbidden: User "system:node:ci-4284-0-0-6-22e9b0bb97" cannot list resource "secrets" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ci-4284-0-0-6-22e9b0bb97' and this object Mar 25 01:28:55.348829 kubelet[2795]: E0325 01:28:55.348764 2795 reflector.go:158] "Unhandled Error" err="object-\"calico-apiserver\"/\"calico-apiserver-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"calico-apiserver-certs\" is forbidden: User \"system:node:ci-4284-0-0-6-22e9b0bb97\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'ci-4284-0-0-6-22e9b0bb97' and this object" logger="UnhandledError" Mar 25 01:28:55.353834 kubelet[2795]: W0325 01:28:55.349024 2795 reflector.go:561] object-"calico-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4284-0-0-6-22e9b0bb97" cannot list resource "configmaps" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ci-4284-0-0-6-22e9b0bb97' and this object Mar 25 01:28:55.353834 kubelet[2795]: E0325 01:28:55.349044 2795 reflector.go:158] "Unhandled Error" err="object-\"calico-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ci-4284-0-0-6-22e9b0bb97\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'ci-4284-0-0-6-22e9b0bb97' and this object" logger="UnhandledError" Mar 25 01:28:55.349993 systemd[1]: Created slice kubepods-burstable-pode1099ad4_bb29_4692_8b74_1f06f414103c.slice - libcontainer container kubepods-burstable-pode1099ad4_bb29_4692_8b74_1f06f414103c.slice. Mar 25 01:28:55.368124 systemd[1]: Created slice kubepods-besteffort-pod35d02384_32ff_4cc9_8558_540e7f668e10.slice - libcontainer container kubepods-besteffort-pod35d02384_32ff_4cc9_8558_540e7f668e10.slice. Mar 25 01:28:55.380437 systemd[1]: Created slice kubepods-besteffort-podc0b0d32b_56b5_4b1a_93b8_d4235f47aaa6.slice - libcontainer container kubepods-besteffort-podc0b0d32b_56b5_4b1a_93b8_d4235f47aaa6.slice. Mar 25 01:28:55.386992 systemd[1]: Created slice kubepods-besteffort-pod48367b39_f9db_43f8_985c_12f3e16d795e.slice - libcontainer container kubepods-besteffort-pod48367b39_f9db_43f8_985c_12f3e16d795e.slice. Mar 25 01:28:55.424520 kubelet[2795]: I0325 01:28:55.424086 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e1099ad4-bb29-4692-8b74-1f06f414103c-config-volume\") pod \"coredns-6f6b679f8f-d6g5n\" (UID: \"e1099ad4-bb29-4692-8b74-1f06f414103c\") " pod="kube-system/coredns-6f6b679f8f-d6g5n" Mar 25 01:28:55.424520 kubelet[2795]: I0325 01:28:55.424146 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmbpn\" (UniqueName: \"kubernetes.io/projected/48367b39-f9db-43f8-985c-12f3e16d795e-kube-api-access-jmbpn\") pod \"calico-apiserver-77b65b67d6-rf4c2\" (UID: \"48367b39-f9db-43f8-985c-12f3e16d795e\") " pod="calico-apiserver/calico-apiserver-77b65b67d6-rf4c2" Mar 25 01:28:55.424520 kubelet[2795]: I0325 01:28:55.424175 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wmsk\" (UniqueName: \"kubernetes.io/projected/e1099ad4-bb29-4692-8b74-1f06f414103c-kube-api-access-7wmsk\") pod \"coredns-6f6b679f8f-d6g5n\" (UID: \"e1099ad4-bb29-4692-8b74-1f06f414103c\") " pod="kube-system/coredns-6f6b679f8f-d6g5n" Mar 25 01:28:55.424520 kubelet[2795]: I0325 01:28:55.424203 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/48367b39-f9db-43f8-985c-12f3e16d795e-calico-apiserver-certs\") pod \"calico-apiserver-77b65b67d6-rf4c2\" (UID: \"48367b39-f9db-43f8-985c-12f3e16d795e\") " pod="calico-apiserver/calico-apiserver-77b65b67d6-rf4c2" Mar 25 01:28:55.424520 kubelet[2795]: I0325 01:28:55.424235 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqjcc\" (UniqueName: \"kubernetes.io/projected/2c508311-23b5-48ed-a7d6-4e74589ff007-kube-api-access-lqjcc\") pod \"coredns-6f6b679f8f-qqqdg\" (UID: \"2c508311-23b5-48ed-a7d6-4e74589ff007\") " pod="kube-system/coredns-6f6b679f8f-qqqdg" Mar 25 01:28:55.424965 kubelet[2795]: I0325 01:28:55.424263 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/35d02384-32ff-4cc9-8558-540e7f668e10-calico-apiserver-certs\") pod \"calico-apiserver-77b65b67d6-57pch\" (UID: \"35d02384-32ff-4cc9-8558-540e7f668e10\") " pod="calico-apiserver/calico-apiserver-77b65b67d6-57pch" Mar 25 01:28:55.424965 kubelet[2795]: I0325 01:28:55.424297 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nn7d\" (UniqueName: \"kubernetes.io/projected/35d02384-32ff-4cc9-8558-540e7f668e10-kube-api-access-9nn7d\") pod \"calico-apiserver-77b65b67d6-57pch\" (UID: \"35d02384-32ff-4cc9-8558-540e7f668e10\") " pod="calico-apiserver/calico-apiserver-77b65b67d6-57pch" Mar 25 01:28:55.424965 kubelet[2795]: I0325 01:28:55.424328 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0b0d32b-56b5-4b1a-93b8-d4235f47aaa6-tigera-ca-bundle\") pod \"calico-kube-controllers-74b96585fb-rvlz4\" (UID: \"c0b0d32b-56b5-4b1a-93b8-d4235f47aaa6\") " pod="calico-system/calico-kube-controllers-74b96585fb-rvlz4" Mar 25 01:28:55.424965 kubelet[2795]: I0325 01:28:55.424354 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvfk7\" (UniqueName: \"kubernetes.io/projected/c0b0d32b-56b5-4b1a-93b8-d4235f47aaa6-kube-api-access-tvfk7\") pod \"calico-kube-controllers-74b96585fb-rvlz4\" (UID: \"c0b0d32b-56b5-4b1a-93b8-d4235f47aaa6\") " pod="calico-system/calico-kube-controllers-74b96585fb-rvlz4" Mar 25 01:28:55.424965 kubelet[2795]: I0325 01:28:55.424386 2795 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2c508311-23b5-48ed-a7d6-4e74589ff007-config-volume\") pod \"coredns-6f6b679f8f-qqqdg\" (UID: \"2c508311-23b5-48ed-a7d6-4e74589ff007\") " pod="kube-system/coredns-6f6b679f8f-qqqdg" Mar 25 01:28:55.605664 systemd[1]: Created slice kubepods-besteffort-pod455e7431_91bf_4680_a141_9fa18af89c18.slice - libcontainer container kubepods-besteffort-pod455e7431_91bf_4680_a141_9fa18af89c18.slice. Mar 25 01:28:55.608576 containerd[1497]: time="2025-03-25T01:28:55.608526990Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-t6zpq,Uid:455e7431-91bf-4680-a141-9fa18af89c18,Namespace:calico-system,Attempt:0,}" Mar 25 01:28:55.666530 containerd[1497]: time="2025-03-25T01:28:55.665867813Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-qqqdg,Uid:2c508311-23b5-48ed-a7d6-4e74589ff007,Namespace:kube-system,Attempt:0,}" Mar 25 01:28:55.666765 containerd[1497]: time="2025-03-25T01:28:55.666692243Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-d6g5n,Uid:e1099ad4-bb29-4692-8b74-1f06f414103c,Namespace:kube-system,Attempt:0,}" Mar 25 01:28:55.687726 containerd[1497]: time="2025-03-25T01:28:55.687579955Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-74b96585fb-rvlz4,Uid:c0b0d32b-56b5-4b1a-93b8-d4235f47aaa6,Namespace:calico-system,Attempt:0,}" Mar 25 01:28:55.743227 containerd[1497]: time="2025-03-25T01:28:55.743153874Z" level=error msg="Failed to destroy network for sandbox \"c4645c0d7ca204a57e90b5b02953df2a38e5fc93d31a8fe82a89c015a402bb82\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:28:55.745542 containerd[1497]: time="2025-03-25T01:28:55.745434996Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-t6zpq,Uid:455e7431-91bf-4680-a141-9fa18af89c18,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c4645c0d7ca204a57e90b5b02953df2a38e5fc93d31a8fe82a89c015a402bb82\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:28:55.746359 kubelet[2795]: E0325 01:28:55.746303 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c4645c0d7ca204a57e90b5b02953df2a38e5fc93d31a8fe82a89c015a402bb82\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:28:55.746573 kubelet[2795]: E0325 01:28:55.746545 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c4645c0d7ca204a57e90b5b02953df2a38e5fc93d31a8fe82a89c015a402bb82\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-t6zpq" Mar 25 01:28:55.747363 kubelet[2795]: E0325 01:28:55.746657 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c4645c0d7ca204a57e90b5b02953df2a38e5fc93d31a8fe82a89c015a402bb82\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-t6zpq" Mar 25 01:28:55.747363 kubelet[2795]: E0325 01:28:55.746713 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-t6zpq_calico-system(455e7431-91bf-4680-a141-9fa18af89c18)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-t6zpq_calico-system(455e7431-91bf-4680-a141-9fa18af89c18)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c4645c0d7ca204a57e90b5b02953df2a38e5fc93d31a8fe82a89c015a402bb82\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-t6zpq" podUID="455e7431-91bf-4680-a141-9fa18af89c18" Mar 25 01:28:55.762612 containerd[1497]: time="2025-03-25T01:28:55.762573373Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\"" Mar 25 01:28:55.788407 containerd[1497]: time="2025-03-25T01:28:55.787973647Z" level=error msg="Failed to destroy network for sandbox \"dae73f2d7447bcd2f83ae3ad4be5cdebf2d2646e47a3442f1ea20ae231c58ffa\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:28:55.797705 containerd[1497]: time="2025-03-25T01:28:55.796924409Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-qqqdg,Uid:2c508311-23b5-48ed-a7d6-4e74589ff007,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"dae73f2d7447bcd2f83ae3ad4be5cdebf2d2646e47a3442f1ea20ae231c58ffa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:28:55.800212 kubelet[2795]: E0325 01:28:55.798083 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dae73f2d7447bcd2f83ae3ad4be5cdebf2d2646e47a3442f1ea20ae231c58ffa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:28:55.800212 kubelet[2795]: E0325 01:28:55.798163 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dae73f2d7447bcd2f83ae3ad4be5cdebf2d2646e47a3442f1ea20ae231c58ffa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-qqqdg" Mar 25 01:28:55.800212 kubelet[2795]: E0325 01:28:55.798190 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dae73f2d7447bcd2f83ae3ad4be5cdebf2d2646e47a3442f1ea20ae231c58ffa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-qqqdg" Mar 25 01:28:55.800919 kubelet[2795]: E0325 01:28:55.798265 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-qqqdg_kube-system(2c508311-23b5-48ed-a7d6-4e74589ff007)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-qqqdg_kube-system(2c508311-23b5-48ed-a7d6-4e74589ff007)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dae73f2d7447bcd2f83ae3ad4be5cdebf2d2646e47a3442f1ea20ae231c58ffa\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-qqqdg" podUID="2c508311-23b5-48ed-a7d6-4e74589ff007" Mar 25 01:28:55.826229 containerd[1497]: time="2025-03-25T01:28:55.826046377Z" level=error msg="Failed to destroy network for sandbox \"26bb7ab19d274faa239173cc0f6c3626c570806e5a2aac990fd7f53ab12ac899\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:28:55.830112 containerd[1497]: time="2025-03-25T01:28:55.828925921Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-74b96585fb-rvlz4,Uid:c0b0d32b-56b5-4b1a-93b8-d4235f47aaa6,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"26bb7ab19d274faa239173cc0f6c3626c570806e5a2aac990fd7f53ab12ac899\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:28:55.830260 kubelet[2795]: E0325 01:28:55.829627 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"26bb7ab19d274faa239173cc0f6c3626c570806e5a2aac990fd7f53ab12ac899\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:28:55.830260 kubelet[2795]: E0325 01:28:55.829678 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"26bb7ab19d274faa239173cc0f6c3626c570806e5a2aac990fd7f53ab12ac899\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-74b96585fb-rvlz4" Mar 25 01:28:55.830260 kubelet[2795]: E0325 01:28:55.829702 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"26bb7ab19d274faa239173cc0f6c3626c570806e5a2aac990fd7f53ab12ac899\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-74b96585fb-rvlz4" Mar 25 01:28:55.830377 kubelet[2795]: E0325 01:28:55.829755 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-74b96585fb-rvlz4_calico-system(c0b0d32b-56b5-4b1a-93b8-d4235f47aaa6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-74b96585fb-rvlz4_calico-system(c0b0d32b-56b5-4b1a-93b8-d4235f47aaa6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"26bb7ab19d274faa239173cc0f6c3626c570806e5a2aac990fd7f53ab12ac899\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-74b96585fb-rvlz4" podUID="c0b0d32b-56b5-4b1a-93b8-d4235f47aaa6" Mar 25 01:28:55.831069 containerd[1497]: time="2025-03-25T01:28:55.831019476Z" level=error msg="Failed to destroy network for sandbox \"2bc55163d531a102349abb67e402bbc137daf174776d2c8994cc24f2172e8c30\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:28:55.832453 containerd[1497]: time="2025-03-25T01:28:55.832354964Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-d6g5n,Uid:e1099ad4-bb29-4692-8b74-1f06f414103c,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2bc55163d531a102349abb67e402bbc137daf174776d2c8994cc24f2172e8c30\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:28:55.833240 kubelet[2795]: E0325 01:28:55.832727 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2bc55163d531a102349abb67e402bbc137daf174776d2c8994cc24f2172e8c30\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:28:55.833240 kubelet[2795]: E0325 01:28:55.832781 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2bc55163d531a102349abb67e402bbc137daf174776d2c8994cc24f2172e8c30\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-d6g5n" Mar 25 01:28:55.833240 kubelet[2795]: E0325 01:28:55.832799 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2bc55163d531a102349abb67e402bbc137daf174776d2c8994cc24f2172e8c30\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-d6g5n" Mar 25 01:28:55.833469 kubelet[2795]: E0325 01:28:55.833416 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-d6g5n_kube-system(e1099ad4-bb29-4692-8b74-1f06f414103c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-d6g5n_kube-system(e1099ad4-bb29-4692-8b74-1f06f414103c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2bc55163d531a102349abb67e402bbc137daf174776d2c8994cc24f2172e8c30\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-d6g5n" podUID="e1099ad4-bb29-4692-8b74-1f06f414103c" Mar 25 01:28:56.537296 kubelet[2795]: E0325 01:28:56.537196 2795 projected.go:288] Couldn't get configMap calico-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 25 01:28:56.537296 kubelet[2795]: E0325 01:28:56.537278 2795 projected.go:194] Error preparing data for projected volume kube-api-access-jmbpn for pod calico-apiserver/calico-apiserver-77b65b67d6-rf4c2: failed to sync configmap cache: timed out waiting for the condition Mar 25 01:28:56.538232 kubelet[2795]: E0325 01:28:56.537359 2795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/48367b39-f9db-43f8-985c-12f3e16d795e-kube-api-access-jmbpn podName:48367b39-f9db-43f8-985c-12f3e16d795e nodeName:}" failed. No retries permitted until 2025-03-25 01:28:57.037335674 +0000 UTC m=+30.573449450 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-jmbpn" (UniqueName: "kubernetes.io/projected/48367b39-f9db-43f8-985c-12f3e16d795e-kube-api-access-jmbpn") pod "calico-apiserver-77b65b67d6-rf4c2" (UID: "48367b39-f9db-43f8-985c-12f3e16d795e") : failed to sync configmap cache: timed out waiting for the condition Mar 25 01:28:56.547691 systemd[1]: run-netns-cni\x2d6c5ab2ac\x2d975b\x2dc1a7\x2d9f4b\x2d1691df978c4b.mount: Deactivated successfully. Mar 25 01:28:56.548040 systemd[1]: run-netns-cni\x2dc9ed6fc5\x2dfdf1\x2d8f9e\x2d2822\x2de66eabe60f55.mount: Deactivated successfully. Mar 25 01:28:56.550929 kubelet[2795]: E0325 01:28:56.550735 2795 projected.go:288] Couldn't get configMap calico-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 25 01:28:56.550929 kubelet[2795]: E0325 01:28:56.550785 2795 projected.go:194] Error preparing data for projected volume kube-api-access-9nn7d for pod calico-apiserver/calico-apiserver-77b65b67d6-57pch: failed to sync configmap cache: timed out waiting for the condition Mar 25 01:28:56.551324 kubelet[2795]: E0325 01:28:56.551137 2795 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/35d02384-32ff-4cc9-8558-540e7f668e10-kube-api-access-9nn7d podName:35d02384-32ff-4cc9-8558-540e7f668e10 nodeName:}" failed. No retries permitted until 2025-03-25 01:28:57.051107188 +0000 UTC m=+30.587220964 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-9nn7d" (UniqueName: "kubernetes.io/projected/35d02384-32ff-4cc9-8558-540e7f668e10-kube-api-access-9nn7d") pod "calico-apiserver-77b65b67d6-57pch" (UID: "35d02384-32ff-4cc9-8558-540e7f668e10") : failed to sync configmap cache: timed out waiting for the condition Mar 25 01:28:57.037723 kubelet[2795]: I0325 01:28:57.037052 2795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:28:57.172394 containerd[1497]: time="2025-03-25T01:28:57.172352333Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-77b65b67d6-57pch,Uid:35d02384-32ff-4cc9-8558-540e7f668e10,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:28:57.195218 containerd[1497]: time="2025-03-25T01:28:57.195176133Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-77b65b67d6-rf4c2,Uid:48367b39-f9db-43f8-985c-12f3e16d795e,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:28:57.281211 containerd[1497]: time="2025-03-25T01:28:57.280947281Z" level=error msg="Failed to destroy network for sandbox \"aa21e01d1be3223016b1d6f1f765fd79d108bd391d268130d1308fba7f224de5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:28:57.283456 containerd[1497]: time="2025-03-25T01:28:57.283388495Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-77b65b67d6-57pch,Uid:35d02384-32ff-4cc9-8558-540e7f668e10,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa21e01d1be3223016b1d6f1f765fd79d108bd391d268130d1308fba7f224de5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:28:57.284269 kubelet[2795]: E0325 01:28:57.283759 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa21e01d1be3223016b1d6f1f765fd79d108bd391d268130d1308fba7f224de5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:28:57.284269 kubelet[2795]: E0325 01:28:57.283846 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa21e01d1be3223016b1d6f1f765fd79d108bd391d268130d1308fba7f224de5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-77b65b67d6-57pch" Mar 25 01:28:57.284269 kubelet[2795]: E0325 01:28:57.283866 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aa21e01d1be3223016b1d6f1f765fd79d108bd391d268130d1308fba7f224de5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-77b65b67d6-57pch" Mar 25 01:28:57.284424 kubelet[2795]: E0325 01:28:57.283907 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-77b65b67d6-57pch_calico-apiserver(35d02384-32ff-4cc9-8558-540e7f668e10)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-77b65b67d6-57pch_calico-apiserver(35d02384-32ff-4cc9-8558-540e7f668e10)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"aa21e01d1be3223016b1d6f1f765fd79d108bd391d268130d1308fba7f224de5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-77b65b67d6-57pch" podUID="35d02384-32ff-4cc9-8558-540e7f668e10" Mar 25 01:28:57.309315 containerd[1497]: time="2025-03-25T01:28:57.308937680Z" level=error msg="Failed to destroy network for sandbox \"8caded9e6fabb44c2bcbd569e08da5e99c330df1f8ae2a31654bc58de0ac6342\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:28:57.310962 containerd[1497]: time="2025-03-25T01:28:57.310827873Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-77b65b67d6-rf4c2,Uid:48367b39-f9db-43f8-985c-12f3e16d795e,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8caded9e6fabb44c2bcbd569e08da5e99c330df1f8ae2a31654bc58de0ac6342\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:28:57.311324 kubelet[2795]: E0325 01:28:57.311129 2795 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8caded9e6fabb44c2bcbd569e08da5e99c330df1f8ae2a31654bc58de0ac6342\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:28:57.311924 kubelet[2795]: E0325 01:28:57.311499 2795 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8caded9e6fabb44c2bcbd569e08da5e99c330df1f8ae2a31654bc58de0ac6342\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-77b65b67d6-rf4c2" Mar 25 01:28:57.311924 kubelet[2795]: E0325 01:28:57.311596 2795 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8caded9e6fabb44c2bcbd569e08da5e99c330df1f8ae2a31654bc58de0ac6342\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-77b65b67d6-rf4c2" Mar 25 01:28:57.311924 kubelet[2795]: E0325 01:28:57.311655 2795 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-77b65b67d6-rf4c2_calico-apiserver(48367b39-f9db-43f8-985c-12f3e16d795e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-77b65b67d6-rf4c2_calico-apiserver(48367b39-f9db-43f8-985c-12f3e16d795e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8caded9e6fabb44c2bcbd569e08da5e99c330df1f8ae2a31654bc58de0ac6342\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-77b65b67d6-rf4c2" podUID="48367b39-f9db-43f8-985c-12f3e16d795e" Mar 25 01:28:57.544305 systemd[1]: run-netns-cni\x2d802b663d\x2d3e99\x2d2ce5\x2d4df9\x2d3294174080c1.mount: Deactivated successfully. Mar 25 01:28:57.544657 systemd[1]: run-netns-cni\x2dd7aa16e0\x2d97af\x2d8c11\x2dadad\x2d96e30fe93ea7.mount: Deactivated successfully. Mar 25 01:28:59.569689 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3824247622.mount: Deactivated successfully. Mar 25 01:28:59.604603 containerd[1497]: time="2025-03-25T01:28:59.603947109Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:28:59.605141 containerd[1497]: time="2025-03-25T01:28:59.605087476Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.2: active requests=0, bytes read=137086024" Mar 25 01:28:59.606513 containerd[1497]: time="2025-03-25T01:28:59.606458492Z" level=info msg="ImageCreate event name:\"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:28:59.609386 containerd[1497]: time="2025-03-25T01:28:59.609343650Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:28:59.610048 containerd[1497]: time="2025-03-25T01:28:59.609992637Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.2\" with image id \"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\", size \"137085886\" in 3.846988688s" Mar 25 01:28:59.610048 containerd[1497]: time="2025-03-25T01:28:59.610040239Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\" returns image reference \"sha256:8fd1983cc851d15f05a37eb3ff85b0cde86869beec7630d2940c86fc7b98d0c1\"" Mar 25 01:28:59.628305 containerd[1497]: time="2025-03-25T01:28:59.628257865Z" level=info msg="CreateContainer within sandbox \"0981bc70b8474e6e76c8796196ff1c1242759024bd7c9e0fcc729e041c8869e7\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 25 01:28:59.640704 containerd[1497]: time="2025-03-25T01:28:59.640653373Z" level=info msg="Container 48216c06ec2fcc708460762897ab7fddaecd1e84bee19354646d8525b9cb5440: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:28:59.652083 containerd[1497]: time="2025-03-25T01:28:59.651942316Z" level=info msg="CreateContainer within sandbox \"0981bc70b8474e6e76c8796196ff1c1242759024bd7c9e0fcc729e041c8869e7\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"48216c06ec2fcc708460762897ab7fddaecd1e84bee19354646d8525b9cb5440\"" Mar 25 01:28:59.653335 containerd[1497]: time="2025-03-25T01:28:59.653149485Z" level=info msg="StartContainer for \"48216c06ec2fcc708460762897ab7fddaecd1e84bee19354646d8525b9cb5440\"" Mar 25 01:28:59.655784 containerd[1497]: time="2025-03-25T01:28:59.655671589Z" level=info msg="connecting to shim 48216c06ec2fcc708460762897ab7fddaecd1e84bee19354646d8525b9cb5440" address="unix:///run/containerd/s/e6ea8904e1571de8a31af665a33ec794fa32e8548fd78a48e28aa4df0ecade64" protocol=ttrpc version=3 Mar 25 01:28:59.682322 systemd[1]: Started cri-containerd-48216c06ec2fcc708460762897ab7fddaecd1e84bee19354646d8525b9cb5440.scope - libcontainer container 48216c06ec2fcc708460762897ab7fddaecd1e84bee19354646d8525b9cb5440. Mar 25 01:28:59.730865 containerd[1497]: time="2025-03-25T01:28:59.730351089Z" level=info msg="StartContainer for \"48216c06ec2fcc708460762897ab7fddaecd1e84bee19354646d8525b9cb5440\" returns successfully" Mar 25 01:28:59.878305 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Mar 25 01:28:59.878419 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Mar 25 01:28:59.884759 containerd[1497]: time="2025-03-25T01:28:59.884371562Z" level=info msg="TaskExit event in podsandbox handler container_id:\"48216c06ec2fcc708460762897ab7fddaecd1e84bee19354646d8525b9cb5440\" id:\"1faceb20ecfda337234a437a425d1140392122095f58ac4042138e203a9f426d\" pid:3748 exit_status:1 exited_at:{seconds:1742866139 nanos:883956545}" Mar 25 01:29:00.859852 containerd[1497]: time="2025-03-25T01:29:00.859707404Z" level=info msg="TaskExit event in podsandbox handler container_id:\"48216c06ec2fcc708460762897ab7fddaecd1e84bee19354646d8525b9cb5440\" id:\"44d93a59775a4a71a644d64924157366bc9d20bc65c636314bddae21450af160\" pid:3804 exit_status:1 exited_at:{seconds:1742866140 nanos:859316827}" Mar 25 01:29:01.749888 kernel: bpftool[3935]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 25 01:29:01.884905 containerd[1497]: time="2025-03-25T01:29:01.884848749Z" level=info msg="TaskExit event in podsandbox handler container_id:\"48216c06ec2fcc708460762897ab7fddaecd1e84bee19354646d8525b9cb5440\" id:\"32d8960bd52bba3dff2b115ad7a1b970e4392c0a414ea266ad04220cb7d0c69a\" pid:3949 exit_status:1 exited_at:{seconds:1742866141 nanos:868949861}" Mar 25 01:29:01.974277 systemd-networkd[1395]: vxlan.calico: Link UP Mar 25 01:29:01.974286 systemd-networkd[1395]: vxlan.calico: Gained carrier Mar 25 01:29:03.522197 systemd-networkd[1395]: vxlan.calico: Gained IPv6LL Mar 25 01:29:10.600388 containerd[1497]: time="2025-03-25T01:29:10.599803211Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-77b65b67d6-rf4c2,Uid:48367b39-f9db-43f8-985c-12f3e16d795e,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:29:10.600388 containerd[1497]: time="2025-03-25T01:29:10.600086346Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-74b96585fb-rvlz4,Uid:c0b0d32b-56b5-4b1a-93b8-d4235f47aaa6,Namespace:calico-system,Attempt:0,}" Mar 25 01:29:10.601707 containerd[1497]: time="2025-03-25T01:29:10.601387294Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-qqqdg,Uid:2c508311-23b5-48ed-a7d6-4e74589ff007,Namespace:kube-system,Attempt:0,}" Mar 25 01:29:10.868026 systemd-networkd[1395]: calib331ea41d10: Link UP Mar 25 01:29:10.868292 systemd-networkd[1395]: calib331ea41d10: Gained carrier Mar 25 01:29:10.895278 kubelet[2795]: I0325 01:29:10.894021 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-klltq" podStartSLOduration=11.915689117 podStartE2EDuration="23.893768378s" podCreationTimestamp="2025-03-25 01:28:47 +0000 UTC" firstStartedPulling="2025-03-25 01:28:47.633456319 +0000 UTC m=+21.169570095" lastFinishedPulling="2025-03-25 01:28:59.61153558 +0000 UTC m=+33.147649356" observedRunningTime="2025-03-25 01:28:59.803860342 +0000 UTC m=+33.339974158" watchObservedRunningTime="2025-03-25 01:29:10.893768378 +0000 UTC m=+44.429882154" Mar 25 01:29:10.900071 containerd[1497]: 2025-03-25 01:29:10.716 [INFO][4049] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--6--22e9b0bb97-k8s-calico--apiserver--77b65b67d6--rf4c2-eth0 calico-apiserver-77b65b67d6- calico-apiserver 48367b39-f9db-43f8-985c-12f3e16d795e 709 0 2025-03-25 01:28:46 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:77b65b67d6 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4284-0-0-6-22e9b0bb97 calico-apiserver-77b65b67d6-rf4c2 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calib331ea41d10 [] []}} ContainerID="b648c3727878e7b766fb88b629e7c27d9fec934aa8571b7800d5a49f639269a2" Namespace="calico-apiserver" Pod="calico-apiserver-77b65b67d6-rf4c2" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-calico--apiserver--77b65b67d6--rf4c2-" Mar 25 01:29:10.900071 containerd[1497]: 2025-03-25 01:29:10.717 [INFO][4049] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="b648c3727878e7b766fb88b629e7c27d9fec934aa8571b7800d5a49f639269a2" Namespace="calico-apiserver" Pod="calico-apiserver-77b65b67d6-rf4c2" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-calico--apiserver--77b65b67d6--rf4c2-eth0" Mar 25 01:29:10.900071 containerd[1497]: 2025-03-25 01:29:10.786 [INFO][4081] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b648c3727878e7b766fb88b629e7c27d9fec934aa8571b7800d5a49f639269a2" HandleID="k8s-pod-network.b648c3727878e7b766fb88b629e7c27d9fec934aa8571b7800d5a49f639269a2" Workload="ci--4284--0--0--6--22e9b0bb97-k8s-calico--apiserver--77b65b67d6--rf4c2-eth0" Mar 25 01:29:10.900347 containerd[1497]: 2025-03-25 01:29:10.814 [INFO][4081] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b648c3727878e7b766fb88b629e7c27d9fec934aa8571b7800d5a49f639269a2" HandleID="k8s-pod-network.b648c3727878e7b766fb88b629e7c27d9fec934aa8571b7800d5a49f639269a2" Workload="ci--4284--0--0--6--22e9b0bb97-k8s-calico--apiserver--77b65b67d6--rf4c2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000398c40), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4284-0-0-6-22e9b0bb97", "pod":"calico-apiserver-77b65b67d6-rf4c2", "timestamp":"2025-03-25 01:29:10.786888115 +0000 UTC"}, Hostname:"ci-4284-0-0-6-22e9b0bb97", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:29:10.900347 containerd[1497]: 2025-03-25 01:29:10.814 [INFO][4081] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:29:10.900347 containerd[1497]: 2025-03-25 01:29:10.814 [INFO][4081] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:29:10.900347 containerd[1497]: 2025-03-25 01:29:10.814 [INFO][4081] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-6-22e9b0bb97' Mar 25 01:29:10.900347 containerd[1497]: 2025-03-25 01:29:10.821 [INFO][4081] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.b648c3727878e7b766fb88b629e7c27d9fec934aa8571b7800d5a49f639269a2" host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:10.900347 containerd[1497]: 2025-03-25 01:29:10.827 [INFO][4081] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:10.900347 containerd[1497]: 2025-03-25 01:29:10.833 [INFO][4081] ipam/ipam.go 489: Trying affinity for 192.168.98.192/26 host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:10.900347 containerd[1497]: 2025-03-25 01:29:10.836 [INFO][4081] ipam/ipam.go 155: Attempting to load block cidr=192.168.98.192/26 host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:10.900347 containerd[1497]: 2025-03-25 01:29:10.839 [INFO][4081] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.98.192/26 host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:10.900536 containerd[1497]: 2025-03-25 01:29:10.839 [INFO][4081] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.98.192/26 handle="k8s-pod-network.b648c3727878e7b766fb88b629e7c27d9fec934aa8571b7800d5a49f639269a2" host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:10.900536 containerd[1497]: 2025-03-25 01:29:10.842 [INFO][4081] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.b648c3727878e7b766fb88b629e7c27d9fec934aa8571b7800d5a49f639269a2 Mar 25 01:29:10.900536 containerd[1497]: 2025-03-25 01:29:10.848 [INFO][4081] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.98.192/26 handle="k8s-pod-network.b648c3727878e7b766fb88b629e7c27d9fec934aa8571b7800d5a49f639269a2" host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:10.900536 containerd[1497]: 2025-03-25 01:29:10.856 [INFO][4081] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.98.193/26] block=192.168.98.192/26 handle="k8s-pod-network.b648c3727878e7b766fb88b629e7c27d9fec934aa8571b7800d5a49f639269a2" host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:10.900536 containerd[1497]: 2025-03-25 01:29:10.856 [INFO][4081] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.98.193/26] handle="k8s-pod-network.b648c3727878e7b766fb88b629e7c27d9fec934aa8571b7800d5a49f639269a2" host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:10.900536 containerd[1497]: 2025-03-25 01:29:10.857 [INFO][4081] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:29:10.900536 containerd[1497]: 2025-03-25 01:29:10.857 [INFO][4081] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.98.193/26] IPv6=[] ContainerID="b648c3727878e7b766fb88b629e7c27d9fec934aa8571b7800d5a49f639269a2" HandleID="k8s-pod-network.b648c3727878e7b766fb88b629e7c27d9fec934aa8571b7800d5a49f639269a2" Workload="ci--4284--0--0--6--22e9b0bb97-k8s-calico--apiserver--77b65b67d6--rf4c2-eth0" Mar 25 01:29:10.900666 containerd[1497]: 2025-03-25 01:29:10.861 [INFO][4049] cni-plugin/k8s.go 386: Populated endpoint ContainerID="b648c3727878e7b766fb88b629e7c27d9fec934aa8571b7800d5a49f639269a2" Namespace="calico-apiserver" Pod="calico-apiserver-77b65b67d6-rf4c2" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-calico--apiserver--77b65b67d6--rf4c2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--6--22e9b0bb97-k8s-calico--apiserver--77b65b67d6--rf4c2-eth0", GenerateName:"calico-apiserver-77b65b67d6-", Namespace:"calico-apiserver", SelfLink:"", UID:"48367b39-f9db-43f8-985c-12f3e16d795e", ResourceVersion:"709", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 28, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"77b65b67d6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-6-22e9b0bb97", ContainerID:"", Pod:"calico-apiserver-77b65b67d6-rf4c2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.98.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib331ea41d10", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:29:10.900721 containerd[1497]: 2025-03-25 01:29:10.862 [INFO][4049] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.98.193/32] ContainerID="b648c3727878e7b766fb88b629e7c27d9fec934aa8571b7800d5a49f639269a2" Namespace="calico-apiserver" Pod="calico-apiserver-77b65b67d6-rf4c2" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-calico--apiserver--77b65b67d6--rf4c2-eth0" Mar 25 01:29:10.900721 containerd[1497]: 2025-03-25 01:29:10.862 [INFO][4049] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib331ea41d10 ContainerID="b648c3727878e7b766fb88b629e7c27d9fec934aa8571b7800d5a49f639269a2" Namespace="calico-apiserver" Pod="calico-apiserver-77b65b67d6-rf4c2" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-calico--apiserver--77b65b67d6--rf4c2-eth0" Mar 25 01:29:10.900721 containerd[1497]: 2025-03-25 01:29:10.868 [INFO][4049] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b648c3727878e7b766fb88b629e7c27d9fec934aa8571b7800d5a49f639269a2" Namespace="calico-apiserver" Pod="calico-apiserver-77b65b67d6-rf4c2" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-calico--apiserver--77b65b67d6--rf4c2-eth0" Mar 25 01:29:10.900784 containerd[1497]: 2025-03-25 01:29:10.870 [INFO][4049] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="b648c3727878e7b766fb88b629e7c27d9fec934aa8571b7800d5a49f639269a2" Namespace="calico-apiserver" Pod="calico-apiserver-77b65b67d6-rf4c2" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-calico--apiserver--77b65b67d6--rf4c2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--6--22e9b0bb97-k8s-calico--apiserver--77b65b67d6--rf4c2-eth0", GenerateName:"calico-apiserver-77b65b67d6-", Namespace:"calico-apiserver", SelfLink:"", UID:"48367b39-f9db-43f8-985c-12f3e16d795e", ResourceVersion:"709", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 28, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"77b65b67d6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-6-22e9b0bb97", ContainerID:"b648c3727878e7b766fb88b629e7c27d9fec934aa8571b7800d5a49f639269a2", Pod:"calico-apiserver-77b65b67d6-rf4c2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.98.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib331ea41d10", MAC:"6e:f7:00:32:21:c8", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:29:10.901549 containerd[1497]: 2025-03-25 01:29:10.896 [INFO][4049] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="b648c3727878e7b766fb88b629e7c27d9fec934aa8571b7800d5a49f639269a2" Namespace="calico-apiserver" Pod="calico-apiserver-77b65b67d6-rf4c2" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-calico--apiserver--77b65b67d6--rf4c2-eth0" Mar 25 01:29:10.982188 containerd[1497]: time="2025-03-25T01:29:10.981901709Z" level=info msg="connecting to shim b648c3727878e7b766fb88b629e7c27d9fec934aa8571b7800d5a49f639269a2" address="unix:///run/containerd/s/0ebb8d116f693c0cc026cc95132c9e30f69485dc1c3a413acc82b95513837200" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:29:10.992314 systemd-networkd[1395]: calia4e92c28674: Link UP Mar 25 01:29:10.993777 systemd-networkd[1395]: calia4e92c28674: Gained carrier Mar 25 01:29:11.020139 containerd[1497]: 2025-03-25 01:29:10.717 [INFO][4042] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--6--22e9b0bb97-k8s-calico--kube--controllers--74b96585fb--rvlz4-eth0 calico-kube-controllers-74b96585fb- calico-system c0b0d32b-56b5-4b1a-93b8-d4235f47aaa6 706 0 2025-03-25 01:28:47 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:74b96585fb projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4284-0-0-6-22e9b0bb97 calico-kube-controllers-74b96585fb-rvlz4 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calia4e92c28674 [] []}} ContainerID="be23acdcca48310e71c118420cfd6621cb31294deb345b3330303ae69fd53b24" Namespace="calico-system" Pod="calico-kube-controllers-74b96585fb-rvlz4" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-calico--kube--controllers--74b96585fb--rvlz4-" Mar 25 01:29:11.020139 containerd[1497]: 2025-03-25 01:29:10.718 [INFO][4042] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="be23acdcca48310e71c118420cfd6621cb31294deb345b3330303ae69fd53b24" Namespace="calico-system" Pod="calico-kube-controllers-74b96585fb-rvlz4" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-calico--kube--controllers--74b96585fb--rvlz4-eth0" Mar 25 01:29:11.020139 containerd[1497]: 2025-03-25 01:29:10.797 [INFO][4088] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="be23acdcca48310e71c118420cfd6621cb31294deb345b3330303ae69fd53b24" HandleID="k8s-pod-network.be23acdcca48310e71c118420cfd6621cb31294deb345b3330303ae69fd53b24" Workload="ci--4284--0--0--6--22e9b0bb97-k8s-calico--kube--controllers--74b96585fb--rvlz4-eth0" Mar 25 01:29:11.020343 containerd[1497]: 2025-03-25 01:29:10.819 [INFO][4088] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="be23acdcca48310e71c118420cfd6621cb31294deb345b3330303ae69fd53b24" HandleID="k8s-pod-network.be23acdcca48310e71c118420cfd6621cb31294deb345b3330303ae69fd53b24" Workload="ci--4284--0--0--6--22e9b0bb97-k8s-calico--kube--controllers--74b96585fb--rvlz4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400011bb50), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4284-0-0-6-22e9b0bb97", "pod":"calico-kube-controllers-74b96585fb-rvlz4", "timestamp":"2025-03-25 01:29:10.796997759 +0000 UTC"}, Hostname:"ci-4284-0-0-6-22e9b0bb97", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:29:11.020343 containerd[1497]: 2025-03-25 01:29:10.819 [INFO][4088] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:29:11.020343 containerd[1497]: 2025-03-25 01:29:10.859 [INFO][4088] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:29:11.020343 containerd[1497]: 2025-03-25 01:29:10.859 [INFO][4088] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-6-22e9b0bb97' Mar 25 01:29:11.020343 containerd[1497]: 2025-03-25 01:29:10.922 [INFO][4088] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.be23acdcca48310e71c118420cfd6621cb31294deb345b3330303ae69fd53b24" host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:11.020343 containerd[1497]: 2025-03-25 01:29:10.933 [INFO][4088] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:11.020343 containerd[1497]: 2025-03-25 01:29:10.949 [INFO][4088] ipam/ipam.go 489: Trying affinity for 192.168.98.192/26 host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:11.020343 containerd[1497]: 2025-03-25 01:29:10.952 [INFO][4088] ipam/ipam.go 155: Attempting to load block cidr=192.168.98.192/26 host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:11.020343 containerd[1497]: 2025-03-25 01:29:10.955 [INFO][4088] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.98.192/26 host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:11.021678 containerd[1497]: 2025-03-25 01:29:10.955 [INFO][4088] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.98.192/26 handle="k8s-pod-network.be23acdcca48310e71c118420cfd6621cb31294deb345b3330303ae69fd53b24" host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:11.021678 containerd[1497]: 2025-03-25 01:29:10.958 [INFO][4088] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.be23acdcca48310e71c118420cfd6621cb31294deb345b3330303ae69fd53b24 Mar 25 01:29:11.021678 containerd[1497]: 2025-03-25 01:29:10.965 [INFO][4088] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.98.192/26 handle="k8s-pod-network.be23acdcca48310e71c118420cfd6621cb31294deb345b3330303ae69fd53b24" host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:11.021678 containerd[1497]: 2025-03-25 01:29:10.981 [INFO][4088] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.98.194/26] block=192.168.98.192/26 handle="k8s-pod-network.be23acdcca48310e71c118420cfd6621cb31294deb345b3330303ae69fd53b24" host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:11.021678 containerd[1497]: 2025-03-25 01:29:10.981 [INFO][4088] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.98.194/26] handle="k8s-pod-network.be23acdcca48310e71c118420cfd6621cb31294deb345b3330303ae69fd53b24" host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:11.021678 containerd[1497]: 2025-03-25 01:29:10.981 [INFO][4088] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:29:11.021678 containerd[1497]: 2025-03-25 01:29:10.981 [INFO][4088] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.98.194/26] IPv6=[] ContainerID="be23acdcca48310e71c118420cfd6621cb31294deb345b3330303ae69fd53b24" HandleID="k8s-pod-network.be23acdcca48310e71c118420cfd6621cb31294deb345b3330303ae69fd53b24" Workload="ci--4284--0--0--6--22e9b0bb97-k8s-calico--kube--controllers--74b96585fb--rvlz4-eth0" Mar 25 01:29:11.022505 containerd[1497]: 2025-03-25 01:29:10.989 [INFO][4042] cni-plugin/k8s.go 386: Populated endpoint ContainerID="be23acdcca48310e71c118420cfd6621cb31294deb345b3330303ae69fd53b24" Namespace="calico-system" Pod="calico-kube-controllers-74b96585fb-rvlz4" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-calico--kube--controllers--74b96585fb--rvlz4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--6--22e9b0bb97-k8s-calico--kube--controllers--74b96585fb--rvlz4-eth0", GenerateName:"calico-kube-controllers-74b96585fb-", Namespace:"calico-system", SelfLink:"", UID:"c0b0d32b-56b5-4b1a-93b8-d4235f47aaa6", ResourceVersion:"706", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 28, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"74b96585fb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-6-22e9b0bb97", ContainerID:"", Pod:"calico-kube-controllers-74b96585fb-rvlz4", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.98.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calia4e92c28674", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:29:11.022607 containerd[1497]: 2025-03-25 01:29:10.989 [INFO][4042] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.98.194/32] ContainerID="be23acdcca48310e71c118420cfd6621cb31294deb345b3330303ae69fd53b24" Namespace="calico-system" Pod="calico-kube-controllers-74b96585fb-rvlz4" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-calico--kube--controllers--74b96585fb--rvlz4-eth0" Mar 25 01:29:11.022607 containerd[1497]: 2025-03-25 01:29:10.989 [INFO][4042] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia4e92c28674 ContainerID="be23acdcca48310e71c118420cfd6621cb31294deb345b3330303ae69fd53b24" Namespace="calico-system" Pod="calico-kube-controllers-74b96585fb-rvlz4" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-calico--kube--controllers--74b96585fb--rvlz4-eth0" Mar 25 01:29:11.022607 containerd[1497]: 2025-03-25 01:29:10.992 [INFO][4042] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="be23acdcca48310e71c118420cfd6621cb31294deb345b3330303ae69fd53b24" Namespace="calico-system" Pod="calico-kube-controllers-74b96585fb-rvlz4" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-calico--kube--controllers--74b96585fb--rvlz4-eth0" Mar 25 01:29:11.022727 containerd[1497]: 2025-03-25 01:29:10.993 [INFO][4042] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="be23acdcca48310e71c118420cfd6621cb31294deb345b3330303ae69fd53b24" Namespace="calico-system" Pod="calico-kube-controllers-74b96585fb-rvlz4" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-calico--kube--controllers--74b96585fb--rvlz4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--6--22e9b0bb97-k8s-calico--kube--controllers--74b96585fb--rvlz4-eth0", GenerateName:"calico-kube-controllers-74b96585fb-", Namespace:"calico-system", SelfLink:"", UID:"c0b0d32b-56b5-4b1a-93b8-d4235f47aaa6", ResourceVersion:"706", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 28, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"74b96585fb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-6-22e9b0bb97", ContainerID:"be23acdcca48310e71c118420cfd6621cb31294deb345b3330303ae69fd53b24", Pod:"calico-kube-controllers-74b96585fb-rvlz4", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.98.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calia4e92c28674", MAC:"92:20:ef:03:fc:c7", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:29:11.022788 containerd[1497]: 2025-03-25 01:29:11.015 [INFO][4042] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="be23acdcca48310e71c118420cfd6621cb31294deb345b3330303ae69fd53b24" Namespace="calico-system" Pod="calico-kube-controllers-74b96585fb-rvlz4" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-calico--kube--controllers--74b96585fb--rvlz4-eth0" Mar 25 01:29:11.080533 containerd[1497]: time="2025-03-25T01:29:11.079932458Z" level=info msg="connecting to shim be23acdcca48310e71c118420cfd6621cb31294deb345b3330303ae69fd53b24" address="unix:///run/containerd/s/f415ad45dfc441d64da23912278c8352d7bd8f826b025aae97291c4f275f7397" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:29:11.081740 systemd[1]: Started cri-containerd-b648c3727878e7b766fb88b629e7c27d9fec934aa8571b7800d5a49f639269a2.scope - libcontainer container b648c3727878e7b766fb88b629e7c27d9fec934aa8571b7800d5a49f639269a2. Mar 25 01:29:11.103716 systemd-networkd[1395]: cali2343d5c8e51: Link UP Mar 25 01:29:11.109762 systemd-networkd[1395]: cali2343d5c8e51: Gained carrier Mar 25 01:29:11.114065 systemd[1]: Started cri-containerd-be23acdcca48310e71c118420cfd6621cb31294deb345b3330303ae69fd53b24.scope - libcontainer container be23acdcca48310e71c118420cfd6621cb31294deb345b3330303ae69fd53b24. Mar 25 01:29:11.151803 containerd[1497]: 2025-03-25 01:29:10.716 [INFO][4061] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--6--22e9b0bb97-k8s-coredns--6f6b679f8f--qqqdg-eth0 coredns-6f6b679f8f- kube-system 2c508311-23b5-48ed-a7d6-4e74589ff007 702 0 2025-03-25 01:28:32 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4284-0-0-6-22e9b0bb97 coredns-6f6b679f8f-qqqdg eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali2343d5c8e51 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="ea217660112ff83e4046dea54fcfec20ae5a97d58b4e7446bc1868fcf301d48b" Namespace="kube-system" Pod="coredns-6f6b679f8f-qqqdg" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-coredns--6f6b679f8f--qqqdg-" Mar 25 01:29:11.151803 containerd[1497]: 2025-03-25 01:29:10.716 [INFO][4061] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="ea217660112ff83e4046dea54fcfec20ae5a97d58b4e7446bc1868fcf301d48b" Namespace="kube-system" Pod="coredns-6f6b679f8f-qqqdg" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-coredns--6f6b679f8f--qqqdg-eth0" Mar 25 01:29:11.151803 containerd[1497]: 2025-03-25 01:29:10.786 [INFO][4086] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ea217660112ff83e4046dea54fcfec20ae5a97d58b4e7446bc1868fcf301d48b" HandleID="k8s-pod-network.ea217660112ff83e4046dea54fcfec20ae5a97d58b4e7446bc1868fcf301d48b" Workload="ci--4284--0--0--6--22e9b0bb97-k8s-coredns--6f6b679f8f--qqqdg-eth0" Mar 25 01:29:11.152144 containerd[1497]: 2025-03-25 01:29:10.821 [INFO][4086] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ea217660112ff83e4046dea54fcfec20ae5a97d58b4e7446bc1868fcf301d48b" HandleID="k8s-pod-network.ea217660112ff83e4046dea54fcfec20ae5a97d58b4e7446bc1868fcf301d48b" Workload="ci--4284--0--0--6--22e9b0bb97-k8s-coredns--6f6b679f8f--qqqdg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003e9a90), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4284-0-0-6-22e9b0bb97", "pod":"coredns-6f6b679f8f-qqqdg", "timestamp":"2025-03-25 01:29:10.786283483 +0000 UTC"}, Hostname:"ci-4284-0-0-6-22e9b0bb97", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:29:11.152144 containerd[1497]: 2025-03-25 01:29:10.821 [INFO][4086] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:29:11.152144 containerd[1497]: 2025-03-25 01:29:10.983 [INFO][4086] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:29:11.152144 containerd[1497]: 2025-03-25 01:29:10.983 [INFO][4086] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-6-22e9b0bb97' Mar 25 01:29:11.152144 containerd[1497]: 2025-03-25 01:29:11.024 [INFO][4086] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.ea217660112ff83e4046dea54fcfec20ae5a97d58b4e7446bc1868fcf301d48b" host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:11.152144 containerd[1497]: 2025-03-25 01:29:11.038 [INFO][4086] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:11.152144 containerd[1497]: 2025-03-25 01:29:11.050 [INFO][4086] ipam/ipam.go 489: Trying affinity for 192.168.98.192/26 host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:11.152144 containerd[1497]: 2025-03-25 01:29:11.053 [INFO][4086] ipam/ipam.go 155: Attempting to load block cidr=192.168.98.192/26 host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:11.152144 containerd[1497]: 2025-03-25 01:29:11.061 [INFO][4086] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.98.192/26 host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:11.152527 containerd[1497]: 2025-03-25 01:29:11.061 [INFO][4086] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.98.192/26 handle="k8s-pod-network.ea217660112ff83e4046dea54fcfec20ae5a97d58b4e7446bc1868fcf301d48b" host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:11.152527 containerd[1497]: 2025-03-25 01:29:11.064 [INFO][4086] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.ea217660112ff83e4046dea54fcfec20ae5a97d58b4e7446bc1868fcf301d48b Mar 25 01:29:11.152527 containerd[1497]: 2025-03-25 01:29:11.076 [INFO][4086] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.98.192/26 handle="k8s-pod-network.ea217660112ff83e4046dea54fcfec20ae5a97d58b4e7446bc1868fcf301d48b" host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:11.152527 containerd[1497]: 2025-03-25 01:29:11.091 [INFO][4086] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.98.195/26] block=192.168.98.192/26 handle="k8s-pod-network.ea217660112ff83e4046dea54fcfec20ae5a97d58b4e7446bc1868fcf301d48b" host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:11.152527 containerd[1497]: 2025-03-25 01:29:11.091 [INFO][4086] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.98.195/26] handle="k8s-pod-network.ea217660112ff83e4046dea54fcfec20ae5a97d58b4e7446bc1868fcf301d48b" host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:11.152527 containerd[1497]: 2025-03-25 01:29:11.091 [INFO][4086] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:29:11.152527 containerd[1497]: 2025-03-25 01:29:11.091 [INFO][4086] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.98.195/26] IPv6=[] ContainerID="ea217660112ff83e4046dea54fcfec20ae5a97d58b4e7446bc1868fcf301d48b" HandleID="k8s-pod-network.ea217660112ff83e4046dea54fcfec20ae5a97d58b4e7446bc1868fcf301d48b" Workload="ci--4284--0--0--6--22e9b0bb97-k8s-coredns--6f6b679f8f--qqqdg-eth0" Mar 25 01:29:11.152826 containerd[1497]: 2025-03-25 01:29:11.097 [INFO][4061] cni-plugin/k8s.go 386: Populated endpoint ContainerID="ea217660112ff83e4046dea54fcfec20ae5a97d58b4e7446bc1868fcf301d48b" Namespace="kube-system" Pod="coredns-6f6b679f8f-qqqdg" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-coredns--6f6b679f8f--qqqdg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--6--22e9b0bb97-k8s-coredns--6f6b679f8f--qqqdg-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"2c508311-23b5-48ed-a7d6-4e74589ff007", ResourceVersion:"702", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 28, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-6-22e9b0bb97", ContainerID:"", Pod:"coredns-6f6b679f8f-qqqdg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.98.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali2343d5c8e51", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:29:11.152826 containerd[1497]: 2025-03-25 01:29:11.098 [INFO][4061] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.98.195/32] ContainerID="ea217660112ff83e4046dea54fcfec20ae5a97d58b4e7446bc1868fcf301d48b" Namespace="kube-system" Pod="coredns-6f6b679f8f-qqqdg" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-coredns--6f6b679f8f--qqqdg-eth0" Mar 25 01:29:11.152826 containerd[1497]: 2025-03-25 01:29:11.098 [INFO][4061] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2343d5c8e51 ContainerID="ea217660112ff83e4046dea54fcfec20ae5a97d58b4e7446bc1868fcf301d48b" Namespace="kube-system" Pod="coredns-6f6b679f8f-qqqdg" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-coredns--6f6b679f8f--qqqdg-eth0" Mar 25 01:29:11.152826 containerd[1497]: 2025-03-25 01:29:11.105 [INFO][4061] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ea217660112ff83e4046dea54fcfec20ae5a97d58b4e7446bc1868fcf301d48b" Namespace="kube-system" Pod="coredns-6f6b679f8f-qqqdg" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-coredns--6f6b679f8f--qqqdg-eth0" Mar 25 01:29:11.152826 containerd[1497]: 2025-03-25 01:29:11.106 [INFO][4061] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="ea217660112ff83e4046dea54fcfec20ae5a97d58b4e7446bc1868fcf301d48b" Namespace="kube-system" Pod="coredns-6f6b679f8f-qqqdg" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-coredns--6f6b679f8f--qqqdg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--6--22e9b0bb97-k8s-coredns--6f6b679f8f--qqqdg-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"2c508311-23b5-48ed-a7d6-4e74589ff007", ResourceVersion:"702", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 28, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-6-22e9b0bb97", ContainerID:"ea217660112ff83e4046dea54fcfec20ae5a97d58b4e7446bc1868fcf301d48b", Pod:"coredns-6f6b679f8f-qqqdg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.98.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali2343d5c8e51", MAC:"5a:dd:48:d3:38:82", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:29:11.152826 containerd[1497]: 2025-03-25 01:29:11.146 [INFO][4061] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="ea217660112ff83e4046dea54fcfec20ae5a97d58b4e7446bc1868fcf301d48b" Namespace="kube-system" Pod="coredns-6f6b679f8f-qqqdg" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-coredns--6f6b679f8f--qqqdg-eth0" Mar 25 01:29:11.203159 containerd[1497]: time="2025-03-25T01:29:11.203086506Z" level=info msg="connecting to shim ea217660112ff83e4046dea54fcfec20ae5a97d58b4e7446bc1868fcf301d48b" address="unix:///run/containerd/s/f80f0ba975201a95dc940a2b24f585207758faa1ed0c7fc319f7b648211a6080" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:29:11.206838 containerd[1497]: time="2025-03-25T01:29:11.204510181Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-74b96585fb-rvlz4,Uid:c0b0d32b-56b5-4b1a-93b8-d4235f47aaa6,Namespace:calico-system,Attempt:0,} returns sandbox id \"be23acdcca48310e71c118420cfd6621cb31294deb345b3330303ae69fd53b24\"" Mar 25 01:29:11.215261 containerd[1497]: time="2025-03-25T01:29:11.215206025Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\"" Mar 25 01:29:11.265218 containerd[1497]: time="2025-03-25T01:29:11.265159936Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-77b65b67d6-rf4c2,Uid:48367b39-f9db-43f8-985c-12f3e16d795e,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"b648c3727878e7b766fb88b629e7c27d9fec934aa8571b7800d5a49f639269a2\"" Mar 25 01:29:11.269067 systemd[1]: Started cri-containerd-ea217660112ff83e4046dea54fcfec20ae5a97d58b4e7446bc1868fcf301d48b.scope - libcontainer container ea217660112ff83e4046dea54fcfec20ae5a97d58b4e7446bc1868fcf301d48b. Mar 25 01:29:11.325060 containerd[1497]: time="2025-03-25T01:29:11.324879002Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-qqqdg,Uid:2c508311-23b5-48ed-a7d6-4e74589ff007,Namespace:kube-system,Attempt:0,} returns sandbox id \"ea217660112ff83e4046dea54fcfec20ae5a97d58b4e7446bc1868fcf301d48b\"" Mar 25 01:29:11.328892 containerd[1497]: time="2025-03-25T01:29:11.328844531Z" level=info msg="CreateContainer within sandbox \"ea217660112ff83e4046dea54fcfec20ae5a97d58b4e7446bc1868fcf301d48b\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 25 01:29:11.346981 containerd[1497]: time="2025-03-25T01:29:11.346893522Z" level=info msg="Container cb5e444bcb8729282f529feb42cb7793a9cdd6fd961d4a2b799e7745405fdf1c: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:29:11.361560 containerd[1497]: time="2025-03-25T01:29:11.361403486Z" level=info msg="CreateContainer within sandbox \"ea217660112ff83e4046dea54fcfec20ae5a97d58b4e7446bc1868fcf301d48b\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"cb5e444bcb8729282f529feb42cb7793a9cdd6fd961d4a2b799e7745405fdf1c\"" Mar 25 01:29:11.362643 containerd[1497]: time="2025-03-25T01:29:11.362569308Z" level=info msg="StartContainer for \"cb5e444bcb8729282f529feb42cb7793a9cdd6fd961d4a2b799e7745405fdf1c\"" Mar 25 01:29:11.363845 containerd[1497]: time="2025-03-25T01:29:11.363768331Z" level=info msg="connecting to shim cb5e444bcb8729282f529feb42cb7793a9cdd6fd961d4a2b799e7745405fdf1c" address="unix:///run/containerd/s/f80f0ba975201a95dc940a2b24f585207758faa1ed0c7fc319f7b648211a6080" protocol=ttrpc version=3 Mar 25 01:29:11.386096 systemd[1]: Started cri-containerd-cb5e444bcb8729282f529feb42cb7793a9cdd6fd961d4a2b799e7745405fdf1c.scope - libcontainer container cb5e444bcb8729282f529feb42cb7793a9cdd6fd961d4a2b799e7745405fdf1c. Mar 25 01:29:11.427787 containerd[1497]: time="2025-03-25T01:29:11.427214353Z" level=info msg="StartContainer for \"cb5e444bcb8729282f529feb42cb7793a9cdd6fd961d4a2b799e7745405fdf1c\" returns successfully" Mar 25 01:29:11.599727 containerd[1497]: time="2025-03-25T01:29:11.599166212Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-t6zpq,Uid:455e7431-91bf-4680-a141-9fa18af89c18,Namespace:calico-system,Attempt:0,}" Mar 25 01:29:11.600480 containerd[1497]: time="2025-03-25T01:29:11.599780804Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-d6g5n,Uid:e1099ad4-bb29-4692-8b74-1f06f414103c,Namespace:kube-system,Attempt:0,}" Mar 25 01:29:11.601613 containerd[1497]: time="2025-03-25T01:29:11.601427411Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-77b65b67d6-57pch,Uid:35d02384-32ff-4cc9-8558-540e7f668e10,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:29:11.848685 kubelet[2795]: I0325 01:29:11.848592 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-qqqdg" podStartSLOduration=39.84856907 podStartE2EDuration="39.84856907s" podCreationTimestamp="2025-03-25 01:28:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:29:11.846630408 +0000 UTC m=+45.382744144" watchObservedRunningTime="2025-03-25 01:29:11.84856907 +0000 UTC m=+45.384682806" Mar 25 01:29:11.900063 systemd-networkd[1395]: cali66c0a249a04: Link UP Mar 25 01:29:11.900327 systemd-networkd[1395]: cali66c0a249a04: Gained carrier Mar 25 01:29:11.945294 containerd[1497]: 2025-03-25 01:29:11.717 [INFO][4311] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--6--22e9b0bb97-k8s-coredns--6f6b679f8f--d6g5n-eth0 coredns-6f6b679f8f- kube-system e1099ad4-bb29-4692-8b74-1f06f414103c 710 0 2025-03-25 01:28:32 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4284-0-0-6-22e9b0bb97 coredns-6f6b679f8f-d6g5n eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali66c0a249a04 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="17c97c7f0112ed0bca36364c443376d0de749dd1f86eed287f8021bb8330a362" Namespace="kube-system" Pod="coredns-6f6b679f8f-d6g5n" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-coredns--6f6b679f8f--d6g5n-" Mar 25 01:29:11.945294 containerd[1497]: 2025-03-25 01:29:11.718 [INFO][4311] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="17c97c7f0112ed0bca36364c443376d0de749dd1f86eed287f8021bb8330a362" Namespace="kube-system" Pod="coredns-6f6b679f8f-d6g5n" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-coredns--6f6b679f8f--d6g5n-eth0" Mar 25 01:29:11.945294 containerd[1497]: 2025-03-25 01:29:11.769 [INFO][4351] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="17c97c7f0112ed0bca36364c443376d0de749dd1f86eed287f8021bb8330a362" HandleID="k8s-pod-network.17c97c7f0112ed0bca36364c443376d0de749dd1f86eed287f8021bb8330a362" Workload="ci--4284--0--0--6--22e9b0bb97-k8s-coredns--6f6b679f8f--d6g5n-eth0" Mar 25 01:29:11.945294 containerd[1497]: 2025-03-25 01:29:11.789 [INFO][4351] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="17c97c7f0112ed0bca36364c443376d0de749dd1f86eed287f8021bb8330a362" HandleID="k8s-pod-network.17c97c7f0112ed0bca36364c443376d0de749dd1f86eed287f8021bb8330a362" Workload="ci--4284--0--0--6--22e9b0bb97-k8s-coredns--6f6b679f8f--d6g5n-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003192a0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4284-0-0-6-22e9b0bb97", "pod":"coredns-6f6b679f8f-d6g5n", "timestamp":"2025-03-25 01:29:11.769397419 +0000 UTC"}, Hostname:"ci-4284-0-0-6-22e9b0bb97", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:29:11.945294 containerd[1497]: 2025-03-25 01:29:11.790 [INFO][4351] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:29:11.945294 containerd[1497]: 2025-03-25 01:29:11.790 [INFO][4351] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:29:11.945294 containerd[1497]: 2025-03-25 01:29:11.790 [INFO][4351] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-6-22e9b0bb97' Mar 25 01:29:11.945294 containerd[1497]: 2025-03-25 01:29:11.796 [INFO][4351] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.17c97c7f0112ed0bca36364c443376d0de749dd1f86eed287f8021bb8330a362" host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:11.945294 containerd[1497]: 2025-03-25 01:29:11.808 [INFO][4351] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:11.945294 containerd[1497]: 2025-03-25 01:29:11.823 [INFO][4351] ipam/ipam.go 489: Trying affinity for 192.168.98.192/26 host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:11.945294 containerd[1497]: 2025-03-25 01:29:11.829 [INFO][4351] ipam/ipam.go 155: Attempting to load block cidr=192.168.98.192/26 host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:11.945294 containerd[1497]: 2025-03-25 01:29:11.836 [INFO][4351] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.98.192/26 host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:11.945294 containerd[1497]: 2025-03-25 01:29:11.836 [INFO][4351] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.98.192/26 handle="k8s-pod-network.17c97c7f0112ed0bca36364c443376d0de749dd1f86eed287f8021bb8330a362" host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:11.945294 containerd[1497]: 2025-03-25 01:29:11.841 [INFO][4351] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.17c97c7f0112ed0bca36364c443376d0de749dd1f86eed287f8021bb8330a362 Mar 25 01:29:11.945294 containerd[1497]: 2025-03-25 01:29:11.862 [INFO][4351] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.98.192/26 handle="k8s-pod-network.17c97c7f0112ed0bca36364c443376d0de749dd1f86eed287f8021bb8330a362" host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:11.945294 containerd[1497]: 2025-03-25 01:29:11.887 [INFO][4351] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.98.196/26] block=192.168.98.192/26 handle="k8s-pod-network.17c97c7f0112ed0bca36364c443376d0de749dd1f86eed287f8021bb8330a362" host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:11.945294 containerd[1497]: 2025-03-25 01:29:11.887 [INFO][4351] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.98.196/26] handle="k8s-pod-network.17c97c7f0112ed0bca36364c443376d0de749dd1f86eed287f8021bb8330a362" host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:11.945294 containerd[1497]: 2025-03-25 01:29:11.887 [INFO][4351] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:29:11.945294 containerd[1497]: 2025-03-25 01:29:11.887 [INFO][4351] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.98.196/26] IPv6=[] ContainerID="17c97c7f0112ed0bca36364c443376d0de749dd1f86eed287f8021bb8330a362" HandleID="k8s-pod-network.17c97c7f0112ed0bca36364c443376d0de749dd1f86eed287f8021bb8330a362" Workload="ci--4284--0--0--6--22e9b0bb97-k8s-coredns--6f6b679f8f--d6g5n-eth0" Mar 25 01:29:11.946681 containerd[1497]: 2025-03-25 01:29:11.893 [INFO][4311] cni-plugin/k8s.go 386: Populated endpoint ContainerID="17c97c7f0112ed0bca36364c443376d0de749dd1f86eed287f8021bb8330a362" Namespace="kube-system" Pod="coredns-6f6b679f8f-d6g5n" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-coredns--6f6b679f8f--d6g5n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--6--22e9b0bb97-k8s-coredns--6f6b679f8f--d6g5n-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"e1099ad4-bb29-4692-8b74-1f06f414103c", ResourceVersion:"710", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 28, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-6-22e9b0bb97", ContainerID:"", Pod:"coredns-6f6b679f8f-d6g5n", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.98.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali66c0a249a04", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:29:11.946681 containerd[1497]: 2025-03-25 01:29:11.893 [INFO][4311] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.98.196/32] ContainerID="17c97c7f0112ed0bca36364c443376d0de749dd1f86eed287f8021bb8330a362" Namespace="kube-system" Pod="coredns-6f6b679f8f-d6g5n" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-coredns--6f6b679f8f--d6g5n-eth0" Mar 25 01:29:11.946681 containerd[1497]: 2025-03-25 01:29:11.893 [INFO][4311] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali66c0a249a04 ContainerID="17c97c7f0112ed0bca36364c443376d0de749dd1f86eed287f8021bb8330a362" Namespace="kube-system" Pod="coredns-6f6b679f8f-d6g5n" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-coredns--6f6b679f8f--d6g5n-eth0" Mar 25 01:29:11.946681 containerd[1497]: 2025-03-25 01:29:11.896 [INFO][4311] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="17c97c7f0112ed0bca36364c443376d0de749dd1f86eed287f8021bb8330a362" Namespace="kube-system" Pod="coredns-6f6b679f8f-d6g5n" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-coredns--6f6b679f8f--d6g5n-eth0" Mar 25 01:29:11.946681 containerd[1497]: 2025-03-25 01:29:11.902 [INFO][4311] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="17c97c7f0112ed0bca36364c443376d0de749dd1f86eed287f8021bb8330a362" Namespace="kube-system" Pod="coredns-6f6b679f8f-d6g5n" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-coredns--6f6b679f8f--d6g5n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--6--22e9b0bb97-k8s-coredns--6f6b679f8f--d6g5n-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"e1099ad4-bb29-4692-8b74-1f06f414103c", ResourceVersion:"710", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 28, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-6-22e9b0bb97", ContainerID:"17c97c7f0112ed0bca36364c443376d0de749dd1f86eed287f8021bb8330a362", Pod:"coredns-6f6b679f8f-d6g5n", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.98.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali66c0a249a04", MAC:"b6:2c:21:6b:8a:89", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:29:11.946681 containerd[1497]: 2025-03-25 01:29:11.942 [INFO][4311] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="17c97c7f0112ed0bca36364c443376d0de749dd1f86eed287f8021bb8330a362" Namespace="kube-system" Pod="coredns-6f6b679f8f-d6g5n" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-coredns--6f6b679f8f--d6g5n-eth0" Mar 25 01:29:11.992830 containerd[1497]: time="2025-03-25T01:29:11.992685702Z" level=info msg="connecting to shim 17c97c7f0112ed0bca36364c443376d0de749dd1f86eed287f8021bb8330a362" address="unix:///run/containerd/s/03a74a858e0686d9b1c6784314e1e7d97441d20e3e831b6db155a0a6608fdbcb" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:29:12.046764 systemd[1]: Started cri-containerd-17c97c7f0112ed0bca36364c443376d0de749dd1f86eed287f8021bb8330a362.scope - libcontainer container 17c97c7f0112ed0bca36364c443376d0de749dd1f86eed287f8021bb8330a362. Mar 25 01:29:12.055645 systemd-networkd[1395]: cali7cc5c46091c: Link UP Mar 25 01:29:12.055868 systemd-networkd[1395]: cali7cc5c46091c: Gained carrier Mar 25 01:29:12.092563 containerd[1497]: 2025-03-25 01:29:11.706 [INFO][4310] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--6--22e9b0bb97-k8s-csi--node--driver--t6zpq-eth0 csi-node-driver- calico-system 455e7431-91bf-4680-a141-9fa18af89c18 631 0 2025-03-25 01:28:47 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:568c96974f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4284-0-0-6-22e9b0bb97 csi-node-driver-t6zpq eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali7cc5c46091c [] []}} ContainerID="717aa05a4394c67bf9e2b92678f03d1ac27d7675178169acdddfd572c4d29f9e" Namespace="calico-system" Pod="csi-node-driver-t6zpq" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-csi--node--driver--t6zpq-" Mar 25 01:29:12.092563 containerd[1497]: 2025-03-25 01:29:11.707 [INFO][4310] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="717aa05a4394c67bf9e2b92678f03d1ac27d7675178169acdddfd572c4d29f9e" Namespace="calico-system" Pod="csi-node-driver-t6zpq" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-csi--node--driver--t6zpq-eth0" Mar 25 01:29:12.092563 containerd[1497]: 2025-03-25 01:29:11.780 [INFO][4346] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="717aa05a4394c67bf9e2b92678f03d1ac27d7675178169acdddfd572c4d29f9e" HandleID="k8s-pod-network.717aa05a4394c67bf9e2b92678f03d1ac27d7675178169acdddfd572c4d29f9e" Workload="ci--4284--0--0--6--22e9b0bb97-k8s-csi--node--driver--t6zpq-eth0" Mar 25 01:29:12.092563 containerd[1497]: 2025-03-25 01:29:11.804 [INFO][4346] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="717aa05a4394c67bf9e2b92678f03d1ac27d7675178169acdddfd572c4d29f9e" HandleID="k8s-pod-network.717aa05a4394c67bf9e2b92678f03d1ac27d7675178169acdddfd572c4d29f9e" Workload="ci--4284--0--0--6--22e9b0bb97-k8s-csi--node--driver--t6zpq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40004e2dc0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4284-0-0-6-22e9b0bb97", "pod":"csi-node-driver-t6zpq", "timestamp":"2025-03-25 01:29:11.780701895 +0000 UTC"}, Hostname:"ci-4284-0-0-6-22e9b0bb97", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:29:12.092563 containerd[1497]: 2025-03-25 01:29:11.805 [INFO][4346] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:29:12.092563 containerd[1497]: 2025-03-25 01:29:11.887 [INFO][4346] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:29:12.092563 containerd[1497]: 2025-03-25 01:29:11.887 [INFO][4346] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-6-22e9b0bb97' Mar 25 01:29:12.092563 containerd[1497]: 2025-03-25 01:29:11.899 [INFO][4346] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.717aa05a4394c67bf9e2b92678f03d1ac27d7675178169acdddfd572c4d29f9e" host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:12.092563 containerd[1497]: 2025-03-25 01:29:11.986 [INFO][4346] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:12.092563 containerd[1497]: 2025-03-25 01:29:12.000 [INFO][4346] ipam/ipam.go 489: Trying affinity for 192.168.98.192/26 host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:12.092563 containerd[1497]: 2025-03-25 01:29:12.004 [INFO][4346] ipam/ipam.go 155: Attempting to load block cidr=192.168.98.192/26 host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:12.092563 containerd[1497]: 2025-03-25 01:29:12.011 [INFO][4346] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.98.192/26 host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:12.092563 containerd[1497]: 2025-03-25 01:29:12.011 [INFO][4346] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.98.192/26 handle="k8s-pod-network.717aa05a4394c67bf9e2b92678f03d1ac27d7675178169acdddfd572c4d29f9e" host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:12.092563 containerd[1497]: 2025-03-25 01:29:12.015 [INFO][4346] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.717aa05a4394c67bf9e2b92678f03d1ac27d7675178169acdddfd572c4d29f9e Mar 25 01:29:12.092563 containerd[1497]: 2025-03-25 01:29:12.026 [INFO][4346] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.98.192/26 handle="k8s-pod-network.717aa05a4394c67bf9e2b92678f03d1ac27d7675178169acdddfd572c4d29f9e" host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:12.092563 containerd[1497]: 2025-03-25 01:29:12.041 [INFO][4346] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.98.197/26] block=192.168.98.192/26 handle="k8s-pod-network.717aa05a4394c67bf9e2b92678f03d1ac27d7675178169acdddfd572c4d29f9e" host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:12.092563 containerd[1497]: 2025-03-25 01:29:12.041 [INFO][4346] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.98.197/26] handle="k8s-pod-network.717aa05a4394c67bf9e2b92678f03d1ac27d7675178169acdddfd572c4d29f9e" host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:12.092563 containerd[1497]: 2025-03-25 01:29:12.041 [INFO][4346] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:29:12.092563 containerd[1497]: 2025-03-25 01:29:12.041 [INFO][4346] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.98.197/26] IPv6=[] ContainerID="717aa05a4394c67bf9e2b92678f03d1ac27d7675178169acdddfd572c4d29f9e" HandleID="k8s-pod-network.717aa05a4394c67bf9e2b92678f03d1ac27d7675178169acdddfd572c4d29f9e" Workload="ci--4284--0--0--6--22e9b0bb97-k8s-csi--node--driver--t6zpq-eth0" Mar 25 01:29:12.093234 containerd[1497]: 2025-03-25 01:29:12.047 [INFO][4310] cni-plugin/k8s.go 386: Populated endpoint ContainerID="717aa05a4394c67bf9e2b92678f03d1ac27d7675178169acdddfd572c4d29f9e" Namespace="calico-system" Pod="csi-node-driver-t6zpq" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-csi--node--driver--t6zpq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--6--22e9b0bb97-k8s-csi--node--driver--t6zpq-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"455e7431-91bf-4680-a141-9fa18af89c18", ResourceVersion:"631", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 28, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"568c96974f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-6-22e9b0bb97", ContainerID:"", Pod:"csi-node-driver-t6zpq", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.98.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali7cc5c46091c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:29:12.093234 containerd[1497]: 2025-03-25 01:29:12.048 [INFO][4310] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.98.197/32] ContainerID="717aa05a4394c67bf9e2b92678f03d1ac27d7675178169acdddfd572c4d29f9e" Namespace="calico-system" Pod="csi-node-driver-t6zpq" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-csi--node--driver--t6zpq-eth0" Mar 25 01:29:12.093234 containerd[1497]: 2025-03-25 01:29:12.048 [INFO][4310] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7cc5c46091c ContainerID="717aa05a4394c67bf9e2b92678f03d1ac27d7675178169acdddfd572c4d29f9e" Namespace="calico-system" Pod="csi-node-driver-t6zpq" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-csi--node--driver--t6zpq-eth0" Mar 25 01:29:12.093234 containerd[1497]: 2025-03-25 01:29:12.057 [INFO][4310] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="717aa05a4394c67bf9e2b92678f03d1ac27d7675178169acdddfd572c4d29f9e" Namespace="calico-system" Pod="csi-node-driver-t6zpq" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-csi--node--driver--t6zpq-eth0" Mar 25 01:29:12.093234 containerd[1497]: 2025-03-25 01:29:12.058 [INFO][4310] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="717aa05a4394c67bf9e2b92678f03d1ac27d7675178169acdddfd572c4d29f9e" Namespace="calico-system" Pod="csi-node-driver-t6zpq" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-csi--node--driver--t6zpq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--6--22e9b0bb97-k8s-csi--node--driver--t6zpq-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"455e7431-91bf-4680-a141-9fa18af89c18", ResourceVersion:"631", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 28, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"568c96974f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-6-22e9b0bb97", ContainerID:"717aa05a4394c67bf9e2b92678f03d1ac27d7675178169acdddfd572c4d29f9e", Pod:"csi-node-driver-t6zpq", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.98.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali7cc5c46091c", MAC:"c2:78:70:de:61:24", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:29:12.093234 containerd[1497]: 2025-03-25 01:29:12.086 [INFO][4310] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="717aa05a4394c67bf9e2b92678f03d1ac27d7675178169acdddfd572c4d29f9e" Namespace="calico-system" Pod="csi-node-driver-t6zpq" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-csi--node--driver--t6zpq-eth0" Mar 25 01:29:12.143875 containerd[1497]: time="2025-03-25T01:29:12.142799522Z" level=info msg="connecting to shim 717aa05a4394c67bf9e2b92678f03d1ac27d7675178169acdddfd572c4d29f9e" address="unix:///run/containerd/s/874bd27b5487c818d06fe5291a9c87d116993619c1c47e4b6982d8c56885e812" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:29:12.201383 systemd[1]: Started cri-containerd-717aa05a4394c67bf9e2b92678f03d1ac27d7675178169acdddfd572c4d29f9e.scope - libcontainer container 717aa05a4394c67bf9e2b92678f03d1ac27d7675178169acdddfd572c4d29f9e. Mar 25 01:29:12.203252 systemd-networkd[1395]: calic6c23cf1e0f: Link UP Mar 25 01:29:12.205690 systemd-networkd[1395]: calic6c23cf1e0f: Gained carrier Mar 25 01:29:12.227435 containerd[1497]: time="2025-03-25T01:29:12.227352963Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-d6g5n,Uid:e1099ad4-bb29-4692-8b74-1f06f414103c,Namespace:kube-system,Attempt:0,} returns sandbox id \"17c97c7f0112ed0bca36364c443376d0de749dd1f86eed287f8021bb8330a362\"" Mar 25 01:29:12.235804 containerd[1497]: time="2025-03-25T01:29:12.235733651Z" level=info msg="CreateContainer within sandbox \"17c97c7f0112ed0bca36364c443376d0de749dd1f86eed287f8021bb8330a362\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 25 01:29:12.238380 containerd[1497]: 2025-03-25 01:29:11.806 [INFO][4341] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--6--22e9b0bb97-k8s-calico--apiserver--77b65b67d6--57pch-eth0 calico-apiserver-77b65b67d6- calico-apiserver 35d02384-32ff-4cc9-8558-540e7f668e10 707 0 2025-03-25 01:28:46 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:77b65b67d6 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4284-0-0-6-22e9b0bb97 calico-apiserver-77b65b67d6-57pch eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calic6c23cf1e0f [] []}} ContainerID="ddbc94aff68853abb6ace3841353595e0f60dfcc96094f7a63b7ee29d68f1204" Namespace="calico-apiserver" Pod="calico-apiserver-77b65b67d6-57pch" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-calico--apiserver--77b65b67d6--57pch-" Mar 25 01:29:12.238380 containerd[1497]: 2025-03-25 01:29:11.807 [INFO][4341] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="ddbc94aff68853abb6ace3841353595e0f60dfcc96094f7a63b7ee29d68f1204" Namespace="calico-apiserver" Pod="calico-apiserver-77b65b67d6-57pch" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-calico--apiserver--77b65b67d6--57pch-eth0" Mar 25 01:29:12.238380 containerd[1497]: 2025-03-25 01:29:11.858 [INFO][4363] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ddbc94aff68853abb6ace3841353595e0f60dfcc96094f7a63b7ee29d68f1204" HandleID="k8s-pod-network.ddbc94aff68853abb6ace3841353595e0f60dfcc96094f7a63b7ee29d68f1204" Workload="ci--4284--0--0--6--22e9b0bb97-k8s-calico--apiserver--77b65b67d6--57pch-eth0" Mar 25 01:29:12.238380 containerd[1497]: 2025-03-25 01:29:11.985 [INFO][4363] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ddbc94aff68853abb6ace3841353595e0f60dfcc96094f7a63b7ee29d68f1204" HandleID="k8s-pod-network.ddbc94aff68853abb6ace3841353595e0f60dfcc96094f7a63b7ee29d68f1204" Workload="ci--4284--0--0--6--22e9b0bb97-k8s-calico--apiserver--77b65b67d6--57pch-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000318760), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4284-0-0-6-22e9b0bb97", "pod":"calico-apiserver-77b65b67d6-57pch", "timestamp":"2025-03-25 01:29:11.858735846 +0000 UTC"}, Hostname:"ci-4284-0-0-6-22e9b0bb97", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:29:12.238380 containerd[1497]: 2025-03-25 01:29:11.985 [INFO][4363] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:29:12.238380 containerd[1497]: 2025-03-25 01:29:12.044 [INFO][4363] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:29:12.238380 containerd[1497]: 2025-03-25 01:29:12.044 [INFO][4363] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-6-22e9b0bb97' Mar 25 01:29:12.238380 containerd[1497]: 2025-03-25 01:29:12.053 [INFO][4363] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.ddbc94aff68853abb6ace3841353595e0f60dfcc96094f7a63b7ee29d68f1204" host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:12.238380 containerd[1497]: 2025-03-25 01:29:12.089 [INFO][4363] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:12.238380 containerd[1497]: 2025-03-25 01:29:12.108 [INFO][4363] ipam/ipam.go 489: Trying affinity for 192.168.98.192/26 host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:12.238380 containerd[1497]: 2025-03-25 01:29:12.115 [INFO][4363] ipam/ipam.go 155: Attempting to load block cidr=192.168.98.192/26 host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:12.238380 containerd[1497]: 2025-03-25 01:29:12.124 [INFO][4363] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.98.192/26 host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:12.238380 containerd[1497]: 2025-03-25 01:29:12.125 [INFO][4363] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.98.192/26 handle="k8s-pod-network.ddbc94aff68853abb6ace3841353595e0f60dfcc96094f7a63b7ee29d68f1204" host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:12.238380 containerd[1497]: 2025-03-25 01:29:12.131 [INFO][4363] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.ddbc94aff68853abb6ace3841353595e0f60dfcc96094f7a63b7ee29d68f1204 Mar 25 01:29:12.238380 containerd[1497]: 2025-03-25 01:29:12.148 [INFO][4363] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.98.192/26 handle="k8s-pod-network.ddbc94aff68853abb6ace3841353595e0f60dfcc96094f7a63b7ee29d68f1204" host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:12.238380 containerd[1497]: 2025-03-25 01:29:12.179 [INFO][4363] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.98.198/26] block=192.168.98.192/26 handle="k8s-pod-network.ddbc94aff68853abb6ace3841353595e0f60dfcc96094f7a63b7ee29d68f1204" host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:12.238380 containerd[1497]: 2025-03-25 01:29:12.179 [INFO][4363] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.98.198/26] handle="k8s-pod-network.ddbc94aff68853abb6ace3841353595e0f60dfcc96094f7a63b7ee29d68f1204" host="ci-4284-0-0-6-22e9b0bb97" Mar 25 01:29:12.238380 containerd[1497]: 2025-03-25 01:29:12.181 [INFO][4363] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:29:12.238380 containerd[1497]: 2025-03-25 01:29:12.182 [INFO][4363] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.98.198/26] IPv6=[] ContainerID="ddbc94aff68853abb6ace3841353595e0f60dfcc96094f7a63b7ee29d68f1204" HandleID="k8s-pod-network.ddbc94aff68853abb6ace3841353595e0f60dfcc96094f7a63b7ee29d68f1204" Workload="ci--4284--0--0--6--22e9b0bb97-k8s-calico--apiserver--77b65b67d6--57pch-eth0" Mar 25 01:29:12.239121 containerd[1497]: 2025-03-25 01:29:12.190 [INFO][4341] cni-plugin/k8s.go 386: Populated endpoint ContainerID="ddbc94aff68853abb6ace3841353595e0f60dfcc96094f7a63b7ee29d68f1204" Namespace="calico-apiserver" Pod="calico-apiserver-77b65b67d6-57pch" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-calico--apiserver--77b65b67d6--57pch-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--6--22e9b0bb97-k8s-calico--apiserver--77b65b67d6--57pch-eth0", GenerateName:"calico-apiserver-77b65b67d6-", Namespace:"calico-apiserver", SelfLink:"", UID:"35d02384-32ff-4cc9-8558-540e7f668e10", ResourceVersion:"707", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 28, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"77b65b67d6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-6-22e9b0bb97", ContainerID:"", Pod:"calico-apiserver-77b65b67d6-57pch", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.98.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic6c23cf1e0f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:29:12.239121 containerd[1497]: 2025-03-25 01:29:12.190 [INFO][4341] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.98.198/32] ContainerID="ddbc94aff68853abb6ace3841353595e0f60dfcc96094f7a63b7ee29d68f1204" Namespace="calico-apiserver" Pod="calico-apiserver-77b65b67d6-57pch" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-calico--apiserver--77b65b67d6--57pch-eth0" Mar 25 01:29:12.239121 containerd[1497]: 2025-03-25 01:29:12.191 [INFO][4341] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic6c23cf1e0f ContainerID="ddbc94aff68853abb6ace3841353595e0f60dfcc96094f7a63b7ee29d68f1204" Namespace="calico-apiserver" Pod="calico-apiserver-77b65b67d6-57pch" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-calico--apiserver--77b65b67d6--57pch-eth0" Mar 25 01:29:12.239121 containerd[1497]: 2025-03-25 01:29:12.207 [INFO][4341] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ddbc94aff68853abb6ace3841353595e0f60dfcc96094f7a63b7ee29d68f1204" Namespace="calico-apiserver" Pod="calico-apiserver-77b65b67d6-57pch" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-calico--apiserver--77b65b67d6--57pch-eth0" Mar 25 01:29:12.239121 containerd[1497]: 2025-03-25 01:29:12.214 [INFO][4341] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="ddbc94aff68853abb6ace3841353595e0f60dfcc96094f7a63b7ee29d68f1204" Namespace="calico-apiserver" Pod="calico-apiserver-77b65b67d6-57pch" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-calico--apiserver--77b65b67d6--57pch-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--6--22e9b0bb97-k8s-calico--apiserver--77b65b67d6--57pch-eth0", GenerateName:"calico-apiserver-77b65b67d6-", Namespace:"calico-apiserver", SelfLink:"", UID:"35d02384-32ff-4cc9-8558-540e7f668e10", ResourceVersion:"707", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 28, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"77b65b67d6", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-6-22e9b0bb97", ContainerID:"ddbc94aff68853abb6ace3841353595e0f60dfcc96094f7a63b7ee29d68f1204", Pod:"calico-apiserver-77b65b67d6-57pch", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.98.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic6c23cf1e0f", MAC:"06:eb:3e:27:6f:32", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:29:12.239121 containerd[1497]: 2025-03-25 01:29:12.233 [INFO][4341] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="ddbc94aff68853abb6ace3841353595e0f60dfcc96094f7a63b7ee29d68f1204" Namespace="calico-apiserver" Pod="calico-apiserver-77b65b67d6-57pch" WorkloadEndpoint="ci--4284--0--0--6--22e9b0bb97-k8s-calico--apiserver--77b65b67d6--57pch-eth0" Mar 25 01:29:12.253620 containerd[1497]: time="2025-03-25T01:29:12.253063818Z" level=info msg="Container bb6b024cdc098adbe4bd15536304dc57fdc1cd68bbf58e1abb8b2ca2af843793: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:29:12.272628 containerd[1497]: time="2025-03-25T01:29:12.272572021Z" level=info msg="CreateContainer within sandbox \"17c97c7f0112ed0bca36364c443376d0de749dd1f86eed287f8021bb8330a362\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"bb6b024cdc098adbe4bd15536304dc57fdc1cd68bbf58e1abb8b2ca2af843793\"" Mar 25 01:29:12.275182 containerd[1497]: time="2025-03-25T01:29:12.275139358Z" level=info msg="StartContainer for \"bb6b024cdc098adbe4bd15536304dc57fdc1cd68bbf58e1abb8b2ca2af843793\"" Mar 25 01:29:12.277903 containerd[1497]: time="2025-03-25T01:29:12.277855983Z" level=info msg="connecting to shim bb6b024cdc098adbe4bd15536304dc57fdc1cd68bbf58e1abb8b2ca2af843793" address="unix:///run/containerd/s/03a74a858e0686d9b1c6784314e1e7d97441d20e3e831b6db155a0a6608fdbcb" protocol=ttrpc version=3 Mar 25 01:29:12.290916 systemd-networkd[1395]: cali2343d5c8e51: Gained IPv6LL Mar 25 01:29:12.337494 containerd[1497]: time="2025-03-25T01:29:12.334892353Z" level=info msg="connecting to shim ddbc94aff68853abb6ace3841353595e0f60dfcc96094f7a63b7ee29d68f1204" address="unix:///run/containerd/s/fac41be0d8299493ec5c872288df0b19e49b611573c16b4eb56cc7dac524e981" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:29:12.350577 systemd[1]: Started cri-containerd-bb6b024cdc098adbe4bd15536304dc57fdc1cd68bbf58e1abb8b2ca2af843793.scope - libcontainer container bb6b024cdc098adbe4bd15536304dc57fdc1cd68bbf58e1abb8b2ca2af843793. Mar 25 01:29:12.383325 containerd[1497]: time="2025-03-25T01:29:12.383153253Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-t6zpq,Uid:455e7431-91bf-4680-a141-9fa18af89c18,Namespace:calico-system,Attempt:0,} returns sandbox id \"717aa05a4394c67bf9e2b92678f03d1ac27d7675178169acdddfd572c4d29f9e\"" Mar 25 01:29:12.412883 systemd[1]: Started cri-containerd-ddbc94aff68853abb6ace3841353595e0f60dfcc96094f7a63b7ee29d68f1204.scope - libcontainer container ddbc94aff68853abb6ace3841353595e0f60dfcc96094f7a63b7ee29d68f1204. Mar 25 01:29:12.427516 containerd[1497]: time="2025-03-25T01:29:12.426602337Z" level=info msg="StartContainer for \"bb6b024cdc098adbe4bd15536304dc57fdc1cd68bbf58e1abb8b2ca2af843793\" returns successfully" Mar 25 01:29:12.501199 containerd[1497]: time="2025-03-25T01:29:12.501087879Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-77b65b67d6-57pch,Uid:35d02384-32ff-4cc9-8558-540e7f668e10,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"ddbc94aff68853abb6ace3841353595e0f60dfcc96094f7a63b7ee29d68f1204\"" Mar 25 01:29:12.738098 systemd-networkd[1395]: calib331ea41d10: Gained IPv6LL Mar 25 01:29:12.738726 systemd-networkd[1395]: calia4e92c28674: Gained IPv6LL Mar 25 01:29:12.856425 kubelet[2795]: I0325 01:29:12.856337 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-d6g5n" podStartSLOduration=40.856316273 podStartE2EDuration="40.856316273s" podCreationTimestamp="2025-03-25 01:28:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:29:12.854353648 +0000 UTC m=+46.390467384" watchObservedRunningTime="2025-03-25 01:29:12.856316273 +0000 UTC m=+46.392430049" Mar 25 01:29:13.250086 systemd-networkd[1395]: cali66c0a249a04: Gained IPv6LL Mar 25 01:29:13.506297 systemd-networkd[1395]: calic6c23cf1e0f: Gained IPv6LL Mar 25 01:29:13.699369 systemd-networkd[1395]: cali7cc5c46091c: Gained IPv6LL Mar 25 01:29:14.181769 containerd[1497]: time="2025-03-25T01:29:14.181687413Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:29:14.183431 containerd[1497]: time="2025-03-25T01:29:14.183086170Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.2: active requests=0, bytes read=32560257" Mar 25 01:29:14.185866 containerd[1497]: time="2025-03-25T01:29:14.184363441Z" level=info msg="ImageCreate event name:\"sha256:39a6e91a11a792441d34dccf5e11416a0fd297782f169fdb871a5558ad50b229\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:29:14.188014 containerd[1497]: time="2025-03-25T01:29:14.187974159Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:29:14.188712 containerd[1497]: time="2025-03-25T01:29:14.188674238Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" with image id \"sha256:39a6e91a11a792441d34dccf5e11416a0fd297782f169fdb871a5558ad50b229\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\", size \"33929982\" in 2.97321356s" Mar 25 01:29:14.188835 containerd[1497]: time="2025-03-25T01:29:14.188714360Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" returns image reference \"sha256:39a6e91a11a792441d34dccf5e11416a0fd297782f169fdb871a5558ad50b229\"" Mar 25 01:29:14.197674 containerd[1497]: time="2025-03-25T01:29:14.197540005Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 25 01:29:14.219626 containerd[1497]: time="2025-03-25T01:29:14.219526254Z" level=info msg="CreateContainer within sandbox \"be23acdcca48310e71c118420cfd6621cb31294deb345b3330303ae69fd53b24\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 25 01:29:14.235686 containerd[1497]: time="2025-03-25T01:29:14.234500557Z" level=info msg="Container a7f6153ab1b9b3f024b6ada473ce83040fb82f3c1b15c44758fa2fa4983d5fc3: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:29:14.246594 containerd[1497]: time="2025-03-25T01:29:14.246537418Z" level=info msg="CreateContainer within sandbox \"be23acdcca48310e71c118420cfd6621cb31294deb345b3330303ae69fd53b24\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"a7f6153ab1b9b3f024b6ada473ce83040fb82f3c1b15c44758fa2fa4983d5fc3\"" Mar 25 01:29:14.251469 containerd[1497]: time="2025-03-25T01:29:14.251416527Z" level=info msg="StartContainer for \"a7f6153ab1b9b3f024b6ada473ce83040fb82f3c1b15c44758fa2fa4983d5fc3\"" Mar 25 01:29:14.252789 containerd[1497]: time="2025-03-25T01:29:14.252651554Z" level=info msg="connecting to shim a7f6153ab1b9b3f024b6ada473ce83040fb82f3c1b15c44758fa2fa4983d5fc3" address="unix:///run/containerd/s/f415ad45dfc441d64da23912278c8352d7bd8f826b025aae97291c4f275f7397" protocol=ttrpc version=3 Mar 25 01:29:14.280040 systemd[1]: Started cri-containerd-a7f6153ab1b9b3f024b6ada473ce83040fb82f3c1b15c44758fa2fa4983d5fc3.scope - libcontainer container a7f6153ab1b9b3f024b6ada473ce83040fb82f3c1b15c44758fa2fa4983d5fc3. Mar 25 01:29:14.331590 containerd[1497]: time="2025-03-25T01:29:14.331551172Z" level=info msg="StartContainer for \"a7f6153ab1b9b3f024b6ada473ce83040fb82f3c1b15c44758fa2fa4983d5fc3\" returns successfully" Mar 25 01:29:14.889480 kubelet[2795]: I0325 01:29:14.889124 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-74b96585fb-rvlz4" podStartSLOduration=24.911623032 podStartE2EDuration="27.8890787s" podCreationTimestamp="2025-03-25 01:28:47 +0000 UTC" firstStartedPulling="2025-03-25 01:29:11.213616901 +0000 UTC m=+44.749730677" lastFinishedPulling="2025-03-25 01:29:14.191072569 +0000 UTC m=+47.727186345" observedRunningTime="2025-03-25 01:29:14.886929142 +0000 UTC m=+48.423042958" watchObservedRunningTime="2025-03-25 01:29:14.8890787 +0000 UTC m=+48.425192476" Mar 25 01:29:15.892564 containerd[1497]: time="2025-03-25T01:29:15.892442735Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a7f6153ab1b9b3f024b6ada473ce83040fb82f3c1b15c44758fa2fa4983d5fc3\" id:\"4fa2b76d266b71fe9c877ec579a62b4d2bb8d3b4e3fb360e3a7f08761206e55c\" pid:4639 exited_at:{seconds:1742866155 nanos:891355435}" Mar 25 01:29:17.060435 containerd[1497]: time="2025-03-25T01:29:17.059976287Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:29:17.061852 containerd[1497]: time="2025-03-25T01:29:17.061756109Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=40253267" Mar 25 01:29:17.063477 containerd[1497]: time="2025-03-25T01:29:17.063042822Z" level=info msg="ImageCreate event name:\"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:29:17.067710 containerd[1497]: time="2025-03-25T01:29:17.067647365Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:29:17.069405 containerd[1497]: time="2025-03-25T01:29:17.069269378Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"41623040\" in 2.871398314s" Mar 25 01:29:17.070229 containerd[1497]: time="2025-03-25T01:29:17.069758806Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\"" Mar 25 01:29:17.084972 containerd[1497]: time="2025-03-25T01:29:17.084462685Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\"" Mar 25 01:29:17.089935 containerd[1497]: time="2025-03-25T01:29:17.089761467Z" level=info msg="CreateContainer within sandbox \"b648c3727878e7b766fb88b629e7c27d9fec934aa8571b7800d5a49f639269a2\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 25 01:29:17.114975 containerd[1497]: time="2025-03-25T01:29:17.113897284Z" level=info msg="Container 023590956b5111ec5c10e07befc7b0393f7d787811e66a60cf115be27f48f607: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:29:17.116288 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3132422469.mount: Deactivated successfully. Mar 25 01:29:17.151486 containerd[1497]: time="2025-03-25T01:29:17.151338660Z" level=info msg="CreateContainer within sandbox \"b648c3727878e7b766fb88b629e7c27d9fec934aa8571b7800d5a49f639269a2\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"023590956b5111ec5c10e07befc7b0393f7d787811e66a60cf115be27f48f607\"" Mar 25 01:29:17.152365 containerd[1497]: time="2025-03-25T01:29:17.152329277Z" level=info msg="StartContainer for \"023590956b5111ec5c10e07befc7b0393f7d787811e66a60cf115be27f48f607\"" Mar 25 01:29:17.154385 containerd[1497]: time="2025-03-25T01:29:17.154307590Z" level=info msg="connecting to shim 023590956b5111ec5c10e07befc7b0393f7d787811e66a60cf115be27f48f607" address="unix:///run/containerd/s/0ebb8d116f693c0cc026cc95132c9e30f69485dc1c3a413acc82b95513837200" protocol=ttrpc version=3 Mar 25 01:29:17.193027 systemd[1]: Started cri-containerd-023590956b5111ec5c10e07befc7b0393f7d787811e66a60cf115be27f48f607.scope - libcontainer container 023590956b5111ec5c10e07befc7b0393f7d787811e66a60cf115be27f48f607. Mar 25 01:29:17.257354 containerd[1497]: time="2025-03-25T01:29:17.257288585Z" level=info msg="StartContainer for \"023590956b5111ec5c10e07befc7b0393f7d787811e66a60cf115be27f48f607\" returns successfully" Mar 25 01:29:18.416843 containerd[1497]: time="2025-03-25T01:29:18.416324264Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:29:18.418173 containerd[1497]: time="2025-03-25T01:29:18.418107767Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.2: active requests=0, bytes read=7473801" Mar 25 01:29:18.420698 containerd[1497]: time="2025-03-25T01:29:18.419418202Z" level=info msg="ImageCreate event name:\"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:29:18.422196 containerd[1497]: time="2025-03-25T01:29:18.422157121Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:29:18.423851 containerd[1497]: time="2025-03-25T01:29:18.423719131Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.2\" with image id \"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\", size \"8843558\" in 1.339161761s" Mar 25 01:29:18.424015 containerd[1497]: time="2025-03-25T01:29:18.423997747Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\" returns image reference \"sha256:f39063099e467ddd9d84500bfd4d97c404bb5f706a2161afc8979f4a94b8ad0b\"" Mar 25 01:29:18.426340 containerd[1497]: time="2025-03-25T01:29:18.426171632Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 25 01:29:18.429023 containerd[1497]: time="2025-03-25T01:29:18.428965033Z" level=info msg="CreateContainer within sandbox \"717aa05a4394c67bf9e2b92678f03d1ac27d7675178169acdddfd572c4d29f9e\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 25 01:29:18.457899 containerd[1497]: time="2025-03-25T01:29:18.457846140Z" level=info msg="Container 37ab32382f4b7313f81558445da9f663351fdf43f5affdd751357706be3cb817: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:29:18.478700 containerd[1497]: time="2025-03-25T01:29:18.476902120Z" level=info msg="CreateContainer within sandbox \"717aa05a4394c67bf9e2b92678f03d1ac27d7675178169acdddfd572c4d29f9e\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"37ab32382f4b7313f81558445da9f663351fdf43f5affdd751357706be3cb817\"" Mar 25 01:29:18.480675 containerd[1497]: time="2025-03-25T01:29:18.479719442Z" level=info msg="StartContainer for \"37ab32382f4b7313f81558445da9f663351fdf43f5affdd751357706be3cb817\"" Mar 25 01:29:18.482150 containerd[1497]: time="2025-03-25T01:29:18.482115501Z" level=info msg="connecting to shim 37ab32382f4b7313f81558445da9f663351fdf43f5affdd751357706be3cb817" address="unix:///run/containerd/s/874bd27b5487c818d06fe5291a9c87d116993619c1c47e4b6982d8c56885e812" protocol=ttrpc version=3 Mar 25 01:29:18.514925 systemd[1]: Started cri-containerd-37ab32382f4b7313f81558445da9f663351fdf43f5affdd751357706be3cb817.scope - libcontainer container 37ab32382f4b7313f81558445da9f663351fdf43f5affdd751357706be3cb817. Mar 25 01:29:18.567087 containerd[1497]: time="2025-03-25T01:29:18.566863991Z" level=info msg="StartContainer for \"37ab32382f4b7313f81558445da9f663351fdf43f5affdd751357706be3cb817\" returns successfully" Mar 25 01:29:18.789083 containerd[1497]: time="2025-03-25T01:29:18.788962568Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:29:18.790350 containerd[1497]: time="2025-03-25T01:29:18.790038870Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=77" Mar 25 01:29:18.793117 containerd[1497]: time="2025-03-25T01:29:18.793076765Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"41623040\" in 366.439706ms" Mar 25 01:29:18.793117 containerd[1497]: time="2025-03-25T01:29:18.793116407Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:15defb01cf01d9d97dc594b25d63dee89192c67a6c991b6a78d49fa834325f4e\"" Mar 25 01:29:18.795950 containerd[1497]: time="2025-03-25T01:29:18.795569669Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\"" Mar 25 01:29:18.797660 containerd[1497]: time="2025-03-25T01:29:18.797619347Z" level=info msg="CreateContainer within sandbox \"ddbc94aff68853abb6ace3841353595e0f60dfcc96094f7a63b7ee29d68f1204\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 25 01:29:18.809141 containerd[1497]: time="2025-03-25T01:29:18.809092049Z" level=info msg="Container 80e59368f96c678b4328ae3933a104cbd007de4911d670f4fd3de4a8437b6b93: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:29:18.819446 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount835109773.mount: Deactivated successfully. Mar 25 01:29:18.826930 containerd[1497]: time="2025-03-25T01:29:18.826786470Z" level=info msg="CreateContainer within sandbox \"ddbc94aff68853abb6ace3841353595e0f60dfcc96094f7a63b7ee29d68f1204\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"80e59368f96c678b4328ae3933a104cbd007de4911d670f4fd3de4a8437b6b93\"" Mar 25 01:29:18.828153 containerd[1497]: time="2025-03-25T01:29:18.828045023Z" level=info msg="StartContainer for \"80e59368f96c678b4328ae3933a104cbd007de4911d670f4fd3de4a8437b6b93\"" Mar 25 01:29:18.834746 containerd[1497]: time="2025-03-25T01:29:18.833177919Z" level=info msg="connecting to shim 80e59368f96c678b4328ae3933a104cbd007de4911d670f4fd3de4a8437b6b93" address="unix:///run/containerd/s/fac41be0d8299493ec5c872288df0b19e49b611573c16b4eb56cc7dac524e981" protocol=ttrpc version=3 Mar 25 01:29:18.863382 systemd[1]: Started cri-containerd-80e59368f96c678b4328ae3933a104cbd007de4911d670f4fd3de4a8437b6b93.scope - libcontainer container 80e59368f96c678b4328ae3933a104cbd007de4911d670f4fd3de4a8437b6b93. Mar 25 01:29:18.880876 kubelet[2795]: I0325 01:29:18.880797 2795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:29:18.922662 containerd[1497]: time="2025-03-25T01:29:18.922445590Z" level=info msg="StartContainer for \"80e59368f96c678b4328ae3933a104cbd007de4911d670f4fd3de4a8437b6b93\" returns successfully" Mar 25 01:29:19.902732 kubelet[2795]: I0325 01:29:19.902581 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-77b65b67d6-rf4c2" podStartSLOduration=28.086850405 podStartE2EDuration="33.902557398s" podCreationTimestamp="2025-03-25 01:28:46 +0000 UTC" firstStartedPulling="2025-03-25 01:29:11.267366052 +0000 UTC m=+44.803479828" lastFinishedPulling="2025-03-25 01:29:17.083073085 +0000 UTC m=+50.619186821" observedRunningTime="2025-03-25 01:29:17.884685381 +0000 UTC m=+51.420799157" watchObservedRunningTime="2025-03-25 01:29:19.902557398 +0000 UTC m=+53.438671254" Mar 25 01:29:20.442690 containerd[1497]: time="2025-03-25T01:29:20.442624573Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:29:20.446644 containerd[1497]: time="2025-03-25T01:29:20.445952570Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2: active requests=0, bytes read=13121717" Mar 25 01:29:20.448264 containerd[1497]: time="2025-03-25T01:29:20.448174741Z" level=info msg="ImageCreate event name:\"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:29:20.451872 containerd[1497]: time="2025-03-25T01:29:20.451556740Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:29:20.453855 containerd[1497]: time="2025-03-25T01:29:20.453221358Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" with image id \"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\", size \"14491426\" in 1.657610607s" Mar 25 01:29:20.453855 containerd[1497]: time="2025-03-25T01:29:20.453269761Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" returns image reference \"sha256:5b766f5f5d1b2ccc7c16f12d59c6c17c490ae33a8973c1fa7b2bcf3b8aa5098a\"" Mar 25 01:29:20.459524 containerd[1497]: time="2025-03-25T01:29:20.459481687Z" level=info msg="CreateContainer within sandbox \"717aa05a4394c67bf9e2b92678f03d1ac27d7675178169acdddfd572c4d29f9e\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 25 01:29:20.473834 containerd[1497]: time="2025-03-25T01:29:20.473658443Z" level=info msg="Container 26358f388d2025361f047f7e2ee3488b5d6fc5d65ae8083cb92aded8f81b4cd0: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:29:20.486552 containerd[1497]: time="2025-03-25T01:29:20.486411675Z" level=info msg="CreateContainer within sandbox \"717aa05a4394c67bf9e2b92678f03d1ac27d7675178169acdddfd572c4d29f9e\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"26358f388d2025361f047f7e2ee3488b5d6fc5d65ae8083cb92aded8f81b4cd0\"" Mar 25 01:29:20.487451 containerd[1497]: time="2025-03-25T01:29:20.487222082Z" level=info msg="StartContainer for \"26358f388d2025361f047f7e2ee3488b5d6fc5d65ae8083cb92aded8f81b4cd0\"" Mar 25 01:29:20.490836 containerd[1497]: time="2025-03-25T01:29:20.490497876Z" level=info msg="connecting to shim 26358f388d2025361f047f7e2ee3488b5d6fc5d65ae8083cb92aded8f81b4cd0" address="unix:///run/containerd/s/874bd27b5487c818d06fe5291a9c87d116993619c1c47e4b6982d8c56885e812" protocol=ttrpc version=3 Mar 25 01:29:20.523065 systemd[1]: Started cri-containerd-26358f388d2025361f047f7e2ee3488b5d6fc5d65ae8083cb92aded8f81b4cd0.scope - libcontainer container 26358f388d2025361f047f7e2ee3488b5d6fc5d65ae8083cb92aded8f81b4cd0. Mar 25 01:29:20.617352 containerd[1497]: time="2025-03-25T01:29:20.617279189Z" level=info msg="StartContainer for \"26358f388d2025361f047f7e2ee3488b5d6fc5d65ae8083cb92aded8f81b4cd0\" returns successfully" Mar 25 01:29:20.732309 kubelet[2795]: I0325 01:29:20.732184 2795 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 25 01:29:20.732309 kubelet[2795]: I0325 01:29:20.732241 2795 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 25 01:29:20.912942 kubelet[2795]: I0325 01:29:20.912861 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-t6zpq" podStartSLOduration=25.841272768 podStartE2EDuration="33.911976081s" podCreationTimestamp="2025-03-25 01:28:47 +0000 UTC" firstStartedPulling="2025-03-25 01:29:12.385035434 +0000 UTC m=+45.921149210" lastFinishedPulling="2025-03-25 01:29:20.455738787 +0000 UTC m=+53.991852523" observedRunningTime="2025-03-25 01:29:20.90906651 +0000 UTC m=+54.445180326" watchObservedRunningTime="2025-03-25 01:29:20.911976081 +0000 UTC m=+54.448089857" Mar 25 01:29:20.914048 kubelet[2795]: I0325 01:29:20.913025 2795 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-77b65b67d6-57pch" podStartSLOduration=28.621982341 podStartE2EDuration="34.913020263s" podCreationTimestamp="2025-03-25 01:28:46 +0000 UTC" firstStartedPulling="2025-03-25 01:29:12.503685258 +0000 UTC m=+46.039799034" lastFinishedPulling="2025-03-25 01:29:18.79472318 +0000 UTC m=+52.330836956" observedRunningTime="2025-03-25 01:29:19.9015703 +0000 UTC m=+53.437684076" watchObservedRunningTime="2025-03-25 01:29:20.913020263 +0000 UTC m=+54.449134039" Mar 25 01:29:21.896076 kubelet[2795]: I0325 01:29:21.895722 2795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:29:22.121162 containerd[1497]: time="2025-03-25T01:29:22.121110291Z" level=info msg="TaskExit event in podsandbox handler container_id:\"48216c06ec2fcc708460762897ab7fddaecd1e84bee19354646d8525b9cb5440\" id:\"6c314ad5790762e06e8194c85d09945379a81c146ab41fa2e554d3080bfddba5\" pid:4813 exited_at:{seconds:1742866162 nanos:120127432}" Mar 25 01:29:37.283201 containerd[1497]: time="2025-03-25T01:29:37.283148895Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a7f6153ab1b9b3f024b6ada473ce83040fb82f3c1b15c44758fa2fa4983d5fc3\" id:\"51ef3ed12fb2b5b5d0cada685ab67cbdb8d7a06a0d0e585355027a2d84289495\" pid:4859 exited_at:{seconds:1742866177 nanos:282762829}" Mar 25 01:29:41.564786 systemd[1]: Started sshd@8-78.46.211.139:22-207.154.232.101:6116.service - OpenSSH per-connection server daemon (207.154.232.101:6116). Mar 25 01:29:42.252631 containerd[1497]: time="2025-03-25T01:29:42.252577958Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a7f6153ab1b9b3f024b6ada473ce83040fb82f3c1b15c44758fa2fa4983d5fc3\" id:\"ae4b76851f66ecb5e1692eba9c3a2727b9fbc66af161b64b41d4fea0a6750a45\" pid:4885 exited_at:{seconds:1742866182 nanos:251060975}" Mar 25 01:29:42.320735 sshd[4870]: kex_protocol_error: type 20 seq 2 [preauth] Mar 25 01:29:42.320735 sshd[4870]: kex_protocol_error: type 30 seq 3 [preauth] Mar 25 01:29:43.535630 sshd[4870]: kex_protocol_error: type 20 seq 4 [preauth] Mar 25 01:29:43.535630 sshd[4870]: kex_protocol_error: type 30 seq 5 [preauth] Mar 25 01:29:45.556493 sshd[4870]: kex_protocol_error: type 20 seq 6 [preauth] Mar 25 01:29:45.556493 sshd[4870]: kex_protocol_error: type 30 seq 7 [preauth] Mar 25 01:29:47.730185 kubelet[2795]: I0325 01:29:47.729528 2795 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:29:52.081156 containerd[1497]: time="2025-03-25T01:29:52.081115435Z" level=info msg="TaskExit event in podsandbox handler container_id:\"48216c06ec2fcc708460762897ab7fddaecd1e84bee19354646d8525b9cb5440\" id:\"f5493ed7fecb07a695c2d408d8ff1eb6ec3f83d0c9ef2595bf2eba2678ee620d\" pid:4915 exited_at:{seconds:1742866192 nanos:80459748}" Mar 25 01:30:12.238174 containerd[1497]: time="2025-03-25T01:30:12.238094788Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a7f6153ab1b9b3f024b6ada473ce83040fb82f3c1b15c44758fa2fa4983d5fc3\" id:\"168a7a31384f5c95a7ec3b3f4ae90db7dd083e91521cbb9df66facd3220047d2\" pid:4943 exited_at:{seconds:1742866212 nanos:237628833}" Mar 25 01:30:13.538952 sshd[4870]: Connection reset by 207.154.232.101 port 6116 [preauth] Mar 25 01:30:13.540281 systemd[1]: sshd@8-78.46.211.139:22-207.154.232.101:6116.service: Deactivated successfully. Mar 25 01:30:22.091796 containerd[1497]: time="2025-03-25T01:30:22.091711198Z" level=info msg="TaskExit event in podsandbox handler container_id:\"48216c06ec2fcc708460762897ab7fddaecd1e84bee19354646d8525b9cb5440\" id:\"a93d663c00ca394513616d4c321987ba8bf971326cf1f808840b272623ac5cb9\" pid:4968 exited_at:{seconds:1742866222 nanos:91202305}" Mar 25 01:30:37.302490 containerd[1497]: time="2025-03-25T01:30:37.302396274Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a7f6153ab1b9b3f024b6ada473ce83040fb82f3c1b15c44758fa2fa4983d5fc3\" id:\"c206ee642045abb6012825f600b0250ed0081d1a7ce7dbc5a8fa1efe36d0e26a\" pid:5023 exited_at:{seconds:1742866237 nanos:301960127}" Mar 25 01:30:42.240800 containerd[1497]: time="2025-03-25T01:30:42.240743436Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a7f6153ab1b9b3f024b6ada473ce83040fb82f3c1b15c44758fa2fa4983d5fc3\" id:\"b90d00a551b0ce069b88430ac2672c2ab740b2c86fa0c1af5955a50f76cea197\" pid:5042 exited_at:{seconds:1742866242 nanos:240360525}" Mar 25 01:30:52.107630 containerd[1497]: time="2025-03-25T01:30:52.106453564Z" level=info msg="TaskExit event in podsandbox handler container_id:\"48216c06ec2fcc708460762897ab7fddaecd1e84bee19354646d8525b9cb5440\" id:\"4f24a2709683153d590ea74143ec2d4fc0b7951fde87aa451cecac6c948d04d4\" pid:5064 exited_at:{seconds:1742866252 nanos:105704091}" Mar 25 01:31:12.238443 containerd[1497]: time="2025-03-25T01:31:12.238381807Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a7f6153ab1b9b3f024b6ada473ce83040fb82f3c1b15c44758fa2fa4983d5fc3\" id:\"c6021d249dcdd4b1f486d2aaa7bd3a45242ad842749e2f2c0ac1b0076851e29c\" pid:5094 exited_at:{seconds:1742866272 nanos:237960883}" Mar 25 01:31:22.072758 containerd[1497]: time="2025-03-25T01:31:22.072544641Z" level=info msg="TaskExit event in podsandbox handler container_id:\"48216c06ec2fcc708460762897ab7fddaecd1e84bee19354646d8525b9cb5440\" id:\"70a9a5e93f624c9089e15e1228c6faf8674ec8486794b5a322cd822cb441fa72\" pid:5121 exited_at:{seconds:1742866282 nanos:72170074}" Mar 25 01:31:37.329617 containerd[1497]: time="2025-03-25T01:31:37.329521862Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a7f6153ab1b9b3f024b6ada473ce83040fb82f3c1b15c44758fa2fa4983d5fc3\" id:\"0407588cd2dccd76088d1f52b9a3a5658bd8c6f5500ce14768ae53db5187d05c\" pid:5149 exited_at:{seconds:1742866297 nanos:328969327}" Mar 25 01:31:42.247329 containerd[1497]: time="2025-03-25T01:31:42.247268675Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a7f6153ab1b9b3f024b6ada473ce83040fb82f3c1b15c44758fa2fa4983d5fc3\" id:\"66fb4e5efe50d095a86a0e29f5c41cdeaac35bb1bd4050261b0830e1989881c5\" pid:5171 exited_at:{seconds:1742866302 nanos:246947906}" Mar 25 01:31:52.084190 containerd[1497]: time="2025-03-25T01:31:52.084143285Z" level=info msg="TaskExit event in podsandbox handler container_id:\"48216c06ec2fcc708460762897ab7fddaecd1e84bee19354646d8525b9cb5440\" id:\"3caeef19ff2aa2c4d656ab786038723fad5322e1ba0d950cae521724df769dfb\" pid:5198 exited_at:{seconds:1742866312 nanos:83636267}" Mar 25 01:32:12.234819 containerd[1497]: time="2025-03-25T01:32:12.234654363Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a7f6153ab1b9b3f024b6ada473ce83040fb82f3c1b15c44758fa2fa4983d5fc3\" id:\"0a9c60be4ae958cff1232d19c2cc74ca9b2f4256fb2800e8dff11bbab00d3385\" pid:5244 exited_at:{seconds:1742866332 nanos:233783966}" Mar 25 01:32:22.086214 containerd[1497]: time="2025-03-25T01:32:22.086164237Z" level=info msg="TaskExit event in podsandbox handler container_id:\"48216c06ec2fcc708460762897ab7fddaecd1e84bee19354646d8525b9cb5440\" id:\"5faf16c1b0cf54502f32834b57e23c12d30a009d3318c489f08696c38f58f116\" pid:5266 exited_at:{seconds:1742866342 nanos:85526048}" Mar 25 01:32:37.283186 containerd[1497]: time="2025-03-25T01:32:37.283087600Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a7f6153ab1b9b3f024b6ada473ce83040fb82f3c1b15c44758fa2fa4983d5fc3\" id:\"b08a9b3212a0465e9acbbf5cb13b68191d62328e6f84840a7b0664b6a1ed590c\" pid:5293 exited_at:{seconds:1742866357 nanos:282858469}" Mar 25 01:32:42.233999 containerd[1497]: time="2025-03-25T01:32:42.233619483Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a7f6153ab1b9b3f024b6ada473ce83040fb82f3c1b15c44758fa2fa4983d5fc3\" id:\"177867de02c18738f10aa7e69d02077df5eee2c63227fd842a08212c5e828fb8\" pid:5315 exited_at:{seconds:1742866362 nanos:233137259}" Mar 25 01:32:52.069562 containerd[1497]: time="2025-03-25T01:32:52.069460943Z" level=info msg="TaskExit event in podsandbox handler container_id:\"48216c06ec2fcc708460762897ab7fddaecd1e84bee19354646d8525b9cb5440\" id:\"2611b22c7dcaa758344920aa9eb2478845d123bd23480e94cff85c44be083a78\" pid:5336 exited_at:{seconds:1742866372 nanos:68858712}" Mar 25 01:33:08.030069 systemd[1]: Started sshd@9-78.46.211.139:22-139.178.89.65:37944.service - OpenSSH per-connection server daemon (139.178.89.65:37944). Mar 25 01:33:09.042660 sshd[5354]: Accepted publickey for core from 139.178.89.65 port 37944 ssh2: RSA SHA256:Xy1qy6Im1XRHylsMcGES+WKq7CDbUddw+Bozhds0vS4 Mar 25 01:33:09.045744 sshd-session[5354]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:33:09.054727 systemd-logind[1475]: New session 8 of user core. Mar 25 01:33:09.065101 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 25 01:33:09.851351 sshd[5356]: Connection closed by 139.178.89.65 port 37944 Mar 25 01:33:09.852242 sshd-session[5354]: pam_unix(sshd:session): session closed for user core Mar 25 01:33:09.861932 systemd[1]: sshd@9-78.46.211.139:22-139.178.89.65:37944.service: Deactivated successfully. Mar 25 01:33:09.865206 systemd[1]: session-8.scope: Deactivated successfully. Mar 25 01:33:09.866928 systemd-logind[1475]: Session 8 logged out. Waiting for processes to exit. Mar 25 01:33:09.869312 systemd-logind[1475]: Removed session 8. Mar 25 01:33:12.238214 containerd[1497]: time="2025-03-25T01:33:12.238166870Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a7f6153ab1b9b3f024b6ada473ce83040fb82f3c1b15c44758fa2fa4983d5fc3\" id:\"4366e0da9f6fe69ccb2bc34f8c0eb86b77814b1a1bb01de69c2abfc152002c22\" pid:5384 exited_at:{seconds:1742866392 nanos:237624041}" Mar 25 01:33:15.028650 systemd[1]: Started sshd@10-78.46.211.139:22-139.178.89.65:37960.service - OpenSSH per-connection server daemon (139.178.89.65:37960). Mar 25 01:33:16.073477 sshd[5393]: Accepted publickey for core from 139.178.89.65 port 37960 ssh2: RSA SHA256:Xy1qy6Im1XRHylsMcGES+WKq7CDbUddw+Bozhds0vS4 Mar 25 01:33:16.076666 sshd-session[5393]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:33:16.085903 systemd-logind[1475]: New session 9 of user core. Mar 25 01:33:16.093374 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 25 01:33:16.863957 sshd[5395]: Connection closed by 139.178.89.65 port 37960 Mar 25 01:33:16.864608 sshd-session[5393]: pam_unix(sshd:session): session closed for user core Mar 25 01:33:16.871365 systemd[1]: sshd@10-78.46.211.139:22-139.178.89.65:37960.service: Deactivated successfully. Mar 25 01:33:16.874237 systemd[1]: session-9.scope: Deactivated successfully. Mar 25 01:33:16.875350 systemd-logind[1475]: Session 9 logged out. Waiting for processes to exit. Mar 25 01:33:16.877039 systemd-logind[1475]: Removed session 9. Mar 25 01:33:17.041244 systemd[1]: Started sshd@11-78.46.211.139:22-139.178.89.65:37968.service - OpenSSH per-connection server daemon (139.178.89.65:37968). Mar 25 01:33:18.042826 sshd[5407]: Accepted publickey for core from 139.178.89.65 port 37968 ssh2: RSA SHA256:Xy1qy6Im1XRHylsMcGES+WKq7CDbUddw+Bozhds0vS4 Mar 25 01:33:18.045471 sshd-session[5407]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:33:18.051279 systemd-logind[1475]: New session 10 of user core. Mar 25 01:33:18.059117 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 25 01:33:18.856906 sshd[5409]: Connection closed by 139.178.89.65 port 37968 Mar 25 01:33:18.857973 sshd-session[5407]: pam_unix(sshd:session): session closed for user core Mar 25 01:33:18.865484 systemd[1]: sshd@11-78.46.211.139:22-139.178.89.65:37968.service: Deactivated successfully. Mar 25 01:33:18.868757 systemd[1]: session-10.scope: Deactivated successfully. Mar 25 01:33:18.870345 systemd-logind[1475]: Session 10 logged out. Waiting for processes to exit. Mar 25 01:33:18.871664 systemd-logind[1475]: Removed session 10. Mar 25 01:33:19.028804 systemd[1]: Started sshd@12-78.46.211.139:22-139.178.89.65:34006.service - OpenSSH per-connection server daemon (139.178.89.65:34006). Mar 25 01:33:20.030852 sshd[5419]: Accepted publickey for core from 139.178.89.65 port 34006 ssh2: RSA SHA256:Xy1qy6Im1XRHylsMcGES+WKq7CDbUddw+Bozhds0vS4 Mar 25 01:33:20.032934 sshd-session[5419]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:33:20.038722 systemd-logind[1475]: New session 11 of user core. Mar 25 01:33:20.047050 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 25 01:33:20.853855 sshd[5421]: Connection closed by 139.178.89.65 port 34006 Mar 25 01:33:20.855956 sshd-session[5419]: pam_unix(sshd:session): session closed for user core Mar 25 01:33:20.864439 systemd[1]: sshd@12-78.46.211.139:22-139.178.89.65:34006.service: Deactivated successfully. Mar 25 01:33:20.870190 systemd[1]: session-11.scope: Deactivated successfully. Mar 25 01:33:20.875060 systemd-logind[1475]: Session 11 logged out. Waiting for processes to exit. Mar 25 01:33:20.876841 systemd-logind[1475]: Removed session 11. Mar 25 01:33:21.057065 containerd[1497]: time="2025-03-25T01:33:21.056950388Z" level=warning msg="container event discarded" container=e8ac010563523da5fb45689be3d9910c2d613c18d836b8a03afd62bb034b3024 type=CONTAINER_CREATED_EVENT Mar 25 01:33:21.057710 containerd[1497]: time="2025-03-25T01:33:21.057357130Z" level=warning msg="container event discarded" container=e8ac010563523da5fb45689be3d9910c2d613c18d836b8a03afd62bb034b3024 type=CONTAINER_STARTED_EVENT Mar 25 01:33:21.086848 containerd[1497]: time="2025-03-25T01:33:21.086650115Z" level=warning msg="container event discarded" container=7b30b9f3e9896b96ac5743838a8fc19bd2a47ff77c67e20175a44b58c3c5d4e8 type=CONTAINER_CREATED_EVENT Mar 25 01:33:21.086848 containerd[1497]: time="2025-03-25T01:33:21.086734559Z" level=warning msg="container event discarded" container=7b30b9f3e9896b96ac5743838a8fc19bd2a47ff77c67e20175a44b58c3c5d4e8 type=CONTAINER_STARTED_EVENT Mar 25 01:33:21.098684 containerd[1497]: time="2025-03-25T01:33:21.098465570Z" level=warning msg="container event discarded" container=664e31135accd76096b1e83f82181f116de9ac169c6ef41c37a1693c4386d592 type=CONTAINER_CREATED_EVENT Mar 25 01:33:21.098684 containerd[1497]: time="2025-03-25T01:33:21.098541094Z" level=warning msg="container event discarded" container=664e31135accd76096b1e83f82181f116de9ac169c6ef41c37a1693c4386d592 type=CONTAINER_STARTED_EVENT Mar 25 01:33:21.098684 containerd[1497]: time="2025-03-25T01:33:21.098636739Z" level=warning msg="container event discarded" container=110a62049e4d4db95ee2ef19440dab5f292ebf3b319ad9e9dbe7e11d30306006 type=CONTAINER_CREATED_EVENT Mar 25 01:33:21.125296 containerd[1497]: time="2025-03-25T01:33:21.125121448Z" level=warning msg="container event discarded" container=f43d9153ea7d829f5362a6d6df47a4a9daae7e9ee2e3be5ac92185a84b3948f5 type=CONTAINER_CREATED_EVENT Mar 25 01:33:21.136575 containerd[1497]: time="2025-03-25T01:33:21.136481438Z" level=warning msg="container event discarded" container=bc38fa05279ef37ba0c197cbc6fd4ce4e6370046a1fcfda9198fd8d7bb597ee8 type=CONTAINER_CREATED_EVENT Mar 25 01:33:21.203907 containerd[1497]: time="2025-03-25T01:33:21.203779009Z" level=warning msg="container event discarded" container=110a62049e4d4db95ee2ef19440dab5f292ebf3b319ad9e9dbe7e11d30306006 type=CONTAINER_STARTED_EVENT Mar 25 01:33:21.262232 containerd[1497]: time="2025-03-25T01:33:21.262125764Z" level=warning msg="container event discarded" container=f43d9153ea7d829f5362a6d6df47a4a9daae7e9ee2e3be5ac92185a84b3948f5 type=CONTAINER_STARTED_EVENT Mar 25 01:33:21.262232 containerd[1497]: time="2025-03-25T01:33:21.262195608Z" level=warning msg="container event discarded" container=bc38fa05279ef37ba0c197cbc6fd4ce4e6370046a1fcfda9198fd8d7bb597ee8 type=CONTAINER_STARTED_EVENT Mar 25 01:33:22.125867 containerd[1497]: time="2025-03-25T01:33:22.125153032Z" level=info msg="TaskExit event in podsandbox handler container_id:\"48216c06ec2fcc708460762897ab7fddaecd1e84bee19354646d8525b9cb5440\" id:\"7f6ec22fb14c63811cb70a9110e2f2a838918816cd50bd68486dca11b6304727\" pid:5448 exit_status:1 exited_at:{seconds:1742866402 nanos:124641883}" Mar 25 01:33:26.029674 systemd[1]: Started sshd@13-78.46.211.139:22-139.178.89.65:34016.service - OpenSSH per-connection server daemon (139.178.89.65:34016). Mar 25 01:33:27.039501 sshd[5461]: Accepted publickey for core from 139.178.89.65 port 34016 ssh2: RSA SHA256:Xy1qy6Im1XRHylsMcGES+WKq7CDbUddw+Bozhds0vS4 Mar 25 01:33:27.041493 sshd-session[5461]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:33:27.048914 systemd-logind[1475]: New session 12 of user core. Mar 25 01:33:27.058145 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 25 01:33:27.804834 sshd[5465]: Connection closed by 139.178.89.65 port 34016 Mar 25 01:33:27.803910 sshd-session[5461]: pam_unix(sshd:session): session closed for user core Mar 25 01:33:27.809257 systemd[1]: sshd@13-78.46.211.139:22-139.178.89.65:34016.service: Deactivated successfully. Mar 25 01:33:27.812117 systemd[1]: session-12.scope: Deactivated successfully. Mar 25 01:33:27.813476 systemd-logind[1475]: Session 12 logged out. Waiting for processes to exit. Mar 25 01:33:27.815441 systemd-logind[1475]: Removed session 12. Mar 25 01:33:32.231412 containerd[1497]: time="2025-03-25T01:33:32.231316926Z" level=warning msg="container event discarded" container=dfc11a2936847fb67fca376cd64cbcaf7e3026d775f141b0a086a6a9cac67648 type=CONTAINER_CREATED_EVENT Mar 25 01:33:32.231412 containerd[1497]: time="2025-03-25T01:33:32.231371930Z" level=warning msg="container event discarded" container=dfc11a2936847fb67fca376cd64cbcaf7e3026d775f141b0a086a6a9cac67648 type=CONTAINER_STARTED_EVENT Mar 25 01:33:32.275311 containerd[1497]: time="2025-03-25T01:33:32.275204248Z" level=warning msg="container event discarded" container=9b63e7835edc03a839fcf663fb03587ec8b1704e7d0dd2391f213fa90dc56f6a type=CONTAINER_CREATED_EVENT Mar 25 01:33:32.344643 containerd[1497]: time="2025-03-25T01:33:32.344553049Z" level=warning msg="container event discarded" container=9b63e7835edc03a839fcf663fb03587ec8b1704e7d0dd2391f213fa90dc56f6a type=CONTAINER_STARTED_EVENT Mar 25 01:33:32.529397 containerd[1497]: time="2025-03-25T01:33:32.529188809Z" level=warning msg="container event discarded" container=33821242d6c0ed7bf73316718acd331853bb05d9f203fabee8de7989f05a4001 type=CONTAINER_CREATED_EVENT Mar 25 01:33:32.529397 containerd[1497]: time="2025-03-25T01:33:32.529259453Z" level=warning msg="container event discarded" container=33821242d6c0ed7bf73316718acd331853bb05d9f203fabee8de7989f05a4001 type=CONTAINER_STARTED_EVENT Mar 25 01:33:32.975497 systemd[1]: Started sshd@14-78.46.211.139:22-139.178.89.65:52628.service - OpenSSH per-connection server daemon (139.178.89.65:52628). Mar 25 01:33:33.964747 sshd[5480]: Accepted publickey for core from 139.178.89.65 port 52628 ssh2: RSA SHA256:Xy1qy6Im1XRHylsMcGES+WKq7CDbUddw+Bozhds0vS4 Mar 25 01:33:33.967006 sshd-session[5480]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:33:33.974948 systemd-logind[1475]: New session 13 of user core. Mar 25 01:33:33.981056 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 25 01:33:34.723457 sshd[5482]: Connection closed by 139.178.89.65 port 52628 Mar 25 01:33:34.724087 sshd-session[5480]: pam_unix(sshd:session): session closed for user core Mar 25 01:33:34.729594 systemd[1]: sshd@14-78.46.211.139:22-139.178.89.65:52628.service: Deactivated successfully. Mar 25 01:33:34.734429 systemd[1]: session-13.scope: Deactivated successfully. Mar 25 01:33:34.737621 systemd-logind[1475]: Session 13 logged out. Waiting for processes to exit. Mar 25 01:33:34.739258 systemd-logind[1475]: Removed session 13. Mar 25 01:33:37.285034 containerd[1497]: time="2025-03-25T01:33:37.284978766Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a7f6153ab1b9b3f024b6ada473ce83040fb82f3c1b15c44758fa2fa4983d5fc3\" id:\"2d2ea0ea2dd897f86e84f33c41d1401cabe3d7b08d1832233babdfbb685ef7e2\" pid:5504 exited_at:{seconds:1742866417 nanos:284045913}" Mar 25 01:33:39.902721 systemd[1]: Started sshd@15-78.46.211.139:22-139.178.89.65:33380.service - OpenSSH per-connection server daemon (139.178.89.65:33380). Mar 25 01:33:40.914857 sshd[5519]: Accepted publickey for core from 139.178.89.65 port 33380 ssh2: RSA SHA256:Xy1qy6Im1XRHylsMcGES+WKq7CDbUddw+Bozhds0vS4 Mar 25 01:33:40.919700 sshd-session[5519]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:33:40.931609 systemd-logind[1475]: New session 14 of user core. Mar 25 01:33:40.941356 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 25 01:33:41.690865 sshd[5521]: Connection closed by 139.178.89.65 port 33380 Mar 25 01:33:41.690657 sshd-session[5519]: pam_unix(sshd:session): session closed for user core Mar 25 01:33:41.696505 systemd[1]: sshd@15-78.46.211.139:22-139.178.89.65:33380.service: Deactivated successfully. Mar 25 01:33:41.699867 systemd[1]: session-14.scope: Deactivated successfully. Mar 25 01:33:41.700836 systemd-logind[1475]: Session 14 logged out. Waiting for processes to exit. Mar 25 01:33:41.702993 systemd-logind[1475]: Removed session 14. Mar 25 01:33:41.861677 systemd[1]: Started sshd@16-78.46.211.139:22-139.178.89.65:33390.service - OpenSSH per-connection server daemon (139.178.89.65:33390). Mar 25 01:33:41.870967 containerd[1497]: time="2025-03-25T01:33:41.870858289Z" level=warning msg="container event discarded" container=bb4266f414617d23abda7a0741560d58b9b4a155d5b9027173e68d09e4144bae type=CONTAINER_CREATED_EVENT Mar 25 01:33:41.942775 containerd[1497]: time="2025-03-25T01:33:41.942544277Z" level=warning msg="container event discarded" container=bb4266f414617d23abda7a0741560d58b9b4a155d5b9027173e68d09e4144bae type=CONTAINER_STARTED_EVENT Mar 25 01:33:42.240208 containerd[1497]: time="2025-03-25T01:33:42.239752288Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a7f6153ab1b9b3f024b6ada473ce83040fb82f3c1b15c44758fa2fa4983d5fc3\" id:\"0a23379e096ae7d0a4ded0268e4739f5f17723c1250a6b2ac78e47be68e9c58b\" pid:5548 exited_at:{seconds:1742866422 nanos:238178438}" Mar 25 01:33:42.857977 sshd[5533]: Accepted publickey for core from 139.178.89.65 port 33390 ssh2: RSA SHA256:Xy1qy6Im1XRHylsMcGES+WKq7CDbUddw+Bozhds0vS4 Mar 25 01:33:42.860531 sshd-session[5533]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:33:42.874491 systemd-logind[1475]: New session 15 of user core. Mar 25 01:33:42.883356 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 25 01:33:43.740234 sshd[5557]: Connection closed by 139.178.89.65 port 33390 Mar 25 01:33:43.741062 sshd-session[5533]: pam_unix(sshd:session): session closed for user core Mar 25 01:33:43.749411 systemd-logind[1475]: Session 15 logged out. Waiting for processes to exit. Mar 25 01:33:43.750154 systemd[1]: sshd@16-78.46.211.139:22-139.178.89.65:33390.service: Deactivated successfully. Mar 25 01:33:43.754117 systemd[1]: session-15.scope: Deactivated successfully. Mar 25 01:33:43.755597 systemd-logind[1475]: Removed session 15. Mar 25 01:33:43.914047 systemd[1]: Started sshd@17-78.46.211.139:22-139.178.89.65:33394.service - OpenSSH per-connection server daemon (139.178.89.65:33394). Mar 25 01:33:44.907865 sshd[5567]: Accepted publickey for core from 139.178.89.65 port 33394 ssh2: RSA SHA256:Xy1qy6Im1XRHylsMcGES+WKq7CDbUddw+Bozhds0vS4 Mar 25 01:33:44.910258 sshd-session[5567]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:33:44.915990 systemd-logind[1475]: New session 16 of user core. Mar 25 01:33:44.925205 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 25 01:33:47.547839 containerd[1497]: time="2025-03-25T01:33:47.547607282Z" level=warning msg="container event discarded" container=ac0794f373e92eb195b28e158d34495584efcf77aaca8094ff5125835e3fa4a0 type=CONTAINER_CREATED_EVENT Mar 25 01:33:47.547839 containerd[1497]: time="2025-03-25T01:33:47.547697527Z" level=warning msg="container event discarded" container=ac0794f373e92eb195b28e158d34495584efcf77aaca8094ff5125835e3fa4a0 type=CONTAINER_STARTED_EVENT Mar 25 01:33:47.641692 containerd[1497]: time="2025-03-25T01:33:47.641597071Z" level=warning msg="container event discarded" container=0981bc70b8474e6e76c8796196ff1c1242759024bd7c9e0fcc729e041c8869e7 type=CONTAINER_CREATED_EVENT Mar 25 01:33:47.641692 containerd[1497]: time="2025-03-25T01:33:47.641668875Z" level=warning msg="container event discarded" container=0981bc70b8474e6e76c8796196ff1c1242759024bd7c9e0fcc729e041c8869e7 type=CONTAINER_STARTED_EVENT Mar 25 01:33:47.750022 sshd[5569]: Connection closed by 139.178.89.65 port 33394 Mar 25 01:33:47.753177 sshd-session[5567]: pam_unix(sshd:session): session closed for user core Mar 25 01:33:47.762117 systemd[1]: sshd@17-78.46.211.139:22-139.178.89.65:33394.service: Deactivated successfully. Mar 25 01:33:47.765975 systemd[1]: session-16.scope: Deactivated successfully. Mar 25 01:33:47.766499 systemd[1]: session-16.scope: Consumed 613ms CPU time, 73.2M memory peak. Mar 25 01:33:47.767760 systemd-logind[1475]: Session 16 logged out. Waiting for processes to exit. Mar 25 01:33:47.771162 systemd-logind[1475]: Removed session 16. Mar 25 01:33:47.928643 systemd[1]: Started sshd@18-78.46.211.139:22-139.178.89.65:33406.service - OpenSSH per-connection server daemon (139.178.89.65:33406). Mar 25 01:33:48.972527 sshd[5587]: Accepted publickey for core from 139.178.89.65 port 33406 ssh2: RSA SHA256:Xy1qy6Im1XRHylsMcGES+WKq7CDbUddw+Bozhds0vS4 Mar 25 01:33:48.974984 sshd-session[5587]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:33:48.982013 systemd-logind[1475]: New session 17 of user core. Mar 25 01:33:48.992162 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 25 01:33:49.374035 containerd[1497]: time="2025-03-25T01:33:49.373896130Z" level=warning msg="container event discarded" container=e97d8c1dd69e4cc99f5b30f297631915063bee21f44268a779b8289f9790b96f type=CONTAINER_CREATED_EVENT Mar 25 01:33:49.468411 containerd[1497]: time="2025-03-25T01:33:49.468306916Z" level=warning msg="container event discarded" container=e97d8c1dd69e4cc99f5b30f297631915063bee21f44268a779b8289f9790b96f type=CONTAINER_STARTED_EVENT Mar 25 01:33:49.883527 sshd[5601]: Connection closed by 139.178.89.65 port 33406 Mar 25 01:33:49.885132 sshd-session[5587]: pam_unix(sshd:session): session closed for user core Mar 25 01:33:49.889713 systemd[1]: sshd@18-78.46.211.139:22-139.178.89.65:33406.service: Deactivated successfully. Mar 25 01:33:49.893143 systemd[1]: session-17.scope: Deactivated successfully. Mar 25 01:33:49.895634 systemd-logind[1475]: Session 17 logged out. Waiting for processes to exit. Mar 25 01:33:49.897000 systemd-logind[1475]: Removed session 17. Mar 25 01:33:50.062881 systemd[1]: Started sshd@19-78.46.211.139:22-139.178.89.65:37108.service - OpenSSH per-connection server daemon (139.178.89.65:37108). Mar 25 01:33:50.665433 containerd[1497]: time="2025-03-25T01:33:50.665352831Z" level=warning msg="container event discarded" container=51c4ab11e09c6fc53f65967737c38df1cd2adfae24ff83f0071512e09f0a7f2e type=CONTAINER_CREATED_EVENT Mar 25 01:33:50.755880 containerd[1497]: time="2025-03-25T01:33:50.755710909Z" level=warning msg="container event discarded" container=51c4ab11e09c6fc53f65967737c38df1cd2adfae24ff83f0071512e09f0a7f2e type=CONTAINER_STARTED_EVENT Mar 25 01:33:50.933349 containerd[1497]: time="2025-03-25T01:33:50.933172677Z" level=warning msg="container event discarded" container=51c4ab11e09c6fc53f65967737c38df1cd2adfae24ff83f0071512e09f0a7f2e type=CONTAINER_STOPPED_EVENT Mar 25 01:33:51.064692 sshd[5611]: Accepted publickey for core from 139.178.89.65 port 37108 ssh2: RSA SHA256:Xy1qy6Im1XRHylsMcGES+WKq7CDbUddw+Bozhds0vS4 Mar 25 01:33:51.067471 sshd-session[5611]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:33:51.074012 systemd-logind[1475]: New session 18 of user core. Mar 25 01:33:51.084409 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 25 01:33:51.834579 sshd[5613]: Connection closed by 139.178.89.65 port 37108 Mar 25 01:33:51.834096 sshd-session[5611]: pam_unix(sshd:session): session closed for user core Mar 25 01:33:51.839889 systemd-logind[1475]: Session 18 logged out. Waiting for processes to exit. Mar 25 01:33:51.840486 systemd[1]: sshd@19-78.46.211.139:22-139.178.89.65:37108.service: Deactivated successfully. Mar 25 01:33:51.843490 systemd[1]: session-18.scope: Deactivated successfully. Mar 25 01:33:51.847155 systemd-logind[1475]: Removed session 18. Mar 25 01:33:52.079360 containerd[1497]: time="2025-03-25T01:33:52.079311236Z" level=info msg="TaskExit event in podsandbox handler container_id:\"48216c06ec2fcc708460762897ab7fddaecd1e84bee19354646d8525b9cb5440\" id:\"452c12c457cb3d7b4b2ac0992b726e6bec34b88b7984f7d22a0fcdaf07d1929d\" pid:5637 exited_at:{seconds:1742866432 nanos:78838888}" Mar 25 01:33:54.502513 containerd[1497]: time="2025-03-25T01:33:54.502441681Z" level=warning msg="container event discarded" container=8b11635d2d9737186dca7a7e09e1fa5dd349aa7f842803f3cc6416e1b8814cd4 type=CONTAINER_CREATED_EVENT Mar 25 01:33:54.621630 containerd[1497]: time="2025-03-25T01:33:54.621440531Z" level=warning msg="container event discarded" container=8b11635d2d9737186dca7a7e09e1fa5dd349aa7f842803f3cc6416e1b8814cd4 type=CONTAINER_STARTED_EVENT Mar 25 01:33:55.409411 containerd[1497]: time="2025-03-25T01:33:55.409297438Z" level=warning msg="container event discarded" container=8b11635d2d9737186dca7a7e09e1fa5dd349aa7f842803f3cc6416e1b8814cd4 type=CONTAINER_STOPPED_EVENT Mar 25 01:33:57.006593 systemd[1]: Started sshd@20-78.46.211.139:22-139.178.89.65:37124.service - OpenSSH per-connection server daemon (139.178.89.65:37124). Mar 25 01:33:58.011862 sshd[5653]: Accepted publickey for core from 139.178.89.65 port 37124 ssh2: RSA SHA256:Xy1qy6Im1XRHylsMcGES+WKq7CDbUddw+Bozhds0vS4 Mar 25 01:33:58.015958 sshd-session[5653]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:33:58.022670 systemd-logind[1475]: New session 19 of user core. Mar 25 01:33:58.030138 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 25 01:33:58.766876 sshd[5655]: Connection closed by 139.178.89.65 port 37124 Mar 25 01:33:58.766662 sshd-session[5653]: pam_unix(sshd:session): session closed for user core Mar 25 01:33:58.771756 systemd[1]: sshd@20-78.46.211.139:22-139.178.89.65:37124.service: Deactivated successfully. Mar 25 01:33:58.777072 systemd[1]: session-19.scope: Deactivated successfully. Mar 25 01:33:58.779483 systemd-logind[1475]: Session 19 logged out. Waiting for processes to exit. Mar 25 01:33:58.781339 systemd-logind[1475]: Removed session 19. Mar 25 01:33:59.661042 containerd[1497]: time="2025-03-25T01:33:59.660938840Z" level=warning msg="container event discarded" container=48216c06ec2fcc708460762897ab7fddaecd1e84bee19354646d8525b9cb5440 type=CONTAINER_CREATED_EVENT Mar 25 01:33:59.740119 containerd[1497]: time="2025-03-25T01:33:59.740002188Z" level=warning msg="container event discarded" container=48216c06ec2fcc708460762897ab7fddaecd1e84bee19354646d8525b9cb5440 type=CONTAINER_STARTED_EVENT Mar 25 01:34:03.943417 systemd[1]: Started sshd@21-78.46.211.139:22-139.178.89.65:60016.service - OpenSSH per-connection server daemon (139.178.89.65:60016). Mar 25 01:34:04.955832 sshd[5668]: Accepted publickey for core from 139.178.89.65 port 60016 ssh2: RSA SHA256:Xy1qy6Im1XRHylsMcGES+WKq7CDbUddw+Bozhds0vS4 Mar 25 01:34:04.958925 sshd-session[5668]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:34:04.966897 systemd-logind[1475]: New session 20 of user core. Mar 25 01:34:04.973288 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 25 01:34:05.725219 sshd[5670]: Connection closed by 139.178.89.65 port 60016 Mar 25 01:34:05.726097 sshd-session[5668]: pam_unix(sshd:session): session closed for user core Mar 25 01:34:05.732318 systemd[1]: sshd@21-78.46.211.139:22-139.178.89.65:60016.service: Deactivated successfully. Mar 25 01:34:05.737663 systemd[1]: session-20.scope: Deactivated successfully. Mar 25 01:34:05.738958 systemd-logind[1475]: Session 20 logged out. Waiting for processes to exit. Mar 25 01:34:05.740276 systemd-logind[1475]: Removed session 20. Mar 25 01:34:11.215018 containerd[1497]: time="2025-03-25T01:34:11.214870460Z" level=warning msg="container event discarded" container=be23acdcca48310e71c118420cfd6621cb31294deb345b3330303ae69fd53b24 type=CONTAINER_CREATED_EVENT Mar 25 01:34:11.215018 containerd[1497]: time="2025-03-25T01:34:11.214980747Z" level=warning msg="container event discarded" container=be23acdcca48310e71c118420cfd6621cb31294deb345b3330303ae69fd53b24 type=CONTAINER_STARTED_EVENT Mar 25 01:34:11.275372 containerd[1497]: time="2025-03-25T01:34:11.275235633Z" level=warning msg="container event discarded" container=b648c3727878e7b766fb88b629e7c27d9fec934aa8571b7800d5a49f639269a2 type=CONTAINER_CREATED_EVENT Mar 25 01:34:11.275372 containerd[1497]: time="2025-03-25T01:34:11.275307518Z" level=warning msg="container event discarded" container=b648c3727878e7b766fb88b629e7c27d9fec934aa8571b7800d5a49f639269a2 type=CONTAINER_STARTED_EVENT Mar 25 01:34:11.335926 containerd[1497]: time="2025-03-25T01:34:11.335840741Z" level=warning msg="container event discarded" container=ea217660112ff83e4046dea54fcfec20ae5a97d58b4e7446bc1868fcf301d48b type=CONTAINER_CREATED_EVENT Mar 25 01:34:11.335926 containerd[1497]: time="2025-03-25T01:34:11.335894944Z" level=warning msg="container event discarded" container=ea217660112ff83e4046dea54fcfec20ae5a97d58b4e7446bc1868fcf301d48b type=CONTAINER_STARTED_EVENT Mar 25 01:34:11.371400 containerd[1497]: time="2025-03-25T01:34:11.371260158Z" level=warning msg="container event discarded" container=cb5e444bcb8729282f529feb42cb7793a9cdd6fd961d4a2b799e7745405fdf1c type=CONTAINER_CREATED_EVENT Mar 25 01:34:11.434761 containerd[1497]: time="2025-03-25T01:34:11.434660791Z" level=warning msg="container event discarded" container=cb5e444bcb8729282f529feb42cb7793a9cdd6fd961d4a2b799e7745405fdf1c type=CONTAINER_STARTED_EVENT Mar 25 01:34:12.237559 containerd[1497]: time="2025-03-25T01:34:12.237364998Z" level=warning msg="container event discarded" container=17c97c7f0112ed0bca36364c443376d0de749dd1f86eed287f8021bb8330a362 type=CONTAINER_CREATED_EVENT Mar 25 01:34:12.237559 containerd[1497]: time="2025-03-25T01:34:12.237456844Z" level=warning msg="container event discarded" container=17c97c7f0112ed0bca36364c443376d0de749dd1f86eed287f8021bb8330a362 type=CONTAINER_STARTED_EVENT Mar 25 01:34:12.243475 containerd[1497]: time="2025-03-25T01:34:12.243054736Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a7f6153ab1b9b3f024b6ada473ce83040fb82f3c1b15c44758fa2fa4983d5fc3\" id:\"2f8e0e56075ac228ee7a1aeb108952ab1302cca6b60cda6a393bcec0d6fbb2f3\" pid:5693 exited_at:{seconds:1742866452 nanos:242446380}" Mar 25 01:34:12.277305 containerd[1497]: time="2025-03-25T01:34:12.277187958Z" level=warning msg="container event discarded" container=bb6b024cdc098adbe4bd15536304dc57fdc1cd68bbf58e1abb8b2ca2af843793 type=CONTAINER_CREATED_EVENT Mar 25 01:34:12.393785 containerd[1497]: time="2025-03-25T01:34:12.393670139Z" level=warning msg="container event discarded" container=717aa05a4394c67bf9e2b92678f03d1ac27d7675178169acdddfd572c4d29f9e type=CONTAINER_CREATED_EVENT Mar 25 01:34:12.393785 containerd[1497]: time="2025-03-25T01:34:12.393739463Z" level=warning msg="container event discarded" container=717aa05a4394c67bf9e2b92678f03d1ac27d7675178169acdddfd572c4d29f9e type=CONTAINER_STARTED_EVENT Mar 25 01:34:12.434145 containerd[1497]: time="2025-03-25T01:34:12.434009609Z" level=warning msg="container event discarded" container=bb6b024cdc098adbe4bd15536304dc57fdc1cd68bbf58e1abb8b2ca2af843793 type=CONTAINER_STARTED_EVENT Mar 25 01:34:12.511700 containerd[1497]: time="2025-03-25T01:34:12.511330389Z" level=warning msg="container event discarded" container=ddbc94aff68853abb6ace3841353595e0f60dfcc96094f7a63b7ee29d68f1204 type=CONTAINER_CREATED_EVENT Mar 25 01:34:12.511700 containerd[1497]: time="2025-03-25T01:34:12.511389553Z" level=warning msg="container event discarded" container=ddbc94aff68853abb6ace3841353595e0f60dfcc96094f7a63b7ee29d68f1204 type=CONTAINER_STARTED_EVENT Mar 25 01:34:14.255249 containerd[1497]: time="2025-03-25T01:34:14.255141212Z" level=warning msg="container event discarded" container=a7f6153ab1b9b3f024b6ada473ce83040fb82f3c1b15c44758fa2fa4983d5fc3 type=CONTAINER_CREATED_EVENT Mar 25 01:34:14.339286 containerd[1497]: time="2025-03-25T01:34:14.339152837Z" level=warning msg="container event discarded" container=a7f6153ab1b9b3f024b6ada473ce83040fb82f3c1b15c44758fa2fa4983d5fc3 type=CONTAINER_STARTED_EVENT Mar 25 01:34:17.143692 containerd[1497]: time="2025-03-25T01:34:17.143578369Z" level=warning msg="container event discarded" container=023590956b5111ec5c10e07befc7b0393f7d787811e66a60cf115be27f48f607 type=CONTAINER_CREATED_EVENT Mar 25 01:34:17.266325 containerd[1497]: time="2025-03-25T01:34:17.266231104Z" level=warning msg="container event discarded" container=023590956b5111ec5c10e07befc7b0393f7d787811e66a60cf115be27f48f607 type=CONTAINER_STARTED_EVENT Mar 25 01:34:18.477316 containerd[1497]: time="2025-03-25T01:34:18.477233473Z" level=warning msg="container event discarded" container=37ab32382f4b7313f81558445da9f663351fdf43f5affdd751357706be3cb817 type=CONTAINER_CREATED_EVENT Mar 25 01:34:18.575946 containerd[1497]: time="2025-03-25T01:34:18.575795619Z" level=warning msg="container event discarded" container=37ab32382f4b7313f81558445da9f663351fdf43f5affdd751357706be3cb817 type=CONTAINER_STARTED_EVENT Mar 25 01:34:18.836791 containerd[1497]: time="2025-03-25T01:34:18.836577782Z" level=warning msg="container event discarded" container=80e59368f96c678b4328ae3933a104cbd007de4911d670f4fd3de4a8437b6b93 type=CONTAINER_CREATED_EVENT Mar 25 01:34:18.930970 containerd[1497]: time="2025-03-25T01:34:18.930879915Z" level=warning msg="container event discarded" container=80e59368f96c678b4328ae3933a104cbd007de4911d670f4fd3de4a8437b6b93 type=CONTAINER_STARTED_EVENT Mar 25 01:34:19.863894 update_engine[1476]: I20250325 01:34:19.862663 1476 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Mar 25 01:34:19.863894 update_engine[1476]: I20250325 01:34:19.862766 1476 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Mar 25 01:34:19.863894 update_engine[1476]: I20250325 01:34:19.863209 1476 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Mar 25 01:34:19.865573 update_engine[1476]: I20250325 01:34:19.865495 1476 omaha_request_params.cc:62] Current group set to alpha Mar 25 01:34:19.868171 update_engine[1476]: I20250325 01:34:19.867632 1476 update_attempter.cc:499] Already updated boot flags. Skipping. Mar 25 01:34:19.868171 update_engine[1476]: I20250325 01:34:19.868080 1476 update_attempter.cc:643] Scheduling an action processor start. Mar 25 01:34:19.868171 update_engine[1476]: I20250325 01:34:19.868111 1476 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Mar 25 01:34:19.873374 update_engine[1476]: I20250325 01:34:19.869755 1476 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Mar 25 01:34:19.873374 update_engine[1476]: I20250325 01:34:19.869889 1476 omaha_request_action.cc:271] Posting an Omaha request to disabled Mar 25 01:34:19.873374 update_engine[1476]: I20250325 01:34:19.869901 1476 omaha_request_action.cc:272] Request: Mar 25 01:34:19.873374 update_engine[1476]: Mar 25 01:34:19.873374 update_engine[1476]: Mar 25 01:34:19.873374 update_engine[1476]: Mar 25 01:34:19.873374 update_engine[1476]: Mar 25 01:34:19.873374 update_engine[1476]: Mar 25 01:34:19.873374 update_engine[1476]: Mar 25 01:34:19.873374 update_engine[1476]: Mar 25 01:34:19.873374 update_engine[1476]: Mar 25 01:34:19.873374 update_engine[1476]: I20250325 01:34:19.869907 1476 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 25 01:34:19.876000 update_engine[1476]: I20250325 01:34:19.875933 1476 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 25 01:34:19.876774 update_engine[1476]: I20250325 01:34:19.876644 1476 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 25 01:34:19.876887 locksmithd[1511]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Mar 25 01:34:19.879035 update_engine[1476]: E20250325 01:34:19.878789 1476 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 25 01:34:19.879035 update_engine[1476]: I20250325 01:34:19.878949 1476 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Mar 25 01:34:20.494598 containerd[1497]: time="2025-03-25T01:34:20.494515231Z" level=warning msg="container event discarded" container=26358f388d2025361f047f7e2ee3488b5d6fc5d65ae8083cb92aded8f81b4cd0 type=CONTAINER_CREATED_EVENT Mar 25 01:34:20.625446 containerd[1497]: time="2025-03-25T01:34:20.625294667Z" level=warning msg="container event discarded" container=26358f388d2025361f047f7e2ee3488b5d6fc5d65ae8083cb92aded8f81b4cd0 type=CONTAINER_STARTED_EVENT Mar 25 01:34:21.570606 systemd[1]: cri-containerd-110a62049e4d4db95ee2ef19440dab5f292ebf3b319ad9e9dbe7e11d30306006.scope: Deactivated successfully. Mar 25 01:34:21.570982 systemd[1]: cri-containerd-110a62049e4d4db95ee2ef19440dab5f292ebf3b319ad9e9dbe7e11d30306006.scope: Consumed 6.473s CPU time, 58M memory peak, 3.3M read from disk. Mar 25 01:34:21.579354 containerd[1497]: time="2025-03-25T01:34:21.578050164Z" level=info msg="received exit event container_id:\"110a62049e4d4db95ee2ef19440dab5f292ebf3b319ad9e9dbe7e11d30306006\" id:\"110a62049e4d4db95ee2ef19440dab5f292ebf3b319ad9e9dbe7e11d30306006\" pid:2626 exit_status:1 exited_at:{seconds:1742866461 nanos:576918576}" Mar 25 01:34:21.580935 containerd[1497]: time="2025-03-25T01:34:21.580378463Z" level=info msg="TaskExit event in podsandbox handler container_id:\"110a62049e4d4db95ee2ef19440dab5f292ebf3b319ad9e9dbe7e11d30306006\" id:\"110a62049e4d4db95ee2ef19440dab5f292ebf3b319ad9e9dbe7e11d30306006\" pid:2626 exit_status:1 exited_at:{seconds:1742866461 nanos:576918576}" Mar 25 01:34:21.610230 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-110a62049e4d4db95ee2ef19440dab5f292ebf3b319ad9e9dbe7e11d30306006-rootfs.mount: Deactivated successfully. Mar 25 01:34:21.863622 kubelet[2795]: I0325 01:34:21.863564 2795 scope.go:117] "RemoveContainer" containerID="110a62049e4d4db95ee2ef19440dab5f292ebf3b319ad9e9dbe7e11d30306006" Mar 25 01:34:21.867053 containerd[1497]: time="2025-03-25T01:34:21.866566894Z" level=info msg="CreateContainer within sandbox \"e8ac010563523da5fb45689be3d9910c2d613c18d836b8a03afd62bb034b3024\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Mar 25 01:34:21.878943 containerd[1497]: time="2025-03-25T01:34:21.878899470Z" level=info msg="Container ed9b7cc1988961376741cd14776a67ef9cf33d324387e97e56b374524ab039af: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:34:21.892598 containerd[1497]: time="2025-03-25T01:34:21.892545724Z" level=info msg="CreateContainer within sandbox \"e8ac010563523da5fb45689be3d9910c2d613c18d836b8a03afd62bb034b3024\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"ed9b7cc1988961376741cd14776a67ef9cf33d324387e97e56b374524ab039af\"" Mar 25 01:34:21.894901 containerd[1497]: time="2025-03-25T01:34:21.893547104Z" level=info msg="StartContainer for \"ed9b7cc1988961376741cd14776a67ef9cf33d324387e97e56b374524ab039af\"" Mar 25 01:34:21.895049 containerd[1497]: time="2025-03-25T01:34:21.895025472Z" level=info msg="connecting to shim ed9b7cc1988961376741cd14776a67ef9cf33d324387e97e56b374524ab039af" address="unix:///run/containerd/s/d3586e5775577857858f36607ab0842535a9c69bf2822a3ce89409b2757d354f" protocol=ttrpc version=3 Mar 25 01:34:21.926038 systemd[1]: Started cri-containerd-ed9b7cc1988961376741cd14776a67ef9cf33d324387e97e56b374524ab039af.scope - libcontainer container ed9b7cc1988961376741cd14776a67ef9cf33d324387e97e56b374524ab039af. Mar 25 01:34:21.981565 containerd[1497]: time="2025-03-25T01:34:21.981509191Z" level=info msg="StartContainer for \"ed9b7cc1988961376741cd14776a67ef9cf33d324387e97e56b374524ab039af\" returns successfully" Mar 25 01:34:21.989300 kubelet[2795]: E0325 01:34:21.989207 2795 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:40974->10.0.0.2:2379: read: connection timed out" Mar 25 01:34:22.060240 systemd[1]: cri-containerd-bb4266f414617d23abda7a0741560d58b9b4a155d5b9027173e68d09e4144bae.scope: Deactivated successfully. Mar 25 01:34:22.063092 systemd[1]: cri-containerd-bb4266f414617d23abda7a0741560d58b9b4a155d5b9027173e68d09e4144bae.scope: Consumed 7.016s CPU time, 41.4M memory peak. Mar 25 01:34:22.068043 containerd[1497]: time="2025-03-25T01:34:22.067663253Z" level=info msg="received exit event container_id:\"bb4266f414617d23abda7a0741560d58b9b4a155d5b9027173e68d09e4144bae\" id:\"bb4266f414617d23abda7a0741560d58b9b4a155d5b9027173e68d09e4144bae\" pid:3148 exit_status:1 exited_at:{seconds:1742866462 nanos:67322632}" Mar 25 01:34:22.068043 containerd[1497]: time="2025-03-25T01:34:22.067751538Z" level=info msg="TaskExit event in podsandbox handler container_id:\"bb4266f414617d23abda7a0741560d58b9b4a155d5b9027173e68d09e4144bae\" id:\"bb4266f414617d23abda7a0741560d58b9b4a155d5b9027173e68d09e4144bae\" pid:3148 exit_status:1 exited_at:{seconds:1742866462 nanos:67322632}" Mar 25 01:34:22.107867 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-bb4266f414617d23abda7a0741560d58b9b4a155d5b9027173e68d09e4144bae-rootfs.mount: Deactivated successfully. Mar 25 01:34:22.150081 containerd[1497]: time="2025-03-25T01:34:22.149756113Z" level=info msg="TaskExit event in podsandbox handler container_id:\"48216c06ec2fcc708460762897ab7fddaecd1e84bee19354646d8525b9cb5440\" id:\"9e8bb9fa831eab75c148086106fcab06baaf61087cfc5e5b7ec893d3fdb0e900\" pid:5758 exited_at:{seconds:1742866462 nanos:149072712}" Mar 25 01:34:22.866373 kubelet[2795]: I0325 01:34:22.865147 2795 scope.go:117] "RemoveContainer" containerID="bb4266f414617d23abda7a0741560d58b9b4a155d5b9027173e68d09e4144bae" Mar 25 01:34:22.867211 containerd[1497]: time="2025-03-25T01:34:22.867172978Z" level=info msg="CreateContainer within sandbox \"33821242d6c0ed7bf73316718acd331853bb05d9f203fabee8de7989f05a4001\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Mar 25 01:34:22.880845 containerd[1497]: time="2025-03-25T01:34:22.877118211Z" level=info msg="Container c11e6b0e0af0ce835c2f1b12b2b276143b68c610aae1ccc2eb5395b7d196167a: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:34:22.887317 containerd[1497]: time="2025-03-25T01:34:22.887268497Z" level=info msg="CreateContainer within sandbox \"33821242d6c0ed7bf73316718acd331853bb05d9f203fabee8de7989f05a4001\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"c11e6b0e0af0ce835c2f1b12b2b276143b68c610aae1ccc2eb5395b7d196167a\"" Mar 25 01:34:22.888091 containerd[1497]: time="2025-03-25T01:34:22.888033223Z" level=info msg="StartContainer for \"c11e6b0e0af0ce835c2f1b12b2b276143b68c610aae1ccc2eb5395b7d196167a\"" Mar 25 01:34:22.889245 containerd[1497]: time="2025-03-25T01:34:22.889197052Z" level=info msg="connecting to shim c11e6b0e0af0ce835c2f1b12b2b276143b68c610aae1ccc2eb5395b7d196167a" address="unix:///run/containerd/s/4bde231bb79f3650548149cec450d6fc9a59105a3ee6a6fd0793bc9447ca4665" protocol=ttrpc version=3 Mar 25 01:34:22.914066 systemd[1]: Started cri-containerd-c11e6b0e0af0ce835c2f1b12b2b276143b68c610aae1ccc2eb5395b7d196167a.scope - libcontainer container c11e6b0e0af0ce835c2f1b12b2b276143b68c610aae1ccc2eb5395b7d196167a. Mar 25 01:34:22.953441 containerd[1497]: time="2025-03-25T01:34:22.953396765Z" level=info msg="StartContainer for \"c11e6b0e0af0ce835c2f1b12b2b276143b68c610aae1ccc2eb5395b7d196167a\" returns successfully" Mar 25 01:34:25.578747 kubelet[2795]: E0325 01:34:25.576897 2795 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:40782->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4284-0-0-6-22e9b0bb97.182fe7ce8cd9f8a8 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4284-0-0-6-22e9b0bb97,UID:d47669bb28818cbe71f3f62ea49c10d2,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4284-0-0-6-22e9b0bb97,},FirstTimestamp:2025-03-25 01:34:15.128045736 +0000 UTC m=+348.664159552,LastTimestamp:2025-03-25 01:34:15.128045736 +0000 UTC m=+348.664159552,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4284-0-0-6-22e9b0bb97,}"