Mar 25 01:49:53.824101 kernel: Linux version 6.6.83-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Mon Mar 24 23:38:35 -00 2025 Mar 25 01:49:53.824132 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=e7a00b7ee8d97e8d255663e9d3fa92277da8316702fb7f6d664fd7b137c307e9 Mar 25 01:49:53.824145 kernel: BIOS-provided physical RAM map: Mar 25 01:49:53.824151 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Mar 25 01:49:53.824156 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Mar 25 01:49:53.824161 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Mar 25 01:49:53.824167 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007cfdbfff] usable Mar 25 01:49:53.824172 kernel: BIOS-e820: [mem 0x000000007cfdc000-0x000000007cffffff] reserved Mar 25 01:49:53.824179 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Mar 25 01:49:53.824184 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Mar 25 01:49:53.824189 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Mar 25 01:49:53.824194 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Mar 25 01:49:53.824199 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Mar 25 01:49:53.824204 kernel: NX (Execute Disable) protection: active Mar 25 01:49:53.824210 kernel: APIC: Static calls initialized Mar 25 01:49:53.824217 kernel: SMBIOS 3.0.0 present. Mar 25 01:49:53.824223 kernel: DMI: Hetzner vServer/Standard PC (Q35 + ICH9, 2009), BIOS 20171111 11/11/2017 Mar 25 01:49:53.824228 kernel: Hypervisor detected: KVM Mar 25 01:49:53.824233 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Mar 25 01:49:53.824239 kernel: kvm-clock: using sched offset of 3233357365 cycles Mar 25 01:49:53.824244 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Mar 25 01:49:53.824250 kernel: tsc: Detected 2445.406 MHz processor Mar 25 01:49:53.824256 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 25 01:49:53.824267 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 25 01:49:53.824280 kernel: last_pfn = 0x7cfdc max_arch_pfn = 0x400000000 Mar 25 01:49:53.824290 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Mar 25 01:49:53.824300 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 25 01:49:53.824310 kernel: Using GB pages for direct mapping Mar 25 01:49:53.824316 kernel: ACPI: Early table checksum verification disabled Mar 25 01:49:53.824359 kernel: ACPI: RSDP 0x00000000000F5270 000014 (v00 BOCHS ) Mar 25 01:49:53.824366 kernel: ACPI: RSDT 0x000000007CFE265D 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:49:53.824372 kernel: ACPI: FACP 0x000000007CFE244D 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:49:53.824377 kernel: ACPI: DSDT 0x000000007CFE0040 00240D (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:49:53.824385 kernel: ACPI: FACS 0x000000007CFE0000 000040 Mar 25 01:49:53.824391 kernel: ACPI: APIC 0x000000007CFE2541 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:49:53.824397 kernel: ACPI: HPET 0x000000007CFE25C1 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:49:53.824402 kernel: ACPI: MCFG 0x000000007CFE25F9 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:49:53.824408 kernel: ACPI: WAET 0x000000007CFE2635 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:49:53.824413 kernel: ACPI: Reserving FACP table memory at [mem 0x7cfe244d-0x7cfe2540] Mar 25 01:49:53.824419 kernel: ACPI: Reserving DSDT table memory at [mem 0x7cfe0040-0x7cfe244c] Mar 25 01:49:53.824427 kernel: ACPI: Reserving FACS table memory at [mem 0x7cfe0000-0x7cfe003f] Mar 25 01:49:53.824434 kernel: ACPI: Reserving APIC table memory at [mem 0x7cfe2541-0x7cfe25c0] Mar 25 01:49:53.824439 kernel: ACPI: Reserving HPET table memory at [mem 0x7cfe25c1-0x7cfe25f8] Mar 25 01:49:53.824445 kernel: ACPI: Reserving MCFG table memory at [mem 0x7cfe25f9-0x7cfe2634] Mar 25 01:49:53.824451 kernel: ACPI: Reserving WAET table memory at [mem 0x7cfe2635-0x7cfe265c] Mar 25 01:49:53.824457 kernel: No NUMA configuration found Mar 25 01:49:53.824462 kernel: Faking a node at [mem 0x0000000000000000-0x000000007cfdbfff] Mar 25 01:49:53.824469 kernel: NODE_DATA(0) allocated [mem 0x7cfd6000-0x7cfdbfff] Mar 25 01:49:53.824475 kernel: Zone ranges: Mar 25 01:49:53.824481 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 25 01:49:53.824487 kernel: DMA32 [mem 0x0000000001000000-0x000000007cfdbfff] Mar 25 01:49:53.824492 kernel: Normal empty Mar 25 01:49:53.824498 kernel: Movable zone start for each node Mar 25 01:49:53.824504 kernel: Early memory node ranges Mar 25 01:49:53.824509 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Mar 25 01:49:53.824515 kernel: node 0: [mem 0x0000000000100000-0x000000007cfdbfff] Mar 25 01:49:53.824524 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007cfdbfff] Mar 25 01:49:53.824536 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 25 01:49:53.824547 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Mar 25 01:49:53.824557 kernel: On node 0, zone DMA32: 12324 pages in unavailable ranges Mar 25 01:49:53.824580 kernel: ACPI: PM-Timer IO Port: 0x608 Mar 25 01:49:53.824590 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Mar 25 01:49:53.824596 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Mar 25 01:49:53.824601 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Mar 25 01:49:53.824607 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Mar 25 01:49:53.824613 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Mar 25 01:49:53.824624 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Mar 25 01:49:53.824634 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Mar 25 01:49:53.824645 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 25 01:49:53.824655 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Mar 25 01:49:53.824665 kernel: smpboot: Allowing 2 CPUs, 0 hotplug CPUs Mar 25 01:49:53.824674 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Mar 25 01:49:53.824680 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Mar 25 01:49:53.824685 kernel: Booting paravirtualized kernel on KVM Mar 25 01:49:53.824691 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 25 01:49:53.824700 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Mar 25 01:49:53.824705 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u1048576 Mar 25 01:49:53.824711 kernel: pcpu-alloc: s197032 r8192 d32344 u1048576 alloc=1*2097152 Mar 25 01:49:53.824719 kernel: pcpu-alloc: [0] 0 1 Mar 25 01:49:53.824729 kernel: kvm-guest: PV spinlocks disabled, no host support Mar 25 01:49:53.824742 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=e7a00b7ee8d97e8d255663e9d3fa92277da8316702fb7f6d664fd7b137c307e9 Mar 25 01:49:53.824752 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 25 01:49:53.824762 kernel: random: crng init done Mar 25 01:49:53.824770 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 25 01:49:53.824776 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Mar 25 01:49:53.824782 kernel: Fallback order for Node 0: 0 Mar 25 01:49:53.824787 kernel: Built 1 zonelists, mobility grouping on. Total pages: 503708 Mar 25 01:49:53.824793 kernel: Policy zone: DMA32 Mar 25 01:49:53.824799 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 25 01:49:53.824808 kernel: Memory: 1917956K/2047464K available (14336K kernel code, 2304K rwdata, 25060K rodata, 43592K init, 1472K bss, 129248K reserved, 0K cma-reserved) Mar 25 01:49:53.824819 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 25 01:49:53.824829 kernel: ftrace: allocating 37985 entries in 149 pages Mar 25 01:49:53.824842 kernel: ftrace: allocated 149 pages with 4 groups Mar 25 01:49:53.824851 kernel: Dynamic Preempt: voluntary Mar 25 01:49:53.824860 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 25 01:49:53.824869 kernel: rcu: RCU event tracing is enabled. Mar 25 01:49:53.824879 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 25 01:49:53.824888 kernel: Trampoline variant of Tasks RCU enabled. Mar 25 01:49:53.824898 kernel: Rude variant of Tasks RCU enabled. Mar 25 01:49:53.824907 kernel: Tracing variant of Tasks RCU enabled. Mar 25 01:49:53.824916 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 25 01:49:53.824928 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 25 01:49:53.824937 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Mar 25 01:49:53.824946 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 25 01:49:53.824956 kernel: Console: colour VGA+ 80x25 Mar 25 01:49:53.824967 kernel: printk: console [tty0] enabled Mar 25 01:49:53.824977 kernel: printk: console [ttyS0] enabled Mar 25 01:49:53.824987 kernel: ACPI: Core revision 20230628 Mar 25 01:49:53.824996 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Mar 25 01:49:53.825006 kernel: APIC: Switch to symmetric I/O mode setup Mar 25 01:49:53.825017 kernel: x2apic enabled Mar 25 01:49:53.825033 kernel: APIC: Switched APIC routing to: physical x2apic Mar 25 01:49:53.825043 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Mar 25 01:49:53.825054 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized Mar 25 01:49:53.825061 kernel: Calibrating delay loop (skipped) preset value.. 4890.81 BogoMIPS (lpj=2445406) Mar 25 01:49:53.825067 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Mar 25 01:49:53.825073 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Mar 25 01:49:53.825079 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Mar 25 01:49:53.825090 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 25 01:49:53.825096 kernel: Spectre V2 : Mitigation: Retpolines Mar 25 01:49:53.825102 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Mar 25 01:49:53.825108 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Mar 25 01:49:53.825115 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Mar 25 01:49:53.825121 kernel: RETBleed: Mitigation: untrained return thunk Mar 25 01:49:53.825128 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Mar 25 01:49:53.825134 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Mar 25 01:49:53.825140 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Mar 25 01:49:53.825147 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Mar 25 01:49:53.825153 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Mar 25 01:49:53.825159 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Mar 25 01:49:53.825165 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Mar 25 01:49:53.825171 kernel: Freeing SMP alternatives memory: 32K Mar 25 01:49:53.825177 kernel: pid_max: default: 32768 minimum: 301 Mar 25 01:49:53.825183 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 25 01:49:53.825189 kernel: landlock: Up and running. Mar 25 01:49:53.825195 kernel: SELinux: Initializing. Mar 25 01:49:53.825202 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Mar 25 01:49:53.825209 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Mar 25 01:49:53.825215 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0) Mar 25 01:49:53.825221 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 25 01:49:53.825227 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 25 01:49:53.825233 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 25 01:49:53.825239 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Mar 25 01:49:53.825245 kernel: ... version: 0 Mar 25 01:49:53.825251 kernel: ... bit width: 48 Mar 25 01:49:53.825258 kernel: ... generic registers: 6 Mar 25 01:49:53.825264 kernel: ... value mask: 0000ffffffffffff Mar 25 01:49:53.825270 kernel: ... max period: 00007fffffffffff Mar 25 01:49:53.825276 kernel: ... fixed-purpose events: 0 Mar 25 01:49:53.825281 kernel: ... event mask: 000000000000003f Mar 25 01:49:53.825287 kernel: signal: max sigframe size: 1776 Mar 25 01:49:53.825293 kernel: rcu: Hierarchical SRCU implementation. Mar 25 01:49:53.825300 kernel: rcu: Max phase no-delay instances is 400. Mar 25 01:49:53.825306 kernel: smp: Bringing up secondary CPUs ... Mar 25 01:49:53.825313 kernel: smpboot: x86: Booting SMP configuration: Mar 25 01:49:53.825319 kernel: .... node #0, CPUs: #1 Mar 25 01:49:53.826018 kernel: smp: Brought up 1 node, 2 CPUs Mar 25 01:49:53.826029 kernel: smpboot: Max logical packages: 1 Mar 25 01:49:53.826035 kernel: smpboot: Total of 2 processors activated (9781.62 BogoMIPS) Mar 25 01:49:53.826042 kernel: devtmpfs: initialized Mar 25 01:49:53.826048 kernel: x86/mm: Memory block size: 128MB Mar 25 01:49:53.826054 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 25 01:49:53.826060 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 25 01:49:53.826070 kernel: pinctrl core: initialized pinctrl subsystem Mar 25 01:49:53.826076 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 25 01:49:53.826082 kernel: audit: initializing netlink subsys (disabled) Mar 25 01:49:53.826088 kernel: audit: type=2000 audit(1742867392.347:1): state=initialized audit_enabled=0 res=1 Mar 25 01:49:53.826094 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 25 01:49:53.826100 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 25 01:49:53.826106 kernel: cpuidle: using governor menu Mar 25 01:49:53.826112 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 25 01:49:53.826118 kernel: dca service started, version 1.12.1 Mar 25 01:49:53.826126 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xb0000000-0xbfffffff] (base 0xb0000000) Mar 25 01:49:53.826132 kernel: PCI: Using configuration type 1 for base access Mar 25 01:49:53.826138 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 25 01:49:53.826144 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 25 01:49:53.826150 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Mar 25 01:49:53.826156 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 25 01:49:53.826162 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Mar 25 01:49:53.826168 kernel: ACPI: Added _OSI(Module Device) Mar 25 01:49:53.826174 kernel: ACPI: Added _OSI(Processor Device) Mar 25 01:49:53.826181 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 25 01:49:53.826187 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 25 01:49:53.826193 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 25 01:49:53.826199 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Mar 25 01:49:53.826205 kernel: ACPI: Interpreter enabled Mar 25 01:49:53.826211 kernel: ACPI: PM: (supports S0 S5) Mar 25 01:49:53.826217 kernel: ACPI: Using IOAPIC for interrupt routing Mar 25 01:49:53.826223 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 25 01:49:53.826229 kernel: PCI: Using E820 reservations for host bridge windows Mar 25 01:49:53.826236 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Mar 25 01:49:53.826242 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 25 01:49:53.827709 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 25 01:49:53.827790 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Mar 25 01:49:53.827856 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Mar 25 01:49:53.827865 kernel: PCI host bridge to bus 0000:00 Mar 25 01:49:53.827934 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Mar 25 01:49:53.827998 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Mar 25 01:49:53.828055 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Mar 25 01:49:53.828112 kernel: pci_bus 0000:00: root bus resource [mem 0x7d000000-0xafffffff window] Mar 25 01:49:53.828172 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Mar 25 01:49:53.828229 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x8ffffffff window] Mar 25 01:49:53.828286 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 25 01:49:53.828411 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 Mar 25 01:49:53.828497 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 Mar 25 01:49:53.828578 kernel: pci 0000:00:01.0: reg 0x10: [mem 0xfb800000-0xfbffffff pref] Mar 25 01:49:53.828647 kernel: pci 0000:00:01.0: reg 0x18: [mem 0xfd200000-0xfd203fff 64bit pref] Mar 25 01:49:53.828710 kernel: pci 0000:00:01.0: reg 0x20: [mem 0xfea10000-0xfea10fff] Mar 25 01:49:53.828774 kernel: pci 0000:00:01.0: reg 0x30: [mem 0xfea00000-0xfea0ffff pref] Mar 25 01:49:53.828838 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Mar 25 01:49:53.828912 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Mar 25 01:49:53.828977 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfea11000-0xfea11fff] Mar 25 01:49:53.829047 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Mar 25 01:49:53.829111 kernel: pci 0000:00:02.1: reg 0x10: [mem 0xfea12000-0xfea12fff] Mar 25 01:49:53.829178 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Mar 25 01:49:53.829243 kernel: pci 0000:00:02.2: reg 0x10: [mem 0xfea13000-0xfea13fff] Mar 25 01:49:53.829316 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Mar 25 01:49:53.830231 kernel: pci 0000:00:02.3: reg 0x10: [mem 0xfea14000-0xfea14fff] Mar 25 01:49:53.830311 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Mar 25 01:49:53.830403 kernel: pci 0000:00:02.4: reg 0x10: [mem 0xfea15000-0xfea15fff] Mar 25 01:49:53.830477 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Mar 25 01:49:53.830615 kernel: pci 0000:00:02.5: reg 0x10: [mem 0xfea16000-0xfea16fff] Mar 25 01:49:53.830702 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Mar 25 01:49:53.830767 kernel: pci 0000:00:02.6: reg 0x10: [mem 0xfea17000-0xfea17fff] Mar 25 01:49:53.830836 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Mar 25 01:49:53.830899 kernel: pci 0000:00:02.7: reg 0x10: [mem 0xfea18000-0xfea18fff] Mar 25 01:49:53.830970 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 Mar 25 01:49:53.831034 kernel: pci 0000:00:03.0: reg 0x10: [mem 0xfea19000-0xfea19fff] Mar 25 01:49:53.831102 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 Mar 25 01:49:53.831171 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Mar 25 01:49:53.831239 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 Mar 25 01:49:53.831303 kernel: pci 0000:00:1f.2: reg 0x20: [io 0xc040-0xc05f] Mar 25 01:49:53.831392 kernel: pci 0000:00:1f.2: reg 0x24: [mem 0xfea1a000-0xfea1afff] Mar 25 01:49:53.831463 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 Mar 25 01:49:53.831526 kernel: pci 0000:00:1f.3: reg 0x20: [io 0x0700-0x073f] Mar 25 01:49:53.831629 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 Mar 25 01:49:53.831698 kernel: pci 0000:01:00.0: reg 0x14: [mem 0xfe880000-0xfe880fff] Mar 25 01:49:53.831764 kernel: pci 0000:01:00.0: reg 0x20: [mem 0xfd000000-0xfd003fff 64bit pref] Mar 25 01:49:53.831829 kernel: pci 0000:01:00.0: reg 0x30: [mem 0xfe800000-0xfe87ffff pref] Mar 25 01:49:53.831893 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Mar 25 01:49:53.831957 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Mar 25 01:49:53.832020 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Mar 25 01:49:53.832095 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 Mar 25 01:49:53.832161 kernel: pci 0000:02:00.0: reg 0x10: [mem 0xfe600000-0xfe603fff 64bit] Mar 25 01:49:53.832225 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Mar 25 01:49:53.832288 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Mar 25 01:49:53.832434 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Mar 25 01:49:53.832514 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 Mar 25 01:49:53.832600 kernel: pci 0000:03:00.0: reg 0x14: [mem 0xfe400000-0xfe400fff] Mar 25 01:49:53.832689 kernel: pci 0000:03:00.0: reg 0x20: [mem 0xfcc00000-0xfcc03fff 64bit pref] Mar 25 01:49:53.832771 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Mar 25 01:49:53.832836 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Mar 25 01:49:53.832899 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Mar 25 01:49:53.832970 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 Mar 25 01:49:53.833037 kernel: pci 0000:04:00.0: reg 0x20: [mem 0xfca00000-0xfca03fff 64bit pref] Mar 25 01:49:53.833105 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Mar 25 01:49:53.833166 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Mar 25 01:49:53.833229 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Mar 25 01:49:53.833300 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 Mar 25 01:49:53.833386 kernel: pci 0000:05:00.0: reg 0x14: [mem 0xfe000000-0xfe000fff] Mar 25 01:49:53.833453 kernel: pci 0000:05:00.0: reg 0x20: [mem 0xfc800000-0xfc803fff 64bit pref] Mar 25 01:49:53.833515 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Mar 25 01:49:53.833591 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Mar 25 01:49:53.833661 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Mar 25 01:49:53.833754 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 Mar 25 01:49:53.833859 kernel: pci 0000:06:00.0: reg 0x14: [mem 0xfde00000-0xfde00fff] Mar 25 01:49:53.833928 kernel: pci 0000:06:00.0: reg 0x20: [mem 0xfc600000-0xfc603fff 64bit pref] Mar 25 01:49:53.833992 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Mar 25 01:49:53.834056 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Mar 25 01:49:53.834117 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Mar 25 01:49:53.834129 kernel: acpiphp: Slot [0] registered Mar 25 01:49:53.834201 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 Mar 25 01:49:53.834267 kernel: pci 0000:07:00.0: reg 0x14: [mem 0xfdc80000-0xfdc80fff] Mar 25 01:49:53.834350 kernel: pci 0000:07:00.0: reg 0x20: [mem 0xfc400000-0xfc403fff 64bit pref] Mar 25 01:49:53.834421 kernel: pci 0000:07:00.0: reg 0x30: [mem 0xfdc00000-0xfdc7ffff pref] Mar 25 01:49:53.837425 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Mar 25 01:49:53.837516 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Mar 25 01:49:53.837596 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Mar 25 01:49:53.837609 kernel: acpiphp: Slot [0-2] registered Mar 25 01:49:53.837723 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Mar 25 01:49:53.837825 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Mar 25 01:49:53.837893 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Mar 25 01:49:53.837902 kernel: acpiphp: Slot [0-3] registered Mar 25 01:49:53.837964 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Mar 25 01:49:53.838026 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Mar 25 01:49:53.838087 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Mar 25 01:49:53.838099 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Mar 25 01:49:53.838105 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Mar 25 01:49:53.838111 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Mar 25 01:49:53.838118 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Mar 25 01:49:53.838124 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Mar 25 01:49:53.838130 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Mar 25 01:49:53.838136 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Mar 25 01:49:53.838142 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Mar 25 01:49:53.838148 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Mar 25 01:49:53.838155 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Mar 25 01:49:53.838161 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Mar 25 01:49:53.838167 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Mar 25 01:49:53.838173 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Mar 25 01:49:53.838179 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Mar 25 01:49:53.838185 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Mar 25 01:49:53.838191 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Mar 25 01:49:53.838197 kernel: iommu: Default domain type: Translated Mar 25 01:49:53.838204 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 25 01:49:53.838211 kernel: PCI: Using ACPI for IRQ routing Mar 25 01:49:53.838217 kernel: PCI: pci_cache_line_size set to 64 bytes Mar 25 01:49:53.838223 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Mar 25 01:49:53.838230 kernel: e820: reserve RAM buffer [mem 0x7cfdc000-0x7fffffff] Mar 25 01:49:53.838294 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Mar 25 01:49:53.838449 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Mar 25 01:49:53.838514 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Mar 25 01:49:53.838523 kernel: vgaarb: loaded Mar 25 01:49:53.838530 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Mar 25 01:49:53.838539 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Mar 25 01:49:53.838545 kernel: clocksource: Switched to clocksource kvm-clock Mar 25 01:49:53.838552 kernel: VFS: Disk quotas dquot_6.6.0 Mar 25 01:49:53.838558 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 25 01:49:53.838577 kernel: pnp: PnP ACPI init Mar 25 01:49:53.838648 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Mar 25 01:49:53.838658 kernel: pnp: PnP ACPI: found 5 devices Mar 25 01:49:53.838665 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 25 01:49:53.838673 kernel: NET: Registered PF_INET protocol family Mar 25 01:49:53.838680 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 25 01:49:53.838686 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Mar 25 01:49:53.838692 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 25 01:49:53.838699 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Mar 25 01:49:53.838705 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Mar 25 01:49:53.838711 kernel: TCP: Hash tables configured (established 16384 bind 16384) Mar 25 01:49:53.838718 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Mar 25 01:49:53.838724 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Mar 25 01:49:53.838731 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 25 01:49:53.838737 kernel: NET: Registered PF_XDP protocol family Mar 25 01:49:53.838802 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Mar 25 01:49:53.838867 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Mar 25 01:49:53.838929 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Mar 25 01:49:53.838993 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x1000-0x1fff] Mar 25 01:49:53.839062 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x2000-0x2fff] Mar 25 01:49:53.839195 kernel: pci 0000:00:03.0: BAR 13: assigned [io 0x3000-0x3fff] Mar 25 01:49:53.839268 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Mar 25 01:49:53.840401 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Mar 25 01:49:53.840490 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Mar 25 01:49:53.840557 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Mar 25 01:49:53.840638 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Mar 25 01:49:53.840744 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Mar 25 01:49:53.840821 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Mar 25 01:49:53.840891 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Mar 25 01:49:53.841010 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Mar 25 01:49:53.841084 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Mar 25 01:49:53.841148 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Mar 25 01:49:53.841210 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Mar 25 01:49:53.841274 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Mar 25 01:49:53.841361 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Mar 25 01:49:53.841438 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Mar 25 01:49:53.841500 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Mar 25 01:49:53.841605 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Mar 25 01:49:53.841718 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Mar 25 01:49:53.841786 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Mar 25 01:49:53.841848 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff] Mar 25 01:49:53.841911 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Mar 25 01:49:53.841974 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Mar 25 01:49:53.842036 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Mar 25 01:49:53.842097 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff] Mar 25 01:49:53.842166 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Mar 25 01:49:53.842229 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Mar 25 01:49:53.842291 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Mar 25 01:49:53.842902 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff] Mar 25 01:49:53.843039 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Mar 25 01:49:53.843148 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Mar 25 01:49:53.843256 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Mar 25 01:49:53.843321 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Mar 25 01:49:53.843460 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Mar 25 01:49:53.843525 kernel: pci_bus 0000:00: resource 7 [mem 0x7d000000-0xafffffff window] Mar 25 01:49:53.843620 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Mar 25 01:49:53.843680 kernel: pci_bus 0000:00: resource 9 [mem 0x100000000-0x8ffffffff window] Mar 25 01:49:53.843747 kernel: pci_bus 0000:01: resource 1 [mem 0xfe800000-0xfe9fffff] Mar 25 01:49:53.843807 kernel: pci_bus 0000:01: resource 2 [mem 0xfd000000-0xfd1fffff 64bit pref] Mar 25 01:49:53.843875 kernel: pci_bus 0000:02: resource 1 [mem 0xfe600000-0xfe7fffff] Mar 25 01:49:53.843934 kernel: pci_bus 0000:02: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Mar 25 01:49:53.843998 kernel: pci_bus 0000:03: resource 1 [mem 0xfe400000-0xfe5fffff] Mar 25 01:49:53.844113 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Mar 25 01:49:53.844192 kernel: pci_bus 0000:04: resource 1 [mem 0xfe200000-0xfe3fffff] Mar 25 01:49:53.844253 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Mar 25 01:49:53.844316 kernel: pci_bus 0000:05: resource 1 [mem 0xfe000000-0xfe1fffff] Mar 25 01:49:53.844513 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Mar 25 01:49:53.844810 kernel: pci_bus 0000:06: resource 1 [mem 0xfde00000-0xfdffffff] Mar 25 01:49:53.844945 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Mar 25 01:49:53.845030 kernel: pci_bus 0000:07: resource 0 [io 0x1000-0x1fff] Mar 25 01:49:53.845124 kernel: pci_bus 0000:07: resource 1 [mem 0xfdc00000-0xfddfffff] Mar 25 01:49:53.845188 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Mar 25 01:49:53.845252 kernel: pci_bus 0000:08: resource 0 [io 0x2000-0x2fff] Mar 25 01:49:53.845310 kernel: pci_bus 0000:08: resource 1 [mem 0xfda00000-0xfdbfffff] Mar 25 01:49:53.845491 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Mar 25 01:49:53.845583 kernel: pci_bus 0000:09: resource 0 [io 0x3000-0x3fff] Mar 25 01:49:53.845647 kernel: pci_bus 0000:09: resource 1 [mem 0xfd800000-0xfd9fffff] Mar 25 01:49:53.845705 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Mar 25 01:49:53.845715 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Mar 25 01:49:53.845722 kernel: PCI: CLS 0 bytes, default 64 Mar 25 01:49:53.845734 kernel: Initialise system trusted keyrings Mar 25 01:49:53.845746 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Mar 25 01:49:53.845762 kernel: Key type asymmetric registered Mar 25 01:49:53.845774 kernel: Asymmetric key parser 'x509' registered Mar 25 01:49:53.845784 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Mar 25 01:49:53.845791 kernel: io scheduler mq-deadline registered Mar 25 01:49:53.845797 kernel: io scheduler kyber registered Mar 25 01:49:53.845804 kernel: io scheduler bfq registered Mar 25 01:49:53.845879 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Mar 25 01:49:53.845945 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Mar 25 01:49:53.846008 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Mar 25 01:49:53.846075 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Mar 25 01:49:53.846137 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Mar 25 01:49:53.846200 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Mar 25 01:49:53.846262 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Mar 25 01:49:53.846337 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Mar 25 01:49:53.846409 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Mar 25 01:49:53.846474 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Mar 25 01:49:53.846590 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Mar 25 01:49:53.846675 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Mar 25 01:49:53.846747 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Mar 25 01:49:53.846810 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Mar 25 01:49:53.846917 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Mar 25 01:49:53.846987 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Mar 25 01:49:53.846997 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Mar 25 01:49:53.847059 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Mar 25 01:49:53.847122 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Mar 25 01:49:53.847131 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 25 01:49:53.847141 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Mar 25 01:49:53.847148 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 25 01:49:53.847154 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 25 01:49:53.847161 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Mar 25 01:49:53.847168 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Mar 25 01:49:53.847174 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Mar 25 01:49:53.847244 kernel: rtc_cmos 00:03: RTC can wake from S4 Mar 25 01:49:53.847255 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Mar 25 01:49:53.847314 kernel: rtc_cmos 00:03: registered as rtc0 Mar 25 01:49:53.847433 kernel: rtc_cmos 00:03: setting system clock to 2025-03-25T01:49:53 UTC (1742867393) Mar 25 01:49:53.847493 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Mar 25 01:49:53.847502 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Mar 25 01:49:53.847509 kernel: NET: Registered PF_INET6 protocol family Mar 25 01:49:53.847515 kernel: Segment Routing with IPv6 Mar 25 01:49:53.847522 kernel: In-situ OAM (IOAM) with IPv6 Mar 25 01:49:53.847529 kernel: NET: Registered PF_PACKET protocol family Mar 25 01:49:53.847538 kernel: Key type dns_resolver registered Mar 25 01:49:53.847545 kernel: IPI shorthand broadcast: enabled Mar 25 01:49:53.847551 kernel: sched_clock: Marking stable (1030006876, 142547556)->(1180881803, -8327371) Mar 25 01:49:53.847558 kernel: registered taskstats version 1 Mar 25 01:49:53.847578 kernel: Loading compiled-in X.509 certificates Mar 25 01:49:53.847585 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.83-flatcar: eff01054e94a599f8e404b9a9482f4e2220f5386' Mar 25 01:49:53.847591 kernel: Key type .fscrypt registered Mar 25 01:49:53.847598 kernel: Key type fscrypt-provisioning registered Mar 25 01:49:53.847604 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 25 01:49:53.847612 kernel: ima: Allocated hash algorithm: sha1 Mar 25 01:49:53.847619 kernel: ima: No architecture policies found Mar 25 01:49:53.847625 kernel: clk: Disabling unused clocks Mar 25 01:49:53.847631 kernel: Freeing unused kernel image (initmem) memory: 43592K Mar 25 01:49:53.847638 kernel: Write protecting the kernel read-only data: 40960k Mar 25 01:49:53.847645 kernel: Freeing unused kernel image (rodata/data gap) memory: 1564K Mar 25 01:49:53.847651 kernel: Run /init as init process Mar 25 01:49:53.847657 kernel: with arguments: Mar 25 01:49:53.847664 kernel: /init Mar 25 01:49:53.847671 kernel: with environment: Mar 25 01:49:53.847678 kernel: HOME=/ Mar 25 01:49:53.847684 kernel: TERM=linux Mar 25 01:49:53.847690 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 25 01:49:53.847698 systemd[1]: Successfully made /usr/ read-only. Mar 25 01:49:53.847707 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 25 01:49:53.847715 systemd[1]: Detected virtualization kvm. Mar 25 01:49:53.847722 systemd[1]: Detected architecture x86-64. Mar 25 01:49:53.847730 systemd[1]: Running in initrd. Mar 25 01:49:53.847737 systemd[1]: No hostname configured, using default hostname. Mar 25 01:49:53.847744 systemd[1]: Hostname set to . Mar 25 01:49:53.847751 systemd[1]: Initializing machine ID from VM UUID. Mar 25 01:49:53.847757 systemd[1]: Queued start job for default target initrd.target. Mar 25 01:49:53.847764 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 01:49:53.847771 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 01:49:53.847779 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 25 01:49:53.847788 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 25 01:49:53.847795 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 25 01:49:53.847802 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 25 01:49:53.847810 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 25 01:49:53.847817 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 25 01:49:53.847824 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 01:49:53.847832 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 25 01:49:53.847840 systemd[1]: Reached target paths.target - Path Units. Mar 25 01:49:53.847847 systemd[1]: Reached target slices.target - Slice Units. Mar 25 01:49:53.847854 systemd[1]: Reached target swap.target - Swaps. Mar 25 01:49:53.847861 systemd[1]: Reached target timers.target - Timer Units. Mar 25 01:49:53.847868 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 25 01:49:53.847875 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 25 01:49:53.847882 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 25 01:49:53.847889 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 25 01:49:53.847898 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 25 01:49:53.847905 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 25 01:49:53.847913 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 01:49:53.847920 systemd[1]: Reached target sockets.target - Socket Units. Mar 25 01:49:53.847926 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 25 01:49:53.847933 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 25 01:49:53.847940 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 25 01:49:53.847948 systemd[1]: Starting systemd-fsck-usr.service... Mar 25 01:49:53.847955 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 25 01:49:53.847963 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 25 01:49:53.847970 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:49:53.847977 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 25 01:49:53.848002 systemd-journald[188]: Collecting audit messages is disabled. Mar 25 01:49:53.848021 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 01:49:53.848029 systemd[1]: Finished systemd-fsck-usr.service. Mar 25 01:49:53.848036 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 25 01:49:53.848044 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 25 01:49:53.848054 kernel: Bridge firewalling registered Mar 25 01:49:53.848066 systemd-journald[188]: Journal started Mar 25 01:49:53.848097 systemd-journald[188]: Runtime Journal (/run/log/journal/6db4939356e646b18162c91546f4cf7b) is 4.7M, max 38.3M, 33.5M free. Mar 25 01:49:53.818556 systemd-modules-load[189]: Inserted module 'overlay' Mar 25 01:49:53.846662 systemd-modules-load[189]: Inserted module 'br_netfilter' Mar 25 01:49:53.889176 systemd[1]: Started systemd-journald.service - Journal Service. Mar 25 01:49:53.888366 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 25 01:49:53.888904 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:49:53.889483 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 25 01:49:53.892158 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 01:49:53.894692 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 25 01:49:53.901047 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 25 01:49:53.903408 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 25 01:49:53.905460 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 25 01:49:53.905984 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 01:49:53.911601 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:49:53.914408 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 25 01:49:53.916387 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 01:49:53.922420 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 25 01:49:53.928588 dracut-cmdline[221]: dracut-dracut-053 Mar 25 01:49:53.930733 dracut-cmdline[221]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=e7a00b7ee8d97e8d255663e9d3fa92277da8316702fb7f6d664fd7b137c307e9 Mar 25 01:49:53.945924 systemd-resolved[223]: Positive Trust Anchors: Mar 25 01:49:53.945934 systemd-resolved[223]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 25 01:49:53.945958 systemd-resolved[223]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 25 01:49:53.954178 systemd-resolved[223]: Defaulting to hostname 'linux'. Mar 25 01:49:53.954975 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 25 01:49:53.955635 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 25 01:49:53.977360 kernel: SCSI subsystem initialized Mar 25 01:49:53.984343 kernel: Loading iSCSI transport class v2.0-870. Mar 25 01:49:53.993358 kernel: iscsi: registered transport (tcp) Mar 25 01:49:54.009352 kernel: iscsi: registered transport (qla4xxx) Mar 25 01:49:54.009377 kernel: QLogic iSCSI HBA Driver Mar 25 01:49:54.031718 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 25 01:49:54.032894 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 25 01:49:54.058354 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 25 01:49:54.058398 kernel: device-mapper: uevent: version 1.0.3 Mar 25 01:49:54.058408 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 25 01:49:54.094353 kernel: raid6: avx2x4 gen() 30188 MB/s Mar 25 01:49:54.111346 kernel: raid6: avx2x2 gen() 30262 MB/s Mar 25 01:49:54.128438 kernel: raid6: avx2x1 gen() 22174 MB/s Mar 25 01:49:54.128477 kernel: raid6: using algorithm avx2x2 gen() 30262 MB/s Mar 25 01:49:54.146601 kernel: raid6: .... xor() 31954 MB/s, rmw enabled Mar 25 01:49:54.146636 kernel: raid6: using avx2x2 recovery algorithm Mar 25 01:49:54.164360 kernel: xor: automatically using best checksumming function avx Mar 25 01:49:54.275353 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 25 01:49:54.281804 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 25 01:49:54.283360 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 01:49:54.308374 systemd-udevd[407]: Using default interface naming scheme 'v255'. Mar 25 01:49:54.312057 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 01:49:54.316412 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 25 01:49:54.330880 dracut-pre-trigger[412]: rd.md=0: removing MD RAID activation Mar 25 01:49:54.347665 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 25 01:49:54.349130 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 25 01:49:54.389540 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 01:49:54.392641 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 25 01:49:54.411230 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 25 01:49:54.412542 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 25 01:49:54.413865 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 01:49:54.414927 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 25 01:49:54.417413 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 25 01:49:54.431897 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 25 01:49:54.449359 kernel: libata version 3.00 loaded. Mar 25 01:49:54.453452 kernel: ahci 0000:00:1f.2: version 3.0 Mar 25 01:49:54.473272 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Mar 25 01:49:54.473311 kernel: ahci 0000:00:1f.2: AHCI 0001.0000 32 slots 6 ports 1.5 Gbps 0x3f impl SATA mode Mar 25 01:49:54.473981 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Mar 25 01:49:54.474739 kernel: scsi host1: ahci Mar 25 01:49:54.475239 kernel: scsi host2: ahci Mar 25 01:49:54.475725 kernel: scsi host3: ahci Mar 25 01:49:54.476111 kernel: scsi host0: Virtio SCSI HBA Mar 25 01:49:54.476432 kernel: scsi host4: ahci Mar 25 01:49:54.476517 kernel: scsi host5: ahci Mar 25 01:49:54.476609 kernel: scsi host6: ahci Mar 25 01:49:54.476683 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a100 irq 35 Mar 25 01:49:54.476692 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a180 irq 35 Mar 25 01:49:54.476699 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a200 irq 35 Mar 25 01:49:54.476707 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a280 irq 35 Mar 25 01:49:54.476714 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a300 irq 35 Mar 25 01:49:54.476721 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a380 irq 35 Mar 25 01:49:54.476732 kernel: scsi 0:0:0:0: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Mar 25 01:49:54.481356 kernel: cryptd: max_cpu_qlen set to 1000 Mar 25 01:49:54.487184 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 25 01:49:54.487284 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:49:54.488877 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 01:49:54.489959 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 25 01:49:54.490619 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:49:54.493434 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:49:54.495475 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:49:54.504230 kernel: ACPI: bus type USB registered Mar 25 01:49:54.504260 kernel: usbcore: registered new interface driver usbfs Mar 25 01:49:54.504270 kernel: usbcore: registered new interface driver hub Mar 25 01:49:54.504278 kernel: usbcore: registered new device driver usb Mar 25 01:49:54.568598 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:49:54.570463 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 01:49:54.588189 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:49:54.786058 kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Mar 25 01:49:54.786145 kernel: ata6: SATA link down (SStatus 0 SControl 300) Mar 25 01:49:54.786158 kernel: ata4: SATA link down (SStatus 0 SControl 300) Mar 25 01:49:54.790437 kernel: ata3: SATA link down (SStatus 0 SControl 300) Mar 25 01:49:54.790476 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Mar 25 01:49:54.790487 kernel: ata1.00: applying bridge limits Mar 25 01:49:54.791342 kernel: ata2: SATA link down (SStatus 0 SControl 300) Mar 25 01:49:54.793735 kernel: ata1.00: configured for UDMA/100 Mar 25 01:49:54.794340 kernel: ata5: SATA link down (SStatus 0 SControl 300) Mar 25 01:49:54.799403 kernel: scsi 1:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Mar 25 01:49:54.850174 kernel: AVX2 version of gcm_enc/dec engaged. Mar 25 01:49:54.850221 kernel: AES CTR mode by8 optimization enabled Mar 25 01:49:54.855304 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Mar 25 01:49:54.876608 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Mar 25 01:49:54.877564 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Mar 25 01:49:54.877688 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Mar 25 01:49:54.877776 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Mar 25 01:49:54.877857 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Mar 25 01:49:54.877941 kernel: hub 1-0:1.0: USB hub found Mar 25 01:49:54.878036 kernel: hub 1-0:1.0: 4 ports detected Mar 25 01:49:54.878115 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Mar 25 01:49:54.878241 kernel: hub 2-0:1.0: USB hub found Mar 25 01:49:54.878344 kernel: hub 2-0:1.0: 4 ports detected Mar 25 01:49:54.880930 kernel: sr 1:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Mar 25 01:49:54.897979 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 25 01:49:54.897993 kernel: sd 0:0:0:0: Power-on or device reset occurred Mar 25 01:49:54.910422 kernel: sd 0:0:0:0: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Mar 25 01:49:54.910527 kernel: sd 0:0:0:0: [sda] Write Protect is off Mar 25 01:49:54.910628 kernel: sd 0:0:0:0: [sda] Mode Sense: 63 00 00 08 Mar 25 01:49:54.910709 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Mar 25 01:49:54.910788 kernel: sr 1:0:0:0: Attached scsi CD-ROM sr0 Mar 25 01:49:54.910875 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 25 01:49:54.910886 kernel: GPT:17805311 != 80003071 Mar 25 01:49:54.910893 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 25 01:49:54.910901 kernel: GPT:17805311 != 80003071 Mar 25 01:49:54.910908 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 25 01:49:54.910915 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 25 01:49:54.910923 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Mar 25 01:49:54.949349 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by (udev-worker) (460) Mar 25 01:49:54.953367 kernel: BTRFS: device fsid 6d9424cd-1432-492b-b006-b311869817e2 devid 1 transid 39 /dev/sda3 scanned by (udev-worker) (451) Mar 25 01:49:54.964293 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Mar 25 01:49:54.973305 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Mar 25 01:49:54.984069 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Mar 25 01:49:54.990473 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Mar 25 01:49:54.990994 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Mar 25 01:49:54.993314 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 25 01:49:55.009226 disk-uuid[576]: Primary Header is updated. Mar 25 01:49:55.009226 disk-uuid[576]: Secondary Entries is updated. Mar 25 01:49:55.009226 disk-uuid[576]: Secondary Header is updated. Mar 25 01:49:55.014376 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 25 01:49:55.118363 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Mar 25 01:49:55.255366 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 25 01:49:55.265137 kernel: usbcore: registered new interface driver usbhid Mar 25 01:49:55.265189 kernel: usbhid: USB HID core driver Mar 25 01:49:55.275413 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input3 Mar 25 01:49:55.275466 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Mar 25 01:49:56.022380 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 25 01:49:56.023161 disk-uuid[577]: The operation has completed successfully. Mar 25 01:49:56.076013 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 25 01:49:56.076087 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 25 01:49:56.105835 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 25 01:49:56.120939 sh[594]: Success Mar 25 01:49:56.132372 kernel: device-mapper: verity: sha256 using implementation "sha256-ni" Mar 25 01:49:56.179362 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 25 01:49:56.183416 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 25 01:49:56.188901 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 25 01:49:56.201613 kernel: BTRFS info (device dm-0): first mount of filesystem 6d9424cd-1432-492b-b006-b311869817e2 Mar 25 01:49:56.201651 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Mar 25 01:49:56.203475 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 25 01:49:56.205389 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 25 01:49:56.206559 kernel: BTRFS info (device dm-0): using free space tree Mar 25 01:49:56.214348 kernel: BTRFS info (device dm-0): enabling ssd optimizations Mar 25 01:49:56.216868 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 25 01:49:56.217854 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 25 01:49:56.219085 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 25 01:49:56.221430 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 25 01:49:56.243134 kernel: BTRFS info (device sda6): first mount of filesystem a72930ba-1354-475c-94df-b83a66efea67 Mar 25 01:49:56.243170 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 25 01:49:56.243180 kernel: BTRFS info (device sda6): using free space tree Mar 25 01:49:56.248551 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 25 01:49:56.248582 kernel: BTRFS info (device sda6): auto enabling async discard Mar 25 01:49:56.254345 kernel: BTRFS info (device sda6): last unmount of filesystem a72930ba-1354-475c-94df-b83a66efea67 Mar 25 01:49:56.255568 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 25 01:49:56.257689 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 25 01:49:56.288576 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 25 01:49:56.293536 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 25 01:49:56.329894 ignition[716]: Ignition 2.20.0 Mar 25 01:49:56.329907 ignition[716]: Stage: fetch-offline Mar 25 01:49:56.329933 ignition[716]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:49:56.330940 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 25 01:49:56.329940 ignition[716]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 25 01:49:56.330002 ignition[716]: parsed url from cmdline: "" Mar 25 01:49:56.330005 ignition[716]: no config URL provided Mar 25 01:49:56.330008 ignition[716]: reading system config file "/usr/lib/ignition/user.ign" Mar 25 01:49:56.330014 ignition[716]: no config at "/usr/lib/ignition/user.ign" Mar 25 01:49:56.335184 systemd-networkd[772]: lo: Link UP Mar 25 01:49:56.330018 ignition[716]: failed to fetch config: resource requires networking Mar 25 01:49:56.335187 systemd-networkd[772]: lo: Gained carrier Mar 25 01:49:56.330250 ignition[716]: Ignition finished successfully Mar 25 01:49:56.336769 systemd-networkd[772]: Enumeration completed Mar 25 01:49:56.337055 systemd-networkd[772]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:49:56.337058 systemd-networkd[772]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 25 01:49:56.337542 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 25 01:49:56.337706 systemd-networkd[772]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:49:56.337708 systemd-networkd[772]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 25 01:49:56.338139 systemd-networkd[772]: eth0: Link UP Mar 25 01:49:56.338142 systemd-networkd[772]: eth0: Gained carrier Mar 25 01:49:56.338148 systemd-networkd[772]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:49:56.338915 systemd[1]: Reached target network.target - Network. Mar 25 01:49:56.339549 systemd-networkd[772]: eth1: Link UP Mar 25 01:49:56.339552 systemd-networkd[772]: eth1: Gained carrier Mar 25 01:49:56.339557 systemd-networkd[772]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:49:56.340615 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 25 01:49:56.354188 ignition[781]: Ignition 2.20.0 Mar 25 01:49:56.354197 ignition[781]: Stage: fetch Mar 25 01:49:56.354309 ignition[781]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:49:56.354316 ignition[781]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 25 01:49:56.354400 ignition[781]: parsed url from cmdline: "" Mar 25 01:49:56.354403 ignition[781]: no config URL provided Mar 25 01:49:56.354406 ignition[781]: reading system config file "/usr/lib/ignition/user.ign" Mar 25 01:49:56.354412 ignition[781]: no config at "/usr/lib/ignition/user.ign" Mar 25 01:49:56.354429 ignition[781]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Mar 25 01:49:56.354535 ignition[781]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Mar 25 01:49:56.374364 systemd-networkd[772]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 Mar 25 01:49:56.408364 systemd-networkd[772]: eth0: DHCPv4 address 95.217.13.107/32, gateway 172.31.1.1 acquired from 172.31.1.1 Mar 25 01:49:56.555631 ignition[781]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Mar 25 01:49:56.561650 ignition[781]: GET result: OK Mar 25 01:49:56.561714 ignition[781]: parsing config with SHA512: fb40ddd410fb533289d74972639e479034895ff9811dfad34d614618034db984bb1c9e07f39a76705e4ded5f243faab627251630dc6de830ffdc7bc5426ebf7f Mar 25 01:49:56.568820 unknown[781]: fetched base config from "system" Mar 25 01:49:56.568836 unknown[781]: fetched base config from "system" Mar 25 01:49:56.569166 ignition[781]: fetch: fetch complete Mar 25 01:49:56.568846 unknown[781]: fetched user config from "hetzner" Mar 25 01:49:56.569172 ignition[781]: fetch: fetch passed Mar 25 01:49:56.569210 ignition[781]: Ignition finished successfully Mar 25 01:49:56.571768 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 25 01:49:56.573717 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 25 01:49:56.593473 ignition[789]: Ignition 2.20.0 Mar 25 01:49:56.593490 ignition[789]: Stage: kargs Mar 25 01:49:56.593718 ignition[789]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:49:56.593730 ignition[789]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 25 01:49:56.594824 ignition[789]: kargs: kargs passed Mar 25 01:49:56.595653 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 25 01:49:56.594861 ignition[789]: Ignition finished successfully Mar 25 01:49:56.598774 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 25 01:49:56.612778 ignition[795]: Ignition 2.20.0 Mar 25 01:49:56.612798 ignition[795]: Stage: disks Mar 25 01:49:56.614398 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 25 01:49:56.612967 ignition[795]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:49:56.618747 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 25 01:49:56.612978 ignition[795]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 25 01:49:56.619967 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 25 01:49:56.613721 ignition[795]: disks: disks passed Mar 25 01:49:56.621090 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 25 01:49:56.613755 ignition[795]: Ignition finished successfully Mar 25 01:49:56.622476 systemd[1]: Reached target sysinit.target - System Initialization. Mar 25 01:49:56.623842 systemd[1]: Reached target basic.target - Basic System. Mar 25 01:49:56.627457 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 25 01:49:56.644949 systemd-fsck[804]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Mar 25 01:49:56.646822 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 25 01:49:56.648773 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 25 01:49:56.721352 kernel: EXT4-fs (sda9): mounted filesystem 4e6dca82-2e50-453c-be25-61f944b72008 r/w with ordered data mode. Quota mode: none. Mar 25 01:49:56.721800 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 25 01:49:56.722562 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 25 01:49:56.724357 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 25 01:49:56.726391 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 25 01:49:56.728435 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Mar 25 01:49:56.730549 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 25 01:49:56.730574 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 25 01:49:56.737229 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 25 01:49:56.740026 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 25 01:49:56.748345 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (812) Mar 25 01:49:56.751466 kernel: BTRFS info (device sda6): first mount of filesystem a72930ba-1354-475c-94df-b83a66efea67 Mar 25 01:49:56.751495 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 25 01:49:56.753911 kernel: BTRFS info (device sda6): using free space tree Mar 25 01:49:56.761685 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 25 01:49:56.761714 kernel: BTRFS info (device sda6): auto enabling async discard Mar 25 01:49:56.763615 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 25 01:49:56.788017 coreos-metadata[814]: Mar 25 01:49:56.787 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Mar 25 01:49:56.789343 coreos-metadata[814]: Mar 25 01:49:56.789 INFO Fetch successful Mar 25 01:49:56.790403 coreos-metadata[814]: Mar 25 01:49:56.790 INFO wrote hostname ci-4284-0-0-a-abb47662e0 to /sysroot/etc/hostname Mar 25 01:49:56.791924 initrd-setup-root[839]: cut: /sysroot/etc/passwd: No such file or directory Mar 25 01:49:56.792318 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 25 01:49:56.795277 initrd-setup-root[847]: cut: /sysroot/etc/group: No such file or directory Mar 25 01:49:56.798208 initrd-setup-root[854]: cut: /sysroot/etc/shadow: No such file or directory Mar 25 01:49:56.800623 initrd-setup-root[861]: cut: /sysroot/etc/gshadow: No such file or directory Mar 25 01:49:56.857015 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 25 01:49:56.858432 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 25 01:49:56.861180 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 25 01:49:56.870341 kernel: BTRFS info (device sda6): last unmount of filesystem a72930ba-1354-475c-94df-b83a66efea67 Mar 25 01:49:56.881248 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 25 01:49:56.888067 ignition[930]: INFO : Ignition 2.20.0 Mar 25 01:49:56.888067 ignition[930]: INFO : Stage: mount Mar 25 01:49:56.888067 ignition[930]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 01:49:56.888067 ignition[930]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 25 01:49:56.890582 ignition[930]: INFO : mount: mount passed Mar 25 01:49:56.890582 ignition[930]: INFO : Ignition finished successfully Mar 25 01:49:56.889103 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 25 01:49:56.891975 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 25 01:49:57.200906 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 25 01:49:57.203434 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 25 01:49:57.231379 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/sda6 scanned by mount (941) Mar 25 01:49:57.237089 kernel: BTRFS info (device sda6): first mount of filesystem a72930ba-1354-475c-94df-b83a66efea67 Mar 25 01:49:57.237141 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 25 01:49:57.241741 kernel: BTRFS info (device sda6): using free space tree Mar 25 01:49:57.248702 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 25 01:49:57.248748 kernel: BTRFS info (device sda6): auto enabling async discard Mar 25 01:49:57.254479 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 25 01:49:57.288588 ignition[958]: INFO : Ignition 2.20.0 Mar 25 01:49:57.288588 ignition[958]: INFO : Stage: files Mar 25 01:49:57.290984 ignition[958]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 01:49:57.290984 ignition[958]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 25 01:49:57.293872 ignition[958]: DEBUG : files: compiled without relabeling support, skipping Mar 25 01:49:57.293872 ignition[958]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 25 01:49:57.293872 ignition[958]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 25 01:49:57.299052 ignition[958]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 25 01:49:57.300680 ignition[958]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 25 01:49:57.300680 ignition[958]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 25 01:49:57.300076 unknown[958]: wrote ssh authorized keys file for user: core Mar 25 01:49:57.305436 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Mar 25 01:49:57.305436 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Mar 25 01:49:57.516897 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 25 01:49:58.111538 systemd-networkd[772]: eth0: Gained IPv6LL Mar 25 01:49:58.240185 systemd-networkd[772]: eth1: Gained IPv6LL Mar 25 01:49:58.665888 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Mar 25 01:49:58.667468 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 25 01:49:58.667468 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 25 01:49:58.667468 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 25 01:49:58.667468 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 25 01:49:58.667468 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 25 01:49:58.667468 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 25 01:49:58.667468 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 25 01:49:58.667468 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 25 01:49:58.667468 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 25 01:49:58.667468 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 25 01:49:58.667468 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 25 01:49:58.667468 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 25 01:49:58.667468 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 25 01:49:58.667468 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-x86-64.raw: attempt #1 Mar 25 01:49:59.323519 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 25 01:49:59.440140 ignition[958]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 25 01:49:59.440140 ignition[958]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 25 01:49:59.443433 ignition[958]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 25 01:49:59.443433 ignition[958]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 25 01:49:59.443433 ignition[958]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 25 01:49:59.443433 ignition[958]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Mar 25 01:49:59.443433 ignition[958]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Mar 25 01:49:59.443433 ignition[958]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Mar 25 01:49:59.443433 ignition[958]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Mar 25 01:49:59.443433 ignition[958]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Mar 25 01:49:59.443433 ignition[958]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Mar 25 01:49:59.443433 ignition[958]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 25 01:49:59.443433 ignition[958]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 25 01:49:59.443433 ignition[958]: INFO : files: files passed Mar 25 01:49:59.443433 ignition[958]: INFO : Ignition finished successfully Mar 25 01:49:59.443627 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 25 01:49:59.448468 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 25 01:49:59.450229 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 25 01:49:59.463602 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 25 01:49:59.463687 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 25 01:49:59.468450 initrd-setup-root-after-ignition[987]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 25 01:49:59.468450 initrd-setup-root-after-ignition[987]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 25 01:49:59.469962 initrd-setup-root-after-ignition[991]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 25 01:49:59.469828 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 25 01:49:59.470584 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 25 01:49:59.472446 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 25 01:49:59.500577 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 25 01:49:59.500677 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 25 01:49:59.501441 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 25 01:49:59.502201 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 25 01:49:59.503264 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 25 01:49:59.504966 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 25 01:49:59.519014 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 25 01:49:59.520388 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 25 01:49:59.532062 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 25 01:49:59.533210 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 01:49:59.534364 systemd[1]: Stopped target timers.target - Timer Units. Mar 25 01:49:59.534900 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 25 01:49:59.535003 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 25 01:49:59.536033 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 25 01:49:59.536641 systemd[1]: Stopped target basic.target - Basic System. Mar 25 01:49:59.537630 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 25 01:49:59.538499 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 25 01:49:59.539402 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 25 01:49:59.540407 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 25 01:49:59.541443 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 25 01:49:59.542476 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 25 01:49:59.543661 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 25 01:49:59.544720 systemd[1]: Stopped target swap.target - Swaps. Mar 25 01:49:59.545682 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 25 01:49:59.545805 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 25 01:49:59.546929 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 25 01:49:59.547756 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 01:49:59.548889 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 25 01:49:59.549307 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 01:49:59.550239 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 25 01:49:59.550376 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 25 01:49:59.552572 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 25 01:49:59.552731 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 25 01:49:59.553897 systemd[1]: ignition-files.service: Deactivated successfully. Mar 25 01:49:59.554049 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 25 01:49:59.555109 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Mar 25 01:49:59.555226 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 25 01:49:59.558515 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 25 01:49:59.561189 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 25 01:49:59.562643 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 25 01:49:59.562750 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 01:49:59.564594 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 25 01:49:59.564710 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 25 01:49:59.570652 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 25 01:49:59.570729 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 25 01:49:59.579078 ignition[1011]: INFO : Ignition 2.20.0 Mar 25 01:49:59.579078 ignition[1011]: INFO : Stage: umount Mar 25 01:49:59.579078 ignition[1011]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 01:49:59.579078 ignition[1011]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 25 01:49:59.583128 ignition[1011]: INFO : umount: umount passed Mar 25 01:49:59.583128 ignition[1011]: INFO : Ignition finished successfully Mar 25 01:49:59.580321 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 25 01:49:59.580426 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 25 01:49:59.582288 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 25 01:49:59.582680 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 25 01:49:59.582714 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 25 01:49:59.587753 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 25 01:49:59.587789 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 25 01:49:59.588609 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 25 01:49:59.588654 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 25 01:49:59.589504 systemd[1]: Stopped target network.target - Network. Mar 25 01:49:59.590280 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 25 01:49:59.590316 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 25 01:49:59.591175 systemd[1]: Stopped target paths.target - Path Units. Mar 25 01:49:59.592055 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 25 01:49:59.593390 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 01:49:59.594334 systemd[1]: Stopped target slices.target - Slice Units. Mar 25 01:49:59.595130 systemd[1]: Stopped target sockets.target - Socket Units. Mar 25 01:49:59.596108 systemd[1]: iscsid.socket: Deactivated successfully. Mar 25 01:49:59.596133 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 25 01:49:59.597078 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 25 01:49:59.597102 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 25 01:49:59.597894 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 25 01:49:59.597926 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 25 01:49:59.598852 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 25 01:49:59.598882 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 25 01:49:59.599989 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 25 01:49:59.600824 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 25 01:49:59.602748 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 25 01:49:59.602812 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 25 01:49:59.603664 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 25 01:49:59.603725 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 25 01:49:59.606206 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 25 01:49:59.606778 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 25 01:49:59.606831 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 25 01:49:59.607966 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 25 01:49:59.608001 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 01:49:59.610124 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 25 01:49:59.610305 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 25 01:49:59.610417 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 25 01:49:59.611932 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 25 01:49:59.612250 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 25 01:49:59.612288 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 25 01:49:59.614437 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 25 01:49:59.615184 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 25 01:49:59.615222 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 25 01:49:59.616584 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 25 01:49:59.616627 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 25 01:49:59.621275 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 25 01:49:59.621309 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 25 01:49:59.622131 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 01:49:59.624364 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 25 01:49:59.634752 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 25 01:49:59.635158 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 01:49:59.636051 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 25 01:49:59.636111 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 25 01:49:59.637213 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 25 01:49:59.637256 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 25 01:49:59.638166 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 25 01:49:59.638188 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 01:49:59.639109 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 25 01:49:59.639141 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 25 01:49:59.640523 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 25 01:49:59.640555 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 25 01:49:59.641516 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 25 01:49:59.641549 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:49:59.643416 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 25 01:49:59.644034 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 25 01:49:59.644072 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 01:49:59.646586 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Mar 25 01:49:59.646632 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 25 01:49:59.647594 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 25 01:49:59.647642 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 01:49:59.648663 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 25 01:49:59.648695 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:49:59.654319 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 25 01:49:59.654452 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 25 01:49:59.655520 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 25 01:49:59.657466 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 25 01:49:59.673434 systemd[1]: Switching root. Mar 25 01:49:59.717440 systemd-journald[188]: Journal stopped Mar 25 01:50:00.505438 systemd-journald[188]: Received SIGTERM from PID 1 (systemd). Mar 25 01:50:00.505485 kernel: SELinux: policy capability network_peer_controls=1 Mar 25 01:50:00.505500 kernel: SELinux: policy capability open_perms=1 Mar 25 01:50:00.505508 kernel: SELinux: policy capability extended_socket_class=1 Mar 25 01:50:00.505516 kernel: SELinux: policy capability always_check_network=0 Mar 25 01:50:00.505524 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 25 01:50:00.505537 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 25 01:50:00.505545 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 25 01:50:00.505553 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 25 01:50:00.505561 kernel: audit: type=1403 audit(1742867399.828:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 25 01:50:00.505570 systemd[1]: Successfully loaded SELinux policy in 39.512ms. Mar 25 01:50:00.505586 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 10.870ms. Mar 25 01:50:00.505595 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 25 01:50:00.505605 systemd[1]: Detected virtualization kvm. Mar 25 01:50:00.505615 systemd[1]: Detected architecture x86-64. Mar 25 01:50:00.505636 systemd[1]: Detected first boot. Mar 25 01:50:00.505646 systemd[1]: Hostname set to . Mar 25 01:50:00.505659 systemd[1]: Initializing machine ID from VM UUID. Mar 25 01:50:00.505669 zram_generator::config[1055]: No configuration found. Mar 25 01:50:00.505678 kernel: Guest personality initialized and is inactive Mar 25 01:50:00.505688 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Mar 25 01:50:00.505696 kernel: Initialized host personality Mar 25 01:50:00.505705 kernel: NET: Registered PF_VSOCK protocol family Mar 25 01:50:00.505714 systemd[1]: Populated /etc with preset unit settings. Mar 25 01:50:00.505723 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 25 01:50:00.505732 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 25 01:50:00.505746 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 25 01:50:00.505762 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 25 01:50:00.505780 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 25 01:50:00.505800 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 25 01:50:00.505815 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 25 01:50:00.505827 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 25 01:50:00.505837 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 25 01:50:00.505845 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 25 01:50:00.505854 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 25 01:50:00.505863 systemd[1]: Created slice user.slice - User and Session Slice. Mar 25 01:50:00.505872 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 01:50:00.505881 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 01:50:00.505890 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 25 01:50:00.505899 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 25 01:50:00.505909 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 25 01:50:00.505920 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 25 01:50:00.505929 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 25 01:50:00.505938 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 01:50:00.505946 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 25 01:50:00.505955 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 25 01:50:00.505965 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 25 01:50:00.505973 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 25 01:50:00.505982 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 01:50:00.505990 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 25 01:50:00.505999 systemd[1]: Reached target slices.target - Slice Units. Mar 25 01:50:00.506008 systemd[1]: Reached target swap.target - Swaps. Mar 25 01:50:00.506017 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 25 01:50:00.506025 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 25 01:50:00.506034 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 25 01:50:00.506044 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 25 01:50:00.506056 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 25 01:50:00.506065 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 01:50:00.506074 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 25 01:50:00.506082 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 25 01:50:00.506091 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 25 01:50:00.506101 systemd[1]: Mounting media.mount - External Media Directory... Mar 25 01:50:00.506112 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 01:50:00.506121 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 25 01:50:00.506129 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 25 01:50:00.506138 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 25 01:50:00.506148 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 25 01:50:00.506157 systemd[1]: Reached target machines.target - Containers. Mar 25 01:50:00.506166 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 25 01:50:00.506176 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 01:50:00.506185 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 25 01:50:00.506194 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 25 01:50:00.506202 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 25 01:50:00.506211 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 25 01:50:00.506220 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 25 01:50:00.506229 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 25 01:50:00.506237 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 25 01:50:00.506246 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 25 01:50:00.506263 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 25 01:50:00.506280 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 25 01:50:00.506294 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 25 01:50:00.506308 systemd[1]: Stopped systemd-fsck-usr.service. Mar 25 01:50:00.508833 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 01:50:00.508853 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 25 01:50:00.508864 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 25 01:50:00.508873 kernel: loop: module loaded Mar 25 01:50:00.508886 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 25 01:50:00.508895 kernel: fuse: init (API version 7.39) Mar 25 01:50:00.508903 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 25 01:50:00.508912 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 25 01:50:00.508921 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 25 01:50:00.508930 systemd[1]: verity-setup.service: Deactivated successfully. Mar 25 01:50:00.508939 systemd[1]: Stopped verity-setup.service. Mar 25 01:50:00.508950 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 01:50:00.508959 kernel: ACPI: bus type drm_connector registered Mar 25 01:50:00.508968 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 25 01:50:00.508977 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 25 01:50:00.508986 systemd[1]: Mounted media.mount - External Media Directory. Mar 25 01:50:00.509010 systemd-journald[1139]: Collecting audit messages is disabled. Mar 25 01:50:00.509030 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 25 01:50:00.509040 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 25 01:50:00.509049 systemd-journald[1139]: Journal started Mar 25 01:50:00.509080 systemd-journald[1139]: Runtime Journal (/run/log/journal/6db4939356e646b18162c91546f4cf7b) is 4.7M, max 38.3M, 33.5M free. Mar 25 01:50:00.268705 systemd[1]: Queued start job for default target multi-user.target. Mar 25 01:50:00.511398 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 25 01:50:00.511420 systemd[1]: Started systemd-journald.service - Journal Service. Mar 25 01:50:00.276879 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Mar 25 01:50:00.277196 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 25 01:50:00.512796 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 25 01:50:00.513448 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 01:50:00.514115 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 25 01:50:00.514222 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 25 01:50:00.514964 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 25 01:50:00.515137 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 25 01:50:00.515870 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 25 01:50:00.516035 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 25 01:50:00.516717 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 25 01:50:00.516918 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 25 01:50:00.517758 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 25 01:50:00.517917 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 25 01:50:00.518614 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 25 01:50:00.518793 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 25 01:50:00.519570 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 25 01:50:00.520312 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 25 01:50:00.521115 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 25 01:50:00.521972 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 25 01:50:00.529853 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 25 01:50:00.531422 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 25 01:50:00.535389 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 25 01:50:00.536215 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 25 01:50:00.536292 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 25 01:50:00.537523 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 25 01:50:00.545449 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 25 01:50:00.547482 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 25 01:50:00.549409 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 01:50:00.550948 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 25 01:50:00.552800 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 25 01:50:00.554918 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 25 01:50:00.556006 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 25 01:50:00.556766 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 25 01:50:00.558112 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 25 01:50:00.561425 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 25 01:50:00.565106 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 25 01:50:00.566854 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 25 01:50:00.571481 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 25 01:50:00.572314 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 25 01:50:00.577121 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 25 01:50:00.581949 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 25 01:50:00.586121 systemd-journald[1139]: Time spent on flushing to /var/log/journal/6db4939356e646b18162c91546f4cf7b is 37.866ms for 1146 entries. Mar 25 01:50:00.586121 systemd-journald[1139]: System Journal (/var/log/journal/6db4939356e646b18162c91546f4cf7b) is 8M, max 584.8M, 576.8M free. Mar 25 01:50:00.640102 systemd-journald[1139]: Received client request to flush runtime journal. Mar 25 01:50:00.640138 kernel: loop0: detected capacity change from 0 to 109808 Mar 25 01:50:00.586445 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 25 01:50:00.601289 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 25 01:50:00.611037 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 01:50:00.619561 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 25 01:50:00.634600 systemd-tmpfiles[1182]: ACLs are not supported, ignoring. Mar 25 01:50:00.634610 systemd-tmpfiles[1182]: ACLs are not supported, ignoring. Mar 25 01:50:00.640381 udevadm[1193]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. Mar 25 01:50:00.643060 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 25 01:50:00.645747 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 25 01:50:00.649309 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 25 01:50:00.662763 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 25 01:50:00.665348 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 25 01:50:00.687606 kernel: loop1: detected capacity change from 0 to 151640 Mar 25 01:50:00.698145 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 25 01:50:00.703363 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 25 01:50:00.720221 systemd-tmpfiles[1203]: ACLs are not supported, ignoring. Mar 25 01:50:00.720678 systemd-tmpfiles[1203]: ACLs are not supported, ignoring. Mar 25 01:50:00.725226 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 01:50:00.738396 kernel: loop2: detected capacity change from 0 to 8 Mar 25 01:50:00.758435 kernel: loop3: detected capacity change from 0 to 210664 Mar 25 01:50:00.805966 kernel: loop4: detected capacity change from 0 to 109808 Mar 25 01:50:00.826362 kernel: loop5: detected capacity change from 0 to 151640 Mar 25 01:50:00.849355 kernel: loop6: detected capacity change from 0 to 8 Mar 25 01:50:00.851359 kernel: loop7: detected capacity change from 0 to 210664 Mar 25 01:50:00.875787 (sd-merge)[1209]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Mar 25 01:50:00.876433 (sd-merge)[1209]: Merged extensions into '/usr'. Mar 25 01:50:00.881484 systemd[1]: Reload requested from client PID 1181 ('systemd-sysext') (unit systemd-sysext.service)... Mar 25 01:50:00.881789 systemd[1]: Reloading... Mar 25 01:50:00.950456 zram_generator::config[1240]: No configuration found. Mar 25 01:50:01.046061 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:50:01.072355 ldconfig[1176]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 25 01:50:01.105032 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 25 01:50:01.105448 systemd[1]: Reloading finished in 223 ms. Mar 25 01:50:01.117954 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 25 01:50:01.118871 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 25 01:50:01.129423 systemd[1]: Starting ensure-sysext.service... Mar 25 01:50:01.131477 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 25 01:50:01.144045 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 25 01:50:01.148423 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 01:50:01.151795 systemd-tmpfiles[1281]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 25 01:50:01.152182 systemd-tmpfiles[1281]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 25 01:50:01.152406 systemd[1]: Reload requested from client PID 1280 ('systemctl') (unit ensure-sysext.service)... Mar 25 01:50:01.152423 systemd[1]: Reloading... Mar 25 01:50:01.152815 systemd-tmpfiles[1281]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 25 01:50:01.152992 systemd-tmpfiles[1281]: ACLs are not supported, ignoring. Mar 25 01:50:01.153032 systemd-tmpfiles[1281]: ACLs are not supported, ignoring. Mar 25 01:50:01.156206 systemd-tmpfiles[1281]: Detected autofs mount point /boot during canonicalization of boot. Mar 25 01:50:01.156300 systemd-tmpfiles[1281]: Skipping /boot Mar 25 01:50:01.169416 systemd-tmpfiles[1281]: Detected autofs mount point /boot during canonicalization of boot. Mar 25 01:50:01.169424 systemd-tmpfiles[1281]: Skipping /boot Mar 25 01:50:01.190307 systemd-udevd[1284]: Using default interface naming scheme 'v255'. Mar 25 01:50:01.210354 zram_generator::config[1310]: No configuration found. Mar 25 01:50:01.296351 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (1324) Mar 25 01:50:01.340931 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:50:01.380341 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Mar 25 01:50:01.396335 kernel: mousedev: PS/2 mouse device common for all mice Mar 25 01:50:01.410337 kernel: ACPI: button: Power Button [PWRF] Mar 25 01:50:01.418475 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Mar 25 01:50:01.419214 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 25 01:50:01.419552 systemd[1]: Reloading finished in 266 ms. Mar 25 01:50:01.428246 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 01:50:01.430352 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 01:50:01.447541 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input5 Mar 25 01:50:01.454779 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Mar 25 01:50:01.458476 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI) Mar 25 01:50:01.458597 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Mar 25 01:50:01.472357 kernel: EDAC MC: Ver: 3.0.0 Mar 25 01:50:01.482343 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Mar 25 01:50:01.485364 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Mar 25 01:50:01.488443 kernel: Console: switching to colour dummy device 80x25 Mar 25 01:50:01.489687 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Mar 25 01:50:01.489715 kernel: [drm] features: -context_init Mar 25 01:50:01.490762 kernel: [drm] number of scanouts: 1 Mar 25 01:50:01.491343 kernel: [drm] number of cap sets: 0 Mar 25 01:50:01.491706 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Mar 25 01:50:01.493426 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:01.0 on minor 0 Mar 25 01:50:01.500031 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 01:50:01.502491 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Mar 25 01:50:01.502523 kernel: Console: switching to colour frame buffer device 160x50 Mar 25 01:50:01.503580 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 25 01:50:01.512737 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Mar 25 01:50:01.523085 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 25 01:50:01.526167 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 01:50:01.528434 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 25 01:50:01.534127 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 25 01:50:01.535163 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 25 01:50:01.541191 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 25 01:50:01.541405 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 01:50:01.543094 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 25 01:50:01.543423 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 01:50:01.545485 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 25 01:50:01.547348 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 25 01:50:01.555278 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 25 01:50:01.558739 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 25 01:50:01.561386 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:50:01.561452 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 01:50:01.564728 systemd[1]: Finished ensure-sysext.service. Mar 25 01:50:01.570207 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 25 01:50:01.570373 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 25 01:50:01.572454 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 25 01:50:01.572573 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 25 01:50:01.572943 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 25 01:50:01.573052 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 25 01:50:01.574405 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 25 01:50:01.579081 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 25 01:50:01.584777 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 25 01:50:01.600103 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 25 01:50:01.603173 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 25 01:50:01.603223 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 25 01:50:01.605440 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 25 01:50:01.607438 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 25 01:50:01.610038 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 25 01:50:01.610182 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:50:01.612717 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 25 01:50:01.616199 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 25 01:50:01.619041 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 25 01:50:01.624181 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:50:01.626600 augenrules[1437]: No rules Mar 25 01:50:01.627188 systemd[1]: audit-rules.service: Deactivated successfully. Mar 25 01:50:01.627400 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 25 01:50:01.643817 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 25 01:50:01.648659 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 25 01:50:01.652410 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 25 01:50:01.669405 lvm[1446]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 25 01:50:01.673648 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 25 01:50:01.690783 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 25 01:50:01.693054 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 25 01:50:01.693862 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 25 01:50:01.701215 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 25 01:50:01.703263 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 25 01:50:01.706215 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:50:01.718473 lvm[1460]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 25 01:50:01.739621 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 25 01:50:01.770892 systemd-resolved[1405]: Positive Trust Anchors: Mar 25 01:50:01.771139 systemd-resolved[1405]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 25 01:50:01.771210 systemd-resolved[1405]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 25 01:50:01.772798 systemd-networkd[1400]: lo: Link UP Mar 25 01:50:01.773823 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 25 01:50:01.774847 systemd-resolved[1405]: Using system hostname 'ci-4284-0-0-a-abb47662e0'. Mar 25 01:50:01.775350 systemd-networkd[1400]: lo: Gained carrier Mar 25 01:50:01.776190 systemd[1]: Reached target time-set.target - System Time Set. Mar 25 01:50:01.777066 systemd-networkd[1400]: Enumeration completed Mar 25 01:50:01.777375 systemd-networkd[1400]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:50:01.777379 systemd-networkd[1400]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 25 01:50:01.777472 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 25 01:50:01.777819 systemd-networkd[1400]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:50:01.777822 systemd-networkd[1400]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 25 01:50:01.778147 systemd-networkd[1400]: eth0: Link UP Mar 25 01:50:01.778150 systemd-networkd[1400]: eth0: Gained carrier Mar 25 01:50:01.778160 systemd-networkd[1400]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:50:01.779742 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 25 01:50:01.781535 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 25 01:50:01.783124 systemd-networkd[1400]: eth1: Link UP Mar 25 01:50:01.783128 systemd-networkd[1400]: eth1: Gained carrier Mar 25 01:50:01.783144 systemd-networkd[1400]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:50:01.784971 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 25 01:50:01.785415 systemd[1]: Reached target network.target - Network. Mar 25 01:50:01.785776 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 25 01:50:01.786110 systemd[1]: Reached target sysinit.target - System Initialization. Mar 25 01:50:01.787410 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 25 01:50:01.789034 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 25 01:50:01.792576 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 25 01:50:01.793149 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 25 01:50:01.793610 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 25 01:50:01.794092 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 25 01:50:01.794190 systemd[1]: Reached target paths.target - Path Units. Mar 25 01:50:01.794589 systemd[1]: Reached target timers.target - Timer Units. Mar 25 01:50:01.799505 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 25 01:50:01.801528 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 25 01:50:01.805242 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 25 01:50:01.805898 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 25 01:50:01.807033 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 25 01:50:01.809067 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 25 01:50:01.811957 systemd-networkd[1400]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 Mar 25 01:50:01.812970 systemd-timesyncd[1429]: Network configuration changed, trying to establish connection. Mar 25 01:50:01.813655 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 25 01:50:01.816218 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 25 01:50:01.817904 systemd[1]: Reached target sockets.target - Socket Units. Mar 25 01:50:01.818469 systemd[1]: Reached target basic.target - Basic System. Mar 25 01:50:01.818924 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 25 01:50:01.818958 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 25 01:50:01.821305 systemd[1]: Starting containerd.service - containerd container runtime... Mar 25 01:50:01.824436 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 25 01:50:01.828029 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 25 01:50:01.835182 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 25 01:50:01.837444 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 25 01:50:01.839827 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 25 01:50:01.842460 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 25 01:50:01.844602 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 25 01:50:01.848404 systemd-networkd[1400]: eth0: DHCPv4 address 95.217.13.107/32, gateway 172.31.1.1 acquired from 172.31.1.1 Mar 25 01:50:01.849371 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Mar 25 01:50:01.849571 systemd-timesyncd[1429]: Network configuration changed, trying to establish connection. Mar 25 01:50:01.853525 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 25 01:50:01.854435 jq[1475]: false Mar 25 01:50:01.860493 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 25 01:50:01.870419 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 25 01:50:01.871458 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 25 01:50:01.871794 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 25 01:50:01.873819 systemd[1]: Starting update-engine.service - Update Engine... Mar 25 01:50:01.878505 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 25 01:50:01.883043 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 25 01:50:01.888361 coreos-metadata[1471]: Mar 25 01:50:01.887 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Mar 25 01:50:01.889501 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 25 01:50:01.896245 jq[1485]: true Mar 25 01:50:01.900576 coreos-metadata[1471]: Mar 25 01:50:01.891 INFO Fetch successful Mar 25 01:50:01.900576 coreos-metadata[1471]: Mar 25 01:50:01.891 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Mar 25 01:50:01.900576 coreos-metadata[1471]: Mar 25 01:50:01.892 INFO Fetch successful Mar 25 01:50:01.890517 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 25 01:50:01.892716 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 25 01:50:01.892849 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 25 01:50:01.908025 dbus-daemon[1472]: [system] SELinux support is enabled Mar 25 01:50:01.911300 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 25 01:50:01.915847 update_engine[1484]: I20250325 01:50:01.915552 1484 main.cc:92] Flatcar Update Engine starting Mar 25 01:50:01.923393 extend-filesystems[1476]: Found loop4 Mar 25 01:50:01.923393 extend-filesystems[1476]: Found loop5 Mar 25 01:50:01.923393 extend-filesystems[1476]: Found loop6 Mar 25 01:50:01.923393 extend-filesystems[1476]: Found loop7 Mar 25 01:50:01.923393 extend-filesystems[1476]: Found sda Mar 25 01:50:01.923393 extend-filesystems[1476]: Found sda1 Mar 25 01:50:01.923393 extend-filesystems[1476]: Found sda2 Mar 25 01:50:01.923393 extend-filesystems[1476]: Found sda3 Mar 25 01:50:01.923393 extend-filesystems[1476]: Found usr Mar 25 01:50:01.923393 extend-filesystems[1476]: Found sda4 Mar 25 01:50:01.923393 extend-filesystems[1476]: Found sda6 Mar 25 01:50:01.923393 extend-filesystems[1476]: Found sda7 Mar 25 01:50:01.923393 extend-filesystems[1476]: Found sda9 Mar 25 01:50:01.923393 extend-filesystems[1476]: Checking size of /dev/sda9 Mar 25 01:50:01.924883 systemd[1]: motdgen.service: Deactivated successfully. Mar 25 01:50:01.957041 tar[1490]: linux-amd64/helm Mar 25 01:50:01.957236 update_engine[1484]: I20250325 01:50:01.932529 1484 update_check_scheduler.cc:74] Next update check in 6m4s Mar 25 01:50:01.925038 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 25 01:50:01.959375 jq[1503]: true Mar 25 01:50:01.939388 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 25 01:50:01.939419 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 25 01:50:01.942353 (ntainerd)[1507]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 25 01:50:01.947426 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 25 01:50:01.947446 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 25 01:50:01.961028 systemd[1]: Started update-engine.service - Update Engine. Mar 25 01:50:01.969928 extend-filesystems[1476]: Resized partition /dev/sda9 Mar 25 01:50:01.979402 extend-filesystems[1521]: resize2fs 1.47.2 (1-Jan-2025) Mar 25 01:50:01.982631 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Mar 25 01:50:01.984077 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 25 01:50:02.035307 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (1330) Mar 25 01:50:02.057180 systemd-logind[1483]: New seat seat0. Mar 25 01:50:02.059719 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 25 01:50:02.061869 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 25 01:50:02.069186 systemd-logind[1483]: Watching system buttons on /dev/input/event2 (Power Button) Mar 25 01:50:02.069201 systemd-logind[1483]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 25 01:50:02.069816 systemd[1]: Started systemd-logind.service - User Login Management. Mar 25 01:50:02.128092 bash[1541]: Updated "/home/core/.ssh/authorized_keys" Mar 25 01:50:02.128274 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 25 01:50:02.135393 systemd[1]: Starting sshkeys.service... Mar 25 01:50:02.165076 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Mar 25 01:50:02.170104 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 25 01:50:02.174045 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 25 01:50:02.178999 locksmithd[1520]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 25 01:50:02.188622 extend-filesystems[1521]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Mar 25 01:50:02.188622 extend-filesystems[1521]: old_desc_blocks = 1, new_desc_blocks = 5 Mar 25 01:50:02.188622 extend-filesystems[1521]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Mar 25 01:50:02.195348 extend-filesystems[1476]: Resized filesystem in /dev/sda9 Mar 25 01:50:02.195348 extend-filesystems[1476]: Found sr0 Mar 25 01:50:02.190089 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 25 01:50:02.190268 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 25 01:50:02.234566 coreos-metadata[1556]: Mar 25 01:50:02.234 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Mar 25 01:50:02.235792 coreos-metadata[1556]: Mar 25 01:50:02.235 INFO Fetch successful Mar 25 01:50:02.236733 unknown[1556]: wrote ssh authorized keys file for user: core Mar 25 01:50:02.239884 containerd[1507]: time="2025-03-25T01:50:02Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 25 01:50:02.239884 containerd[1507]: time="2025-03-25T01:50:02.239110322Z" level=info msg="starting containerd" revision=88aa2f531d6c2922003cc7929e51daf1c14caa0a version=v2.0.1 Mar 25 01:50:02.254664 containerd[1507]: time="2025-03-25T01:50:02.254622489Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="6.601µs" Mar 25 01:50:02.254967 containerd[1507]: time="2025-03-25T01:50:02.254949392Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 25 01:50:02.255104 containerd[1507]: time="2025-03-25T01:50:02.255090036Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 25 01:50:02.256176 containerd[1507]: time="2025-03-25T01:50:02.256160022Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 25 01:50:02.256259 containerd[1507]: time="2025-03-25T01:50:02.256246294Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 25 01:50:02.256315 containerd[1507]: time="2025-03-25T01:50:02.256304052Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 25 01:50:02.256835 containerd[1507]: time="2025-03-25T01:50:02.256817956Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 25 01:50:02.256882 containerd[1507]: time="2025-03-25T01:50:02.256872538Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 25 01:50:02.257750 containerd[1507]: time="2025-03-25T01:50:02.257689209Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 25 01:50:02.257750 containerd[1507]: time="2025-03-25T01:50:02.257709267Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 25 01:50:02.257750 containerd[1507]: time="2025-03-25T01:50:02.257719105Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 25 01:50:02.257750 containerd[1507]: time="2025-03-25T01:50:02.257725938Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 25 01:50:02.258798 containerd[1507]: time="2025-03-25T01:50:02.258782359Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 25 01:50:02.261862 update-ssh-keys[1562]: Updated "/home/core/.ssh/authorized_keys" Mar 25 01:50:02.259984 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 25 01:50:02.262531 containerd[1507]: time="2025-03-25T01:50:02.262452381Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 25 01:50:02.262531 containerd[1507]: time="2025-03-25T01:50:02.262498377Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 25 01:50:02.262531 containerd[1507]: time="2025-03-25T01:50:02.262508646Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 25 01:50:02.262875 containerd[1507]: time="2025-03-25T01:50:02.262858702Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 25 01:50:02.263364 systemd[1]: Finished sshkeys.service. Mar 25 01:50:02.268346 containerd[1507]: time="2025-03-25T01:50:02.267899525Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 25 01:50:02.268346 containerd[1507]: time="2025-03-25T01:50:02.267977170Z" level=info msg="metadata content store policy set" policy=shared Mar 25 01:50:02.277463 containerd[1507]: time="2025-03-25T01:50:02.277433923Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 25 01:50:02.277586 containerd[1507]: time="2025-03-25T01:50:02.277573013Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 25 01:50:02.281180 containerd[1507]: time="2025-03-25T01:50:02.279387255Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 25 01:50:02.281180 containerd[1507]: time="2025-03-25T01:50:02.279413996Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 25 01:50:02.281180 containerd[1507]: time="2025-03-25T01:50:02.279426248Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 25 01:50:02.281180 containerd[1507]: time="2025-03-25T01:50:02.279435646Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 25 01:50:02.281180 containerd[1507]: time="2025-03-25T01:50:02.279456736Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 25 01:50:02.281180 containerd[1507]: time="2025-03-25T01:50:02.279467496Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 25 01:50:02.281180 containerd[1507]: time="2025-03-25T01:50:02.279476283Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 25 01:50:02.281180 containerd[1507]: time="2025-03-25T01:50:02.279484859Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 25 01:50:02.281180 containerd[1507]: time="2025-03-25T01:50:02.279492483Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 25 01:50:02.281180 containerd[1507]: time="2025-03-25T01:50:02.279501469Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 25 01:50:02.281180 containerd[1507]: time="2025-03-25T01:50:02.279615133Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 25 01:50:02.281180 containerd[1507]: time="2025-03-25T01:50:02.279633137Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 25 01:50:02.281180 containerd[1507]: time="2025-03-25T01:50:02.279657392Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 25 01:50:02.281180 containerd[1507]: time="2025-03-25T01:50:02.279666469Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 25 01:50:02.281429 containerd[1507]: time="2025-03-25T01:50:02.279674955Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 25 01:50:02.281429 containerd[1507]: time="2025-03-25T01:50:02.279684143Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 25 01:50:02.281429 containerd[1507]: time="2025-03-25T01:50:02.279692708Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 25 01:50:02.281429 containerd[1507]: time="2025-03-25T01:50:02.279701114Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 25 01:50:02.281429 containerd[1507]: time="2025-03-25T01:50:02.279709560Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 25 01:50:02.281429 containerd[1507]: time="2025-03-25T01:50:02.279719208Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 25 01:50:02.281429 containerd[1507]: time="2025-03-25T01:50:02.279727403Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 25 01:50:02.281429 containerd[1507]: time="2025-03-25T01:50:02.279780112Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 25 01:50:02.281429 containerd[1507]: time="2025-03-25T01:50:02.279790542Z" level=info msg="Start snapshots syncer" Mar 25 01:50:02.281429 containerd[1507]: time="2025-03-25T01:50:02.279811281Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 25 01:50:02.281593 containerd[1507]: time="2025-03-25T01:50:02.279994584Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 25 01:50:02.281593 containerd[1507]: time="2025-03-25T01:50:02.280034259Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 25 01:50:02.281702 containerd[1507]: time="2025-03-25T01:50:02.280084894Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 25 01:50:02.281702 containerd[1507]: time="2025-03-25T01:50:02.280153081Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 25 01:50:02.281702 containerd[1507]: time="2025-03-25T01:50:02.280170885Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 25 01:50:02.281702 containerd[1507]: time="2025-03-25T01:50:02.280179611Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 25 01:50:02.281702 containerd[1507]: time="2025-03-25T01:50:02.280187195Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 25 01:50:02.281702 containerd[1507]: time="2025-03-25T01:50:02.280197004Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 25 01:50:02.281702 containerd[1507]: time="2025-03-25T01:50:02.280205219Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 25 01:50:02.281702 containerd[1507]: time="2025-03-25T01:50:02.280213565Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 25 01:50:02.281702 containerd[1507]: time="2025-03-25T01:50:02.280231018Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 25 01:50:02.281702 containerd[1507]: time="2025-03-25T01:50:02.280240466Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 25 01:50:02.281702 containerd[1507]: time="2025-03-25T01:50:02.280247909Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 25 01:50:02.281702 containerd[1507]: time="2025-03-25T01:50:02.280270442Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 25 01:50:02.281702 containerd[1507]: time="2025-03-25T01:50:02.280281602Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 25 01:50:02.281702 containerd[1507]: time="2025-03-25T01:50:02.280288104Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 25 01:50:02.281895 containerd[1507]: time="2025-03-25T01:50:02.280295569Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 25 01:50:02.281895 containerd[1507]: time="2025-03-25T01:50:02.280301269Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 25 01:50:02.281895 containerd[1507]: time="2025-03-25T01:50:02.280311899Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 25 01:50:02.281895 containerd[1507]: time="2025-03-25T01:50:02.280319794Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 25 01:50:02.281895 containerd[1507]: time="2025-03-25T01:50:02.280350432Z" level=info msg="runtime interface created" Mar 25 01:50:02.281895 containerd[1507]: time="2025-03-25T01:50:02.280355201Z" level=info msg="created NRI interface" Mar 25 01:50:02.281895 containerd[1507]: time="2025-03-25T01:50:02.280364558Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 25 01:50:02.281895 containerd[1507]: time="2025-03-25T01:50:02.280375218Z" level=info msg="Connect containerd service" Mar 25 01:50:02.281895 containerd[1507]: time="2025-03-25T01:50:02.280395036Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 25 01:50:02.281895 containerd[1507]: time="2025-03-25T01:50:02.280904671Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 25 01:50:02.417987 containerd[1507]: time="2025-03-25T01:50:02.417907816Z" level=info msg="Start subscribing containerd event" Mar 25 01:50:02.423343 containerd[1507]: time="2025-03-25T01:50:02.421464865Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 25 01:50:02.423343 containerd[1507]: time="2025-03-25T01:50:02.421509178Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 25 01:50:02.425485 containerd[1507]: time="2025-03-25T01:50:02.425456350Z" level=info msg="Start recovering state" Mar 25 01:50:02.425628 containerd[1507]: time="2025-03-25T01:50:02.425615839Z" level=info msg="Start event monitor" Mar 25 01:50:02.425704 containerd[1507]: time="2025-03-25T01:50:02.425689958Z" level=info msg="Start cni network conf syncer for default" Mar 25 01:50:02.425743 containerd[1507]: time="2025-03-25T01:50:02.425734952Z" level=info msg="Start streaming server" Mar 25 01:50:02.425809 containerd[1507]: time="2025-03-25T01:50:02.425798872Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 25 01:50:02.425870 containerd[1507]: time="2025-03-25T01:50:02.425845710Z" level=info msg="runtime interface starting up..." Mar 25 01:50:02.425919 containerd[1507]: time="2025-03-25T01:50:02.425909149Z" level=info msg="starting plugins..." Mar 25 01:50:02.425965 containerd[1507]: time="2025-03-25T01:50:02.425955846Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 25 01:50:02.426225 systemd[1]: Started containerd.service - containerd container runtime. Mar 25 01:50:02.435781 containerd[1507]: time="2025-03-25T01:50:02.433147321Z" level=info msg="containerd successfully booted in 0.194967s" Mar 25 01:50:02.546875 tar[1490]: linux-amd64/LICENSE Mar 25 01:50:02.547066 tar[1490]: linux-amd64/README.md Mar 25 01:50:02.563472 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 25 01:50:02.625708 sshd_keygen[1505]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 25 01:50:02.644196 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 25 01:50:02.646593 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 25 01:50:02.654917 systemd[1]: issuegen.service: Deactivated successfully. Mar 25 01:50:02.655055 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 25 01:50:02.657667 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 25 01:50:02.668308 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 25 01:50:02.670518 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 25 01:50:02.673884 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 25 01:50:02.674581 systemd[1]: Reached target getty.target - Login Prompts. Mar 25 01:50:03.487499 systemd-networkd[1400]: eth0: Gained IPv6LL Mar 25 01:50:03.488259 systemd-timesyncd[1429]: Network configuration changed, trying to establish connection. Mar 25 01:50:03.490314 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 25 01:50:03.491847 systemd[1]: Reached target network-online.target - Network is Online. Mar 25 01:50:03.500947 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:50:03.505524 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 25 01:50:03.529638 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 25 01:50:03.615490 systemd-networkd[1400]: eth1: Gained IPv6LL Mar 25 01:50:03.615983 systemd-timesyncd[1429]: Network configuration changed, trying to establish connection. Mar 25 01:50:04.242119 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:50:04.246771 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 25 01:50:04.247093 (kubelet)[1616]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:50:04.250705 systemd[1]: Startup finished in 1.140s (kernel) + 6.161s (initrd) + 4.460s (userspace) = 11.762s. Mar 25 01:50:04.787406 kubelet[1616]: E0325 01:50:04.787300 1616 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:50:04.789452 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:50:04.789571 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:50:04.789864 systemd[1]: kubelet.service: Consumed 770ms CPU time, 244.8M memory peak. Mar 25 01:50:14.837578 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 25 01:50:14.840147 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:50:14.960516 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:50:14.970605 (kubelet)[1635]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:50:15.013130 kubelet[1635]: E0325 01:50:15.013071 1635 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:50:15.016794 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:50:15.016967 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:50:15.017217 systemd[1]: kubelet.service: Consumed 142ms CPU time, 97.8M memory peak. Mar 25 01:50:25.087178 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 25 01:50:25.088949 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:50:25.195375 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:50:25.200565 (kubelet)[1651]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:50:25.238817 kubelet[1651]: E0325 01:50:25.238779 1651 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:50:25.240347 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:50:25.240481 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:50:25.240801 systemd[1]: kubelet.service: Consumed 111ms CPU time, 98.7M memory peak. Mar 25 01:50:34.066868 systemd-timesyncd[1429]: Contacted time server 167.235.69.67:123 (2.flatcar.pool.ntp.org). Mar 25 01:50:34.066983 systemd-timesyncd[1429]: Initial clock synchronization to Tue 2025-03-25 01:50:33.868794 UTC. Mar 25 01:50:35.337088 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 25 01:50:35.338681 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:50:35.425997 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:50:35.428466 (kubelet)[1668]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:50:35.456899 kubelet[1668]: E0325 01:50:35.456848 1668 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:50:35.459062 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:50:35.459180 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:50:35.459429 systemd[1]: kubelet.service: Consumed 97ms CPU time, 95.7M memory peak. Mar 25 01:50:42.122727 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 25 01:50:42.124643 systemd[1]: Started sshd@0-95.217.13.107:22-154.117.199.5:19802.service - OpenSSH per-connection server daemon (154.117.199.5:19802). Mar 25 01:50:42.720169 sshd[1677]: Connection closed by 154.117.199.5 port 19802 [preauth] Mar 25 01:50:42.721535 systemd[1]: sshd@0-95.217.13.107:22-154.117.199.5:19802.service: Deactivated successfully. Mar 25 01:50:45.587284 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Mar 25 01:50:45.588966 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:50:45.684913 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:50:45.689895 (kubelet)[1689]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:50:45.748015 kubelet[1689]: E0325 01:50:45.747967 1689 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:50:45.749758 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:50:45.749897 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:50:45.750343 systemd[1]: kubelet.service: Consumed 140ms CPU time, 98.4M memory peak. Mar 25 01:50:47.533573 update_engine[1484]: I20250325 01:50:47.533401 1484 update_attempter.cc:509] Updating boot flags... Mar 25 01:50:47.586390 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (1706) Mar 25 01:50:47.636061 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (1709) Mar 25 01:50:55.837122 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Mar 25 01:50:55.839251 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:50:55.927934 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:50:55.933611 (kubelet)[1723]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:50:55.967995 kubelet[1723]: E0325 01:50:55.967930 1723 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:50:55.970105 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:50:55.970220 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:50:55.970571 systemd[1]: kubelet.service: Consumed 108ms CPU time, 99.8M memory peak. Mar 25 01:51:06.087485 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. Mar 25 01:51:06.089760 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:51:06.209222 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:51:06.219531 (kubelet)[1739]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:51:06.248744 kubelet[1739]: E0325 01:51:06.248692 1739 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:51:06.250501 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:51:06.250639 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:51:06.251266 systemd[1]: kubelet.service: Consumed 116ms CPU time, 95.9M memory peak. Mar 25 01:51:16.337598 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 7. Mar 25 01:51:16.339781 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:51:16.446267 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:51:16.449239 (kubelet)[1755]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:51:16.481050 kubelet[1755]: E0325 01:51:16.480996 1755 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:51:16.483177 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:51:16.483294 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:51:16.483532 systemd[1]: kubelet.service: Consumed 113ms CPU time, 97.4M memory peak. Mar 25 01:51:26.587111 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 8. Mar 25 01:51:26.588571 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:51:26.682135 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:51:26.687660 (kubelet)[1771]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:51:26.719453 kubelet[1771]: E0325 01:51:26.719399 1771 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:51:26.721604 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:51:26.721725 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:51:26.721973 systemd[1]: kubelet.service: Consumed 107ms CPU time, 97.7M memory peak. Mar 25 01:51:36.837037 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 9. Mar 25 01:51:36.838520 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:51:36.943122 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:51:36.946300 (kubelet)[1787]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:51:36.980256 kubelet[1787]: E0325 01:51:36.980197 1787 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:51:36.982552 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:51:36.982676 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:51:36.982924 systemd[1]: kubelet.service: Consumed 115ms CPU time, 97.8M memory peak. Mar 25 01:51:47.087189 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 10. Mar 25 01:51:47.088741 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:51:47.206354 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:51:47.225686 (kubelet)[1803]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:51:47.260748 kubelet[1803]: E0325 01:51:47.260696 1803 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:51:47.262559 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:51:47.262684 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:51:47.262943 systemd[1]: kubelet.service: Consumed 115ms CPU time, 95.7M memory peak. Mar 25 01:51:48.442947 systemd[1]: Started sshd@1-95.217.13.107:22-139.178.68.195:40694.service - OpenSSH per-connection server daemon (139.178.68.195:40694). Mar 25 01:51:49.453516 sshd[1812]: Accepted publickey for core from 139.178.68.195 port 40694 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:51:49.456959 sshd-session[1812]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:51:49.476075 systemd-logind[1483]: New session 1 of user core. Mar 25 01:51:49.477008 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 25 01:51:49.480685 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 25 01:51:49.509051 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 25 01:51:49.512844 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 25 01:51:49.525747 (systemd)[1816]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 25 01:51:49.529552 systemd-logind[1483]: New session c1 of user core. Mar 25 01:51:49.675705 systemd[1816]: Queued start job for default target default.target. Mar 25 01:51:49.685091 systemd[1816]: Created slice app.slice - User Application Slice. Mar 25 01:51:49.685230 systemd[1816]: Reached target paths.target - Paths. Mar 25 01:51:49.685273 systemd[1816]: Reached target timers.target - Timers. Mar 25 01:51:49.686352 systemd[1816]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 25 01:51:49.706665 systemd[1816]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 25 01:51:49.706864 systemd[1816]: Reached target sockets.target - Sockets. Mar 25 01:51:49.706900 systemd[1816]: Reached target basic.target - Basic System. Mar 25 01:51:49.706928 systemd[1816]: Reached target default.target - Main User Target. Mar 25 01:51:49.706948 systemd[1816]: Startup finished in 169ms. Mar 25 01:51:49.707258 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 25 01:51:49.720664 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 25 01:51:50.420379 systemd[1]: Started sshd@2-95.217.13.107:22-139.178.68.195:40708.service - OpenSSH per-connection server daemon (139.178.68.195:40708). Mar 25 01:51:51.397895 sshd[1827]: Accepted publickey for core from 139.178.68.195 port 40708 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:51:51.399502 sshd-session[1827]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:51:51.404346 systemd-logind[1483]: New session 2 of user core. Mar 25 01:51:51.413489 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 25 01:51:52.078491 sshd[1829]: Connection closed by 139.178.68.195 port 40708 Mar 25 01:51:52.079229 sshd-session[1827]: pam_unix(sshd:session): session closed for user core Mar 25 01:51:52.082490 systemd[1]: sshd@2-95.217.13.107:22-139.178.68.195:40708.service: Deactivated successfully. Mar 25 01:51:52.084114 systemd[1]: session-2.scope: Deactivated successfully. Mar 25 01:51:52.085765 systemd-logind[1483]: Session 2 logged out. Waiting for processes to exit. Mar 25 01:51:52.086954 systemd-logind[1483]: Removed session 2. Mar 25 01:51:52.246221 systemd[1]: Started sshd@3-95.217.13.107:22-139.178.68.195:40724.service - OpenSSH per-connection server daemon (139.178.68.195:40724). Mar 25 01:51:53.227585 sshd[1835]: Accepted publickey for core from 139.178.68.195 port 40724 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:51:53.228928 sshd-session[1835]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:51:53.232910 systemd-logind[1483]: New session 3 of user core. Mar 25 01:51:53.239485 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 25 01:51:53.897060 sshd[1837]: Connection closed by 139.178.68.195 port 40724 Mar 25 01:51:53.898004 sshd-session[1835]: pam_unix(sshd:session): session closed for user core Mar 25 01:51:53.902995 systemd[1]: sshd@3-95.217.13.107:22-139.178.68.195:40724.service: Deactivated successfully. Mar 25 01:51:53.906314 systemd[1]: session-3.scope: Deactivated successfully. Mar 25 01:51:53.908905 systemd-logind[1483]: Session 3 logged out. Waiting for processes to exit. Mar 25 01:51:53.910691 systemd-logind[1483]: Removed session 3. Mar 25 01:51:54.064758 systemd[1]: Started sshd@4-95.217.13.107:22-139.178.68.195:40738.service - OpenSSH per-connection server daemon (139.178.68.195:40738). Mar 25 01:51:55.041742 sshd[1843]: Accepted publickey for core from 139.178.68.195 port 40738 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:51:55.042900 sshd-session[1843]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:51:55.046748 systemd-logind[1483]: New session 4 of user core. Mar 25 01:51:55.056463 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 25 01:51:55.715661 sshd[1845]: Connection closed by 139.178.68.195 port 40738 Mar 25 01:51:55.716317 sshd-session[1843]: pam_unix(sshd:session): session closed for user core Mar 25 01:51:55.718731 systemd[1]: sshd@4-95.217.13.107:22-139.178.68.195:40738.service: Deactivated successfully. Mar 25 01:51:55.720592 systemd[1]: session-4.scope: Deactivated successfully. Mar 25 01:51:55.721442 systemd-logind[1483]: Session 4 logged out. Waiting for processes to exit. Mar 25 01:51:55.722312 systemd-logind[1483]: Removed session 4. Mar 25 01:51:55.883564 systemd[1]: Started sshd@5-95.217.13.107:22-139.178.68.195:42176.service - OpenSSH per-connection server daemon (139.178.68.195:42176). Mar 25 01:51:56.865639 sshd[1851]: Accepted publickey for core from 139.178.68.195 port 42176 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:51:56.866811 sshd-session[1851]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:51:56.871897 systemd-logind[1483]: New session 5 of user core. Mar 25 01:51:56.877475 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 25 01:51:57.337589 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 11. Mar 25 01:51:57.339724 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:51:57.394169 sudo[1857]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 25 01:51:57.394442 sudo[1857]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:51:57.409603 sudo[1857]: pam_unix(sudo:session): session closed for user root Mar 25 01:51:57.483388 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:51:57.491585 (kubelet)[1864]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:51:57.523846 kubelet[1864]: E0325 01:51:57.523800 1864 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:51:57.525876 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:51:57.525995 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:51:57.526376 systemd[1]: kubelet.service: Consumed 139ms CPU time, 97.8M memory peak. Mar 25 01:51:57.566769 sshd[1853]: Connection closed by 139.178.68.195 port 42176 Mar 25 01:51:57.567350 sshd-session[1851]: pam_unix(sshd:session): session closed for user core Mar 25 01:51:57.569667 systemd[1]: sshd@5-95.217.13.107:22-139.178.68.195:42176.service: Deactivated successfully. Mar 25 01:51:57.571040 systemd[1]: session-5.scope: Deactivated successfully. Mar 25 01:51:57.571976 systemd-logind[1483]: Session 5 logged out. Waiting for processes to exit. Mar 25 01:51:57.572855 systemd-logind[1483]: Removed session 5. Mar 25 01:51:57.745014 systemd[1]: Started sshd@6-95.217.13.107:22-139.178.68.195:42192.service - OpenSSH per-connection server daemon (139.178.68.195:42192). Mar 25 01:51:58.760890 sshd[1877]: Accepted publickey for core from 139.178.68.195 port 42192 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:51:58.763272 sshd-session[1877]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:51:58.770153 systemd-logind[1483]: New session 6 of user core. Mar 25 01:51:58.779568 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 25 01:51:59.289646 sudo[1881]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 25 01:51:59.289911 sudo[1881]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:51:59.293234 sudo[1881]: pam_unix(sudo:session): session closed for user root Mar 25 01:51:59.298112 sudo[1880]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 25 01:51:59.298482 sudo[1880]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:51:59.308183 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 25 01:51:59.353830 augenrules[1903]: No rules Mar 25 01:51:59.354804 systemd[1]: audit-rules.service: Deactivated successfully. Mar 25 01:51:59.355081 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 25 01:51:59.356749 sudo[1880]: pam_unix(sudo:session): session closed for user root Mar 25 01:51:59.518197 sshd[1879]: Connection closed by 139.178.68.195 port 42192 Mar 25 01:51:59.518723 sshd-session[1877]: pam_unix(sshd:session): session closed for user core Mar 25 01:51:59.521276 systemd[1]: sshd@6-95.217.13.107:22-139.178.68.195:42192.service: Deactivated successfully. Mar 25 01:51:59.522751 systemd[1]: session-6.scope: Deactivated successfully. Mar 25 01:51:59.523946 systemd-logind[1483]: Session 6 logged out. Waiting for processes to exit. Mar 25 01:51:59.524866 systemd-logind[1483]: Removed session 6. Mar 25 01:51:59.692697 systemd[1]: Started sshd@7-95.217.13.107:22-139.178.68.195:42208.service - OpenSSH per-connection server daemon (139.178.68.195:42208). Mar 25 01:52:00.716621 sshd[1912]: Accepted publickey for core from 139.178.68.195 port 42208 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:52:00.718905 sshd-session[1912]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:52:00.727134 systemd-logind[1483]: New session 7 of user core. Mar 25 01:52:00.734802 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 25 01:52:01.250439 sudo[1915]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 25 01:52:01.250898 sudo[1915]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:52:01.562224 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 25 01:52:01.581620 (dockerd)[1933]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 25 01:52:01.791624 dockerd[1933]: time="2025-03-25T01:52:01.790833598Z" level=info msg="Starting up" Mar 25 01:52:01.793741 dockerd[1933]: time="2025-03-25T01:52:01.793719127Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 25 01:52:01.842557 dockerd[1933]: time="2025-03-25T01:52:01.842520811Z" level=info msg="Loading containers: start." Mar 25 01:52:01.971365 kernel: Initializing XFRM netlink socket Mar 25 01:52:02.040951 systemd-networkd[1400]: docker0: Link UP Mar 25 01:52:02.080267 dockerd[1933]: time="2025-03-25T01:52:02.080216932Z" level=info msg="Loading containers: done." Mar 25 01:52:02.093498 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2048011775-merged.mount: Deactivated successfully. Mar 25 01:52:02.094862 dockerd[1933]: time="2025-03-25T01:52:02.094823249Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 25 01:52:02.094946 dockerd[1933]: time="2025-03-25T01:52:02.094924157Z" level=info msg="Docker daemon" commit=c710b88579fcb5e0d53f96dcae976d79323b9166 containerd-snapshotter=false storage-driver=overlay2 version=27.4.1 Mar 25 01:52:02.095031 dockerd[1933]: time="2025-03-25T01:52:02.095010197Z" level=info msg="Daemon has completed initialization" Mar 25 01:52:02.117667 dockerd[1933]: time="2025-03-25T01:52:02.117340506Z" level=info msg="API listen on /run/docker.sock" Mar 25 01:52:02.117752 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 25 01:52:03.195277 containerd[1507]: time="2025-03-25T01:52:03.195238394Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.11\"" Mar 25 01:52:03.765839 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4060782816.mount: Deactivated successfully. Mar 25 01:52:05.370145 containerd[1507]: time="2025-03-25T01:52:05.370096306Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:52:05.371014 containerd[1507]: time="2025-03-25T01:52:05.370867443Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.30.11: active requests=0, bytes read=32674667" Mar 25 01:52:05.371814 containerd[1507]: time="2025-03-25T01:52:05.371740501Z" level=info msg="ImageCreate event name:\"sha256:4db5a05c271eac8f5da2f95895ea1ccb9a38f48db3135ba3bdfe35941a396ea8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:52:05.373818 containerd[1507]: time="2025-03-25T01:52:05.373786007Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:77c54346965036acc7ac95c3200597ede36db9246179248dde21c1a3ecc1caf0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:52:05.374470 containerd[1507]: time="2025-03-25T01:52:05.374451619Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.30.11\" with image id \"sha256:4db5a05c271eac8f5da2f95895ea1ccb9a38f48db3135ba3bdfe35941a396ea8\", repo tag \"registry.k8s.io/kube-apiserver:v1.30.11\", repo digest \"registry.k8s.io/kube-apiserver@sha256:77c54346965036acc7ac95c3200597ede36db9246179248dde21c1a3ecc1caf0\", size \"32671373\" in 2.179179101s" Mar 25 01:52:05.374658 containerd[1507]: time="2025-03-25T01:52:05.374530205Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.11\" returns image reference \"sha256:4db5a05c271eac8f5da2f95895ea1ccb9a38f48db3135ba3bdfe35941a396ea8\"" Mar 25 01:52:05.388488 containerd[1507]: time="2025-03-25T01:52:05.388452701Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.11\"" Mar 25 01:52:07.141086 containerd[1507]: time="2025-03-25T01:52:07.141017032Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:52:07.142033 containerd[1507]: time="2025-03-25T01:52:07.142009923Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.30.11: active requests=0, bytes read=29619794" Mar 25 01:52:07.143017 containerd[1507]: time="2025-03-25T01:52:07.142983789Z" level=info msg="ImageCreate event name:\"sha256:de1025c2d496829d3250130380737609ffcdd10a4dce6f2dcd03f23a85a15e6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:52:07.144937 containerd[1507]: time="2025-03-25T01:52:07.144903379Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:d8874f3fb45591ecdac67a3035c730808f18b3ab13147495c7d77eb1960d4f6f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:52:07.145692 containerd[1507]: time="2025-03-25T01:52:07.145579943Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.30.11\" with image id \"sha256:de1025c2d496829d3250130380737609ffcdd10a4dce6f2dcd03f23a85a15e6a\", repo tag \"registry.k8s.io/kube-controller-manager:v1.30.11\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:d8874f3fb45591ecdac67a3035c730808f18b3ab13147495c7d77eb1960d4f6f\", size \"31107380\" in 1.757097475s" Mar 25 01:52:07.145692 containerd[1507]: time="2025-03-25T01:52:07.145613334Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.11\" returns image reference \"sha256:de1025c2d496829d3250130380737609ffcdd10a4dce6f2dcd03f23a85a15e6a\"" Mar 25 01:52:07.158462 containerd[1507]: time="2025-03-25T01:52:07.158435772Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.11\"" Mar 25 01:52:07.587593 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 12. Mar 25 01:52:07.590485 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:52:07.714786 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:52:07.724800 (kubelet)[2220]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:52:07.774703 kubelet[2220]: E0325 01:52:07.774450 2220 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:52:07.776000 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:52:07.776182 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:52:07.776602 systemd[1]: kubelet.service: Consumed 153ms CPU time, 97.6M memory peak. Mar 25 01:52:08.162813 containerd[1507]: time="2025-03-25T01:52:08.162744376Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:52:08.163728 containerd[1507]: time="2025-03-25T01:52:08.163674792Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.30.11: active requests=0, bytes read=17903331" Mar 25 01:52:08.164541 containerd[1507]: time="2025-03-25T01:52:08.164506022Z" level=info msg="ImageCreate event name:\"sha256:11492f0faf138e933cadd6f533f03e401da9a35e53711e833f18afa6b185b2b7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:52:08.166776 containerd[1507]: time="2025-03-25T01:52:08.166530448Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:c699f8c97ae7ec819c8bd878d3db104ba72fc440d810d9030e09286b696017b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:52:08.167227 containerd[1507]: time="2025-03-25T01:52:08.167189357Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.30.11\" with image id \"sha256:11492f0faf138e933cadd6f533f03e401da9a35e53711e833f18afa6b185b2b7\", repo tag \"registry.k8s.io/kube-scheduler:v1.30.11\", repo digest \"registry.k8s.io/kube-scheduler@sha256:c699f8c97ae7ec819c8bd878d3db104ba72fc440d810d9030e09286b696017b5\", size \"19390935\" in 1.008725754s" Mar 25 01:52:08.167227 containerd[1507]: time="2025-03-25T01:52:08.167222239Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.11\" returns image reference \"sha256:11492f0faf138e933cadd6f533f03e401da9a35e53711e833f18afa6b185b2b7\"" Mar 25 01:52:08.180501 containerd[1507]: time="2025-03-25T01:52:08.180463247Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.11\"" Mar 25 01:52:09.105024 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1056453006.mount: Deactivated successfully. Mar 25 01:52:09.378158 containerd[1507]: time="2025-03-25T01:52:09.378022403Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.30.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:52:09.379191 containerd[1507]: time="2025-03-25T01:52:09.379142181Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.30.11: active requests=0, bytes read=29185400" Mar 25 01:52:09.380447 containerd[1507]: time="2025-03-25T01:52:09.380406340Z" level=info msg="ImageCreate event name:\"sha256:01045f200a8856c3f5ccfa7be03d72274f1f16fc7a047659e709d603d5c019dc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:52:09.382199 containerd[1507]: time="2025-03-25T01:52:09.382160001Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:ea4da798040a18ed3f302e8d5f67307c7275a2a53bcf3d51bcec223acda84a55\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:52:09.382921 containerd[1507]: time="2025-03-25T01:52:09.382812539Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.30.11\" with image id \"sha256:01045f200a8856c3f5ccfa7be03d72274f1f16fc7a047659e709d603d5c019dc\", repo tag \"registry.k8s.io/kube-proxy:v1.30.11\", repo digest \"registry.k8s.io/kube-proxy@sha256:ea4da798040a18ed3f302e8d5f67307c7275a2a53bcf3d51bcec223acda84a55\", size \"29184391\" in 1.202199562s" Mar 25 01:52:09.382921 containerd[1507]: time="2025-03-25T01:52:09.382852863Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.11\" returns image reference \"sha256:01045f200a8856c3f5ccfa7be03d72274f1f16fc7a047659e709d603d5c019dc\"" Mar 25 01:52:09.399532 containerd[1507]: time="2025-03-25T01:52:09.399496561Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Mar 25 01:52:09.907945 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2314518123.mount: Deactivated successfully. Mar 25 01:52:10.681622 containerd[1507]: time="2025-03-25T01:52:10.681572168Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:52:10.682572 containerd[1507]: time="2025-03-25T01:52:10.682522844Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185843" Mar 25 01:52:10.683302 containerd[1507]: time="2025-03-25T01:52:10.683259020Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:52:10.685343 containerd[1507]: time="2025-03-25T01:52:10.685285798Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:52:10.686100 containerd[1507]: time="2025-03-25T01:52:10.685990185Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 1.286454792s" Mar 25 01:52:10.686100 containerd[1507]: time="2025-03-25T01:52:10.686016614Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" Mar 25 01:52:10.699424 containerd[1507]: time="2025-03-25T01:52:10.699392343Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Mar 25 01:52:11.158138 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1580486796.mount: Deactivated successfully. Mar 25 01:52:11.165560 containerd[1507]: time="2025-03-25T01:52:11.165501257Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:52:11.166211 containerd[1507]: time="2025-03-25T01:52:11.166164054Z" level=info msg="stop pulling image registry.k8s.io/pause:3.9: active requests=0, bytes read=322312" Mar 25 01:52:11.166997 containerd[1507]: time="2025-03-25T01:52:11.166941459Z" level=info msg="ImageCreate event name:\"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:52:11.168800 containerd[1507]: time="2025-03-25T01:52:11.168749069Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:52:11.169523 containerd[1507]: time="2025-03-25T01:52:11.169393943Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.9\" with image id \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\", repo tag \"registry.k8s.io/pause:3.9\", repo digest \"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\", size \"321520\" in 469.972967ms" Mar 25 01:52:11.169523 containerd[1507]: time="2025-03-25T01:52:11.169429361Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\"" Mar 25 01:52:11.194593 containerd[1507]: time="2025-03-25T01:52:11.194511299Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\"" Mar 25 01:52:11.806667 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3939494190.mount: Deactivated successfully. Mar 25 01:52:14.162922 containerd[1507]: time="2025-03-25T01:52:14.162857328Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.12-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:52:14.163896 containerd[1507]: time="2025-03-25T01:52:14.163737665Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.12-0: active requests=0, bytes read=57238653" Mar 25 01:52:14.164658 containerd[1507]: time="2025-03-25T01:52:14.164605481Z" level=info msg="ImageCreate event name:\"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:52:14.167574 containerd[1507]: time="2025-03-25T01:52:14.166761737Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:52:14.167574 containerd[1507]: time="2025-03-25T01:52:14.167457355Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.12-0\" with image id \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\", repo tag \"registry.k8s.io/etcd:3.5.12-0\", repo digest \"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\", size \"57236178\" in 2.972896524s" Mar 25 01:52:14.167574 containerd[1507]: time="2025-03-25T01:52:14.167482714Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\" returns image reference \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\"" Mar 25 01:52:16.536437 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:52:16.536557 systemd[1]: kubelet.service: Consumed 153ms CPU time, 97.6M memory peak. Mar 25 01:52:16.538160 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:52:16.553877 systemd[1]: Reload requested from client PID 2447 ('systemctl') (unit session-7.scope)... Mar 25 01:52:16.553998 systemd[1]: Reloading... Mar 25 01:52:16.639340 zram_generator::config[2492]: No configuration found. Mar 25 01:52:16.719975 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:52:16.801364 systemd[1]: Reloading finished in 246 ms. Mar 25 01:52:16.854051 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:52:16.855909 systemd[1]: kubelet.service: Deactivated successfully. Mar 25 01:52:16.856114 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:52:16.856154 systemd[1]: kubelet.service: Consumed 64ms CPU time, 83.7M memory peak. Mar 25 01:52:16.857503 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:52:16.946768 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:52:16.954828 (kubelet)[2548]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 25 01:52:16.993689 kubelet[2548]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:52:16.993689 kubelet[2548]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 25 01:52:16.993689 kubelet[2548]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:52:16.995812 kubelet[2548]: I0325 01:52:16.995760 2548 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 25 01:52:17.399943 kubelet[2548]: I0325 01:52:17.399884 2548 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Mar 25 01:52:17.399943 kubelet[2548]: I0325 01:52:17.399911 2548 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 25 01:52:17.400186 kubelet[2548]: I0325 01:52:17.400091 2548 server.go:927] "Client rotation is on, will bootstrap in background" Mar 25 01:52:17.420147 kubelet[2548]: I0325 01:52:17.420109 2548 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 25 01:52:17.425552 kubelet[2548]: E0325 01:52:17.425407 2548 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://95.217.13.107:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 95.217.13.107:6443: connect: connection refused Mar 25 01:52:17.439413 kubelet[2548]: I0325 01:52:17.439372 2548 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 25 01:52:17.441428 kubelet[2548]: I0325 01:52:17.441376 2548 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 25 01:52:17.442623 kubelet[2548]: I0325 01:52:17.441424 2548 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4284-0-0-a-abb47662e0","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Mar 25 01:52:17.443011 kubelet[2548]: I0325 01:52:17.442968 2548 topology_manager.go:138] "Creating topology manager with none policy" Mar 25 01:52:17.443011 kubelet[2548]: I0325 01:52:17.442988 2548 container_manager_linux.go:301] "Creating device plugin manager" Mar 25 01:52:17.443122 kubelet[2548]: I0325 01:52:17.443079 2548 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:52:17.443815 kubelet[2548]: I0325 01:52:17.443797 2548 kubelet.go:400] "Attempting to sync node with API server" Mar 25 01:52:17.443815 kubelet[2548]: I0325 01:52:17.443815 2548 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 25 01:52:17.445037 kubelet[2548]: I0325 01:52:17.443835 2548 kubelet.go:312] "Adding apiserver pod source" Mar 25 01:52:17.445037 kubelet[2548]: I0325 01:52:17.443848 2548 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 25 01:52:17.445852 kubelet[2548]: W0325 01:52:17.445822 2548 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://95.217.13.107:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 95.217.13.107:6443: connect: connection refused Mar 25 01:52:17.446136 kubelet[2548]: E0325 01:52:17.445949 2548 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://95.217.13.107:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 95.217.13.107:6443: connect: connection refused Mar 25 01:52:17.446136 kubelet[2548]: W0325 01:52:17.446032 2548 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://95.217.13.107:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-a-abb47662e0&limit=500&resourceVersion=0": dial tcp 95.217.13.107:6443: connect: connection refused Mar 25 01:52:17.446136 kubelet[2548]: E0325 01:52:17.446059 2548 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://95.217.13.107:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-a-abb47662e0&limit=500&resourceVersion=0": dial tcp 95.217.13.107:6443: connect: connection refused Mar 25 01:52:17.446600 kubelet[2548]: I0325 01:52:17.446560 2548 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 25 01:52:17.448025 kubelet[2548]: I0325 01:52:17.448006 2548 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 25 01:52:17.448772 kubelet[2548]: W0325 01:52:17.448103 2548 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 25 01:52:17.448772 kubelet[2548]: I0325 01:52:17.448670 2548 server.go:1264] "Started kubelet" Mar 25 01:52:17.456510 kubelet[2548]: I0325 01:52:17.456496 2548 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 25 01:52:17.462865 kubelet[2548]: I0325 01:52:17.462745 2548 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 25 01:52:17.464341 kubelet[2548]: E0325 01:52:17.462846 2548 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://95.217.13.107:6443/api/v1/namespaces/default/events\": dial tcp 95.217.13.107:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4284-0-0-a-abb47662e0.182fe8ca8c2fd75d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4284-0-0-a-abb47662e0,UID:ci-4284-0-0-a-abb47662e0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4284-0-0-a-abb47662e0,},FirstTimestamp:2025-03-25 01:52:17.448654685 +0000 UTC m=+0.490662341,LastTimestamp:2025-03-25 01:52:17.448654685 +0000 UTC m=+0.490662341,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4284-0-0-a-abb47662e0,}" Mar 25 01:52:17.464341 kubelet[2548]: I0325 01:52:17.463699 2548 server.go:455] "Adding debug handlers to kubelet server" Mar 25 01:52:17.464341 kubelet[2548]: I0325 01:52:17.463963 2548 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 25 01:52:17.464341 kubelet[2548]: I0325 01:52:17.464269 2548 volume_manager.go:291] "Starting Kubelet Volume Manager" Mar 25 01:52:17.466231 kubelet[2548]: I0325 01:52:17.466199 2548 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 25 01:52:17.468078 kubelet[2548]: I0325 01:52:17.468067 2548 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 25 01:52:17.468179 kubelet[2548]: I0325 01:52:17.468168 2548 reconciler.go:26] "Reconciler: start to sync state" Mar 25 01:52:17.469132 kubelet[2548]: I0325 01:52:17.469095 2548 factory.go:221] Registration of the systemd container factory successfully Mar 25 01:52:17.469226 kubelet[2548]: I0325 01:52:17.469198 2548 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 25 01:52:17.469727 kubelet[2548]: E0325 01:52:17.469687 2548 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://95.217.13.107:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-a-abb47662e0?timeout=10s\": dial tcp 95.217.13.107:6443: connect: connection refused" interval="200ms" Mar 25 01:52:17.472758 kubelet[2548]: W0325 01:52:17.472628 2548 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://95.217.13.107:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 95.217.13.107:6443: connect: connection refused Mar 25 01:52:17.472807 kubelet[2548]: E0325 01:52:17.472759 2548 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://95.217.13.107:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 95.217.13.107:6443: connect: connection refused Mar 25 01:52:17.473531 kubelet[2548]: I0325 01:52:17.473444 2548 factory.go:221] Registration of the containerd container factory successfully Mar 25 01:52:17.477464 kubelet[2548]: I0325 01:52:17.477375 2548 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 25 01:52:17.479051 kubelet[2548]: I0325 01:52:17.478671 2548 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 25 01:52:17.479051 kubelet[2548]: I0325 01:52:17.478690 2548 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 25 01:52:17.479051 kubelet[2548]: I0325 01:52:17.478702 2548 kubelet.go:2337] "Starting kubelet main sync loop" Mar 25 01:52:17.479051 kubelet[2548]: E0325 01:52:17.478729 2548 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 25 01:52:17.481791 kubelet[2548]: E0325 01:52:17.481777 2548 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 25 01:52:17.486112 kubelet[2548]: W0325 01:52:17.486086 2548 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://95.217.13.107:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 95.217.13.107:6443: connect: connection refused Mar 25 01:52:17.486224 kubelet[2548]: E0325 01:52:17.486214 2548 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://95.217.13.107:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 95.217.13.107:6443: connect: connection refused Mar 25 01:52:17.500217 kubelet[2548]: I0325 01:52:17.500203 2548 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 25 01:52:17.500282 kubelet[2548]: I0325 01:52:17.500272 2548 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 25 01:52:17.500376 kubelet[2548]: I0325 01:52:17.500367 2548 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:52:17.502452 kubelet[2548]: I0325 01:52:17.502441 2548 policy_none.go:49] "None policy: Start" Mar 25 01:52:17.503248 kubelet[2548]: I0325 01:52:17.503221 2548 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 25 01:52:17.503294 kubelet[2548]: I0325 01:52:17.503256 2548 state_mem.go:35] "Initializing new in-memory state store" Mar 25 01:52:17.514896 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 25 01:52:17.524535 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 25 01:52:17.528801 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 25 01:52:17.535053 kubelet[2548]: I0325 01:52:17.535026 2548 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 25 01:52:17.535498 kubelet[2548]: I0325 01:52:17.535278 2548 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 25 01:52:17.535498 kubelet[2548]: I0325 01:52:17.535410 2548 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 25 01:52:17.538965 kubelet[2548]: E0325 01:52:17.538934 2548 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4284-0-0-a-abb47662e0\" not found" Mar 25 01:52:17.566893 kubelet[2548]: I0325 01:52:17.566856 2548 kubelet_node_status.go:73] "Attempting to register node" node="ci-4284-0-0-a-abb47662e0" Mar 25 01:52:17.567299 kubelet[2548]: E0325 01:52:17.567267 2548 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://95.217.13.107:6443/api/v1/nodes\": dial tcp 95.217.13.107:6443: connect: connection refused" node="ci-4284-0-0-a-abb47662e0" Mar 25 01:52:17.578926 kubelet[2548]: I0325 01:52:17.578863 2548 topology_manager.go:215] "Topology Admit Handler" podUID="7416dec70bd659a5a030a2a5de8e0c81" podNamespace="kube-system" podName="kube-apiserver-ci-4284-0-0-a-abb47662e0" Mar 25 01:52:17.580579 kubelet[2548]: I0325 01:52:17.580547 2548 topology_manager.go:215] "Topology Admit Handler" podUID="e4f5561ccc95be3eac5c1a720ac3976c" podNamespace="kube-system" podName="kube-controller-manager-ci-4284-0-0-a-abb47662e0" Mar 25 01:52:17.582592 kubelet[2548]: I0325 01:52:17.582538 2548 topology_manager.go:215] "Topology Admit Handler" podUID="734bad9a0942d269f439490900de113f" podNamespace="kube-system" podName="kube-scheduler-ci-4284-0-0-a-abb47662e0" Mar 25 01:52:17.592611 systemd[1]: Created slice kubepods-burstable-pod7416dec70bd659a5a030a2a5de8e0c81.slice - libcontainer container kubepods-burstable-pod7416dec70bd659a5a030a2a5de8e0c81.slice. Mar 25 01:52:17.606750 systemd[1]: Created slice kubepods-burstable-pod734bad9a0942d269f439490900de113f.slice - libcontainer container kubepods-burstable-pod734bad9a0942d269f439490900de113f.slice. Mar 25 01:52:17.611409 systemd[1]: Created slice kubepods-burstable-pode4f5561ccc95be3eac5c1a720ac3976c.slice - libcontainer container kubepods-burstable-pode4f5561ccc95be3eac5c1a720ac3976c.slice. Mar 25 01:52:17.670900 kubelet[2548]: E0325 01:52:17.670759 2548 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://95.217.13.107:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-a-abb47662e0?timeout=10s\": dial tcp 95.217.13.107:6443: connect: connection refused" interval="400ms" Mar 25 01:52:17.769858 kubelet[2548]: I0325 01:52:17.769767 2548 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7416dec70bd659a5a030a2a5de8e0c81-k8s-certs\") pod \"kube-apiserver-ci-4284-0-0-a-abb47662e0\" (UID: \"7416dec70bd659a5a030a2a5de8e0c81\") " pod="kube-system/kube-apiserver-ci-4284-0-0-a-abb47662e0" Mar 25 01:52:17.770263 kubelet[2548]: I0325 01:52:17.769892 2548 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7416dec70bd659a5a030a2a5de8e0c81-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4284-0-0-a-abb47662e0\" (UID: \"7416dec70bd659a5a030a2a5de8e0c81\") " pod="kube-system/kube-apiserver-ci-4284-0-0-a-abb47662e0" Mar 25 01:52:17.770263 kubelet[2548]: I0325 01:52:17.770100 2548 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/734bad9a0942d269f439490900de113f-kubeconfig\") pod \"kube-scheduler-ci-4284-0-0-a-abb47662e0\" (UID: \"734bad9a0942d269f439490900de113f\") " pod="kube-system/kube-scheduler-ci-4284-0-0-a-abb47662e0" Mar 25 01:52:17.770263 kubelet[2548]: I0325 01:52:17.770191 2548 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e4f5561ccc95be3eac5c1a720ac3976c-kubeconfig\") pod \"kube-controller-manager-ci-4284-0-0-a-abb47662e0\" (UID: \"e4f5561ccc95be3eac5c1a720ac3976c\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-a-abb47662e0" Mar 25 01:52:17.770263 kubelet[2548]: I0325 01:52:17.770224 2548 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e4f5561ccc95be3eac5c1a720ac3976c-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4284-0-0-a-abb47662e0\" (UID: \"e4f5561ccc95be3eac5c1a720ac3976c\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-a-abb47662e0" Mar 25 01:52:17.770263 kubelet[2548]: I0325 01:52:17.770253 2548 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7416dec70bd659a5a030a2a5de8e0c81-ca-certs\") pod \"kube-apiserver-ci-4284-0-0-a-abb47662e0\" (UID: \"7416dec70bd659a5a030a2a5de8e0c81\") " pod="kube-system/kube-apiserver-ci-4284-0-0-a-abb47662e0" Mar 25 01:52:17.770465 kubelet[2548]: I0325 01:52:17.770284 2548 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e4f5561ccc95be3eac5c1a720ac3976c-ca-certs\") pod \"kube-controller-manager-ci-4284-0-0-a-abb47662e0\" (UID: \"e4f5561ccc95be3eac5c1a720ac3976c\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-a-abb47662e0" Mar 25 01:52:17.770465 kubelet[2548]: I0325 01:52:17.770380 2548 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/e4f5561ccc95be3eac5c1a720ac3976c-flexvolume-dir\") pod \"kube-controller-manager-ci-4284-0-0-a-abb47662e0\" (UID: \"e4f5561ccc95be3eac5c1a720ac3976c\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-a-abb47662e0" Mar 25 01:52:17.770465 kubelet[2548]: I0325 01:52:17.770411 2548 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e4f5561ccc95be3eac5c1a720ac3976c-k8s-certs\") pod \"kube-controller-manager-ci-4284-0-0-a-abb47662e0\" (UID: \"e4f5561ccc95be3eac5c1a720ac3976c\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-a-abb47662e0" Mar 25 01:52:17.770868 kubelet[2548]: I0325 01:52:17.770827 2548 kubelet_node_status.go:73] "Attempting to register node" node="ci-4284-0-0-a-abb47662e0" Mar 25 01:52:17.771204 kubelet[2548]: E0325 01:52:17.771180 2548 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://95.217.13.107:6443/api/v1/nodes\": dial tcp 95.217.13.107:6443: connect: connection refused" node="ci-4284-0-0-a-abb47662e0" Mar 25 01:52:17.908185 containerd[1507]: time="2025-03-25T01:52:17.908116596Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4284-0-0-a-abb47662e0,Uid:7416dec70bd659a5a030a2a5de8e0c81,Namespace:kube-system,Attempt:0,}" Mar 25 01:52:17.910891 containerd[1507]: time="2025-03-25T01:52:17.910562646Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4284-0-0-a-abb47662e0,Uid:734bad9a0942d269f439490900de113f,Namespace:kube-system,Attempt:0,}" Mar 25 01:52:17.913978 containerd[1507]: time="2025-03-25T01:52:17.913842424Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4284-0-0-a-abb47662e0,Uid:e4f5561ccc95be3eac5c1a720ac3976c,Namespace:kube-system,Attempt:0,}" Mar 25 01:52:18.071637 kubelet[2548]: E0325 01:52:18.071468 2548 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://95.217.13.107:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-a-abb47662e0?timeout=10s\": dial tcp 95.217.13.107:6443: connect: connection refused" interval="800ms" Mar 25 01:52:18.173908 kubelet[2548]: I0325 01:52:18.173838 2548 kubelet_node_status.go:73] "Attempting to register node" node="ci-4284-0-0-a-abb47662e0" Mar 25 01:52:18.174198 kubelet[2548]: E0325 01:52:18.174166 2548 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://95.217.13.107:6443/api/v1/nodes\": dial tcp 95.217.13.107:6443: connect: connection refused" node="ci-4284-0-0-a-abb47662e0" Mar 25 01:52:18.262076 kubelet[2548]: W0325 01:52:18.261965 2548 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://95.217.13.107:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 95.217.13.107:6443: connect: connection refused Mar 25 01:52:18.262076 kubelet[2548]: E0325 01:52:18.262075 2548 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://95.217.13.107:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 95.217.13.107:6443: connect: connection refused Mar 25 01:52:18.344955 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3807595310.mount: Deactivated successfully. Mar 25 01:52:18.356534 containerd[1507]: time="2025-03-25T01:52:18.356447522Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:52:18.359267 containerd[1507]: time="2025-03-25T01:52:18.359160576Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321160" Mar 25 01:52:18.365499 containerd[1507]: time="2025-03-25T01:52:18.365434317Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:52:18.367077 containerd[1507]: time="2025-03-25T01:52:18.367042008Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:52:18.368321 containerd[1507]: time="2025-03-25T01:52:18.368252430Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:52:18.369395 containerd[1507]: time="2025-03-25T01:52:18.369272500Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 25 01:52:18.370836 containerd[1507]: time="2025-03-25T01:52:18.370756067Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Mar 25 01:52:18.372485 containerd[1507]: time="2025-03-25T01:52:18.372288726Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:52:18.376277 containerd[1507]: time="2025-03-25T01:52:18.375386398Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 457.110162ms" Mar 25 01:52:18.376832 containerd[1507]: time="2025-03-25T01:52:18.376776508Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 464.813262ms" Mar 25 01:52:18.379383 containerd[1507]: time="2025-03-25T01:52:18.379257333Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 462.287269ms" Mar 25 01:52:18.467950 kubelet[2548]: W0325 01:52:18.467861 2548 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://95.217.13.107:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-a-abb47662e0&limit=500&resourceVersion=0": dial tcp 95.217.13.107:6443: connect: connection refused Mar 25 01:52:18.467950 kubelet[2548]: E0325 01:52:18.467918 2548 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://95.217.13.107:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-a-abb47662e0&limit=500&resourceVersion=0": dial tcp 95.217.13.107:6443: connect: connection refused Mar 25 01:52:18.474023 containerd[1507]: time="2025-03-25T01:52:18.473875257Z" level=info msg="connecting to shim b1aef21d8edb847a99d9d4f5bf2143e7a9a3d083e26d04143c1f9b4093c1456c" address="unix:///run/containerd/s/b84647a3b3ee4c7d3fac4c94c653f136bcd596a7a8cf7579644f69574baaf7c1" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:52:18.477378 containerd[1507]: time="2025-03-25T01:52:18.477351585Z" level=info msg="connecting to shim 27efbd170daddf7a684fc1a58da2d326e761132d4cf2bdf7e556526ab97d8035" address="unix:///run/containerd/s/21e8c51615753f15e3a931ee78f875ef1fd245dcc48407508db7302d1f1b1031" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:52:18.478105 containerd[1507]: time="2025-03-25T01:52:18.478081606Z" level=info msg="connecting to shim 6bc52c5edd64f5f0201f8362908dc1ab705434616527cacdffb4e4822da73982" address="unix:///run/containerd/s/c96e7abfe5d88154cb0de11c1014c27792a0a47d985d0be97adff7be8584abb3" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:52:18.531470 systemd[1]: Started cri-containerd-b1aef21d8edb847a99d9d4f5bf2143e7a9a3d083e26d04143c1f9b4093c1456c.scope - libcontainer container b1aef21d8edb847a99d9d4f5bf2143e7a9a3d083e26d04143c1f9b4093c1456c. Mar 25 01:52:18.535282 systemd[1]: Started cri-containerd-27efbd170daddf7a684fc1a58da2d326e761132d4cf2bdf7e556526ab97d8035.scope - libcontainer container 27efbd170daddf7a684fc1a58da2d326e761132d4cf2bdf7e556526ab97d8035. Mar 25 01:52:18.537456 systemd[1]: Started cri-containerd-6bc52c5edd64f5f0201f8362908dc1ab705434616527cacdffb4e4822da73982.scope - libcontainer container 6bc52c5edd64f5f0201f8362908dc1ab705434616527cacdffb4e4822da73982. Mar 25 01:52:18.595927 containerd[1507]: time="2025-03-25T01:52:18.595817833Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4284-0-0-a-abb47662e0,Uid:734bad9a0942d269f439490900de113f,Namespace:kube-system,Attempt:0,} returns sandbox id \"6bc52c5edd64f5f0201f8362908dc1ab705434616527cacdffb4e4822da73982\"" Mar 25 01:52:18.597827 containerd[1507]: time="2025-03-25T01:52:18.597666902Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4284-0-0-a-abb47662e0,Uid:e4f5561ccc95be3eac5c1a720ac3976c,Namespace:kube-system,Attempt:0,} returns sandbox id \"b1aef21d8edb847a99d9d4f5bf2143e7a9a3d083e26d04143c1f9b4093c1456c\"" Mar 25 01:52:18.602458 containerd[1507]: time="2025-03-25T01:52:18.601788260Z" level=info msg="CreateContainer within sandbox \"6bc52c5edd64f5f0201f8362908dc1ab705434616527cacdffb4e4822da73982\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 25 01:52:18.602934 containerd[1507]: time="2025-03-25T01:52:18.602594246Z" level=info msg="CreateContainer within sandbox \"b1aef21d8edb847a99d9d4f5bf2143e7a9a3d083e26d04143c1f9b4093c1456c\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 25 01:52:18.607745 containerd[1507]: time="2025-03-25T01:52:18.607717260Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4284-0-0-a-abb47662e0,Uid:7416dec70bd659a5a030a2a5de8e0c81,Namespace:kube-system,Attempt:0,} returns sandbox id \"27efbd170daddf7a684fc1a58da2d326e761132d4cf2bdf7e556526ab97d8035\"" Mar 25 01:52:18.610010 containerd[1507]: time="2025-03-25T01:52:18.609726160Z" level=info msg="CreateContainer within sandbox \"27efbd170daddf7a684fc1a58da2d326e761132d4cf2bdf7e556526ab97d8035\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 25 01:52:18.617443 containerd[1507]: time="2025-03-25T01:52:18.617418235Z" level=info msg="Container fb18c04002e30d6f75e6492c51d5ef348e67a2c46c3fee6fd454bcea84945441: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:52:18.619056 containerd[1507]: time="2025-03-25T01:52:18.618623055Z" level=info msg="Container 3bb9d638c02e4da8a99821df5c06807ff3faadcbf5014d7e1eeef267c5af0eb5: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:52:18.620912 containerd[1507]: time="2025-03-25T01:52:18.620880196Z" level=info msg="Container 4bab651f38320a9625b803f73347c106b4ba04903b4df16a7be544ec913c2c24: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:52:18.627510 containerd[1507]: time="2025-03-25T01:52:18.627482429Z" level=info msg="CreateContainer within sandbox \"b1aef21d8edb847a99d9d4f5bf2143e7a9a3d083e26d04143c1f9b4093c1456c\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"fb18c04002e30d6f75e6492c51d5ef348e67a2c46c3fee6fd454bcea84945441\"" Mar 25 01:52:18.628149 containerd[1507]: time="2025-03-25T01:52:18.628124394Z" level=info msg="StartContainer for \"fb18c04002e30d6f75e6492c51d5ef348e67a2c46c3fee6fd454bcea84945441\"" Mar 25 01:52:18.629664 containerd[1507]: time="2025-03-25T01:52:18.629636455Z" level=info msg="connecting to shim fb18c04002e30d6f75e6492c51d5ef348e67a2c46c3fee6fd454bcea84945441" address="unix:///run/containerd/s/b84647a3b3ee4c7d3fac4c94c653f136bcd596a7a8cf7579644f69574baaf7c1" protocol=ttrpc version=3 Mar 25 01:52:18.633354 containerd[1507]: time="2025-03-25T01:52:18.633221668Z" level=info msg="CreateContainer within sandbox \"6bc52c5edd64f5f0201f8362908dc1ab705434616527cacdffb4e4822da73982\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"3bb9d638c02e4da8a99821df5c06807ff3faadcbf5014d7e1eeef267c5af0eb5\"" Mar 25 01:52:18.633827 containerd[1507]: time="2025-03-25T01:52:18.633809731Z" level=info msg="CreateContainer within sandbox \"27efbd170daddf7a684fc1a58da2d326e761132d4cf2bdf7e556526ab97d8035\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"4bab651f38320a9625b803f73347c106b4ba04903b4df16a7be544ec913c2c24\"" Mar 25 01:52:18.634134 containerd[1507]: time="2025-03-25T01:52:18.634084892Z" level=info msg="StartContainer for \"3bb9d638c02e4da8a99821df5c06807ff3faadcbf5014d7e1eeef267c5af0eb5\"" Mar 25 01:52:18.634921 containerd[1507]: time="2025-03-25T01:52:18.634859397Z" level=info msg="connecting to shim 3bb9d638c02e4da8a99821df5c06807ff3faadcbf5014d7e1eeef267c5af0eb5" address="unix:///run/containerd/s/c96e7abfe5d88154cb0de11c1014c27792a0a47d985d0be97adff7be8584abb3" protocol=ttrpc version=3 Mar 25 01:52:18.635080 containerd[1507]: time="2025-03-25T01:52:18.635065917Z" level=info msg="StartContainer for \"4bab651f38320a9625b803f73347c106b4ba04903b4df16a7be544ec913c2c24\"" Mar 25 01:52:18.636075 containerd[1507]: time="2025-03-25T01:52:18.635748849Z" level=info msg="connecting to shim 4bab651f38320a9625b803f73347c106b4ba04903b4df16a7be544ec913c2c24" address="unix:///run/containerd/s/21e8c51615753f15e3a931ee78f875ef1fd245dcc48407508db7302d1f1b1031" protocol=ttrpc version=3 Mar 25 01:52:18.650589 systemd[1]: Started cri-containerd-fb18c04002e30d6f75e6492c51d5ef348e67a2c46c3fee6fd454bcea84945441.scope - libcontainer container fb18c04002e30d6f75e6492c51d5ef348e67a2c46c3fee6fd454bcea84945441. Mar 25 01:52:18.654840 systemd[1]: Started cri-containerd-3bb9d638c02e4da8a99821df5c06807ff3faadcbf5014d7e1eeef267c5af0eb5.scope - libcontainer container 3bb9d638c02e4da8a99821df5c06807ff3faadcbf5014d7e1eeef267c5af0eb5. Mar 25 01:52:18.656266 systemd[1]: Started cri-containerd-4bab651f38320a9625b803f73347c106b4ba04903b4df16a7be544ec913c2c24.scope - libcontainer container 4bab651f38320a9625b803f73347c106b4ba04903b4df16a7be544ec913c2c24. Mar 25 01:52:18.705057 containerd[1507]: time="2025-03-25T01:52:18.705024962Z" level=info msg="StartContainer for \"fb18c04002e30d6f75e6492c51d5ef348e67a2c46c3fee6fd454bcea84945441\" returns successfully" Mar 25 01:52:18.738183 containerd[1507]: time="2025-03-25T01:52:18.738075710Z" level=info msg="StartContainer for \"3bb9d638c02e4da8a99821df5c06807ff3faadcbf5014d7e1eeef267c5af0eb5\" returns successfully" Mar 25 01:52:18.741639 containerd[1507]: time="2025-03-25T01:52:18.741446869Z" level=info msg="StartContainer for \"4bab651f38320a9625b803f73347c106b4ba04903b4df16a7be544ec913c2c24\" returns successfully" Mar 25 01:52:18.793894 kubelet[2548]: W0325 01:52:18.793786 2548 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://95.217.13.107:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 95.217.13.107:6443: connect: connection refused Mar 25 01:52:18.793894 kubelet[2548]: E0325 01:52:18.793853 2548 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://95.217.13.107:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 95.217.13.107:6443: connect: connection refused Mar 25 01:52:18.858968 kubelet[2548]: W0325 01:52:18.858687 2548 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://95.217.13.107:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 95.217.13.107:6443: connect: connection refused Mar 25 01:52:18.858968 kubelet[2548]: E0325 01:52:18.858760 2548 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://95.217.13.107:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 95.217.13.107:6443: connect: connection refused Mar 25 01:52:18.872388 kubelet[2548]: E0325 01:52:18.871987 2548 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://95.217.13.107:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-a-abb47662e0?timeout=10s\": dial tcp 95.217.13.107:6443: connect: connection refused" interval="1.6s" Mar 25 01:52:18.977211 kubelet[2548]: I0325 01:52:18.976926 2548 kubelet_node_status.go:73] "Attempting to register node" node="ci-4284-0-0-a-abb47662e0" Mar 25 01:52:18.977211 kubelet[2548]: E0325 01:52:18.977143 2548 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://95.217.13.107:6443/api/v1/nodes\": dial tcp 95.217.13.107:6443: connect: connection refused" node="ci-4284-0-0-a-abb47662e0" Mar 25 01:52:20.478575 kubelet[2548]: E0325 01:52:20.478505 2548 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4284-0-0-a-abb47662e0\" not found" node="ci-4284-0-0-a-abb47662e0" Mar 25 01:52:20.572570 kubelet[2548]: E0325 01:52:20.572501 2548 csi_plugin.go:308] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "ci-4284-0-0-a-abb47662e0" not found Mar 25 01:52:20.580111 kubelet[2548]: I0325 01:52:20.580066 2548 kubelet_node_status.go:73] "Attempting to register node" node="ci-4284-0-0-a-abb47662e0" Mar 25 01:52:20.597015 kubelet[2548]: I0325 01:52:20.596909 2548 kubelet_node_status.go:76] "Successfully registered node" node="ci-4284-0-0-a-abb47662e0" Mar 25 01:52:20.607573 kubelet[2548]: E0325 01:52:20.607492 2548 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4284-0-0-a-abb47662e0\" not found" Mar 25 01:52:20.708177 kubelet[2548]: E0325 01:52:20.708127 2548 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4284-0-0-a-abb47662e0\" not found" Mar 25 01:52:20.809199 kubelet[2548]: E0325 01:52:20.809083 2548 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4284-0-0-a-abb47662e0\" not found" Mar 25 01:52:20.909714 kubelet[2548]: E0325 01:52:20.909663 2548 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4284-0-0-a-abb47662e0\" not found" Mar 25 01:52:21.010654 kubelet[2548]: E0325 01:52:21.010613 2548 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4284-0-0-a-abb47662e0\" not found" Mar 25 01:52:21.111534 kubelet[2548]: E0325 01:52:21.111462 2548 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4284-0-0-a-abb47662e0\" not found" Mar 25 01:52:21.212467 kubelet[2548]: E0325 01:52:21.212298 2548 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4284-0-0-a-abb47662e0\" not found" Mar 25 01:52:21.450366 kubelet[2548]: I0325 01:52:21.449160 2548 apiserver.go:52] "Watching apiserver" Mar 25 01:52:21.468651 kubelet[2548]: I0325 01:52:21.468593 2548 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 25 01:52:22.385562 systemd[1]: Reload requested from client PID 2818 ('systemctl') (unit session-7.scope)... Mar 25 01:52:22.385581 systemd[1]: Reloading... Mar 25 01:52:22.486383 zram_generator::config[2875]: No configuration found. Mar 25 01:52:22.559904 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:52:22.654391 systemd[1]: Reloading finished in 268 ms. Mar 25 01:52:22.671684 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:52:22.692280 systemd[1]: kubelet.service: Deactivated successfully. Mar 25 01:52:22.692534 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:52:22.692605 systemd[1]: kubelet.service: Consumed 833ms CPU time, 110.2M memory peak. Mar 25 01:52:22.694300 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:52:22.801836 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:52:22.811636 (kubelet)[2914]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 25 01:52:22.856213 kubelet[2914]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:52:22.856213 kubelet[2914]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 25 01:52:22.856213 kubelet[2914]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:52:22.856213 kubelet[2914]: I0325 01:52:22.855235 2914 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 25 01:52:22.859169 kubelet[2914]: I0325 01:52:22.859127 2914 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Mar 25 01:52:22.859169 kubelet[2914]: I0325 01:52:22.859152 2914 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 25 01:52:22.859397 kubelet[2914]: I0325 01:52:22.859374 2914 server.go:927] "Client rotation is on, will bootstrap in background" Mar 25 01:52:22.860929 kubelet[2914]: I0325 01:52:22.860903 2914 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 25 01:52:22.862600 kubelet[2914]: I0325 01:52:22.862486 2914 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 25 01:52:22.868945 kubelet[2914]: I0325 01:52:22.868914 2914 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 25 01:52:22.869104 kubelet[2914]: I0325 01:52:22.869075 2914 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 25 01:52:22.869246 kubelet[2914]: I0325 01:52:22.869099 2914 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4284-0-0-a-abb47662e0","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Mar 25 01:52:22.869344 kubelet[2914]: I0325 01:52:22.869248 2914 topology_manager.go:138] "Creating topology manager with none policy" Mar 25 01:52:22.869344 kubelet[2914]: I0325 01:52:22.869257 2914 container_manager_linux.go:301] "Creating device plugin manager" Mar 25 01:52:22.870004 kubelet[2914]: I0325 01:52:22.869986 2914 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:52:22.870100 kubelet[2914]: I0325 01:52:22.870079 2914 kubelet.go:400] "Attempting to sync node with API server" Mar 25 01:52:22.870100 kubelet[2914]: I0325 01:52:22.870094 2914 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 25 01:52:22.870169 kubelet[2914]: I0325 01:52:22.870110 2914 kubelet.go:312] "Adding apiserver pod source" Mar 25 01:52:22.870169 kubelet[2914]: I0325 01:52:22.870123 2914 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 25 01:52:22.873515 kubelet[2914]: I0325 01:52:22.873404 2914 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 25 01:52:22.873566 kubelet[2914]: I0325 01:52:22.873542 2914 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 25 01:52:22.875637 kubelet[2914]: I0325 01:52:22.875616 2914 server.go:1264] "Started kubelet" Mar 25 01:52:22.877722 kubelet[2914]: I0325 01:52:22.877622 2914 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 25 01:52:22.887833 kubelet[2914]: I0325 01:52:22.887808 2914 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 25 01:52:22.889138 kubelet[2914]: I0325 01:52:22.889109 2914 server.go:455] "Adding debug handlers to kubelet server" Mar 25 01:52:22.891070 kubelet[2914]: I0325 01:52:22.889809 2914 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 25 01:52:22.891070 kubelet[2914]: I0325 01:52:22.889968 2914 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 25 01:52:22.891481 kubelet[2914]: I0325 01:52:22.891453 2914 volume_manager.go:291] "Starting Kubelet Volume Manager" Mar 25 01:52:22.891564 kubelet[2914]: I0325 01:52:22.891547 2914 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 25 01:52:22.891662 kubelet[2914]: I0325 01:52:22.891646 2914 reconciler.go:26] "Reconciler: start to sync state" Mar 25 01:52:22.893389 kubelet[2914]: E0325 01:52:22.893375 2914 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 25 01:52:22.894640 kubelet[2914]: I0325 01:52:22.894628 2914 factory.go:221] Registration of the systemd container factory successfully Mar 25 01:52:22.894810 kubelet[2914]: I0325 01:52:22.894796 2914 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 25 01:52:22.896797 kubelet[2914]: I0325 01:52:22.896764 2914 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 25 01:52:22.897193 kubelet[2914]: I0325 01:52:22.897182 2914 factory.go:221] Registration of the containerd container factory successfully Mar 25 01:52:22.897673 kubelet[2914]: I0325 01:52:22.897631 2914 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 25 01:52:22.897673 kubelet[2914]: I0325 01:52:22.897658 2914 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 25 01:52:22.897673 kubelet[2914]: I0325 01:52:22.897672 2914 kubelet.go:2337] "Starting kubelet main sync loop" Mar 25 01:52:22.897771 kubelet[2914]: E0325 01:52:22.897704 2914 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 25 01:52:22.933484 kubelet[2914]: I0325 01:52:22.933364 2914 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 25 01:52:22.933484 kubelet[2914]: I0325 01:52:22.933380 2914 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 25 01:52:22.933484 kubelet[2914]: I0325 01:52:22.933394 2914 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:52:22.934471 kubelet[2914]: I0325 01:52:22.934421 2914 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 25 01:52:22.934471 kubelet[2914]: I0325 01:52:22.934450 2914 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 25 01:52:22.934471 kubelet[2914]: I0325 01:52:22.934467 2914 policy_none.go:49] "None policy: Start" Mar 25 01:52:22.935181 kubelet[2914]: I0325 01:52:22.935123 2914 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 25 01:52:22.935181 kubelet[2914]: I0325 01:52:22.935138 2914 state_mem.go:35] "Initializing new in-memory state store" Mar 25 01:52:22.935321 kubelet[2914]: I0325 01:52:22.935285 2914 state_mem.go:75] "Updated machine memory state" Mar 25 01:52:22.942429 kubelet[2914]: I0325 01:52:22.942404 2914 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 25 01:52:22.942586 kubelet[2914]: I0325 01:52:22.942547 2914 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 25 01:52:22.942646 kubelet[2914]: I0325 01:52:22.942627 2914 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 25 01:52:22.995013 kubelet[2914]: I0325 01:52:22.994979 2914 kubelet_node_status.go:73] "Attempting to register node" node="ci-4284-0-0-a-abb47662e0" Mar 25 01:52:22.998868 kubelet[2914]: I0325 01:52:22.998205 2914 topology_manager.go:215] "Topology Admit Handler" podUID="7416dec70bd659a5a030a2a5de8e0c81" podNamespace="kube-system" podName="kube-apiserver-ci-4284-0-0-a-abb47662e0" Mar 25 01:52:22.998868 kubelet[2914]: I0325 01:52:22.998272 2914 topology_manager.go:215] "Topology Admit Handler" podUID="e4f5561ccc95be3eac5c1a720ac3976c" podNamespace="kube-system" podName="kube-controller-manager-ci-4284-0-0-a-abb47662e0" Mar 25 01:52:22.998868 kubelet[2914]: I0325 01:52:22.998319 2914 topology_manager.go:215] "Topology Admit Handler" podUID="734bad9a0942d269f439490900de113f" podNamespace="kube-system" podName="kube-scheduler-ci-4284-0-0-a-abb47662e0" Mar 25 01:52:23.007217 kubelet[2914]: E0325 01:52:23.007192 2914 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4284-0-0-a-abb47662e0\" already exists" pod="kube-system/kube-apiserver-ci-4284-0-0-a-abb47662e0" Mar 25 01:52:23.010879 kubelet[2914]: I0325 01:52:23.010846 2914 kubelet_node_status.go:112] "Node was previously registered" node="ci-4284-0-0-a-abb47662e0" Mar 25 01:52:23.011022 kubelet[2914]: I0325 01:52:23.010901 2914 kubelet_node_status.go:76] "Successfully registered node" node="ci-4284-0-0-a-abb47662e0" Mar 25 01:52:23.192739 kubelet[2914]: I0325 01:52:23.192483 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7416dec70bd659a5a030a2a5de8e0c81-k8s-certs\") pod \"kube-apiserver-ci-4284-0-0-a-abb47662e0\" (UID: \"7416dec70bd659a5a030a2a5de8e0c81\") " pod="kube-system/kube-apiserver-ci-4284-0-0-a-abb47662e0" Mar 25 01:52:23.192739 kubelet[2914]: I0325 01:52:23.192541 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e4f5561ccc95be3eac5c1a720ac3976c-k8s-certs\") pod \"kube-controller-manager-ci-4284-0-0-a-abb47662e0\" (UID: \"e4f5561ccc95be3eac5c1a720ac3976c\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-a-abb47662e0" Mar 25 01:52:23.192739 kubelet[2914]: I0325 01:52:23.192563 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e4f5561ccc95be3eac5c1a720ac3976c-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4284-0-0-a-abb47662e0\" (UID: \"e4f5561ccc95be3eac5c1a720ac3976c\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-a-abb47662e0" Mar 25 01:52:23.192739 kubelet[2914]: I0325 01:52:23.192584 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/734bad9a0942d269f439490900de113f-kubeconfig\") pod \"kube-scheduler-ci-4284-0-0-a-abb47662e0\" (UID: \"734bad9a0942d269f439490900de113f\") " pod="kube-system/kube-scheduler-ci-4284-0-0-a-abb47662e0" Mar 25 01:52:23.192739 kubelet[2914]: I0325 01:52:23.192599 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7416dec70bd659a5a030a2a5de8e0c81-ca-certs\") pod \"kube-apiserver-ci-4284-0-0-a-abb47662e0\" (UID: \"7416dec70bd659a5a030a2a5de8e0c81\") " pod="kube-system/kube-apiserver-ci-4284-0-0-a-abb47662e0" Mar 25 01:52:23.192991 kubelet[2914]: I0325 01:52:23.192615 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7416dec70bd659a5a030a2a5de8e0c81-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4284-0-0-a-abb47662e0\" (UID: \"7416dec70bd659a5a030a2a5de8e0c81\") " pod="kube-system/kube-apiserver-ci-4284-0-0-a-abb47662e0" Mar 25 01:52:23.192991 kubelet[2914]: I0325 01:52:23.192629 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e4f5561ccc95be3eac5c1a720ac3976c-ca-certs\") pod \"kube-controller-manager-ci-4284-0-0-a-abb47662e0\" (UID: \"e4f5561ccc95be3eac5c1a720ac3976c\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-a-abb47662e0" Mar 25 01:52:23.192991 kubelet[2914]: I0325 01:52:23.192671 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/e4f5561ccc95be3eac5c1a720ac3976c-flexvolume-dir\") pod \"kube-controller-manager-ci-4284-0-0-a-abb47662e0\" (UID: \"e4f5561ccc95be3eac5c1a720ac3976c\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-a-abb47662e0" Mar 25 01:52:23.192991 kubelet[2914]: I0325 01:52:23.192687 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e4f5561ccc95be3eac5c1a720ac3976c-kubeconfig\") pod \"kube-controller-manager-ci-4284-0-0-a-abb47662e0\" (UID: \"e4f5561ccc95be3eac5c1a720ac3976c\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-a-abb47662e0" Mar 25 01:52:23.873320 kubelet[2914]: I0325 01:52:23.873266 2914 apiserver.go:52] "Watching apiserver" Mar 25 01:52:23.891869 kubelet[2914]: I0325 01:52:23.891834 2914 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 25 01:52:23.935144 kubelet[2914]: E0325 01:52:23.935015 2914 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4284-0-0-a-abb47662e0\" already exists" pod="kube-system/kube-apiserver-ci-4284-0-0-a-abb47662e0" Mar 25 01:52:23.946572 kubelet[2914]: I0325 01:52:23.946319 2914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4284-0-0-a-abb47662e0" podStartSLOduration=1.946304095 podStartE2EDuration="1.946304095s" podCreationTimestamp="2025-03-25 01:52:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:52:23.944312191 +0000 UTC m=+1.127534032" watchObservedRunningTime="2025-03-25 01:52:23.946304095 +0000 UTC m=+1.129525945" Mar 25 01:52:23.960238 kubelet[2914]: I0325 01:52:23.960163 2914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4284-0-0-a-abb47662e0" podStartSLOduration=2.9601461799999997 podStartE2EDuration="2.96014618s" podCreationTimestamp="2025-03-25 01:52:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:52:23.952639351 +0000 UTC m=+1.135861201" watchObservedRunningTime="2025-03-25 01:52:23.96014618 +0000 UTC m=+1.143368021" Mar 25 01:52:23.968030 kubelet[2914]: I0325 01:52:23.967913 2914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4284-0-0-a-abb47662e0" podStartSLOduration=0.967898325 podStartE2EDuration="967.898325ms" podCreationTimestamp="2025-03-25 01:52:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:52:23.960419998 +0000 UTC m=+1.143641838" watchObservedRunningTime="2025-03-25 01:52:23.967898325 +0000 UTC m=+1.151120164" Mar 25 01:52:27.922460 sudo[1915]: pam_unix(sudo:session): session closed for user root Mar 25 01:52:28.084527 sshd[1914]: Connection closed by 139.178.68.195 port 42208 Mar 25 01:52:28.086394 sshd-session[1912]: pam_unix(sshd:session): session closed for user core Mar 25 01:52:28.092182 systemd[1]: sshd@7-95.217.13.107:22-139.178.68.195:42208.service: Deactivated successfully. Mar 25 01:52:28.094988 systemd[1]: session-7.scope: Deactivated successfully. Mar 25 01:52:28.095253 systemd[1]: session-7.scope: Consumed 3.678s CPU time, 183.5M memory peak. Mar 25 01:52:28.097621 systemd-logind[1483]: Session 7 logged out. Waiting for processes to exit. Mar 25 01:52:28.099241 systemd-logind[1483]: Removed session 7. Mar 25 01:52:37.805420 kubelet[2914]: I0325 01:52:37.805380 2914 topology_manager.go:215] "Topology Admit Handler" podUID="1d775481-b25e-4028-9722-a05f571be4ec" podNamespace="kube-system" podName="kube-proxy-mh8xn" Mar 25 01:52:37.813972 systemd[1]: Created slice kubepods-besteffort-pod1d775481_b25e_4028_9722_a05f571be4ec.slice - libcontainer container kubepods-besteffort-pod1d775481_b25e_4028_9722_a05f571be4ec.slice. Mar 25 01:52:37.857350 kubelet[2914]: I0325 01:52:37.856962 2914 topology_manager.go:215] "Topology Admit Handler" podUID="ea3b72b2-f324-4485-bdaf-f2b0ef8e196b" podNamespace="tigera-operator" podName="tigera-operator-6479d6dc54-9jtdf" Mar 25 01:52:37.863808 systemd[1]: Created slice kubepods-besteffort-podea3b72b2_f324_4485_bdaf_f2b0ef8e196b.slice - libcontainer container kubepods-besteffort-podea3b72b2_f324_4485_bdaf_f2b0ef8e196b.slice. Mar 25 01:52:37.871651 kubelet[2914]: I0325 01:52:37.871590 2914 kuberuntime_manager.go:1523] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 25 01:52:37.872339 containerd[1507]: time="2025-03-25T01:52:37.872114986Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 25 01:52:37.872581 kubelet[2914]: I0325 01:52:37.872299 2914 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 25 01:52:37.887310 kubelet[2914]: I0325 01:52:37.887233 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjldb\" (UniqueName: \"kubernetes.io/projected/ea3b72b2-f324-4485-bdaf-f2b0ef8e196b-kube-api-access-pjldb\") pod \"tigera-operator-6479d6dc54-9jtdf\" (UID: \"ea3b72b2-f324-4485-bdaf-f2b0ef8e196b\") " pod="tigera-operator/tigera-operator-6479d6dc54-9jtdf" Mar 25 01:52:37.887310 kubelet[2914]: I0325 01:52:37.887262 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv8zp\" (UniqueName: \"kubernetes.io/projected/1d775481-b25e-4028-9722-a05f571be4ec-kube-api-access-rv8zp\") pod \"kube-proxy-mh8xn\" (UID: \"1d775481-b25e-4028-9722-a05f571be4ec\") " pod="kube-system/kube-proxy-mh8xn" Mar 25 01:52:37.887310 kubelet[2914]: I0325 01:52:37.887308 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/ea3b72b2-f324-4485-bdaf-f2b0ef8e196b-var-lib-calico\") pod \"tigera-operator-6479d6dc54-9jtdf\" (UID: \"ea3b72b2-f324-4485-bdaf-f2b0ef8e196b\") " pod="tigera-operator/tigera-operator-6479d6dc54-9jtdf" Mar 25 01:52:37.887460 kubelet[2914]: I0325 01:52:37.887358 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/1d775481-b25e-4028-9722-a05f571be4ec-kube-proxy\") pod \"kube-proxy-mh8xn\" (UID: \"1d775481-b25e-4028-9722-a05f571be4ec\") " pod="kube-system/kube-proxy-mh8xn" Mar 25 01:52:37.887460 kubelet[2914]: I0325 01:52:37.887374 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/1d775481-b25e-4028-9722-a05f571be4ec-xtables-lock\") pod \"kube-proxy-mh8xn\" (UID: \"1d775481-b25e-4028-9722-a05f571be4ec\") " pod="kube-system/kube-proxy-mh8xn" Mar 25 01:52:37.887460 kubelet[2914]: I0325 01:52:37.887386 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1d775481-b25e-4028-9722-a05f571be4ec-lib-modules\") pod \"kube-proxy-mh8xn\" (UID: \"1d775481-b25e-4028-9722-a05f571be4ec\") " pod="kube-system/kube-proxy-mh8xn" Mar 25 01:52:38.122545 containerd[1507]: time="2025-03-25T01:52:38.122480913Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-mh8xn,Uid:1d775481-b25e-4028-9722-a05f571be4ec,Namespace:kube-system,Attempt:0,}" Mar 25 01:52:38.153965 containerd[1507]: time="2025-03-25T01:52:38.153560608Z" level=info msg="connecting to shim b7a78a41e87e327792e717e673a24a0049c73586bcfa7212d64e5f80653e43f0" address="unix:///run/containerd/s/beb940f5f37ef6bef990ae867834f8eca97bac948f3cc176ab5f26f6a598e3ed" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:52:38.174014 containerd[1507]: time="2025-03-25T01:52:38.173943735Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6479d6dc54-9jtdf,Uid:ea3b72b2-f324-4485-bdaf-f2b0ef8e196b,Namespace:tigera-operator,Attempt:0,}" Mar 25 01:52:38.198960 systemd[1]: Started cri-containerd-b7a78a41e87e327792e717e673a24a0049c73586bcfa7212d64e5f80653e43f0.scope - libcontainer container b7a78a41e87e327792e717e673a24a0049c73586bcfa7212d64e5f80653e43f0. Mar 25 01:52:38.205353 containerd[1507]: time="2025-03-25T01:52:38.205302195Z" level=info msg="connecting to shim ee7655b9156195a149adf1b4da997c9af8dc2f3fe721bb0f04d64ca34c28cbe3" address="unix:///run/containerd/s/c37122c0610307e9fc954239dda132903f94f2ac475ca6be7486ac3c8709b73d" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:52:38.222925 containerd[1507]: time="2025-03-25T01:52:38.222898478Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-mh8xn,Uid:1d775481-b25e-4028-9722-a05f571be4ec,Namespace:kube-system,Attempt:0,} returns sandbox id \"b7a78a41e87e327792e717e673a24a0049c73586bcfa7212d64e5f80653e43f0\"" Mar 25 01:52:38.226339 containerd[1507]: time="2025-03-25T01:52:38.226207804Z" level=info msg="CreateContainer within sandbox \"b7a78a41e87e327792e717e673a24a0049c73586bcfa7212d64e5f80653e43f0\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 25 01:52:38.230458 systemd[1]: Started cri-containerd-ee7655b9156195a149adf1b4da997c9af8dc2f3fe721bb0f04d64ca34c28cbe3.scope - libcontainer container ee7655b9156195a149adf1b4da997c9af8dc2f3fe721bb0f04d64ca34c28cbe3. Mar 25 01:52:38.239366 containerd[1507]: time="2025-03-25T01:52:38.239282799Z" level=info msg="Container c4c2139afd29b90abd473c71017368025e2af2d13044f4e716577be932a4f717: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:52:38.247588 containerd[1507]: time="2025-03-25T01:52:38.247448726Z" level=info msg="CreateContainer within sandbox \"b7a78a41e87e327792e717e673a24a0049c73586bcfa7212d64e5f80653e43f0\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"c4c2139afd29b90abd473c71017368025e2af2d13044f4e716577be932a4f717\"" Mar 25 01:52:38.248534 containerd[1507]: time="2025-03-25T01:52:38.248500185Z" level=info msg="StartContainer for \"c4c2139afd29b90abd473c71017368025e2af2d13044f4e716577be932a4f717\"" Mar 25 01:52:38.251279 containerd[1507]: time="2025-03-25T01:52:38.251183244Z" level=info msg="connecting to shim c4c2139afd29b90abd473c71017368025e2af2d13044f4e716577be932a4f717" address="unix:///run/containerd/s/beb940f5f37ef6bef990ae867834f8eca97bac948f3cc176ab5f26f6a598e3ed" protocol=ttrpc version=3 Mar 25 01:52:38.268436 systemd[1]: Started cri-containerd-c4c2139afd29b90abd473c71017368025e2af2d13044f4e716577be932a4f717.scope - libcontainer container c4c2139afd29b90abd473c71017368025e2af2d13044f4e716577be932a4f717. Mar 25 01:52:38.280096 containerd[1507]: time="2025-03-25T01:52:38.280057719Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6479d6dc54-9jtdf,Uid:ea3b72b2-f324-4485-bdaf-f2b0ef8e196b,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"ee7655b9156195a149adf1b4da997c9af8dc2f3fe721bb0f04d64ca34c28cbe3\"" Mar 25 01:52:38.281675 containerd[1507]: time="2025-03-25T01:52:38.281632934Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\"" Mar 25 01:52:38.310473 containerd[1507]: time="2025-03-25T01:52:38.310440494Z" level=info msg="StartContainer for \"c4c2139afd29b90abd473c71017368025e2af2d13044f4e716577be932a4f717\" returns successfully" Mar 25 01:52:38.968284 kubelet[2914]: I0325 01:52:38.967911 2914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-mh8xn" podStartSLOduration=1.96789578 podStartE2EDuration="1.96789578s" podCreationTimestamp="2025-03-25 01:52:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:52:38.967442888 +0000 UTC m=+16.150664728" watchObservedRunningTime="2025-03-25 01:52:38.96789578 +0000 UTC m=+16.151117620" Mar 25 01:52:41.481877 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3248837180.mount: Deactivated successfully. Mar 25 01:52:41.767224 containerd[1507]: time="2025-03-25T01:52:41.767124333Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:52:41.768022 containerd[1507]: time="2025-03-25T01:52:41.767941140Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.5: active requests=0, bytes read=21945008" Mar 25 01:52:41.768889 containerd[1507]: time="2025-03-25T01:52:41.768846983Z" level=info msg="ImageCreate event name:\"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:52:41.770598 containerd[1507]: time="2025-03-25T01:52:41.770564302Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:52:41.771167 containerd[1507]: time="2025-03-25T01:52:41.770906186Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.5\" with image id \"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\", repo tag \"quay.io/tigera/operator:v1.36.5\", repo digest \"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\", size \"21941003\" in 3.489250609s" Mar 25 01:52:41.771167 containerd[1507]: time="2025-03-25T01:52:41.770929288Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\" returns image reference \"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\"" Mar 25 01:52:41.777270 containerd[1507]: time="2025-03-25T01:52:41.777245148Z" level=info msg="CreateContainer within sandbox \"ee7655b9156195a149adf1b4da997c9af8dc2f3fe721bb0f04d64ca34c28cbe3\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 25 01:52:41.785305 containerd[1507]: time="2025-03-25T01:52:41.784819723Z" level=info msg="Container e7f97ec9cede0677a6edbfbaf700661da251254bb07c3b2ab1ffda9f3980fbed: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:52:41.788175 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount818637258.mount: Deactivated successfully. Mar 25 01:52:41.800913 containerd[1507]: time="2025-03-25T01:52:41.800884105Z" level=info msg="CreateContainer within sandbox \"ee7655b9156195a149adf1b4da997c9af8dc2f3fe721bb0f04d64ca34c28cbe3\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"e7f97ec9cede0677a6edbfbaf700661da251254bb07c3b2ab1ffda9f3980fbed\"" Mar 25 01:52:41.801340 containerd[1507]: time="2025-03-25T01:52:41.801280993Z" level=info msg="StartContainer for \"e7f97ec9cede0677a6edbfbaf700661da251254bb07c3b2ab1ffda9f3980fbed\"" Mar 25 01:52:41.802037 containerd[1507]: time="2025-03-25T01:52:41.801991008Z" level=info msg="connecting to shim e7f97ec9cede0677a6edbfbaf700661da251254bb07c3b2ab1ffda9f3980fbed" address="unix:///run/containerd/s/c37122c0610307e9fc954239dda132903f94f2ac475ca6be7486ac3c8709b73d" protocol=ttrpc version=3 Mar 25 01:52:41.820213 systemd[1]: Started cri-containerd-e7f97ec9cede0677a6edbfbaf700661da251254bb07c3b2ab1ffda9f3980fbed.scope - libcontainer container e7f97ec9cede0677a6edbfbaf700661da251254bb07c3b2ab1ffda9f3980fbed. Mar 25 01:52:41.857394 containerd[1507]: time="2025-03-25T01:52:41.857351025Z" level=info msg="StartContainer for \"e7f97ec9cede0677a6edbfbaf700661da251254bb07c3b2ab1ffda9f3980fbed\" returns successfully" Mar 25 01:52:42.012776 kubelet[2914]: I0325 01:52:42.012578 2914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6479d6dc54-9jtdf" podStartSLOduration=1.519592879 podStartE2EDuration="5.012314015s" podCreationTimestamp="2025-03-25 01:52:37 +0000 UTC" firstStartedPulling="2025-03-25 01:52:38.281215027 +0000 UTC m=+15.464436867" lastFinishedPulling="2025-03-25 01:52:41.773936164 +0000 UTC m=+18.957158003" observedRunningTime="2025-03-25 01:52:42.012143634 +0000 UTC m=+19.195365484" watchObservedRunningTime="2025-03-25 01:52:42.012314015 +0000 UTC m=+19.195535855" Mar 25 01:52:44.773356 kubelet[2914]: I0325 01:52:44.772572 2914 topology_manager.go:215] "Topology Admit Handler" podUID="f32b19ff-f5ea-4578-b3a8-f141b0932b5b" podNamespace="calico-system" podName="calico-typha-75484d86c8-4r2lp" Mar 25 01:52:44.780016 systemd[1]: Created slice kubepods-besteffort-podf32b19ff_f5ea_4578_b3a8_f141b0932b5b.slice - libcontainer container kubepods-besteffort-podf32b19ff_f5ea_4578_b3a8_f141b0932b5b.slice. Mar 25 01:52:44.846787 kubelet[2914]: I0325 01:52:44.846735 2914 topology_manager.go:215] "Topology Admit Handler" podUID="36897f3a-ec3a-49e4-9101-b172b0544813" podNamespace="calico-system" podName="calico-node-fbkbd" Mar 25 01:52:44.856195 systemd[1]: Created slice kubepods-besteffort-pod36897f3a_ec3a_49e4_9101_b172b0544813.slice - libcontainer container kubepods-besteffort-pod36897f3a_ec3a_49e4_9101_b172b0544813.slice. Mar 25 01:52:44.876167 kubelet[2914]: I0325 01:52:44.876122 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/f32b19ff-f5ea-4578-b3a8-f141b0932b5b-typha-certs\") pod \"calico-typha-75484d86c8-4r2lp\" (UID: \"f32b19ff-f5ea-4578-b3a8-f141b0932b5b\") " pod="calico-system/calico-typha-75484d86c8-4r2lp" Mar 25 01:52:44.876167 kubelet[2914]: I0325 01:52:44.876163 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m8pd\" (UniqueName: \"kubernetes.io/projected/f32b19ff-f5ea-4578-b3a8-f141b0932b5b-kube-api-access-4m8pd\") pod \"calico-typha-75484d86c8-4r2lp\" (UID: \"f32b19ff-f5ea-4578-b3a8-f141b0932b5b\") " pod="calico-system/calico-typha-75484d86c8-4r2lp" Mar 25 01:52:44.876338 kubelet[2914]: I0325 01:52:44.876180 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f32b19ff-f5ea-4578-b3a8-f141b0932b5b-tigera-ca-bundle\") pod \"calico-typha-75484d86c8-4r2lp\" (UID: \"f32b19ff-f5ea-4578-b3a8-f141b0932b5b\") " pod="calico-system/calico-typha-75484d86c8-4r2lp" Mar 25 01:52:44.969118 kubelet[2914]: I0325 01:52:44.969064 2914 topology_manager.go:215] "Topology Admit Handler" podUID="a86ff892-344f-4f10-be9e-1c061509165d" podNamespace="calico-system" podName="csi-node-driver-zsbnv" Mar 25 01:52:44.969428 kubelet[2914]: E0325 01:52:44.969393 2914 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zsbnv" podUID="a86ff892-344f-4f10-be9e-1c061509165d" Mar 25 01:52:44.976726 kubelet[2914]: I0325 01:52:44.976631 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/36897f3a-ec3a-49e4-9101-b172b0544813-cni-log-dir\") pod \"calico-node-fbkbd\" (UID: \"36897f3a-ec3a-49e4-9101-b172b0544813\") " pod="calico-system/calico-node-fbkbd" Mar 25 01:52:44.976726 kubelet[2914]: I0325 01:52:44.976674 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/36897f3a-ec3a-49e4-9101-b172b0544813-policysync\") pod \"calico-node-fbkbd\" (UID: \"36897f3a-ec3a-49e4-9101-b172b0544813\") " pod="calico-system/calico-node-fbkbd" Mar 25 01:52:44.976726 kubelet[2914]: I0325 01:52:44.976697 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flrp4\" (UniqueName: \"kubernetes.io/projected/36897f3a-ec3a-49e4-9101-b172b0544813-kube-api-access-flrp4\") pod \"calico-node-fbkbd\" (UID: \"36897f3a-ec3a-49e4-9101-b172b0544813\") " pod="calico-system/calico-node-fbkbd" Mar 25 01:52:44.976726 kubelet[2914]: I0325 01:52:44.976720 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/36897f3a-ec3a-49e4-9101-b172b0544813-lib-modules\") pod \"calico-node-fbkbd\" (UID: \"36897f3a-ec3a-49e4-9101-b172b0544813\") " pod="calico-system/calico-node-fbkbd" Mar 25 01:52:44.976907 kubelet[2914]: I0325 01:52:44.976740 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36897f3a-ec3a-49e4-9101-b172b0544813-tigera-ca-bundle\") pod \"calico-node-fbkbd\" (UID: \"36897f3a-ec3a-49e4-9101-b172b0544813\") " pod="calico-system/calico-node-fbkbd" Mar 25 01:52:44.976907 kubelet[2914]: I0325 01:52:44.976758 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/36897f3a-ec3a-49e4-9101-b172b0544813-var-lib-calico\") pod \"calico-node-fbkbd\" (UID: \"36897f3a-ec3a-49e4-9101-b172b0544813\") " pod="calico-system/calico-node-fbkbd" Mar 25 01:52:44.976907 kubelet[2914]: I0325 01:52:44.976776 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/36897f3a-ec3a-49e4-9101-b172b0544813-cni-bin-dir\") pod \"calico-node-fbkbd\" (UID: \"36897f3a-ec3a-49e4-9101-b172b0544813\") " pod="calico-system/calico-node-fbkbd" Mar 25 01:52:44.976907 kubelet[2914]: I0325 01:52:44.976818 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/36897f3a-ec3a-49e4-9101-b172b0544813-node-certs\") pod \"calico-node-fbkbd\" (UID: \"36897f3a-ec3a-49e4-9101-b172b0544813\") " pod="calico-system/calico-node-fbkbd" Mar 25 01:52:44.976907 kubelet[2914]: I0325 01:52:44.976855 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/36897f3a-ec3a-49e4-9101-b172b0544813-xtables-lock\") pod \"calico-node-fbkbd\" (UID: \"36897f3a-ec3a-49e4-9101-b172b0544813\") " pod="calico-system/calico-node-fbkbd" Mar 25 01:52:44.977010 kubelet[2914]: I0325 01:52:44.976875 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/36897f3a-ec3a-49e4-9101-b172b0544813-cni-net-dir\") pod \"calico-node-fbkbd\" (UID: \"36897f3a-ec3a-49e4-9101-b172b0544813\") " pod="calico-system/calico-node-fbkbd" Mar 25 01:52:44.977010 kubelet[2914]: I0325 01:52:44.976921 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/36897f3a-ec3a-49e4-9101-b172b0544813-var-run-calico\") pod \"calico-node-fbkbd\" (UID: \"36897f3a-ec3a-49e4-9101-b172b0544813\") " pod="calico-system/calico-node-fbkbd" Mar 25 01:52:44.977010 kubelet[2914]: I0325 01:52:44.976943 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/36897f3a-ec3a-49e4-9101-b172b0544813-flexvol-driver-host\") pod \"calico-node-fbkbd\" (UID: \"36897f3a-ec3a-49e4-9101-b172b0544813\") " pod="calico-system/calico-node-fbkbd" Mar 25 01:52:45.078251 kubelet[2914]: I0325 01:52:45.078137 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpdzz\" (UniqueName: \"kubernetes.io/projected/a86ff892-344f-4f10-be9e-1c061509165d-kube-api-access-kpdzz\") pod \"csi-node-driver-zsbnv\" (UID: \"a86ff892-344f-4f10-be9e-1c061509165d\") " pod="calico-system/csi-node-driver-zsbnv" Mar 25 01:52:45.078251 kubelet[2914]: I0325 01:52:45.078181 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a86ff892-344f-4f10-be9e-1c061509165d-socket-dir\") pod \"csi-node-driver-zsbnv\" (UID: \"a86ff892-344f-4f10-be9e-1c061509165d\") " pod="calico-system/csi-node-driver-zsbnv" Mar 25 01:52:45.078251 kubelet[2914]: I0325 01:52:45.078207 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a86ff892-344f-4f10-be9e-1c061509165d-registration-dir\") pod \"csi-node-driver-zsbnv\" (UID: \"a86ff892-344f-4f10-be9e-1c061509165d\") " pod="calico-system/csi-node-driver-zsbnv" Mar 25 01:52:45.078251 kubelet[2914]: I0325 01:52:45.078231 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/a86ff892-344f-4f10-be9e-1c061509165d-varrun\") pod \"csi-node-driver-zsbnv\" (UID: \"a86ff892-344f-4f10-be9e-1c061509165d\") " pod="calico-system/csi-node-driver-zsbnv" Mar 25 01:52:45.078251 kubelet[2914]: I0325 01:52:45.078245 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a86ff892-344f-4f10-be9e-1c061509165d-kubelet-dir\") pod \"csi-node-driver-zsbnv\" (UID: \"a86ff892-344f-4f10-be9e-1c061509165d\") " pod="calico-system/csi-node-driver-zsbnv" Mar 25 01:52:45.084597 kubelet[2914]: E0325 01:52:45.084515 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:45.084597 kubelet[2914]: W0325 01:52:45.084530 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:45.084597 kubelet[2914]: E0325 01:52:45.084577 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:45.087343 containerd[1507]: time="2025-03-25T01:52:45.085058405Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-75484d86c8-4r2lp,Uid:f32b19ff-f5ea-4578-b3a8-f141b0932b5b,Namespace:calico-system,Attempt:0,}" Mar 25 01:52:45.095475 kubelet[2914]: E0325 01:52:45.094994 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:45.095475 kubelet[2914]: W0325 01:52:45.095010 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:45.095475 kubelet[2914]: E0325 01:52:45.095441 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:45.116143 containerd[1507]: time="2025-03-25T01:52:45.116082504Z" level=info msg="connecting to shim d1849027fad62c5560b4722cfab5adb81844726191c81b2c2f432129e019e5e6" address="unix:///run/containerd/s/5576dc8e6875492f974c8861c14fb3ff932312252b3ca1e380efbc759b652066" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:52:45.137473 systemd[1]: Started cri-containerd-d1849027fad62c5560b4722cfab5adb81844726191c81b2c2f432129e019e5e6.scope - libcontainer container d1849027fad62c5560b4722cfab5adb81844726191c81b2c2f432129e019e5e6. Mar 25 01:52:45.159992 containerd[1507]: time="2025-03-25T01:52:45.159471872Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-fbkbd,Uid:36897f3a-ec3a-49e4-9101-b172b0544813,Namespace:calico-system,Attempt:0,}" Mar 25 01:52:45.179582 kubelet[2914]: E0325 01:52:45.179246 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:45.179582 kubelet[2914]: W0325 01:52:45.179284 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:45.179582 kubelet[2914]: E0325 01:52:45.179304 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:45.180140 kubelet[2914]: E0325 01:52:45.179961 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:45.180140 kubelet[2914]: W0325 01:52:45.179972 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:45.180216 kubelet[2914]: E0325 01:52:45.180147 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:45.180216 kubelet[2914]: W0325 01:52:45.180157 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:45.180216 kubelet[2914]: E0325 01:52:45.180169 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:45.181346 kubelet[2914]: E0325 01:52:45.180349 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:45.181346 kubelet[2914]: E0325 01:52:45.180510 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:45.181346 kubelet[2914]: W0325 01:52:45.180518 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:45.181346 kubelet[2914]: E0325 01:52:45.180530 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:45.181346 kubelet[2914]: E0325 01:52:45.180783 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:45.181346 kubelet[2914]: W0325 01:52:45.180791 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:45.181346 kubelet[2914]: E0325 01:52:45.180799 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:45.182892 kubelet[2914]: E0325 01:52:45.182873 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:45.182892 kubelet[2914]: W0325 01:52:45.182888 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:45.183364 kubelet[2914]: E0325 01:52:45.182973 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:45.183364 kubelet[2914]: E0325 01:52:45.183067 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:45.183364 kubelet[2914]: W0325 01:52:45.183075 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:45.183364 kubelet[2914]: E0325 01:52:45.183284 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:45.183769 kubelet[2914]: E0325 01:52:45.183428 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:45.183769 kubelet[2914]: W0325 01:52:45.183435 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:45.183769 kubelet[2914]: E0325 01:52:45.183456 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:45.183769 kubelet[2914]: E0325 01:52:45.183615 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:45.183769 kubelet[2914]: W0325 01:52:45.183623 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:45.183769 kubelet[2914]: E0325 01:52:45.183633 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:45.183999 kubelet[2914]: E0325 01:52:45.183789 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:45.183999 kubelet[2914]: W0325 01:52:45.183795 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:45.183999 kubelet[2914]: E0325 01:52:45.183811 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:45.183999 kubelet[2914]: E0325 01:52:45.183980 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:45.183999 kubelet[2914]: W0325 01:52:45.183986 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:45.184147 kubelet[2914]: E0325 01:52:45.184003 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:45.184147 kubelet[2914]: E0325 01:52:45.184130 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:45.184147 kubelet[2914]: W0325 01:52:45.184137 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:45.184231 kubelet[2914]: E0325 01:52:45.184155 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:45.186583 kubelet[2914]: E0325 01:52:45.184307 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:45.186583 kubelet[2914]: W0325 01:52:45.184316 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:45.186583 kubelet[2914]: E0325 01:52:45.184420 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:45.186583 kubelet[2914]: E0325 01:52:45.184520 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:45.186583 kubelet[2914]: W0325 01:52:45.184526 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:45.186583 kubelet[2914]: E0325 01:52:45.184535 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:45.186583 kubelet[2914]: E0325 01:52:45.184669 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:45.186583 kubelet[2914]: W0325 01:52:45.184675 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:45.186583 kubelet[2914]: E0325 01:52:45.184723 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:45.186583 kubelet[2914]: E0325 01:52:45.184790 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:45.186783 kubelet[2914]: W0325 01:52:45.184795 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:45.186783 kubelet[2914]: E0325 01:52:45.184876 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:45.186783 kubelet[2914]: E0325 01:52:45.184950 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:45.186783 kubelet[2914]: W0325 01:52:45.184956 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:45.186783 kubelet[2914]: E0325 01:52:45.184964 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:45.186783 kubelet[2914]: E0325 01:52:45.185094 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:45.186783 kubelet[2914]: W0325 01:52:45.185101 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:45.186783 kubelet[2914]: E0325 01:52:45.185110 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:45.186783 kubelet[2914]: E0325 01:52:45.185239 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:45.186783 kubelet[2914]: W0325 01:52:45.185246 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:45.186991 kubelet[2914]: E0325 01:52:45.185263 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:45.186991 kubelet[2914]: E0325 01:52:45.185539 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:45.186991 kubelet[2914]: W0325 01:52:45.185546 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:45.186991 kubelet[2914]: E0325 01:52:45.185555 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:45.186991 kubelet[2914]: E0325 01:52:45.185714 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:45.186991 kubelet[2914]: W0325 01:52:45.185721 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:45.186991 kubelet[2914]: E0325 01:52:45.185737 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:45.186991 kubelet[2914]: E0325 01:52:45.186036 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:45.186991 kubelet[2914]: W0325 01:52:45.186043 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:45.186991 kubelet[2914]: E0325 01:52:45.186052 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:45.187149 kubelet[2914]: E0325 01:52:45.186180 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:45.187149 kubelet[2914]: W0325 01:52:45.186187 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:45.187149 kubelet[2914]: E0325 01:52:45.186204 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:45.187149 kubelet[2914]: E0325 01:52:45.186384 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:45.187149 kubelet[2914]: W0325 01:52:45.186391 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:45.187149 kubelet[2914]: E0325 01:52:45.186400 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:45.187149 kubelet[2914]: E0325 01:52:45.186526 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:45.187149 kubelet[2914]: W0325 01:52:45.186533 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:45.187149 kubelet[2914]: E0325 01:52:45.186539 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:45.188382 containerd[1507]: time="2025-03-25T01:52:45.183482469Z" level=info msg="connecting to shim 54f6638a6f3a8352b869afcf72ccdaa9c8e7f0dccd7d7e68e252de80cf8692cd" address="unix:///run/containerd/s/c9d5e41c3cfb912a4eb86939bfeac9b1fc6988aff31dca8d423b177c26782245" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:52:45.200066 kubelet[2914]: E0325 01:52:45.200000 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:45.200066 kubelet[2914]: W0325 01:52:45.200015 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:45.200066 kubelet[2914]: E0325 01:52:45.200030 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:45.208063 containerd[1507]: time="2025-03-25T01:52:45.207858664Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-75484d86c8-4r2lp,Uid:f32b19ff-f5ea-4578-b3a8-f141b0932b5b,Namespace:calico-system,Attempt:0,} returns sandbox id \"d1849027fad62c5560b4722cfab5adb81844726191c81b2c2f432129e019e5e6\"" Mar 25 01:52:45.210666 containerd[1507]: time="2025-03-25T01:52:45.210647455Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\"" Mar 25 01:52:45.223455 systemd[1]: Started cri-containerd-54f6638a6f3a8352b869afcf72ccdaa9c8e7f0dccd7d7e68e252de80cf8692cd.scope - libcontainer container 54f6638a6f3a8352b869afcf72ccdaa9c8e7f0dccd7d7e68e252de80cf8692cd. Mar 25 01:52:45.253117 containerd[1507]: time="2025-03-25T01:52:45.252581108Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-fbkbd,Uid:36897f3a-ec3a-49e4-9101-b172b0544813,Namespace:calico-system,Attempt:0,} returns sandbox id \"54f6638a6f3a8352b869afcf72ccdaa9c8e7f0dccd7d7e68e252de80cf8692cd\"" Mar 25 01:52:46.899535 kubelet[2914]: E0325 01:52:46.898565 2914 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zsbnv" podUID="a86ff892-344f-4f10-be9e-1c061509165d" Mar 25 01:52:48.131094 containerd[1507]: time="2025-03-25T01:52:48.131045033Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:52:48.132340 containerd[1507]: time="2025-03-25T01:52:48.132269253Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.2: active requests=0, bytes read=30414075" Mar 25 01:52:48.133460 containerd[1507]: time="2025-03-25T01:52:48.133415005Z" level=info msg="ImageCreate event name:\"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:52:48.135136 containerd[1507]: time="2025-03-25T01:52:48.135118134Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:52:48.135774 containerd[1507]: time="2025-03-25T01:52:48.135497357Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.2\" with image id \"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\", size \"31907171\" in 2.924709577s" Mar 25 01:52:48.135774 containerd[1507]: time="2025-03-25T01:52:48.135524958Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\" returns image reference \"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\"" Mar 25 01:52:48.137052 containerd[1507]: time="2025-03-25T01:52:48.137024787Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\"" Mar 25 01:52:48.154653 containerd[1507]: time="2025-03-25T01:52:48.154394407Z" level=info msg="CreateContainer within sandbox \"d1849027fad62c5560b4722cfab5adb81844726191c81b2c2f432129e019e5e6\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 25 01:52:48.172496 containerd[1507]: time="2025-03-25T01:52:48.172476927Z" level=info msg="Container 5963fd0dcc94b4ffc490f88285aaf51e8bf0e6ca9b46b0c265d2a263fec459d7: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:52:48.186557 containerd[1507]: time="2025-03-25T01:52:48.186515641Z" level=info msg="CreateContainer within sandbox \"d1849027fad62c5560b4722cfab5adb81844726191c81b2c2f432129e019e5e6\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"5963fd0dcc94b4ffc490f88285aaf51e8bf0e6ca9b46b0c265d2a263fec459d7\"" Mar 25 01:52:48.188396 containerd[1507]: time="2025-03-25T01:52:48.188052748Z" level=info msg="StartContainer for \"5963fd0dcc94b4ffc490f88285aaf51e8bf0e6ca9b46b0c265d2a263fec459d7\"" Mar 25 01:52:48.189829 containerd[1507]: time="2025-03-25T01:52:48.189804308Z" level=info msg="connecting to shim 5963fd0dcc94b4ffc490f88285aaf51e8bf0e6ca9b46b0c265d2a263fec459d7" address="unix:///run/containerd/s/5576dc8e6875492f974c8861c14fb3ff932312252b3ca1e380efbc759b652066" protocol=ttrpc version=3 Mar 25 01:52:48.212452 systemd[1]: Started cri-containerd-5963fd0dcc94b4ffc490f88285aaf51e8bf0e6ca9b46b0c265d2a263fec459d7.scope - libcontainer container 5963fd0dcc94b4ffc490f88285aaf51e8bf0e6ca9b46b0c265d2a263fec459d7. Mar 25 01:52:48.262569 containerd[1507]: time="2025-03-25T01:52:48.262484567Z" level=info msg="StartContainer for \"5963fd0dcc94b4ffc490f88285aaf51e8bf0e6ca9b46b0c265d2a263fec459d7\" returns successfully" Mar 25 01:52:48.899006 kubelet[2914]: E0325 01:52:48.898549 2914 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zsbnv" podUID="a86ff892-344f-4f10-be9e-1c061509165d" Mar 25 01:52:49.034984 kubelet[2914]: I0325 01:52:49.032219 2914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-75484d86c8-4r2lp" podStartSLOduration=2.105328814 podStartE2EDuration="5.032195806s" podCreationTimestamp="2025-03-25 01:52:44 +0000 UTC" firstStartedPulling="2025-03-25 01:52:45.209966596 +0000 UTC m=+22.393188435" lastFinishedPulling="2025-03-25 01:52:48.136833587 +0000 UTC m=+25.320055427" observedRunningTime="2025-03-25 01:52:49.030814291 +0000 UTC m=+26.214036161" watchObservedRunningTime="2025-03-25 01:52:49.032195806 +0000 UTC m=+26.215417686" Mar 25 01:52:49.064541 kubelet[2914]: E0325 01:52:49.064490 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:49.064541 kubelet[2914]: W0325 01:52:49.064524 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:49.064541 kubelet[2914]: E0325 01:52:49.064547 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:49.065558 kubelet[2914]: E0325 01:52:49.064965 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:49.065558 kubelet[2914]: W0325 01:52:49.064993 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:49.065558 kubelet[2914]: E0325 01:52:49.065021 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:49.065558 kubelet[2914]: E0325 01:52:49.065384 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:49.065558 kubelet[2914]: W0325 01:52:49.065398 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:49.065558 kubelet[2914]: E0325 01:52:49.065422 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:49.065799 kubelet[2914]: E0325 01:52:49.065646 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:49.065799 kubelet[2914]: W0325 01:52:49.065656 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:49.065799 kubelet[2914]: E0325 01:52:49.065666 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:49.065972 kubelet[2914]: E0325 01:52:49.065821 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:49.065972 kubelet[2914]: W0325 01:52:49.065830 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:49.065972 kubelet[2914]: E0325 01:52:49.065838 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:49.066080 kubelet[2914]: E0325 01:52:49.066031 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:49.066080 kubelet[2914]: W0325 01:52:49.066041 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:49.066080 kubelet[2914]: E0325 01:52:49.066050 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:49.066193 kubelet[2914]: E0325 01:52:49.066181 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:49.066193 kubelet[2914]: W0325 01:52:49.066189 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:49.066314 kubelet[2914]: E0325 01:52:49.066197 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:49.066506 kubelet[2914]: E0325 01:52:49.066373 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:49.066506 kubelet[2914]: W0325 01:52:49.066381 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:49.066506 kubelet[2914]: E0325 01:52:49.066389 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:49.066678 kubelet[2914]: E0325 01:52:49.066535 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:49.066678 kubelet[2914]: W0325 01:52:49.066544 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:49.066678 kubelet[2914]: E0325 01:52:49.066552 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:49.066858 kubelet[2914]: E0325 01:52:49.066691 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:49.066858 kubelet[2914]: W0325 01:52:49.066699 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:49.066858 kubelet[2914]: E0325 01:52:49.066708 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:49.066993 kubelet[2914]: E0325 01:52:49.066930 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:49.066993 kubelet[2914]: W0325 01:52:49.066943 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:49.066993 kubelet[2914]: E0325 01:52:49.066956 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:49.067159 kubelet[2914]: E0325 01:52:49.067114 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:49.067159 kubelet[2914]: W0325 01:52:49.067135 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:49.067159 kubelet[2914]: E0325 01:52:49.067148 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:49.067519 kubelet[2914]: E0325 01:52:49.067300 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:49.067519 kubelet[2914]: W0325 01:52:49.067308 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:49.067519 kubelet[2914]: E0325 01:52:49.067316 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:49.067519 kubelet[2914]: E0325 01:52:49.067479 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:49.067519 kubelet[2914]: W0325 01:52:49.067487 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:49.067519 kubelet[2914]: E0325 01:52:49.067495 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:49.067833 kubelet[2914]: E0325 01:52:49.067652 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:49.067833 kubelet[2914]: W0325 01:52:49.067661 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:49.067833 kubelet[2914]: E0325 01:52:49.067668 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:49.109288 kubelet[2914]: E0325 01:52:49.109239 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:49.109288 kubelet[2914]: W0325 01:52:49.109278 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:49.109485 kubelet[2914]: E0325 01:52:49.109307 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:49.109954 kubelet[2914]: E0325 01:52:49.109910 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:49.109954 kubelet[2914]: W0325 01:52:49.109937 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:49.110174 kubelet[2914]: E0325 01:52:49.109965 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:49.110415 kubelet[2914]: E0325 01:52:49.110385 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:49.110475 kubelet[2914]: W0325 01:52:49.110411 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:49.110504 kubelet[2914]: E0325 01:52:49.110478 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:49.110862 kubelet[2914]: E0325 01:52:49.110823 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:49.111069 kubelet[2914]: W0325 01:52:49.110907 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:49.111069 kubelet[2914]: E0325 01:52:49.110936 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:49.111452 kubelet[2914]: E0325 01:52:49.111412 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:49.111506 kubelet[2914]: W0325 01:52:49.111456 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:49.111789 kubelet[2914]: E0325 01:52:49.111752 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:49.111789 kubelet[2914]: E0325 01:52:49.111771 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:49.111789 kubelet[2914]: W0325 01:52:49.111787 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:49.112160 kubelet[2914]: E0325 01:52:49.111959 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:49.112292 kubelet[2914]: E0325 01:52:49.112214 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:49.112292 kubelet[2914]: W0325 01:52:49.112242 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:49.112828 kubelet[2914]: E0325 01:52:49.112547 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:49.112828 kubelet[2914]: E0325 01:52:49.112602 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:49.112828 kubelet[2914]: W0325 01:52:49.112617 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:49.112828 kubelet[2914]: E0325 01:52:49.112661 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:49.113381 kubelet[2914]: E0325 01:52:49.113002 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:49.113381 kubelet[2914]: W0325 01:52:49.113016 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:49.113381 kubelet[2914]: E0325 01:52:49.113057 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:49.113793 kubelet[2914]: E0325 01:52:49.113763 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:49.113793 kubelet[2914]: W0325 01:52:49.113788 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:49.113917 kubelet[2914]: E0325 01:52:49.113815 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:49.114136 kubelet[2914]: E0325 01:52:49.114108 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:49.114136 kubelet[2914]: W0325 01:52:49.114130 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:49.114371 kubelet[2914]: E0325 01:52:49.114217 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:49.114527 kubelet[2914]: E0325 01:52:49.114428 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:49.114527 kubelet[2914]: W0325 01:52:49.114449 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:49.114527 kubelet[2914]: E0325 01:52:49.114500 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:49.114723 kubelet[2914]: E0325 01:52:49.114696 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:49.114723 kubelet[2914]: W0325 01:52:49.114717 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:49.114788 kubelet[2914]: E0325 01:52:49.114741 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:49.115081 kubelet[2914]: E0325 01:52:49.115052 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:49.115081 kubelet[2914]: W0325 01:52:49.115076 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:49.115208 kubelet[2914]: E0325 01:52:49.115100 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:49.115689 kubelet[2914]: E0325 01:52:49.115566 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:49.115689 kubelet[2914]: W0325 01:52:49.115589 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:49.115689 kubelet[2914]: E0325 01:52:49.115612 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:49.116012 kubelet[2914]: E0325 01:52:49.115959 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:49.116012 kubelet[2914]: W0325 01:52:49.115989 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:49.116012 kubelet[2914]: E0325 01:52:49.116005 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:49.116621 kubelet[2914]: E0325 01:52:49.116277 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:49.116621 kubelet[2914]: W0325 01:52:49.116291 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:49.116621 kubelet[2914]: E0325 01:52:49.116304 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:49.116963 kubelet[2914]: E0325 01:52:49.116931 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:49.116963 kubelet[2914]: W0325 01:52:49.116961 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:49.117041 kubelet[2914]: E0325 01:52:49.116979 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:50.010932 kubelet[2914]: I0325 01:52:50.009936 2914 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:52:50.075408 kubelet[2914]: E0325 01:52:50.075291 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:50.075408 kubelet[2914]: W0325 01:52:50.075309 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:50.075408 kubelet[2914]: E0325 01:52:50.075351 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:50.076002 kubelet[2914]: E0325 01:52:50.075732 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:50.076002 kubelet[2914]: W0325 01:52:50.075744 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:50.076002 kubelet[2914]: E0325 01:52:50.075756 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:50.076002 kubelet[2914]: E0325 01:52:50.075928 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:50.076002 kubelet[2914]: W0325 01:52:50.075939 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:50.076002 kubelet[2914]: E0325 01:52:50.075953 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:50.076463 kubelet[2914]: E0325 01:52:50.076428 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:50.076463 kubelet[2914]: W0325 01:52:50.076451 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:50.076463 kubelet[2914]: E0325 01:52:50.076465 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:50.076681 kubelet[2914]: E0325 01:52:50.076663 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:50.076681 kubelet[2914]: W0325 01:52:50.076674 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:50.076758 kubelet[2914]: E0325 01:52:50.076685 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:50.076925 kubelet[2914]: E0325 01:52:50.076879 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:50.077046 kubelet[2914]: W0325 01:52:50.076920 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:50.077046 kubelet[2914]: E0325 01:52:50.077042 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:50.077348 kubelet[2914]: E0325 01:52:50.077299 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:50.077348 kubelet[2914]: W0325 01:52:50.077316 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:50.077448 kubelet[2914]: E0325 01:52:50.077380 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:50.077615 kubelet[2914]: E0325 01:52:50.077584 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:50.077615 kubelet[2914]: W0325 01:52:50.077606 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:50.077805 kubelet[2914]: E0325 01:52:50.077619 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:50.077949 kubelet[2914]: E0325 01:52:50.077924 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:50.078028 kubelet[2914]: W0325 01:52:50.077957 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:50.078028 kubelet[2914]: E0325 01:52:50.077976 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:50.078275 kubelet[2914]: E0325 01:52:50.078249 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:50.078275 kubelet[2914]: W0325 01:52:50.078267 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:50.078395 kubelet[2914]: E0325 01:52:50.078282 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:50.078525 kubelet[2914]: E0325 01:52:50.078501 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:50.078525 kubelet[2914]: W0325 01:52:50.078516 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:50.078525 kubelet[2914]: E0325 01:52:50.078527 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:50.078766 kubelet[2914]: E0325 01:52:50.078728 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:50.078766 kubelet[2914]: W0325 01:52:50.078753 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:50.078766 kubelet[2914]: E0325 01:52:50.078764 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:50.079010 kubelet[2914]: E0325 01:52:50.078984 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:50.079010 kubelet[2914]: W0325 01:52:50.078994 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:50.079010 kubelet[2914]: E0325 01:52:50.079005 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:50.079353 kubelet[2914]: E0325 01:52:50.079203 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:50.079353 kubelet[2914]: W0325 01:52:50.079219 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:50.079353 kubelet[2914]: E0325 01:52:50.079232 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:50.079742 kubelet[2914]: E0325 01:52:50.079627 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:50.079742 kubelet[2914]: W0325 01:52:50.079639 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:50.079742 kubelet[2914]: E0325 01:52:50.079652 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:50.119030 kubelet[2914]: E0325 01:52:50.118986 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:50.119030 kubelet[2914]: W0325 01:52:50.119006 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:50.119030 kubelet[2914]: E0325 01:52:50.119024 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:50.119515 kubelet[2914]: E0325 01:52:50.119392 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:50.119515 kubelet[2914]: W0325 01:52:50.119411 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:50.119515 kubelet[2914]: E0325 01:52:50.119429 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:50.120186 kubelet[2914]: E0325 01:52:50.120130 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:50.120186 kubelet[2914]: W0325 01:52:50.120146 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:50.120186 kubelet[2914]: E0325 01:52:50.120164 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:50.120499 kubelet[2914]: E0325 01:52:50.120475 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:50.120499 kubelet[2914]: W0325 01:52:50.120495 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:50.120729 kubelet[2914]: E0325 01:52:50.120689 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:50.121041 kubelet[2914]: E0325 01:52:50.120964 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:50.121041 kubelet[2914]: W0325 01:52:50.120979 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:50.121441 kubelet[2914]: E0325 01:52:50.121359 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:50.121635 kubelet[2914]: E0325 01:52:50.121613 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:50.121635 kubelet[2914]: W0325 01:52:50.121624 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:50.121716 kubelet[2914]: E0325 01:52:50.121704 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:50.121983 kubelet[2914]: E0325 01:52:50.121967 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:50.121983 kubelet[2914]: W0325 01:52:50.121982 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:50.122190 kubelet[2914]: E0325 01:52:50.121997 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:50.122550 kubelet[2914]: E0325 01:52:50.122476 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:50.122550 kubelet[2914]: W0325 01:52:50.122491 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:50.122550 kubelet[2914]: E0325 01:52:50.122505 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:50.123043 kubelet[2914]: E0325 01:52:50.122898 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:50.123043 kubelet[2914]: W0325 01:52:50.122964 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:50.123043 kubelet[2914]: E0325 01:52:50.122989 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:50.123593 kubelet[2914]: E0325 01:52:50.123456 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:50.123593 kubelet[2914]: W0325 01:52:50.123471 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:50.123817 kubelet[2914]: E0325 01:52:50.123701 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:50.124029 kubelet[2914]: E0325 01:52:50.123937 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:50.124029 kubelet[2914]: W0325 01:52:50.123951 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:50.124231 kubelet[2914]: E0325 01:52:50.124136 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:50.124456 kubelet[2914]: E0325 01:52:50.124374 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:50.124456 kubelet[2914]: W0325 01:52:50.124391 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:50.124758 kubelet[2914]: E0325 01:52:50.124533 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:50.124914 kubelet[2914]: E0325 01:52:50.124890 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:50.124982 kubelet[2914]: W0325 01:52:50.124971 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:50.125045 kubelet[2914]: E0325 01:52:50.125034 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:50.125403 kubelet[2914]: E0325 01:52:50.125391 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:50.125546 kubelet[2914]: W0325 01:52:50.125462 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:50.125546 kubelet[2914]: E0325 01:52:50.125475 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:50.125795 kubelet[2914]: E0325 01:52:50.125701 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:50.125795 kubelet[2914]: W0325 01:52:50.125712 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:50.125795 kubelet[2914]: E0325 01:52:50.125734 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:50.126170 kubelet[2914]: E0325 01:52:50.126062 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:50.126170 kubelet[2914]: W0325 01:52:50.126073 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:50.126365 kubelet[2914]: E0325 01:52:50.126247 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:50.126496 kubelet[2914]: E0325 01:52:50.126485 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:50.126699 kubelet[2914]: W0325 01:52:50.126544 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:50.126699 kubelet[2914]: E0325 01:52:50.126557 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:50.126876 kubelet[2914]: E0325 01:52:50.126866 2914 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:52:50.127003 kubelet[2914]: W0325 01:52:50.126969 2914 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:52:50.127003 kubelet[2914]: E0325 01:52:50.126985 2914 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:52:50.175980 containerd[1507]: time="2025-03-25T01:52:50.175939787Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:52:50.176828 containerd[1507]: time="2025-03-25T01:52:50.176784493Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2: active requests=0, bytes read=5364011" Mar 25 01:52:50.177732 containerd[1507]: time="2025-03-25T01:52:50.177693420Z" level=info msg="ImageCreate event name:\"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:52:50.179021 containerd[1507]: time="2025-03-25T01:52:50.179004422Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:52:50.179756 containerd[1507]: time="2025-03-25T01:52:50.179493701Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" with image id \"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\", size \"6857075\" in 2.042441523s" Mar 25 01:52:50.179756 containerd[1507]: time="2025-03-25T01:52:50.179518537Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" returns image reference \"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\"" Mar 25 01:52:50.181620 containerd[1507]: time="2025-03-25T01:52:50.181596902Z" level=info msg="CreateContainer within sandbox \"54f6638a6f3a8352b869afcf72ccdaa9c8e7f0dccd7d7e68e252de80cf8692cd\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 25 01:52:50.190955 containerd[1507]: time="2025-03-25T01:52:50.190771632Z" level=info msg="Container f1867468c01be0d213e39cf0a0e5d5c46f9bd602e1024e45055e1dad8cd459e4: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:52:50.199167 containerd[1507]: time="2025-03-25T01:52:50.199136462Z" level=info msg="CreateContainer within sandbox \"54f6638a6f3a8352b869afcf72ccdaa9c8e7f0dccd7d7e68e252de80cf8692cd\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"f1867468c01be0d213e39cf0a0e5d5c46f9bd602e1024e45055e1dad8cd459e4\"" Mar 25 01:52:50.199831 containerd[1507]: time="2025-03-25T01:52:50.199805599Z" level=info msg="StartContainer for \"f1867468c01be0d213e39cf0a0e5d5c46f9bd602e1024e45055e1dad8cd459e4\"" Mar 25 01:52:50.200876 containerd[1507]: time="2025-03-25T01:52:50.200828169Z" level=info msg="connecting to shim f1867468c01be0d213e39cf0a0e5d5c46f9bd602e1024e45055e1dad8cd459e4" address="unix:///run/containerd/s/c9d5e41c3cfb912a4eb86939bfeac9b1fc6988aff31dca8d423b177c26782245" protocol=ttrpc version=3 Mar 25 01:52:50.218446 systemd[1]: Started cri-containerd-f1867468c01be0d213e39cf0a0e5d5c46f9bd602e1024e45055e1dad8cd459e4.scope - libcontainer container f1867468c01be0d213e39cf0a0e5d5c46f9bd602e1024e45055e1dad8cd459e4. Mar 25 01:52:50.254876 containerd[1507]: time="2025-03-25T01:52:50.254830962Z" level=info msg="StartContainer for \"f1867468c01be0d213e39cf0a0e5d5c46f9bd602e1024e45055e1dad8cd459e4\" returns successfully" Mar 25 01:52:50.267676 systemd[1]: cri-containerd-f1867468c01be0d213e39cf0a0e5d5c46f9bd602e1024e45055e1dad8cd459e4.scope: Deactivated successfully. Mar 25 01:52:50.295407 containerd[1507]: time="2025-03-25T01:52:50.294053244Z" level=info msg="received exit event container_id:\"f1867468c01be0d213e39cf0a0e5d5c46f9bd602e1024e45055e1dad8cd459e4\" id:\"f1867468c01be0d213e39cf0a0e5d5c46f9bd602e1024e45055e1dad8cd459e4\" pid:3534 exited_at:{seconds:1742867570 nanos:269668878}" Mar 25 01:52:50.320796 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f1867468c01be0d213e39cf0a0e5d5c46f9bd602e1024e45055e1dad8cd459e4-rootfs.mount: Deactivated successfully. Mar 25 01:52:50.334811 containerd[1507]: time="2025-03-25T01:52:50.325275562Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f1867468c01be0d213e39cf0a0e5d5c46f9bd602e1024e45055e1dad8cd459e4\" id:\"f1867468c01be0d213e39cf0a0e5d5c46f9bd602e1024e45055e1dad8cd459e4\" pid:3534 exited_at:{seconds:1742867570 nanos:269668878}" Mar 25 01:52:50.898570 kubelet[2914]: E0325 01:52:50.898069 2914 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zsbnv" podUID="a86ff892-344f-4f10-be9e-1c061509165d" Mar 25 01:52:51.019604 containerd[1507]: time="2025-03-25T01:52:51.019437124Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\"" Mar 25 01:52:51.699375 kubelet[2914]: I0325 01:52:51.698880 2914 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:52:52.900025 kubelet[2914]: E0325 01:52:52.899481 2914 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zsbnv" podUID="a86ff892-344f-4f10-be9e-1c061509165d" Mar 25 01:52:54.899118 kubelet[2914]: E0325 01:52:54.898553 2914 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zsbnv" podUID="a86ff892-344f-4f10-be9e-1c061509165d" Mar 25 01:52:56.899603 kubelet[2914]: E0325 01:52:56.899554 2914 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-zsbnv" podUID="a86ff892-344f-4f10-be9e-1c061509165d" Mar 25 01:52:57.411181 containerd[1507]: time="2025-03-25T01:52:57.411123213Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:52:57.412072 containerd[1507]: time="2025-03-25T01:52:57.412014045Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.2: active requests=0, bytes read=97781477" Mar 25 01:52:57.412906 containerd[1507]: time="2025-03-25T01:52:57.412853039Z" level=info msg="ImageCreate event name:\"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:52:57.414956 containerd[1507]: time="2025-03-25T01:52:57.414911031Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:52:57.415708 containerd[1507]: time="2025-03-25T01:52:57.415404427Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.2\" with image id \"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\", size \"99274581\" in 6.395521286s" Mar 25 01:52:57.415708 containerd[1507]: time="2025-03-25T01:52:57.415439854Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\" returns image reference \"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\"" Mar 25 01:52:57.417607 containerd[1507]: time="2025-03-25T01:52:57.417572716Z" level=info msg="CreateContainer within sandbox \"54f6638a6f3a8352b869afcf72ccdaa9c8e7f0dccd7d7e68e252de80cf8692cd\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 25 01:52:57.428821 containerd[1507]: time="2025-03-25T01:52:57.426566813Z" level=info msg="Container 87956effa6eecffce2081d479a04afcce6701026d4c29707770ced79aa02be2b: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:52:57.447474 containerd[1507]: time="2025-03-25T01:52:57.447435262Z" level=info msg="CreateContainer within sandbox \"54f6638a6f3a8352b869afcf72ccdaa9c8e7f0dccd7d7e68e252de80cf8692cd\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"87956effa6eecffce2081d479a04afcce6701026d4c29707770ced79aa02be2b\"" Mar 25 01:52:57.448518 containerd[1507]: time="2025-03-25T01:52:57.448482528Z" level=info msg="StartContainer for \"87956effa6eecffce2081d479a04afcce6701026d4c29707770ced79aa02be2b\"" Mar 25 01:52:57.449917 containerd[1507]: time="2025-03-25T01:52:57.449882926Z" level=info msg="connecting to shim 87956effa6eecffce2081d479a04afcce6701026d4c29707770ced79aa02be2b" address="unix:///run/containerd/s/c9d5e41c3cfb912a4eb86939bfeac9b1fc6988aff31dca8d423b177c26782245" protocol=ttrpc version=3 Mar 25 01:52:57.487477 systemd[1]: Started cri-containerd-87956effa6eecffce2081d479a04afcce6701026d4c29707770ced79aa02be2b.scope - libcontainer container 87956effa6eecffce2081d479a04afcce6701026d4c29707770ced79aa02be2b. Mar 25 01:52:57.523863 containerd[1507]: time="2025-03-25T01:52:57.523792877Z" level=info msg="StartContainer for \"87956effa6eecffce2081d479a04afcce6701026d4c29707770ced79aa02be2b\" returns successfully" Mar 25 01:52:57.902675 systemd[1]: cri-containerd-87956effa6eecffce2081d479a04afcce6701026d4c29707770ced79aa02be2b.scope: Deactivated successfully. Mar 25 01:52:57.902934 systemd[1]: cri-containerd-87956effa6eecffce2081d479a04afcce6701026d4c29707770ced79aa02be2b.scope: Consumed 365ms CPU time, 151.9M memory peak, 3.1M read from disk, 154M written to disk. Mar 25 01:52:57.904789 containerd[1507]: time="2025-03-25T01:52:57.904734853Z" level=info msg="received exit event container_id:\"87956effa6eecffce2081d479a04afcce6701026d4c29707770ced79aa02be2b\" id:\"87956effa6eecffce2081d479a04afcce6701026d4c29707770ced79aa02be2b\" pid:3594 exited_at:{seconds:1742867577 nanos:904527934}" Mar 25 01:52:57.907283 containerd[1507]: time="2025-03-25T01:52:57.907115980Z" level=info msg="TaskExit event in podsandbox handler container_id:\"87956effa6eecffce2081d479a04afcce6701026d4c29707770ced79aa02be2b\" id:\"87956effa6eecffce2081d479a04afcce6701026d4c29707770ced79aa02be2b\" pid:3594 exited_at:{seconds:1742867577 nanos:904527934}" Mar 25 01:52:57.952553 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-87956effa6eecffce2081d479a04afcce6701026d4c29707770ced79aa02be2b-rootfs.mount: Deactivated successfully. Mar 25 01:52:57.980215 kubelet[2914]: I0325 01:52:57.980175 2914 kubelet_node_status.go:497] "Fast updating node status as it just became ready" Mar 25 01:52:58.007426 kubelet[2914]: I0325 01:52:58.005962 2914 topology_manager.go:215] "Topology Admit Handler" podUID="4197106e-bb3c-4d91-a3a1-29d9a63dea11" podNamespace="kube-system" podName="coredns-7db6d8ff4d-xjt4l" Mar 25 01:52:58.015496 kubelet[2914]: I0325 01:52:58.015470 2914 topology_manager.go:215] "Topology Admit Handler" podUID="5aafe162-9d82-4750-854d-710eb1f52529" podNamespace="kube-system" podName="coredns-7db6d8ff4d-qfxw4" Mar 25 01:52:58.015629 kubelet[2914]: I0325 01:52:58.015603 2914 topology_manager.go:215] "Topology Admit Handler" podUID="d5827ecd-bdbb-4e45-81f1-2bbc6444be50" podNamespace="calico-system" podName="calico-kube-controllers-54675b8544-42csf" Mar 25 01:52:58.017554 kubelet[2914]: I0325 01:52:58.016827 2914 topology_manager.go:215] "Topology Admit Handler" podUID="9b2cfa8a-0c0c-432c-a1ec-2a55980a855f" podNamespace="calico-apiserver" podName="calico-apiserver-7d6975569f-qrwcl" Mar 25 01:52:58.017967 kubelet[2914]: I0325 01:52:58.017865 2914 topology_manager.go:215] "Topology Admit Handler" podUID="b03cc102-7dcd-4592-988f-8209a4d3c8d8" podNamespace="calico-apiserver" podName="calico-apiserver-7d6975569f-xkvtt" Mar 25 01:52:58.022126 systemd[1]: Created slice kubepods-burstable-pod4197106e_bb3c_4d91_a3a1_29d9a63dea11.slice - libcontainer container kubepods-burstable-pod4197106e_bb3c_4d91_a3a1_29d9a63dea11.slice. Mar 25 01:52:58.028889 systemd[1]: Created slice kubepods-besteffort-podd5827ecd_bdbb_4e45_81f1_2bbc6444be50.slice - libcontainer container kubepods-besteffort-podd5827ecd_bdbb_4e45_81f1_2bbc6444be50.slice. Mar 25 01:52:58.035893 systemd[1]: Created slice kubepods-besteffort-pod9b2cfa8a_0c0c_432c_a1ec_2a55980a855f.slice - libcontainer container kubepods-besteffort-pod9b2cfa8a_0c0c_432c_a1ec_2a55980a855f.slice. Mar 25 01:52:58.044842 containerd[1507]: time="2025-03-25T01:52:58.043045953Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\"" Mar 25 01:52:58.046483 systemd[1]: Created slice kubepods-burstable-pod5aafe162_9d82_4750_854d_710eb1f52529.slice - libcontainer container kubepods-burstable-pod5aafe162_9d82_4750_854d_710eb1f52529.slice. Mar 25 01:52:58.053050 systemd[1]: Created slice kubepods-besteffort-podb03cc102_7dcd_4592_988f_8209a4d3c8d8.slice - libcontainer container kubepods-besteffort-podb03cc102_7dcd_4592_988f_8209a4d3c8d8.slice. Mar 25 01:52:58.076500 kubelet[2914]: I0325 01:52:58.076160 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/b03cc102-7dcd-4592-988f-8209a4d3c8d8-calico-apiserver-certs\") pod \"calico-apiserver-7d6975569f-xkvtt\" (UID: \"b03cc102-7dcd-4592-988f-8209a4d3c8d8\") " pod="calico-apiserver/calico-apiserver-7d6975569f-xkvtt" Mar 25 01:52:58.076500 kubelet[2914]: I0325 01:52:58.076196 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zr64\" (UniqueName: \"kubernetes.io/projected/b03cc102-7dcd-4592-988f-8209a4d3c8d8-kube-api-access-2zr64\") pod \"calico-apiserver-7d6975569f-xkvtt\" (UID: \"b03cc102-7dcd-4592-988f-8209a4d3c8d8\") " pod="calico-apiserver/calico-apiserver-7d6975569f-xkvtt" Mar 25 01:52:58.076500 kubelet[2914]: I0325 01:52:58.076213 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4197106e-bb3c-4d91-a3a1-29d9a63dea11-config-volume\") pod \"coredns-7db6d8ff4d-xjt4l\" (UID: \"4197106e-bb3c-4d91-a3a1-29d9a63dea11\") " pod="kube-system/coredns-7db6d8ff4d-xjt4l" Mar 25 01:52:58.076500 kubelet[2914]: I0325 01:52:58.076233 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5aafe162-9d82-4750-854d-710eb1f52529-config-volume\") pod \"coredns-7db6d8ff4d-qfxw4\" (UID: \"5aafe162-9d82-4750-854d-710eb1f52529\") " pod="kube-system/coredns-7db6d8ff4d-qfxw4" Mar 25 01:52:58.076500 kubelet[2914]: I0325 01:52:58.076244 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/9b2cfa8a-0c0c-432c-a1ec-2a55980a855f-calico-apiserver-certs\") pod \"calico-apiserver-7d6975569f-qrwcl\" (UID: \"9b2cfa8a-0c0c-432c-a1ec-2a55980a855f\") " pod="calico-apiserver/calico-apiserver-7d6975569f-qrwcl" Mar 25 01:52:58.077617 kubelet[2914]: I0325 01:52:58.076258 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b82x\" (UniqueName: \"kubernetes.io/projected/4197106e-bb3c-4d91-a3a1-29d9a63dea11-kube-api-access-7b82x\") pod \"coredns-7db6d8ff4d-xjt4l\" (UID: \"4197106e-bb3c-4d91-a3a1-29d9a63dea11\") " pod="kube-system/coredns-7db6d8ff4d-xjt4l" Mar 25 01:52:58.077617 kubelet[2914]: I0325 01:52:58.076270 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d5827ecd-bdbb-4e45-81f1-2bbc6444be50-tigera-ca-bundle\") pod \"calico-kube-controllers-54675b8544-42csf\" (UID: \"d5827ecd-bdbb-4e45-81f1-2bbc6444be50\") " pod="calico-system/calico-kube-controllers-54675b8544-42csf" Mar 25 01:52:58.077617 kubelet[2914]: I0325 01:52:58.076283 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94b4q\" (UniqueName: \"kubernetes.io/projected/d5827ecd-bdbb-4e45-81f1-2bbc6444be50-kube-api-access-94b4q\") pod \"calico-kube-controllers-54675b8544-42csf\" (UID: \"d5827ecd-bdbb-4e45-81f1-2bbc6444be50\") " pod="calico-system/calico-kube-controllers-54675b8544-42csf" Mar 25 01:52:58.077617 kubelet[2914]: I0325 01:52:58.076299 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4q4jv\" (UniqueName: \"kubernetes.io/projected/5aafe162-9d82-4750-854d-710eb1f52529-kube-api-access-4q4jv\") pod \"coredns-7db6d8ff4d-qfxw4\" (UID: \"5aafe162-9d82-4750-854d-710eb1f52529\") " pod="kube-system/coredns-7db6d8ff4d-qfxw4" Mar 25 01:52:58.077617 kubelet[2914]: I0325 01:52:58.076341 2914 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvllb\" (UniqueName: \"kubernetes.io/projected/9b2cfa8a-0c0c-432c-a1ec-2a55980a855f-kube-api-access-fvllb\") pod \"calico-apiserver-7d6975569f-qrwcl\" (UID: \"9b2cfa8a-0c0c-432c-a1ec-2a55980a855f\") " pod="calico-apiserver/calico-apiserver-7d6975569f-qrwcl" Mar 25 01:52:58.327583 containerd[1507]: time="2025-03-25T01:52:58.327448226Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-xjt4l,Uid:4197106e-bb3c-4d91-a3a1-29d9a63dea11,Namespace:kube-system,Attempt:0,}" Mar 25 01:52:58.334716 containerd[1507]: time="2025-03-25T01:52:58.334535674Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-54675b8544-42csf,Uid:d5827ecd-bdbb-4e45-81f1-2bbc6444be50,Namespace:calico-system,Attempt:0,}" Mar 25 01:52:58.355747 containerd[1507]: time="2025-03-25T01:52:58.355203914Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-qfxw4,Uid:5aafe162-9d82-4750-854d-710eb1f52529,Namespace:kube-system,Attempt:0,}" Mar 25 01:52:58.367468 containerd[1507]: time="2025-03-25T01:52:58.366883068Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d6975569f-qrwcl,Uid:9b2cfa8a-0c0c-432c-a1ec-2a55980a855f,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:52:58.368451 containerd[1507]: time="2025-03-25T01:52:58.368397548Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d6975569f-xkvtt,Uid:b03cc102-7dcd-4592-988f-8209a4d3c8d8,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:52:58.579080 containerd[1507]: time="2025-03-25T01:52:58.578956158Z" level=error msg="Failed to destroy network for sandbox \"b2aa4bb9563aa1f7e2de7d62ddcf87bf80743afe1acfbec392613d358c027579\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:52:58.579912 containerd[1507]: time="2025-03-25T01:52:58.579531899Z" level=error msg="Failed to destroy network for sandbox \"df5b38d10c8d3d3721c68ae5db3889bb068741d854205c1d9631ec2e08e7ddbf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:52:58.582439 systemd[1]: run-netns-cni\x2da0f2126f\x2d75a0\x2ddee2\x2d4a80\x2d83ba47e0aafb.mount: Deactivated successfully. Mar 25 01:52:58.588891 systemd[1]: run-netns-cni\x2df743c7ad\x2de9cb\x2de38c\x2d796e\x2d9b9fb62417b4.mount: Deactivated successfully. Mar 25 01:52:58.596141 containerd[1507]: time="2025-03-25T01:52:58.584079651Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-54675b8544-42csf,Uid:d5827ecd-bdbb-4e45-81f1-2bbc6444be50,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"df5b38d10c8d3d3721c68ae5db3889bb068741d854205c1d9631ec2e08e7ddbf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:52:58.596282 containerd[1507]: time="2025-03-25T01:52:58.591245376Z" level=error msg="Failed to destroy network for sandbox \"49aa28761ceeb2310c61068aad20e5b6dda7c91bee21e5b5c129ff2c6dec3a98\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:52:58.596825 containerd[1507]: time="2025-03-25T01:52:58.591269412Z" level=error msg="Failed to destroy network for sandbox \"87a05b0ba9fb06fa9eef9f71d166562c7d2401142945ee864a2926451a8651f4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:52:58.597431 containerd[1507]: time="2025-03-25T01:52:58.593161131Z" level=error msg="Failed to destroy network for sandbox \"6dbe14f514d4301e910544c883cf17919b9a0e720a7f13a24e4b8efdd0e9b1fb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:52:58.597669 containerd[1507]: time="2025-03-25T01:52:58.593649908Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-qfxw4,Uid:5aafe162-9d82-4750-854d-710eb1f52529,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2aa4bb9563aa1f7e2de7d62ddcf87bf80743afe1acfbec392613d358c027579\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:52:58.598800 systemd[1]: run-netns-cni\x2d88814676\x2d3f1f\x2d453a\x2d5fb4\x2d66d2c9b7f038.mount: Deactivated successfully. Mar 25 01:52:58.599365 systemd[1]: run-netns-cni\x2d1943148e\x2dbde7\x2d036e\x2dd58c\x2dc71b559a22d2.mount: Deactivated successfully. Mar 25 01:52:58.603109 systemd[1]: run-netns-cni\x2dee4feeaf\x2d8439\x2ded02\x2d336f\x2d8cfcade93d28.mount: Deactivated successfully. Mar 25 01:52:58.605441 kubelet[2914]: E0325 01:52:58.605015 2914 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2aa4bb9563aa1f7e2de7d62ddcf87bf80743afe1acfbec392613d358c027579\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:52:58.605441 kubelet[2914]: E0325 01:52:58.605097 2914 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2aa4bb9563aa1f7e2de7d62ddcf87bf80743afe1acfbec392613d358c027579\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-qfxw4" Mar 25 01:52:58.605441 kubelet[2914]: E0325 01:52:58.605117 2914 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b2aa4bb9563aa1f7e2de7d62ddcf87bf80743afe1acfbec392613d358c027579\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-qfxw4" Mar 25 01:52:58.605441 kubelet[2914]: E0325 01:52:58.605108 2914 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"df5b38d10c8d3d3721c68ae5db3889bb068741d854205c1d9631ec2e08e7ddbf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:52:58.605908 containerd[1507]: time="2025-03-25T01:52:58.598888458Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d6975569f-xkvtt,Uid:b03cc102-7dcd-4592-988f-8209a4d3c8d8,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"49aa28761ceeb2310c61068aad20e5b6dda7c91bee21e5b5c129ff2c6dec3a98\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:52:58.605955 kubelet[2914]: E0325 01:52:58.605168 2914 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-qfxw4_kube-system(5aafe162-9d82-4750-854d-710eb1f52529)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-qfxw4_kube-system(5aafe162-9d82-4750-854d-710eb1f52529)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b2aa4bb9563aa1f7e2de7d62ddcf87bf80743afe1acfbec392613d358c027579\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-qfxw4" podUID="5aafe162-9d82-4750-854d-710eb1f52529" Mar 25 01:52:58.605955 kubelet[2914]: E0325 01:52:58.605168 2914 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"df5b38d10c8d3d3721c68ae5db3889bb068741d854205c1d9631ec2e08e7ddbf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-54675b8544-42csf" Mar 25 01:52:58.605955 kubelet[2914]: E0325 01:52:58.605301 2914 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"df5b38d10c8d3d3721c68ae5db3889bb068741d854205c1d9631ec2e08e7ddbf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-54675b8544-42csf" Mar 25 01:52:58.606143 kubelet[2914]: E0325 01:52:58.606081 2914 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-54675b8544-42csf_calico-system(d5827ecd-bdbb-4e45-81f1-2bbc6444be50)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-54675b8544-42csf_calico-system(d5827ecd-bdbb-4e45-81f1-2bbc6444be50)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"df5b38d10c8d3d3721c68ae5db3889bb068741d854205c1d9631ec2e08e7ddbf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-54675b8544-42csf" podUID="d5827ecd-bdbb-4e45-81f1-2bbc6444be50" Mar 25 01:52:58.606411 containerd[1507]: time="2025-03-25T01:52:58.606203663Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d6975569f-qrwcl,Uid:9b2cfa8a-0c0c-432c-a1ec-2a55980a855f,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"87a05b0ba9fb06fa9eef9f71d166562c7d2401142945ee864a2926451a8651f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:52:58.606708 kubelet[2914]: E0325 01:52:58.606673 2914 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"87a05b0ba9fb06fa9eef9f71d166562c7d2401142945ee864a2926451a8651f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:52:58.606746 kubelet[2914]: E0325 01:52:58.606707 2914 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"87a05b0ba9fb06fa9eef9f71d166562c7d2401142945ee864a2926451a8651f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7d6975569f-qrwcl" Mar 25 01:52:58.606746 kubelet[2914]: E0325 01:52:58.606722 2914 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"87a05b0ba9fb06fa9eef9f71d166562c7d2401142945ee864a2926451a8651f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7d6975569f-qrwcl" Mar 25 01:52:58.606910 kubelet[2914]: E0325 01:52:58.606752 2914 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7d6975569f-qrwcl_calico-apiserver(9b2cfa8a-0c0c-432c-a1ec-2a55980a855f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7d6975569f-qrwcl_calico-apiserver(9b2cfa8a-0c0c-432c-a1ec-2a55980a855f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"87a05b0ba9fb06fa9eef9f71d166562c7d2401142945ee864a2926451a8651f4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7d6975569f-qrwcl" podUID="9b2cfa8a-0c0c-432c-a1ec-2a55980a855f" Mar 25 01:52:58.606910 kubelet[2914]: E0325 01:52:58.606687 2914 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"49aa28761ceeb2310c61068aad20e5b6dda7c91bee21e5b5c129ff2c6dec3a98\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:52:58.607029 kubelet[2914]: E0325 01:52:58.606837 2914 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"49aa28761ceeb2310c61068aad20e5b6dda7c91bee21e5b5c129ff2c6dec3a98\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7d6975569f-xkvtt" Mar 25 01:52:58.607029 kubelet[2914]: E0325 01:52:58.607021 2914 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"49aa28761ceeb2310c61068aad20e5b6dda7c91bee21e5b5c129ff2c6dec3a98\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7d6975569f-xkvtt" Mar 25 01:52:58.607190 kubelet[2914]: E0325 01:52:58.607055 2914 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7d6975569f-xkvtt_calico-apiserver(b03cc102-7dcd-4592-988f-8209a4d3c8d8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7d6975569f-xkvtt_calico-apiserver(b03cc102-7dcd-4592-988f-8209a4d3c8d8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"49aa28761ceeb2310c61068aad20e5b6dda7c91bee21e5b5c129ff2c6dec3a98\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7d6975569f-xkvtt" podUID="b03cc102-7dcd-4592-988f-8209a4d3c8d8" Mar 25 01:52:58.607252 containerd[1507]: time="2025-03-25T01:52:58.607200232Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-xjt4l,Uid:4197106e-bb3c-4d91-a3a1-29d9a63dea11,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6dbe14f514d4301e910544c883cf17919b9a0e720a7f13a24e4b8efdd0e9b1fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:52:58.607571 kubelet[2914]: E0325 01:52:58.607538 2914 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6dbe14f514d4301e910544c883cf17919b9a0e720a7f13a24e4b8efdd0e9b1fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:52:58.607640 kubelet[2914]: E0325 01:52:58.607575 2914 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6dbe14f514d4301e910544c883cf17919b9a0e720a7f13a24e4b8efdd0e9b1fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-xjt4l" Mar 25 01:52:58.607640 kubelet[2914]: E0325 01:52:58.607596 2914 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6dbe14f514d4301e910544c883cf17919b9a0e720a7f13a24e4b8efdd0e9b1fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-xjt4l" Mar 25 01:52:58.607711 kubelet[2914]: E0325 01:52:58.607635 2914 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-xjt4l_kube-system(4197106e-bb3c-4d91-a3a1-29d9a63dea11)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-xjt4l_kube-system(4197106e-bb3c-4d91-a3a1-29d9a63dea11)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6dbe14f514d4301e910544c883cf17919b9a0e720a7f13a24e4b8efdd0e9b1fb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-xjt4l" podUID="4197106e-bb3c-4d91-a3a1-29d9a63dea11" Mar 25 01:52:58.905269 systemd[1]: Created slice kubepods-besteffort-poda86ff892_344f_4f10_be9e_1c061509165d.slice - libcontainer container kubepods-besteffort-poda86ff892_344f_4f10_be9e_1c061509165d.slice. Mar 25 01:52:58.907991 containerd[1507]: time="2025-03-25T01:52:58.907912237Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zsbnv,Uid:a86ff892-344f-4f10-be9e-1c061509165d,Namespace:calico-system,Attempt:0,}" Mar 25 01:52:58.963169 containerd[1507]: time="2025-03-25T01:52:58.963111406Z" level=error msg="Failed to destroy network for sandbox \"10044f21eb29dc7c671b5af1c75aeb333c8f79120a8b7520e86d049fb5ed71e8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:52:58.964414 containerd[1507]: time="2025-03-25T01:52:58.964357845Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zsbnv,Uid:a86ff892-344f-4f10-be9e-1c061509165d,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"10044f21eb29dc7c671b5af1c75aeb333c8f79120a8b7520e86d049fb5ed71e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:52:58.964698 kubelet[2914]: E0325 01:52:58.964582 2914 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"10044f21eb29dc7c671b5af1c75aeb333c8f79120a8b7520e86d049fb5ed71e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:52:58.964698 kubelet[2914]: E0325 01:52:58.964628 2914 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"10044f21eb29dc7c671b5af1c75aeb333c8f79120a8b7520e86d049fb5ed71e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zsbnv" Mar 25 01:52:58.964698 kubelet[2914]: E0325 01:52:58.964650 2914 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"10044f21eb29dc7c671b5af1c75aeb333c8f79120a8b7520e86d049fb5ed71e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-zsbnv" Mar 25 01:52:58.964902 kubelet[2914]: E0325 01:52:58.964687 2914 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-zsbnv_calico-system(a86ff892-344f-4f10-be9e-1c061509165d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-zsbnv_calico-system(a86ff892-344f-4f10-be9e-1c061509165d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"10044f21eb29dc7c671b5af1c75aeb333c8f79120a8b7520e86d049fb5ed71e8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-zsbnv" podUID="a86ff892-344f-4f10-be9e-1c061509165d" Mar 25 01:53:05.356806 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1319995436.mount: Deactivated successfully. Mar 25 01:53:05.469437 containerd[1507]: time="2025-03-25T01:53:05.469260787Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:53:05.480958 containerd[1507]: time="2025-03-25T01:53:05.473559448Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.2: active requests=0, bytes read=142241445" Mar 25 01:53:05.491181 containerd[1507]: time="2025-03-25T01:53:05.491106860Z" level=info msg="ImageCreate event name:\"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:53:05.492640 containerd[1507]: time="2025-03-25T01:53:05.492185702Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:53:05.492640 containerd[1507]: time="2025-03-25T01:53:05.492409482Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.2\" with image id \"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\", size \"142241307\" in 7.4486761s" Mar 25 01:53:05.492640 containerd[1507]: time="2025-03-25T01:53:05.492434359Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\" returns image reference \"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\"" Mar 25 01:53:05.547074 containerd[1507]: time="2025-03-25T01:53:05.547018809Z" level=info msg="CreateContainer within sandbox \"54f6638a6f3a8352b869afcf72ccdaa9c8e7f0dccd7d7e68e252de80cf8692cd\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 25 01:53:05.589199 containerd[1507]: time="2025-03-25T01:53:05.589149526Z" level=info msg="Container c1a93d09d0789e286491d05e330664b44ac3c0e1be7045deae34a0fdabf57c3c: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:53:05.663476 containerd[1507]: time="2025-03-25T01:53:05.663384079Z" level=info msg="CreateContainer within sandbox \"54f6638a6f3a8352b869afcf72ccdaa9c8e7f0dccd7d7e68e252de80cf8692cd\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"c1a93d09d0789e286491d05e330664b44ac3c0e1be7045deae34a0fdabf57c3c\"" Mar 25 01:53:05.669179 containerd[1507]: time="2025-03-25T01:53:05.669145151Z" level=info msg="StartContainer for \"c1a93d09d0789e286491d05e330664b44ac3c0e1be7045deae34a0fdabf57c3c\"" Mar 25 01:53:05.675238 containerd[1507]: time="2025-03-25T01:53:05.675209744Z" level=info msg="connecting to shim c1a93d09d0789e286491d05e330664b44ac3c0e1be7045deae34a0fdabf57c3c" address="unix:///run/containerd/s/c9d5e41c3cfb912a4eb86939bfeac9b1fc6988aff31dca8d423b177c26782245" protocol=ttrpc version=3 Mar 25 01:53:05.782426 systemd[1]: Started cri-containerd-c1a93d09d0789e286491d05e330664b44ac3c0e1be7045deae34a0fdabf57c3c.scope - libcontainer container c1a93d09d0789e286491d05e330664b44ac3c0e1be7045deae34a0fdabf57c3c. Mar 25 01:53:05.829737 containerd[1507]: time="2025-03-25T01:53:05.829696174Z" level=info msg="StartContainer for \"c1a93d09d0789e286491d05e330664b44ac3c0e1be7045deae34a0fdabf57c3c\" returns successfully" Mar 25 01:53:05.902460 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Mar 25 01:53:05.906724 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Mar 25 01:53:06.132785 kubelet[2914]: I0325 01:53:06.132731 2914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-fbkbd" podStartSLOduration=1.857637044 podStartE2EDuration="22.112208394s" podCreationTimestamp="2025-03-25 01:52:44 +0000 UTC" firstStartedPulling="2025-03-25 01:52:45.254546903 +0000 UTC m=+22.437768742" lastFinishedPulling="2025-03-25 01:53:05.509118252 +0000 UTC m=+42.692340092" observedRunningTime="2025-03-25 01:53:06.108095302 +0000 UTC m=+43.291317152" watchObservedRunningTime="2025-03-25 01:53:06.112208394 +0000 UTC m=+43.295430233" Mar 25 01:53:06.228026 containerd[1507]: time="2025-03-25T01:53:06.227634206Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c1a93d09d0789e286491d05e330664b44ac3c0e1be7045deae34a0fdabf57c3c\" id:\"80eecfaf5623efab5cfb9d467eee0e372766008feeb7a512b13e1fe5b2c9e6af\" pid:3875 exit_status:1 exited_at:{seconds:1742867586 nanos:221527597}" Mar 25 01:53:07.137968 containerd[1507]: time="2025-03-25T01:53:07.137924400Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c1a93d09d0789e286491d05e330664b44ac3c0e1be7045deae34a0fdabf57c3c\" id:\"f428b6c6d1ebb2904019381382a6175d2852944a7f34a588d8cf6b04a0920668\" pid:3913 exit_status:1 exited_at:{seconds:1742867587 nanos:137362346}" Mar 25 01:53:07.466427 kernel: bpftool[4042]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 25 01:53:07.662437 systemd-networkd[1400]: vxlan.calico: Link UP Mar 25 01:53:07.662446 systemd-networkd[1400]: vxlan.calico: Gained carrier Mar 25 01:53:08.163859 containerd[1507]: time="2025-03-25T01:53:08.163810617Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c1a93d09d0789e286491d05e330664b44ac3c0e1be7045deae34a0fdabf57c3c\" id:\"e1b5d9ee8376825a476ceffa7b5aecee63ecdb50ab57f7ad9ca9a580721dafb1\" pid:4125 exit_status:1 exited_at:{seconds:1742867588 nanos:163394547}" Mar 25 01:53:09.023722 systemd-networkd[1400]: vxlan.calico: Gained IPv6LL Mar 25 01:53:10.901849 containerd[1507]: time="2025-03-25T01:53:10.901792412Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d6975569f-qrwcl,Uid:9b2cfa8a-0c0c-432c-a1ec-2a55980a855f,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:53:10.902227 containerd[1507]: time="2025-03-25T01:53:10.902185759Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-qfxw4,Uid:5aafe162-9d82-4750-854d-710eb1f52529,Namespace:kube-system,Attempt:0,}" Mar 25 01:53:11.218393 systemd-networkd[1400]: cali909bc86c256: Link UP Mar 25 01:53:11.218869 systemd-networkd[1400]: cali909bc86c256: Gained carrier Mar 25 01:53:11.241548 containerd[1507]: 2025-03-25 01:53:10.975 [INFO][4147] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--a--abb47662e0-k8s-calico--apiserver--7d6975569f--qrwcl-eth0 calico-apiserver-7d6975569f- calico-apiserver 9b2cfa8a-0c0c-432c-a1ec-2a55980a855f 721 0 2025-03-25 01:52:44 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7d6975569f projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4284-0-0-a-abb47662e0 calico-apiserver-7d6975569f-qrwcl eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali909bc86c256 [] []}} ContainerID="30c8adb2b7c8e8d0aa178d7f43dadcf8fdc428143c634a6c27208cc2845fbd55" Namespace="calico-apiserver" Pod="calico-apiserver-7d6975569f-qrwcl" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-calico--apiserver--7d6975569f--qrwcl-" Mar 25 01:53:11.241548 containerd[1507]: 2025-03-25 01:53:10.976 [INFO][4147] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="30c8adb2b7c8e8d0aa178d7f43dadcf8fdc428143c634a6c27208cc2845fbd55" Namespace="calico-apiserver" Pod="calico-apiserver-7d6975569f-qrwcl" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-calico--apiserver--7d6975569f--qrwcl-eth0" Mar 25 01:53:11.241548 containerd[1507]: 2025-03-25 01:53:11.169 [INFO][4168] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="30c8adb2b7c8e8d0aa178d7f43dadcf8fdc428143c634a6c27208cc2845fbd55" HandleID="k8s-pod-network.30c8adb2b7c8e8d0aa178d7f43dadcf8fdc428143c634a6c27208cc2845fbd55" Workload="ci--4284--0--0--a--abb47662e0-k8s-calico--apiserver--7d6975569f--qrwcl-eth0" Mar 25 01:53:11.241978 containerd[1507]: 2025-03-25 01:53:11.180 [INFO][4168] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="30c8adb2b7c8e8d0aa178d7f43dadcf8fdc428143c634a6c27208cc2845fbd55" HandleID="k8s-pod-network.30c8adb2b7c8e8d0aa178d7f43dadcf8fdc428143c634a6c27208cc2845fbd55" Workload="ci--4284--0--0--a--abb47662e0-k8s-calico--apiserver--7d6975569f--qrwcl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000387240), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4284-0-0-a-abb47662e0", "pod":"calico-apiserver-7d6975569f-qrwcl", "timestamp":"2025-03-25 01:53:11.169597645 +0000 UTC"}, Hostname:"ci-4284-0-0-a-abb47662e0", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:53:11.241978 containerd[1507]: 2025-03-25 01:53:11.180 [INFO][4168] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:53:11.241978 containerd[1507]: 2025-03-25 01:53:11.181 [INFO][4168] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:53:11.241978 containerd[1507]: 2025-03-25 01:53:11.181 [INFO][4168] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-a-abb47662e0' Mar 25 01:53:11.241978 containerd[1507]: 2025-03-25 01:53:11.183 [INFO][4168] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.30c8adb2b7c8e8d0aa178d7f43dadcf8fdc428143c634a6c27208cc2845fbd55" host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:11.241978 containerd[1507]: 2025-03-25 01:53:11.190 [INFO][4168] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:11.241978 containerd[1507]: 2025-03-25 01:53:11.194 [INFO][4168] ipam/ipam.go 489: Trying affinity for 192.168.69.64/26 host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:11.241978 containerd[1507]: 2025-03-25 01:53:11.195 [INFO][4168] ipam/ipam.go 155: Attempting to load block cidr=192.168.69.64/26 host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:11.241978 containerd[1507]: 2025-03-25 01:53:11.197 [INFO][4168] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.69.64/26 host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:11.242727 containerd[1507]: 2025-03-25 01:53:11.197 [INFO][4168] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.69.64/26 handle="k8s-pod-network.30c8adb2b7c8e8d0aa178d7f43dadcf8fdc428143c634a6c27208cc2845fbd55" host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:11.242727 containerd[1507]: 2025-03-25 01:53:11.199 [INFO][4168] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.30c8adb2b7c8e8d0aa178d7f43dadcf8fdc428143c634a6c27208cc2845fbd55 Mar 25 01:53:11.242727 containerd[1507]: 2025-03-25 01:53:11.204 [INFO][4168] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.69.64/26 handle="k8s-pod-network.30c8adb2b7c8e8d0aa178d7f43dadcf8fdc428143c634a6c27208cc2845fbd55" host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:11.242727 containerd[1507]: 2025-03-25 01:53:11.208 [INFO][4168] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.69.65/26] block=192.168.69.64/26 handle="k8s-pod-network.30c8adb2b7c8e8d0aa178d7f43dadcf8fdc428143c634a6c27208cc2845fbd55" host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:11.242727 containerd[1507]: 2025-03-25 01:53:11.208 [INFO][4168] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.69.65/26] handle="k8s-pod-network.30c8adb2b7c8e8d0aa178d7f43dadcf8fdc428143c634a6c27208cc2845fbd55" host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:11.242727 containerd[1507]: 2025-03-25 01:53:11.208 [INFO][4168] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:53:11.242727 containerd[1507]: 2025-03-25 01:53:11.208 [INFO][4168] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.69.65/26] IPv6=[] ContainerID="30c8adb2b7c8e8d0aa178d7f43dadcf8fdc428143c634a6c27208cc2845fbd55" HandleID="k8s-pod-network.30c8adb2b7c8e8d0aa178d7f43dadcf8fdc428143c634a6c27208cc2845fbd55" Workload="ci--4284--0--0--a--abb47662e0-k8s-calico--apiserver--7d6975569f--qrwcl-eth0" Mar 25 01:53:11.242846 containerd[1507]: 2025-03-25 01:53:11.210 [INFO][4147] cni-plugin/k8s.go 386: Populated endpoint ContainerID="30c8adb2b7c8e8d0aa178d7f43dadcf8fdc428143c634a6c27208cc2845fbd55" Namespace="calico-apiserver" Pod="calico-apiserver-7d6975569f-qrwcl" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-calico--apiserver--7d6975569f--qrwcl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--a--abb47662e0-k8s-calico--apiserver--7d6975569f--qrwcl-eth0", GenerateName:"calico-apiserver-7d6975569f-", Namespace:"calico-apiserver", SelfLink:"", UID:"9b2cfa8a-0c0c-432c-a1ec-2a55980a855f", ResourceVersion:"721", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 52, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7d6975569f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-a-abb47662e0", ContainerID:"", Pod:"calico-apiserver-7d6975569f-qrwcl", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.69.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali909bc86c256", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:53:11.242899 containerd[1507]: 2025-03-25 01:53:11.210 [INFO][4147] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.69.65/32] ContainerID="30c8adb2b7c8e8d0aa178d7f43dadcf8fdc428143c634a6c27208cc2845fbd55" Namespace="calico-apiserver" Pod="calico-apiserver-7d6975569f-qrwcl" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-calico--apiserver--7d6975569f--qrwcl-eth0" Mar 25 01:53:11.242899 containerd[1507]: 2025-03-25 01:53:11.211 [INFO][4147] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali909bc86c256 ContainerID="30c8adb2b7c8e8d0aa178d7f43dadcf8fdc428143c634a6c27208cc2845fbd55" Namespace="calico-apiserver" Pod="calico-apiserver-7d6975569f-qrwcl" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-calico--apiserver--7d6975569f--qrwcl-eth0" Mar 25 01:53:11.242899 containerd[1507]: 2025-03-25 01:53:11.219 [INFO][4147] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="30c8adb2b7c8e8d0aa178d7f43dadcf8fdc428143c634a6c27208cc2845fbd55" Namespace="calico-apiserver" Pod="calico-apiserver-7d6975569f-qrwcl" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-calico--apiserver--7d6975569f--qrwcl-eth0" Mar 25 01:53:11.242951 containerd[1507]: 2025-03-25 01:53:11.220 [INFO][4147] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="30c8adb2b7c8e8d0aa178d7f43dadcf8fdc428143c634a6c27208cc2845fbd55" Namespace="calico-apiserver" Pod="calico-apiserver-7d6975569f-qrwcl" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-calico--apiserver--7d6975569f--qrwcl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--a--abb47662e0-k8s-calico--apiserver--7d6975569f--qrwcl-eth0", GenerateName:"calico-apiserver-7d6975569f-", Namespace:"calico-apiserver", SelfLink:"", UID:"9b2cfa8a-0c0c-432c-a1ec-2a55980a855f", ResourceVersion:"721", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 52, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7d6975569f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-a-abb47662e0", ContainerID:"30c8adb2b7c8e8d0aa178d7f43dadcf8fdc428143c634a6c27208cc2845fbd55", Pod:"calico-apiserver-7d6975569f-qrwcl", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.69.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali909bc86c256", MAC:"3a:04:59:c2:fd:72", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:53:11.242997 containerd[1507]: 2025-03-25 01:53:11.236 [INFO][4147] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="30c8adb2b7c8e8d0aa178d7f43dadcf8fdc428143c634a6c27208cc2845fbd55" Namespace="calico-apiserver" Pod="calico-apiserver-7d6975569f-qrwcl" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-calico--apiserver--7d6975569f--qrwcl-eth0" Mar 25 01:53:11.275264 systemd-networkd[1400]: cali3a40cdb854f: Link UP Mar 25 01:53:11.275941 systemd-networkd[1400]: cali3a40cdb854f: Gained carrier Mar 25 01:53:11.297633 containerd[1507]: 2025-03-25 01:53:10.975 [INFO][4150] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--a--abb47662e0-k8s-coredns--7db6d8ff4d--qfxw4-eth0 coredns-7db6d8ff4d- kube-system 5aafe162-9d82-4750-854d-710eb1f52529 720 0 2025-03-25 01:52:37 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4284-0-0-a-abb47662e0 coredns-7db6d8ff4d-qfxw4 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali3a40cdb854f [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="99a287620e4fc7c522f12729237e1a7633d8e36431ce91883634787fb1cb0c2d" Namespace="kube-system" Pod="coredns-7db6d8ff4d-qfxw4" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-coredns--7db6d8ff4d--qfxw4-" Mar 25 01:53:11.297633 containerd[1507]: 2025-03-25 01:53:10.976 [INFO][4150] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="99a287620e4fc7c522f12729237e1a7633d8e36431ce91883634787fb1cb0c2d" Namespace="kube-system" Pod="coredns-7db6d8ff4d-qfxw4" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-coredns--7db6d8ff4d--qfxw4-eth0" Mar 25 01:53:11.297633 containerd[1507]: 2025-03-25 01:53:11.169 [INFO][4167] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="99a287620e4fc7c522f12729237e1a7633d8e36431ce91883634787fb1cb0c2d" HandleID="k8s-pod-network.99a287620e4fc7c522f12729237e1a7633d8e36431ce91883634787fb1cb0c2d" Workload="ci--4284--0--0--a--abb47662e0-k8s-coredns--7db6d8ff4d--qfxw4-eth0" Mar 25 01:53:11.298357 containerd[1507]: 2025-03-25 01:53:11.180 [INFO][4167] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="99a287620e4fc7c522f12729237e1a7633d8e36431ce91883634787fb1cb0c2d" HandleID="k8s-pod-network.99a287620e4fc7c522f12729237e1a7633d8e36431ce91883634787fb1cb0c2d" Workload="ci--4284--0--0--a--abb47662e0-k8s-coredns--7db6d8ff4d--qfxw4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001031c0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4284-0-0-a-abb47662e0", "pod":"coredns-7db6d8ff4d-qfxw4", "timestamp":"2025-03-25 01:53:11.16961149 +0000 UTC"}, Hostname:"ci-4284-0-0-a-abb47662e0", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:53:11.298357 containerd[1507]: 2025-03-25 01:53:11.181 [INFO][4167] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:53:11.298357 containerd[1507]: 2025-03-25 01:53:11.208 [INFO][4167] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:53:11.298357 containerd[1507]: 2025-03-25 01:53:11.210 [INFO][4167] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-a-abb47662e0' Mar 25 01:53:11.298357 containerd[1507]: 2025-03-25 01:53:11.214 [INFO][4167] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.99a287620e4fc7c522f12729237e1a7633d8e36431ce91883634787fb1cb0c2d" host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:11.298357 containerd[1507]: 2025-03-25 01:53:11.223 [INFO][4167] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:11.298357 containerd[1507]: 2025-03-25 01:53:11.233 [INFO][4167] ipam/ipam.go 489: Trying affinity for 192.168.69.64/26 host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:11.298357 containerd[1507]: 2025-03-25 01:53:11.237 [INFO][4167] ipam/ipam.go 155: Attempting to load block cidr=192.168.69.64/26 host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:11.298357 containerd[1507]: 2025-03-25 01:53:11.241 [INFO][4167] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.69.64/26 host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:11.298833 containerd[1507]: 2025-03-25 01:53:11.241 [INFO][4167] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.69.64/26 handle="k8s-pod-network.99a287620e4fc7c522f12729237e1a7633d8e36431ce91883634787fb1cb0c2d" host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:11.298833 containerd[1507]: 2025-03-25 01:53:11.244 [INFO][4167] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.99a287620e4fc7c522f12729237e1a7633d8e36431ce91883634787fb1cb0c2d Mar 25 01:53:11.298833 containerd[1507]: 2025-03-25 01:53:11.253 [INFO][4167] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.69.64/26 handle="k8s-pod-network.99a287620e4fc7c522f12729237e1a7633d8e36431ce91883634787fb1cb0c2d" host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:11.298833 containerd[1507]: 2025-03-25 01:53:11.262 [INFO][4167] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.69.66/26] block=192.168.69.64/26 handle="k8s-pod-network.99a287620e4fc7c522f12729237e1a7633d8e36431ce91883634787fb1cb0c2d" host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:11.298833 containerd[1507]: 2025-03-25 01:53:11.262 [INFO][4167] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.69.66/26] handle="k8s-pod-network.99a287620e4fc7c522f12729237e1a7633d8e36431ce91883634787fb1cb0c2d" host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:11.298833 containerd[1507]: 2025-03-25 01:53:11.262 [INFO][4167] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:53:11.298833 containerd[1507]: 2025-03-25 01:53:11.262 [INFO][4167] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.69.66/26] IPv6=[] ContainerID="99a287620e4fc7c522f12729237e1a7633d8e36431ce91883634787fb1cb0c2d" HandleID="k8s-pod-network.99a287620e4fc7c522f12729237e1a7633d8e36431ce91883634787fb1cb0c2d" Workload="ci--4284--0--0--a--abb47662e0-k8s-coredns--7db6d8ff4d--qfxw4-eth0" Mar 25 01:53:11.298950 containerd[1507]: 2025-03-25 01:53:11.271 [INFO][4150] cni-plugin/k8s.go 386: Populated endpoint ContainerID="99a287620e4fc7c522f12729237e1a7633d8e36431ce91883634787fb1cb0c2d" Namespace="kube-system" Pod="coredns-7db6d8ff4d-qfxw4" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-coredns--7db6d8ff4d--qfxw4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--a--abb47662e0-k8s-coredns--7db6d8ff4d--qfxw4-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"5aafe162-9d82-4750-854d-710eb1f52529", ResourceVersion:"720", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 52, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-a-abb47662e0", ContainerID:"", Pod:"coredns-7db6d8ff4d-qfxw4", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.69.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3a40cdb854f", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:53:11.298950 containerd[1507]: 2025-03-25 01:53:11.272 [INFO][4150] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.69.66/32] ContainerID="99a287620e4fc7c522f12729237e1a7633d8e36431ce91883634787fb1cb0c2d" Namespace="kube-system" Pod="coredns-7db6d8ff4d-qfxw4" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-coredns--7db6d8ff4d--qfxw4-eth0" Mar 25 01:53:11.298950 containerd[1507]: 2025-03-25 01:53:11.272 [INFO][4150] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3a40cdb854f ContainerID="99a287620e4fc7c522f12729237e1a7633d8e36431ce91883634787fb1cb0c2d" Namespace="kube-system" Pod="coredns-7db6d8ff4d-qfxw4" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-coredns--7db6d8ff4d--qfxw4-eth0" Mar 25 01:53:11.298950 containerd[1507]: 2025-03-25 01:53:11.276 [INFO][4150] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="99a287620e4fc7c522f12729237e1a7633d8e36431ce91883634787fb1cb0c2d" Namespace="kube-system" Pod="coredns-7db6d8ff4d-qfxw4" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-coredns--7db6d8ff4d--qfxw4-eth0" Mar 25 01:53:11.298950 containerd[1507]: 2025-03-25 01:53:11.277 [INFO][4150] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="99a287620e4fc7c522f12729237e1a7633d8e36431ce91883634787fb1cb0c2d" Namespace="kube-system" Pod="coredns-7db6d8ff4d-qfxw4" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-coredns--7db6d8ff4d--qfxw4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--a--abb47662e0-k8s-coredns--7db6d8ff4d--qfxw4-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"5aafe162-9d82-4750-854d-710eb1f52529", ResourceVersion:"720", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 52, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-a-abb47662e0", ContainerID:"99a287620e4fc7c522f12729237e1a7633d8e36431ce91883634787fb1cb0c2d", Pod:"coredns-7db6d8ff4d-qfxw4", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.69.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3a40cdb854f", MAC:"72:a1:f1:50:4f:94", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:53:11.298950 containerd[1507]: 2025-03-25 01:53:11.291 [INFO][4150] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="99a287620e4fc7c522f12729237e1a7633d8e36431ce91883634787fb1cb0c2d" Namespace="kube-system" Pod="coredns-7db6d8ff4d-qfxw4" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-coredns--7db6d8ff4d--qfxw4-eth0" Mar 25 01:53:11.300863 containerd[1507]: time="2025-03-25T01:53:11.300586573Z" level=info msg="connecting to shim 30c8adb2b7c8e8d0aa178d7f43dadcf8fdc428143c634a6c27208cc2845fbd55" address="unix:///run/containerd/s/b3f7f3be698bdc867783207d509ad9ae1459c42ea409476c2e2956f26d576df1" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:53:11.330475 systemd[1]: Started cri-containerd-30c8adb2b7c8e8d0aa178d7f43dadcf8fdc428143c634a6c27208cc2845fbd55.scope - libcontainer container 30c8adb2b7c8e8d0aa178d7f43dadcf8fdc428143c634a6c27208cc2845fbd55. Mar 25 01:53:11.341745 containerd[1507]: time="2025-03-25T01:53:11.341422398Z" level=info msg="connecting to shim 99a287620e4fc7c522f12729237e1a7633d8e36431ce91883634787fb1cb0c2d" address="unix:///run/containerd/s/cba5688ecf1c7415b1608f84f6265a9d7940eaaac5783ee35917b92e4e021114" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:53:11.368018 systemd[1]: Started cri-containerd-99a287620e4fc7c522f12729237e1a7633d8e36431ce91883634787fb1cb0c2d.scope - libcontainer container 99a287620e4fc7c522f12729237e1a7633d8e36431ce91883634787fb1cb0c2d. Mar 25 01:53:11.417176 containerd[1507]: time="2025-03-25T01:53:11.417135139Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d6975569f-qrwcl,Uid:9b2cfa8a-0c0c-432c-a1ec-2a55980a855f,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"30c8adb2b7c8e8d0aa178d7f43dadcf8fdc428143c634a6c27208cc2845fbd55\"" Mar 25 01:53:11.417422 containerd[1507]: time="2025-03-25T01:53:11.417346524Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-qfxw4,Uid:5aafe162-9d82-4750-854d-710eb1f52529,Namespace:kube-system,Attempt:0,} returns sandbox id \"99a287620e4fc7c522f12729237e1a7633d8e36431ce91883634787fb1cb0c2d\"" Mar 25 01:53:11.420592 containerd[1507]: time="2025-03-25T01:53:11.420554125Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 25 01:53:11.423343 containerd[1507]: time="2025-03-25T01:53:11.423302918Z" level=info msg="CreateContainer within sandbox \"99a287620e4fc7c522f12729237e1a7633d8e36431ce91883634787fb1cb0c2d\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 25 01:53:11.437023 containerd[1507]: time="2025-03-25T01:53:11.436980923Z" level=info msg="Container fe4e363ba0682799885099307f294c8bc3a0ca5c82f08f958989a23d06e799de: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:53:11.441247 containerd[1507]: time="2025-03-25T01:53:11.441213707Z" level=info msg="CreateContainer within sandbox \"99a287620e4fc7c522f12729237e1a7633d8e36431ce91883634787fb1cb0c2d\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"fe4e363ba0682799885099307f294c8bc3a0ca5c82f08f958989a23d06e799de\"" Mar 25 01:53:11.441932 containerd[1507]: time="2025-03-25T01:53:11.441646086Z" level=info msg="StartContainer for \"fe4e363ba0682799885099307f294c8bc3a0ca5c82f08f958989a23d06e799de\"" Mar 25 01:53:11.450873 containerd[1507]: time="2025-03-25T01:53:11.450822976Z" level=info msg="connecting to shim fe4e363ba0682799885099307f294c8bc3a0ca5c82f08f958989a23d06e799de" address="unix:///run/containerd/s/cba5688ecf1c7415b1608f84f6265a9d7940eaaac5783ee35917b92e4e021114" protocol=ttrpc version=3 Mar 25 01:53:11.466433 systemd[1]: Started cri-containerd-fe4e363ba0682799885099307f294c8bc3a0ca5c82f08f958989a23d06e799de.scope - libcontainer container fe4e363ba0682799885099307f294c8bc3a0ca5c82f08f958989a23d06e799de. Mar 25 01:53:11.494544 containerd[1507]: time="2025-03-25T01:53:11.494468245Z" level=info msg="StartContainer for \"fe4e363ba0682799885099307f294c8bc3a0ca5c82f08f958989a23d06e799de\" returns successfully" Mar 25 01:53:11.899471 containerd[1507]: time="2025-03-25T01:53:11.899424793Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-54675b8544-42csf,Uid:d5827ecd-bdbb-4e45-81f1-2bbc6444be50,Namespace:calico-system,Attempt:0,}" Mar 25 01:53:12.005034 systemd-networkd[1400]: cali689dbdeca71: Link UP Mar 25 01:53:12.005668 systemd-networkd[1400]: cali689dbdeca71: Gained carrier Mar 25 01:53:12.020894 containerd[1507]: 2025-03-25 01:53:11.938 [INFO][4330] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--a--abb47662e0-k8s-calico--kube--controllers--54675b8544--42csf-eth0 calico-kube-controllers-54675b8544- calico-system d5827ecd-bdbb-4e45-81f1-2bbc6444be50 722 0 2025-03-25 01:52:45 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:54675b8544 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4284-0-0-a-abb47662e0 calico-kube-controllers-54675b8544-42csf eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali689dbdeca71 [] []}} ContainerID="c2beac8be50cb7c9320cb3f4af22888c8499fddea9e454e52820bc5946092ec2" Namespace="calico-system" Pod="calico-kube-controllers-54675b8544-42csf" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-calico--kube--controllers--54675b8544--42csf-" Mar 25 01:53:12.020894 containerd[1507]: 2025-03-25 01:53:11.939 [INFO][4330] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="c2beac8be50cb7c9320cb3f4af22888c8499fddea9e454e52820bc5946092ec2" Namespace="calico-system" Pod="calico-kube-controllers-54675b8544-42csf" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-calico--kube--controllers--54675b8544--42csf-eth0" Mar 25 01:53:12.020894 containerd[1507]: 2025-03-25 01:53:11.964 [INFO][4343] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c2beac8be50cb7c9320cb3f4af22888c8499fddea9e454e52820bc5946092ec2" HandleID="k8s-pod-network.c2beac8be50cb7c9320cb3f4af22888c8499fddea9e454e52820bc5946092ec2" Workload="ci--4284--0--0--a--abb47662e0-k8s-calico--kube--controllers--54675b8544--42csf-eth0" Mar 25 01:53:12.020894 containerd[1507]: 2025-03-25 01:53:11.975 [INFO][4343] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c2beac8be50cb7c9320cb3f4af22888c8499fddea9e454e52820bc5946092ec2" HandleID="k8s-pod-network.c2beac8be50cb7c9320cb3f4af22888c8499fddea9e454e52820bc5946092ec2" Workload="ci--4284--0--0--a--abb47662e0-k8s-calico--kube--controllers--54675b8544--42csf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00030d560), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4284-0-0-a-abb47662e0", "pod":"calico-kube-controllers-54675b8544-42csf", "timestamp":"2025-03-25 01:53:11.964634349 +0000 UTC"}, Hostname:"ci-4284-0-0-a-abb47662e0", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:53:12.020894 containerd[1507]: 2025-03-25 01:53:11.975 [INFO][4343] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:53:12.020894 containerd[1507]: 2025-03-25 01:53:11.975 [INFO][4343] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:53:12.020894 containerd[1507]: 2025-03-25 01:53:11.975 [INFO][4343] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-a-abb47662e0' Mar 25 01:53:12.020894 containerd[1507]: 2025-03-25 01:53:11.977 [INFO][4343] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.c2beac8be50cb7c9320cb3f4af22888c8499fddea9e454e52820bc5946092ec2" host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:12.020894 containerd[1507]: 2025-03-25 01:53:11.980 [INFO][4343] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:12.020894 containerd[1507]: 2025-03-25 01:53:11.984 [INFO][4343] ipam/ipam.go 489: Trying affinity for 192.168.69.64/26 host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:12.020894 containerd[1507]: 2025-03-25 01:53:11.985 [INFO][4343] ipam/ipam.go 155: Attempting to load block cidr=192.168.69.64/26 host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:12.020894 containerd[1507]: 2025-03-25 01:53:11.987 [INFO][4343] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.69.64/26 host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:12.020894 containerd[1507]: 2025-03-25 01:53:11.987 [INFO][4343] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.69.64/26 handle="k8s-pod-network.c2beac8be50cb7c9320cb3f4af22888c8499fddea9e454e52820bc5946092ec2" host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:12.020894 containerd[1507]: 2025-03-25 01:53:11.989 [INFO][4343] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.c2beac8be50cb7c9320cb3f4af22888c8499fddea9e454e52820bc5946092ec2 Mar 25 01:53:12.020894 containerd[1507]: 2025-03-25 01:53:11.992 [INFO][4343] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.69.64/26 handle="k8s-pod-network.c2beac8be50cb7c9320cb3f4af22888c8499fddea9e454e52820bc5946092ec2" host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:12.020894 containerd[1507]: 2025-03-25 01:53:11.999 [INFO][4343] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.69.67/26] block=192.168.69.64/26 handle="k8s-pod-network.c2beac8be50cb7c9320cb3f4af22888c8499fddea9e454e52820bc5946092ec2" host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:12.020894 containerd[1507]: 2025-03-25 01:53:11.999 [INFO][4343] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.69.67/26] handle="k8s-pod-network.c2beac8be50cb7c9320cb3f4af22888c8499fddea9e454e52820bc5946092ec2" host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:12.020894 containerd[1507]: 2025-03-25 01:53:11.999 [INFO][4343] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:53:12.020894 containerd[1507]: 2025-03-25 01:53:11.999 [INFO][4343] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.69.67/26] IPv6=[] ContainerID="c2beac8be50cb7c9320cb3f4af22888c8499fddea9e454e52820bc5946092ec2" HandleID="k8s-pod-network.c2beac8be50cb7c9320cb3f4af22888c8499fddea9e454e52820bc5946092ec2" Workload="ci--4284--0--0--a--abb47662e0-k8s-calico--kube--controllers--54675b8544--42csf-eth0" Mar 25 01:53:12.021588 containerd[1507]: 2025-03-25 01:53:12.002 [INFO][4330] cni-plugin/k8s.go 386: Populated endpoint ContainerID="c2beac8be50cb7c9320cb3f4af22888c8499fddea9e454e52820bc5946092ec2" Namespace="calico-system" Pod="calico-kube-controllers-54675b8544-42csf" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-calico--kube--controllers--54675b8544--42csf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--a--abb47662e0-k8s-calico--kube--controllers--54675b8544--42csf-eth0", GenerateName:"calico-kube-controllers-54675b8544-", Namespace:"calico-system", SelfLink:"", UID:"d5827ecd-bdbb-4e45-81f1-2bbc6444be50", ResourceVersion:"722", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 52, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"54675b8544", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-a-abb47662e0", ContainerID:"", Pod:"calico-kube-controllers-54675b8544-42csf", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.69.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali689dbdeca71", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:53:12.021588 containerd[1507]: 2025-03-25 01:53:12.002 [INFO][4330] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.69.67/32] ContainerID="c2beac8be50cb7c9320cb3f4af22888c8499fddea9e454e52820bc5946092ec2" Namespace="calico-system" Pod="calico-kube-controllers-54675b8544-42csf" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-calico--kube--controllers--54675b8544--42csf-eth0" Mar 25 01:53:12.021588 containerd[1507]: 2025-03-25 01:53:12.002 [INFO][4330] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali689dbdeca71 ContainerID="c2beac8be50cb7c9320cb3f4af22888c8499fddea9e454e52820bc5946092ec2" Namespace="calico-system" Pod="calico-kube-controllers-54675b8544-42csf" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-calico--kube--controllers--54675b8544--42csf-eth0" Mar 25 01:53:12.021588 containerd[1507]: 2025-03-25 01:53:12.006 [INFO][4330] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c2beac8be50cb7c9320cb3f4af22888c8499fddea9e454e52820bc5946092ec2" Namespace="calico-system" Pod="calico-kube-controllers-54675b8544-42csf" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-calico--kube--controllers--54675b8544--42csf-eth0" Mar 25 01:53:12.021588 containerd[1507]: 2025-03-25 01:53:12.006 [INFO][4330] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="c2beac8be50cb7c9320cb3f4af22888c8499fddea9e454e52820bc5946092ec2" Namespace="calico-system" Pod="calico-kube-controllers-54675b8544-42csf" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-calico--kube--controllers--54675b8544--42csf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--a--abb47662e0-k8s-calico--kube--controllers--54675b8544--42csf-eth0", GenerateName:"calico-kube-controllers-54675b8544-", Namespace:"calico-system", SelfLink:"", UID:"d5827ecd-bdbb-4e45-81f1-2bbc6444be50", ResourceVersion:"722", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 52, 45, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"54675b8544", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-a-abb47662e0", ContainerID:"c2beac8be50cb7c9320cb3f4af22888c8499fddea9e454e52820bc5946092ec2", Pod:"calico-kube-controllers-54675b8544-42csf", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.69.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali689dbdeca71", MAC:"42:e8:3a:a3:6e:16", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:53:12.021588 containerd[1507]: 2025-03-25 01:53:12.018 [INFO][4330] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="c2beac8be50cb7c9320cb3f4af22888c8499fddea9e454e52820bc5946092ec2" Namespace="calico-system" Pod="calico-kube-controllers-54675b8544-42csf" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-calico--kube--controllers--54675b8544--42csf-eth0" Mar 25 01:53:12.051662 containerd[1507]: time="2025-03-25T01:53:12.051303088Z" level=info msg="connecting to shim c2beac8be50cb7c9320cb3f4af22888c8499fddea9e454e52820bc5946092ec2" address="unix:///run/containerd/s/1a6412383fbf915a7560b0f703ceeaf66470984f96e1d35e901b41ec51420c34" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:53:12.074461 systemd[1]: Started cri-containerd-c2beac8be50cb7c9320cb3f4af22888c8499fddea9e454e52820bc5946092ec2.scope - libcontainer container c2beac8be50cb7c9320cb3f4af22888c8499fddea9e454e52820bc5946092ec2. Mar 25 01:53:12.130070 containerd[1507]: time="2025-03-25T01:53:12.129904992Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-54675b8544-42csf,Uid:d5827ecd-bdbb-4e45-81f1-2bbc6444be50,Namespace:calico-system,Attempt:0,} returns sandbox id \"c2beac8be50cb7c9320cb3f4af22888c8499fddea9e454e52820bc5946092ec2\"" Mar 25 01:53:12.131888 kubelet[2914]: I0325 01:53:12.131306 2914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-qfxw4" podStartSLOduration=35.131289617 podStartE2EDuration="35.131289617s" podCreationTimestamp="2025-03-25 01:52:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:53:12.116194227 +0000 UTC m=+49.299416077" watchObservedRunningTime="2025-03-25 01:53:12.131289617 +0000 UTC m=+49.314511457" Mar 25 01:53:12.901232 containerd[1507]: time="2025-03-25T01:53:12.899688660Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zsbnv,Uid:a86ff892-344f-4f10-be9e-1c061509165d,Namespace:calico-system,Attempt:0,}" Mar 25 01:53:12.901444 containerd[1507]: time="2025-03-25T01:53:12.901414053Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-xjt4l,Uid:4197106e-bb3c-4d91-a3a1-29d9a63dea11,Namespace:kube-system,Attempt:0,}" Mar 25 01:53:12.991661 systemd-networkd[1400]: cali909bc86c256: Gained IPv6LL Mar 25 01:53:13.021986 systemd-networkd[1400]: cali5f30a1e603e: Link UP Mar 25 01:53:13.022267 systemd-networkd[1400]: cali5f30a1e603e: Gained carrier Mar 25 01:53:13.038740 containerd[1507]: 2025-03-25 01:53:12.947 [INFO][4421] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--a--abb47662e0-k8s-coredns--7db6d8ff4d--xjt4l-eth0 coredns-7db6d8ff4d- kube-system 4197106e-bb3c-4d91-a3a1-29d9a63dea11 717 0 2025-03-25 01:52:37 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4284-0-0-a-abb47662e0 coredns-7db6d8ff4d-xjt4l eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali5f30a1e603e [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="baa8fbda303e3570b0e5dc43d87fae78c43fd504eb6afae34d58654af392ac1a" Namespace="kube-system" Pod="coredns-7db6d8ff4d-xjt4l" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-coredns--7db6d8ff4d--xjt4l-" Mar 25 01:53:13.038740 containerd[1507]: 2025-03-25 01:53:12.947 [INFO][4421] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="baa8fbda303e3570b0e5dc43d87fae78c43fd504eb6afae34d58654af392ac1a" Namespace="kube-system" Pod="coredns-7db6d8ff4d-xjt4l" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-coredns--7db6d8ff4d--xjt4l-eth0" Mar 25 01:53:13.038740 containerd[1507]: 2025-03-25 01:53:12.978 [INFO][4448] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="baa8fbda303e3570b0e5dc43d87fae78c43fd504eb6afae34d58654af392ac1a" HandleID="k8s-pod-network.baa8fbda303e3570b0e5dc43d87fae78c43fd504eb6afae34d58654af392ac1a" Workload="ci--4284--0--0--a--abb47662e0-k8s-coredns--7db6d8ff4d--xjt4l-eth0" Mar 25 01:53:13.038740 containerd[1507]: 2025-03-25 01:53:12.990 [INFO][4448] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="baa8fbda303e3570b0e5dc43d87fae78c43fd504eb6afae34d58654af392ac1a" HandleID="k8s-pod-network.baa8fbda303e3570b0e5dc43d87fae78c43fd504eb6afae34d58654af392ac1a" Workload="ci--4284--0--0--a--abb47662e0-k8s-coredns--7db6d8ff4d--xjt4l-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00039aaa0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4284-0-0-a-abb47662e0", "pod":"coredns-7db6d8ff4d-xjt4l", "timestamp":"2025-03-25 01:53:12.978076253 +0000 UTC"}, Hostname:"ci-4284-0-0-a-abb47662e0", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:53:13.038740 containerd[1507]: 2025-03-25 01:53:12.990 [INFO][4448] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:53:13.038740 containerd[1507]: 2025-03-25 01:53:12.990 [INFO][4448] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:53:13.038740 containerd[1507]: 2025-03-25 01:53:12.990 [INFO][4448] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-a-abb47662e0' Mar 25 01:53:13.038740 containerd[1507]: 2025-03-25 01:53:12.993 [INFO][4448] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.baa8fbda303e3570b0e5dc43d87fae78c43fd504eb6afae34d58654af392ac1a" host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:13.038740 containerd[1507]: 2025-03-25 01:53:12.998 [INFO][4448] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:13.038740 containerd[1507]: 2025-03-25 01:53:13.002 [INFO][4448] ipam/ipam.go 489: Trying affinity for 192.168.69.64/26 host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:13.038740 containerd[1507]: 2025-03-25 01:53:13.003 [INFO][4448] ipam/ipam.go 155: Attempting to load block cidr=192.168.69.64/26 host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:13.038740 containerd[1507]: 2025-03-25 01:53:13.005 [INFO][4448] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.69.64/26 host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:13.038740 containerd[1507]: 2025-03-25 01:53:13.005 [INFO][4448] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.69.64/26 handle="k8s-pod-network.baa8fbda303e3570b0e5dc43d87fae78c43fd504eb6afae34d58654af392ac1a" host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:13.038740 containerd[1507]: 2025-03-25 01:53:13.007 [INFO][4448] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.baa8fbda303e3570b0e5dc43d87fae78c43fd504eb6afae34d58654af392ac1a Mar 25 01:53:13.038740 containerd[1507]: 2025-03-25 01:53:13.012 [INFO][4448] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.69.64/26 handle="k8s-pod-network.baa8fbda303e3570b0e5dc43d87fae78c43fd504eb6afae34d58654af392ac1a" host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:13.038740 containerd[1507]: 2025-03-25 01:53:13.016 [INFO][4448] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.69.68/26] block=192.168.69.64/26 handle="k8s-pod-network.baa8fbda303e3570b0e5dc43d87fae78c43fd504eb6afae34d58654af392ac1a" host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:13.038740 containerd[1507]: 2025-03-25 01:53:13.016 [INFO][4448] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.69.68/26] handle="k8s-pod-network.baa8fbda303e3570b0e5dc43d87fae78c43fd504eb6afae34d58654af392ac1a" host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:13.038740 containerd[1507]: 2025-03-25 01:53:13.016 [INFO][4448] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:53:13.038740 containerd[1507]: 2025-03-25 01:53:13.016 [INFO][4448] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.69.68/26] IPv6=[] ContainerID="baa8fbda303e3570b0e5dc43d87fae78c43fd504eb6afae34d58654af392ac1a" HandleID="k8s-pod-network.baa8fbda303e3570b0e5dc43d87fae78c43fd504eb6afae34d58654af392ac1a" Workload="ci--4284--0--0--a--abb47662e0-k8s-coredns--7db6d8ff4d--xjt4l-eth0" Mar 25 01:53:13.041472 containerd[1507]: 2025-03-25 01:53:13.019 [INFO][4421] cni-plugin/k8s.go 386: Populated endpoint ContainerID="baa8fbda303e3570b0e5dc43d87fae78c43fd504eb6afae34d58654af392ac1a" Namespace="kube-system" Pod="coredns-7db6d8ff4d-xjt4l" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-coredns--7db6d8ff4d--xjt4l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--a--abb47662e0-k8s-coredns--7db6d8ff4d--xjt4l-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"4197106e-bb3c-4d91-a3a1-29d9a63dea11", ResourceVersion:"717", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 52, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-a-abb47662e0", ContainerID:"", Pod:"coredns-7db6d8ff4d-xjt4l", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.69.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5f30a1e603e", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:53:13.041472 containerd[1507]: 2025-03-25 01:53:13.019 [INFO][4421] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.69.68/32] ContainerID="baa8fbda303e3570b0e5dc43d87fae78c43fd504eb6afae34d58654af392ac1a" Namespace="kube-system" Pod="coredns-7db6d8ff4d-xjt4l" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-coredns--7db6d8ff4d--xjt4l-eth0" Mar 25 01:53:13.041472 containerd[1507]: 2025-03-25 01:53:13.019 [INFO][4421] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5f30a1e603e ContainerID="baa8fbda303e3570b0e5dc43d87fae78c43fd504eb6afae34d58654af392ac1a" Namespace="kube-system" Pod="coredns-7db6d8ff4d-xjt4l" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-coredns--7db6d8ff4d--xjt4l-eth0" Mar 25 01:53:13.041472 containerd[1507]: 2025-03-25 01:53:13.022 [INFO][4421] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="baa8fbda303e3570b0e5dc43d87fae78c43fd504eb6afae34d58654af392ac1a" Namespace="kube-system" Pod="coredns-7db6d8ff4d-xjt4l" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-coredns--7db6d8ff4d--xjt4l-eth0" Mar 25 01:53:13.041472 containerd[1507]: 2025-03-25 01:53:13.022 [INFO][4421] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="baa8fbda303e3570b0e5dc43d87fae78c43fd504eb6afae34d58654af392ac1a" Namespace="kube-system" Pod="coredns-7db6d8ff4d-xjt4l" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-coredns--7db6d8ff4d--xjt4l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--a--abb47662e0-k8s-coredns--7db6d8ff4d--xjt4l-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"4197106e-bb3c-4d91-a3a1-29d9a63dea11", ResourceVersion:"717", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 52, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-a-abb47662e0", ContainerID:"baa8fbda303e3570b0e5dc43d87fae78c43fd504eb6afae34d58654af392ac1a", Pod:"coredns-7db6d8ff4d-xjt4l", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.69.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5f30a1e603e", MAC:"82:1c:d6:a8:57:e2", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:53:13.041472 containerd[1507]: 2025-03-25 01:53:13.036 [INFO][4421] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="baa8fbda303e3570b0e5dc43d87fae78c43fd504eb6afae34d58654af392ac1a" Namespace="kube-system" Pod="coredns-7db6d8ff4d-xjt4l" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-coredns--7db6d8ff4d--xjt4l-eth0" Mar 25 01:53:13.055505 systemd-networkd[1400]: cali3a40cdb854f: Gained IPv6LL Mar 25 01:53:13.089165 systemd-networkd[1400]: cali3032583ccb9: Link UP Mar 25 01:53:13.089388 systemd-networkd[1400]: cali3032583ccb9: Gained carrier Mar 25 01:53:13.093872 containerd[1507]: time="2025-03-25T01:53:13.093449549Z" level=info msg="connecting to shim baa8fbda303e3570b0e5dc43d87fae78c43fd504eb6afae34d58654af392ac1a" address="unix:///run/containerd/s/a9b785bdcf9e14485688d1bd81c3af3b93804aa2f46eceea00caada7bacad674" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:53:13.115597 containerd[1507]: 2025-03-25 01:53:12.950 [INFO][4420] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--a--abb47662e0-k8s-csi--node--driver--zsbnv-eth0 csi-node-driver- calico-system a86ff892-344f-4f10-be9e-1c061509165d 629 0 2025-03-25 01:52:44 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:69ddf5d45d k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4284-0-0-a-abb47662e0 csi-node-driver-zsbnv eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali3032583ccb9 [] []}} ContainerID="9a54ca7dae51d747b25feb04af04fbb5c7cb020a7bba61703d7eb73bf4dbdbae" Namespace="calico-system" Pod="csi-node-driver-zsbnv" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-csi--node--driver--zsbnv-" Mar 25 01:53:13.115597 containerd[1507]: 2025-03-25 01:53:12.950 [INFO][4420] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="9a54ca7dae51d747b25feb04af04fbb5c7cb020a7bba61703d7eb73bf4dbdbae" Namespace="calico-system" Pod="csi-node-driver-zsbnv" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-csi--node--driver--zsbnv-eth0" Mar 25 01:53:13.115597 containerd[1507]: 2025-03-25 01:53:12.979 [INFO][4446] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9a54ca7dae51d747b25feb04af04fbb5c7cb020a7bba61703d7eb73bf4dbdbae" HandleID="k8s-pod-network.9a54ca7dae51d747b25feb04af04fbb5c7cb020a7bba61703d7eb73bf4dbdbae" Workload="ci--4284--0--0--a--abb47662e0-k8s-csi--node--driver--zsbnv-eth0" Mar 25 01:53:13.115597 containerd[1507]: 2025-03-25 01:53:12.994 [INFO][4446] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9a54ca7dae51d747b25feb04af04fbb5c7cb020a7bba61703d7eb73bf4dbdbae" HandleID="k8s-pod-network.9a54ca7dae51d747b25feb04af04fbb5c7cb020a7bba61703d7eb73bf4dbdbae" Workload="ci--4284--0--0--a--abb47662e0-k8s-csi--node--driver--zsbnv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00031b9d0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4284-0-0-a-abb47662e0", "pod":"csi-node-driver-zsbnv", "timestamp":"2025-03-25 01:53:12.979920069 +0000 UTC"}, Hostname:"ci-4284-0-0-a-abb47662e0", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:53:13.115597 containerd[1507]: 2025-03-25 01:53:12.994 [INFO][4446] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:53:13.115597 containerd[1507]: 2025-03-25 01:53:13.016 [INFO][4446] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:53:13.115597 containerd[1507]: 2025-03-25 01:53:13.016 [INFO][4446] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-a-abb47662e0' Mar 25 01:53:13.115597 containerd[1507]: 2025-03-25 01:53:13.018 [INFO][4446] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.9a54ca7dae51d747b25feb04af04fbb5c7cb020a7bba61703d7eb73bf4dbdbae" host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:13.115597 containerd[1507]: 2025-03-25 01:53:13.026 [INFO][4446] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:13.115597 containerd[1507]: 2025-03-25 01:53:13.035 [INFO][4446] ipam/ipam.go 489: Trying affinity for 192.168.69.64/26 host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:13.115597 containerd[1507]: 2025-03-25 01:53:13.039 [INFO][4446] ipam/ipam.go 155: Attempting to load block cidr=192.168.69.64/26 host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:13.115597 containerd[1507]: 2025-03-25 01:53:13.043 [INFO][4446] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.69.64/26 host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:13.115597 containerd[1507]: 2025-03-25 01:53:13.043 [INFO][4446] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.69.64/26 handle="k8s-pod-network.9a54ca7dae51d747b25feb04af04fbb5c7cb020a7bba61703d7eb73bf4dbdbae" host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:13.115597 containerd[1507]: 2025-03-25 01:53:13.046 [INFO][4446] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.9a54ca7dae51d747b25feb04af04fbb5c7cb020a7bba61703d7eb73bf4dbdbae Mar 25 01:53:13.115597 containerd[1507]: 2025-03-25 01:53:13.052 [INFO][4446] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.69.64/26 handle="k8s-pod-network.9a54ca7dae51d747b25feb04af04fbb5c7cb020a7bba61703d7eb73bf4dbdbae" host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:13.115597 containerd[1507]: 2025-03-25 01:53:13.071 [INFO][4446] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.69.69/26] block=192.168.69.64/26 handle="k8s-pod-network.9a54ca7dae51d747b25feb04af04fbb5c7cb020a7bba61703d7eb73bf4dbdbae" host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:13.115597 containerd[1507]: 2025-03-25 01:53:13.071 [INFO][4446] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.69.69/26] handle="k8s-pod-network.9a54ca7dae51d747b25feb04af04fbb5c7cb020a7bba61703d7eb73bf4dbdbae" host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:13.115597 containerd[1507]: 2025-03-25 01:53:13.071 [INFO][4446] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:53:13.115597 containerd[1507]: 2025-03-25 01:53:13.071 [INFO][4446] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.69.69/26] IPv6=[] ContainerID="9a54ca7dae51d747b25feb04af04fbb5c7cb020a7bba61703d7eb73bf4dbdbae" HandleID="k8s-pod-network.9a54ca7dae51d747b25feb04af04fbb5c7cb020a7bba61703d7eb73bf4dbdbae" Workload="ci--4284--0--0--a--abb47662e0-k8s-csi--node--driver--zsbnv-eth0" Mar 25 01:53:13.116543 containerd[1507]: 2025-03-25 01:53:13.081 [INFO][4420] cni-plugin/k8s.go 386: Populated endpoint ContainerID="9a54ca7dae51d747b25feb04af04fbb5c7cb020a7bba61703d7eb73bf4dbdbae" Namespace="calico-system" Pod="csi-node-driver-zsbnv" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-csi--node--driver--zsbnv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--a--abb47662e0-k8s-csi--node--driver--zsbnv-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"a86ff892-344f-4f10-be9e-1c061509165d", ResourceVersion:"629", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 52, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"69ddf5d45d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-a-abb47662e0", ContainerID:"", Pod:"csi-node-driver-zsbnv", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.69.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali3032583ccb9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:53:13.116543 containerd[1507]: 2025-03-25 01:53:13.081 [INFO][4420] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.69.69/32] ContainerID="9a54ca7dae51d747b25feb04af04fbb5c7cb020a7bba61703d7eb73bf4dbdbae" Namespace="calico-system" Pod="csi-node-driver-zsbnv" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-csi--node--driver--zsbnv-eth0" Mar 25 01:53:13.116543 containerd[1507]: 2025-03-25 01:53:13.081 [INFO][4420] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3032583ccb9 ContainerID="9a54ca7dae51d747b25feb04af04fbb5c7cb020a7bba61703d7eb73bf4dbdbae" Namespace="calico-system" Pod="csi-node-driver-zsbnv" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-csi--node--driver--zsbnv-eth0" Mar 25 01:53:13.116543 containerd[1507]: 2025-03-25 01:53:13.091 [INFO][4420] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9a54ca7dae51d747b25feb04af04fbb5c7cb020a7bba61703d7eb73bf4dbdbae" Namespace="calico-system" Pod="csi-node-driver-zsbnv" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-csi--node--driver--zsbnv-eth0" Mar 25 01:53:13.116543 containerd[1507]: 2025-03-25 01:53:13.091 [INFO][4420] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="9a54ca7dae51d747b25feb04af04fbb5c7cb020a7bba61703d7eb73bf4dbdbae" Namespace="calico-system" Pod="csi-node-driver-zsbnv" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-csi--node--driver--zsbnv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--a--abb47662e0-k8s-csi--node--driver--zsbnv-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"a86ff892-344f-4f10-be9e-1c061509165d", ResourceVersion:"629", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 52, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"69ddf5d45d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-a-abb47662e0", ContainerID:"9a54ca7dae51d747b25feb04af04fbb5c7cb020a7bba61703d7eb73bf4dbdbae", Pod:"csi-node-driver-zsbnv", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.69.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali3032583ccb9", MAC:"a6:87:ef:9e:e4:ff", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:53:13.116543 containerd[1507]: 2025-03-25 01:53:13.106 [INFO][4420] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="9a54ca7dae51d747b25feb04af04fbb5c7cb020a7bba61703d7eb73bf4dbdbae" Namespace="calico-system" Pod="csi-node-driver-zsbnv" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-csi--node--driver--zsbnv-eth0" Mar 25 01:53:13.141778 systemd[1]: Started cri-containerd-baa8fbda303e3570b0e5dc43d87fae78c43fd504eb6afae34d58654af392ac1a.scope - libcontainer container baa8fbda303e3570b0e5dc43d87fae78c43fd504eb6afae34d58654af392ac1a. Mar 25 01:53:13.152714 containerd[1507]: time="2025-03-25T01:53:13.152608489Z" level=info msg="connecting to shim 9a54ca7dae51d747b25feb04af04fbb5c7cb020a7bba61703d7eb73bf4dbdbae" address="unix:///run/containerd/s/d36ead6de0b0ac6f53f87fcf453bf3b3910af7514106dee1f7037c52a4925904" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:53:13.186456 systemd[1]: Started cri-containerd-9a54ca7dae51d747b25feb04af04fbb5c7cb020a7bba61703d7eb73bf4dbdbae.scope - libcontainer container 9a54ca7dae51d747b25feb04af04fbb5c7cb020a7bba61703d7eb73bf4dbdbae. Mar 25 01:53:13.201347 containerd[1507]: time="2025-03-25T01:53:13.201220135Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-xjt4l,Uid:4197106e-bb3c-4d91-a3a1-29d9a63dea11,Namespace:kube-system,Attempt:0,} returns sandbox id \"baa8fbda303e3570b0e5dc43d87fae78c43fd504eb6afae34d58654af392ac1a\"" Mar 25 01:53:13.205605 containerd[1507]: time="2025-03-25T01:53:13.205571459Z" level=info msg="CreateContainer within sandbox \"baa8fbda303e3570b0e5dc43d87fae78c43fd504eb6afae34d58654af392ac1a\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 25 01:53:13.216986 containerd[1507]: time="2025-03-25T01:53:13.216956433Z" level=info msg="Container 0008382f2d5bc54813ed9c03933abe7cde094932b24ecb75d485a8c1cd81aed5: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:53:13.222588 containerd[1507]: time="2025-03-25T01:53:13.222552070Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-zsbnv,Uid:a86ff892-344f-4f10-be9e-1c061509165d,Namespace:calico-system,Attempt:0,} returns sandbox id \"9a54ca7dae51d747b25feb04af04fbb5c7cb020a7bba61703d7eb73bf4dbdbae\"" Mar 25 01:53:13.225342 containerd[1507]: time="2025-03-25T01:53:13.225291423Z" level=info msg="CreateContainer within sandbox \"baa8fbda303e3570b0e5dc43d87fae78c43fd504eb6afae34d58654af392ac1a\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"0008382f2d5bc54813ed9c03933abe7cde094932b24ecb75d485a8c1cd81aed5\"" Mar 25 01:53:13.225748 containerd[1507]: time="2025-03-25T01:53:13.225720426Z" level=info msg="StartContainer for \"0008382f2d5bc54813ed9c03933abe7cde094932b24ecb75d485a8c1cd81aed5\"" Mar 25 01:53:13.226498 containerd[1507]: time="2025-03-25T01:53:13.226268805Z" level=info msg="connecting to shim 0008382f2d5bc54813ed9c03933abe7cde094932b24ecb75d485a8c1cd81aed5" address="unix:///run/containerd/s/a9b785bdcf9e14485688d1bd81c3af3b93804aa2f46eceea00caada7bacad674" protocol=ttrpc version=3 Mar 25 01:53:13.246244 systemd[1]: Started cri-containerd-0008382f2d5bc54813ed9c03933abe7cde094932b24ecb75d485a8c1cd81aed5.scope - libcontainer container 0008382f2d5bc54813ed9c03933abe7cde094932b24ecb75d485a8c1cd81aed5. Mar 25 01:53:13.278731 containerd[1507]: time="2025-03-25T01:53:13.278696812Z" level=info msg="StartContainer for \"0008382f2d5bc54813ed9c03933abe7cde094932b24ecb75d485a8c1cd81aed5\" returns successfully" Mar 25 01:53:13.759948 systemd-networkd[1400]: cali689dbdeca71: Gained IPv6LL Mar 25 01:53:13.900417 containerd[1507]: time="2025-03-25T01:53:13.900302074Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d6975569f-xkvtt,Uid:b03cc102-7dcd-4592-988f-8209a4d3c8d8,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:53:14.042261 systemd-networkd[1400]: calie9ad50a1034: Link UP Mar 25 01:53:14.043086 systemd-networkd[1400]: calie9ad50a1034: Gained carrier Mar 25 01:53:14.059436 containerd[1507]: 2025-03-25 01:53:13.966 [INFO][4608] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--a--abb47662e0-k8s-calico--apiserver--7d6975569f--xkvtt-eth0 calico-apiserver-7d6975569f- calico-apiserver b03cc102-7dcd-4592-988f-8209a4d3c8d8 723 0 2025-03-25 01:52:44 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7d6975569f projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4284-0-0-a-abb47662e0 calico-apiserver-7d6975569f-xkvtt eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calie9ad50a1034 [] []}} ContainerID="925d4bf627c65f3f7762c69dfb6fcb794abdb2e6a06893dc178d5082692d935f" Namespace="calico-apiserver" Pod="calico-apiserver-7d6975569f-xkvtt" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-calico--apiserver--7d6975569f--xkvtt-" Mar 25 01:53:14.059436 containerd[1507]: 2025-03-25 01:53:13.966 [INFO][4608] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="925d4bf627c65f3f7762c69dfb6fcb794abdb2e6a06893dc178d5082692d935f" Namespace="calico-apiserver" Pod="calico-apiserver-7d6975569f-xkvtt" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-calico--apiserver--7d6975569f--xkvtt-eth0" Mar 25 01:53:14.059436 containerd[1507]: 2025-03-25 01:53:14.002 [INFO][4620] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="925d4bf627c65f3f7762c69dfb6fcb794abdb2e6a06893dc178d5082692d935f" HandleID="k8s-pod-network.925d4bf627c65f3f7762c69dfb6fcb794abdb2e6a06893dc178d5082692d935f" Workload="ci--4284--0--0--a--abb47662e0-k8s-calico--apiserver--7d6975569f--xkvtt-eth0" Mar 25 01:53:14.059436 containerd[1507]: 2025-03-25 01:53:14.010 [INFO][4620] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="925d4bf627c65f3f7762c69dfb6fcb794abdb2e6a06893dc178d5082692d935f" HandleID="k8s-pod-network.925d4bf627c65f3f7762c69dfb6fcb794abdb2e6a06893dc178d5082692d935f" Workload="ci--4284--0--0--a--abb47662e0-k8s-calico--apiserver--7d6975569f--xkvtt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000265910), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4284-0-0-a-abb47662e0", "pod":"calico-apiserver-7d6975569f-xkvtt", "timestamp":"2025-03-25 01:53:14.002449573 +0000 UTC"}, Hostname:"ci-4284-0-0-a-abb47662e0", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:53:14.059436 containerd[1507]: 2025-03-25 01:53:14.010 [INFO][4620] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:53:14.059436 containerd[1507]: 2025-03-25 01:53:14.010 [INFO][4620] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:53:14.059436 containerd[1507]: 2025-03-25 01:53:14.010 [INFO][4620] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-a-abb47662e0' Mar 25 01:53:14.059436 containerd[1507]: 2025-03-25 01:53:14.012 [INFO][4620] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.925d4bf627c65f3f7762c69dfb6fcb794abdb2e6a06893dc178d5082692d935f" host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:14.059436 containerd[1507]: 2025-03-25 01:53:14.016 [INFO][4620] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:14.059436 containerd[1507]: 2025-03-25 01:53:14.020 [INFO][4620] ipam/ipam.go 489: Trying affinity for 192.168.69.64/26 host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:14.059436 containerd[1507]: 2025-03-25 01:53:14.022 [INFO][4620] ipam/ipam.go 155: Attempting to load block cidr=192.168.69.64/26 host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:14.059436 containerd[1507]: 2025-03-25 01:53:14.024 [INFO][4620] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.69.64/26 host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:14.059436 containerd[1507]: 2025-03-25 01:53:14.024 [INFO][4620] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.69.64/26 handle="k8s-pod-network.925d4bf627c65f3f7762c69dfb6fcb794abdb2e6a06893dc178d5082692d935f" host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:14.059436 containerd[1507]: 2025-03-25 01:53:14.025 [INFO][4620] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.925d4bf627c65f3f7762c69dfb6fcb794abdb2e6a06893dc178d5082692d935f Mar 25 01:53:14.059436 containerd[1507]: 2025-03-25 01:53:14.029 [INFO][4620] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.69.64/26 handle="k8s-pod-network.925d4bf627c65f3f7762c69dfb6fcb794abdb2e6a06893dc178d5082692d935f" host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:14.059436 containerd[1507]: 2025-03-25 01:53:14.037 [INFO][4620] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.69.70/26] block=192.168.69.64/26 handle="k8s-pod-network.925d4bf627c65f3f7762c69dfb6fcb794abdb2e6a06893dc178d5082692d935f" host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:14.059436 containerd[1507]: 2025-03-25 01:53:14.037 [INFO][4620] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.69.70/26] handle="k8s-pod-network.925d4bf627c65f3f7762c69dfb6fcb794abdb2e6a06893dc178d5082692d935f" host="ci-4284-0-0-a-abb47662e0" Mar 25 01:53:14.059436 containerd[1507]: 2025-03-25 01:53:14.037 [INFO][4620] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:53:14.059436 containerd[1507]: 2025-03-25 01:53:14.037 [INFO][4620] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.69.70/26] IPv6=[] ContainerID="925d4bf627c65f3f7762c69dfb6fcb794abdb2e6a06893dc178d5082692d935f" HandleID="k8s-pod-network.925d4bf627c65f3f7762c69dfb6fcb794abdb2e6a06893dc178d5082692d935f" Workload="ci--4284--0--0--a--abb47662e0-k8s-calico--apiserver--7d6975569f--xkvtt-eth0" Mar 25 01:53:14.061197 containerd[1507]: 2025-03-25 01:53:14.039 [INFO][4608] cni-plugin/k8s.go 386: Populated endpoint ContainerID="925d4bf627c65f3f7762c69dfb6fcb794abdb2e6a06893dc178d5082692d935f" Namespace="calico-apiserver" Pod="calico-apiserver-7d6975569f-xkvtt" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-calico--apiserver--7d6975569f--xkvtt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--a--abb47662e0-k8s-calico--apiserver--7d6975569f--xkvtt-eth0", GenerateName:"calico-apiserver-7d6975569f-", Namespace:"calico-apiserver", SelfLink:"", UID:"b03cc102-7dcd-4592-988f-8209a4d3c8d8", ResourceVersion:"723", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 52, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7d6975569f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-a-abb47662e0", ContainerID:"", Pod:"calico-apiserver-7d6975569f-xkvtt", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.69.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie9ad50a1034", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:53:14.061197 containerd[1507]: 2025-03-25 01:53:14.039 [INFO][4608] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.69.70/32] ContainerID="925d4bf627c65f3f7762c69dfb6fcb794abdb2e6a06893dc178d5082692d935f" Namespace="calico-apiserver" Pod="calico-apiserver-7d6975569f-xkvtt" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-calico--apiserver--7d6975569f--xkvtt-eth0" Mar 25 01:53:14.061197 containerd[1507]: 2025-03-25 01:53:14.039 [INFO][4608] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie9ad50a1034 ContainerID="925d4bf627c65f3f7762c69dfb6fcb794abdb2e6a06893dc178d5082692d935f" Namespace="calico-apiserver" Pod="calico-apiserver-7d6975569f-xkvtt" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-calico--apiserver--7d6975569f--xkvtt-eth0" Mar 25 01:53:14.061197 containerd[1507]: 2025-03-25 01:53:14.041 [INFO][4608] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="925d4bf627c65f3f7762c69dfb6fcb794abdb2e6a06893dc178d5082692d935f" Namespace="calico-apiserver" Pod="calico-apiserver-7d6975569f-xkvtt" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-calico--apiserver--7d6975569f--xkvtt-eth0" Mar 25 01:53:14.061197 containerd[1507]: 2025-03-25 01:53:14.045 [INFO][4608] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="925d4bf627c65f3f7762c69dfb6fcb794abdb2e6a06893dc178d5082692d935f" Namespace="calico-apiserver" Pod="calico-apiserver-7d6975569f-xkvtt" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-calico--apiserver--7d6975569f--xkvtt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--a--abb47662e0-k8s-calico--apiserver--7d6975569f--xkvtt-eth0", GenerateName:"calico-apiserver-7d6975569f-", Namespace:"calico-apiserver", SelfLink:"", UID:"b03cc102-7dcd-4592-988f-8209a4d3c8d8", ResourceVersion:"723", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 52, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7d6975569f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-a-abb47662e0", ContainerID:"925d4bf627c65f3f7762c69dfb6fcb794abdb2e6a06893dc178d5082692d935f", Pod:"calico-apiserver-7d6975569f-xkvtt", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.69.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie9ad50a1034", MAC:"fa:97:54:39:31:01", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:53:14.061197 containerd[1507]: 2025-03-25 01:53:14.055 [INFO][4608] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="925d4bf627c65f3f7762c69dfb6fcb794abdb2e6a06893dc178d5082692d935f" Namespace="calico-apiserver" Pod="calico-apiserver-7d6975569f-xkvtt" WorkloadEndpoint="ci--4284--0--0--a--abb47662e0-k8s-calico--apiserver--7d6975569f--xkvtt-eth0" Mar 25 01:53:14.084650 containerd[1507]: time="2025-03-25T01:53:14.084609626Z" level=info msg="connecting to shim 925d4bf627c65f3f7762c69dfb6fcb794abdb2e6a06893dc178d5082692d935f" address="unix:///run/containerd/s/d17d868b5d960a8d84bf26727ef5ca925e1484818d4cb5ef796011c6cfb41252" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:53:14.107437 systemd[1]: Started cri-containerd-925d4bf627c65f3f7762c69dfb6fcb794abdb2e6a06893dc178d5082692d935f.scope - libcontainer container 925d4bf627c65f3f7762c69dfb6fcb794abdb2e6a06893dc178d5082692d935f. Mar 25 01:53:14.152488 kubelet[2914]: I0325 01:53:14.151903 2914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-xjt4l" podStartSLOduration=37.151883035 podStartE2EDuration="37.151883035s" podCreationTimestamp="2025-03-25 01:52:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:53:14.151723496 +0000 UTC m=+51.334945346" watchObservedRunningTime="2025-03-25 01:53:14.151883035 +0000 UTC m=+51.335104885" Mar 25 01:53:14.171909 containerd[1507]: time="2025-03-25T01:53:14.170978166Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d6975569f-xkvtt,Uid:b03cc102-7dcd-4592-988f-8209a4d3c8d8,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"925d4bf627c65f3f7762c69dfb6fcb794abdb2e6a06893dc178d5082692d935f\"" Mar 25 01:53:14.271485 systemd-networkd[1400]: cali5f30a1e603e: Gained IPv6LL Mar 25 01:53:14.911594 systemd-networkd[1400]: cali3032583ccb9: Gained IPv6LL Mar 25 01:53:15.231473 systemd-networkd[1400]: calie9ad50a1034: Gained IPv6LL Mar 25 01:53:15.422956 containerd[1507]: time="2025-03-25T01:53:15.422900597Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:53:15.424361 containerd[1507]: time="2025-03-25T01:53:15.424288819Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=42993204" Mar 25 01:53:15.426247 containerd[1507]: time="2025-03-25T01:53:15.426195381Z" level=info msg="ImageCreate event name:\"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:53:15.428424 containerd[1507]: time="2025-03-25T01:53:15.428314882Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:53:15.428880 containerd[1507]: time="2025-03-25T01:53:15.428693371Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"44486324\" in 4.008113628s" Mar 25 01:53:15.428880 containerd[1507]: time="2025-03-25T01:53:15.428720311Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\"" Mar 25 01:53:15.432811 containerd[1507]: time="2025-03-25T01:53:15.432649724Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\"" Mar 25 01:53:15.434161 containerd[1507]: time="2025-03-25T01:53:15.434113747Z" level=info msg="CreateContainer within sandbox \"30c8adb2b7c8e8d0aa178d7f43dadcf8fdc428143c634a6c27208cc2845fbd55\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 25 01:53:15.443418 containerd[1507]: time="2025-03-25T01:53:15.442478560Z" level=info msg="Container 680f2db56ee4b6f59e696cd7e2d27699e41346c7b257f4fa0bad184513c33ee9: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:53:15.452933 containerd[1507]: time="2025-03-25T01:53:15.452900709Z" level=info msg="CreateContainer within sandbox \"30c8adb2b7c8e8d0aa178d7f43dadcf8fdc428143c634a6c27208cc2845fbd55\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"680f2db56ee4b6f59e696cd7e2d27699e41346c7b257f4fa0bad184513c33ee9\"" Mar 25 01:53:15.454973 containerd[1507]: time="2025-03-25T01:53:15.453434729Z" level=info msg="StartContainer for \"680f2db56ee4b6f59e696cd7e2d27699e41346c7b257f4fa0bad184513c33ee9\"" Mar 25 01:53:15.454973 containerd[1507]: time="2025-03-25T01:53:15.454455963Z" level=info msg="connecting to shim 680f2db56ee4b6f59e696cd7e2d27699e41346c7b257f4fa0bad184513c33ee9" address="unix:///run/containerd/s/b3f7f3be698bdc867783207d509ad9ae1459c42ea409476c2e2956f26d576df1" protocol=ttrpc version=3 Mar 25 01:53:15.475477 systemd[1]: Started cri-containerd-680f2db56ee4b6f59e696cd7e2d27699e41346c7b257f4fa0bad184513c33ee9.scope - libcontainer container 680f2db56ee4b6f59e696cd7e2d27699e41346c7b257f4fa0bad184513c33ee9. Mar 25 01:53:15.523010 containerd[1507]: time="2025-03-25T01:53:15.522882544Z" level=info msg="StartContainer for \"680f2db56ee4b6f59e696cd7e2d27699e41346c7b257f4fa0bad184513c33ee9\" returns successfully" Mar 25 01:53:16.165956 kubelet[2914]: I0325 01:53:16.165799 2914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7d6975569f-qrwcl" podStartSLOduration=28.153502006 podStartE2EDuration="32.165731735s" podCreationTimestamp="2025-03-25 01:52:44 +0000 UTC" firstStartedPulling="2025-03-25 01:53:11.420048368 +0000 UTC m=+48.603270209" lastFinishedPulling="2025-03-25 01:53:15.432278098 +0000 UTC m=+52.615499938" observedRunningTime="2025-03-25 01:53:16.164613601 +0000 UTC m=+53.347835451" watchObservedRunningTime="2025-03-25 01:53:16.165731735 +0000 UTC m=+53.348953585" Mar 25 01:53:17.152209 kubelet[2914]: I0325 01:53:17.152133 2914 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:53:18.479234 containerd[1507]: time="2025-03-25T01:53:18.479193812Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:53:18.479920 containerd[1507]: time="2025-03-25T01:53:18.479800178Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.2: active requests=0, bytes read=34792912" Mar 25 01:53:18.481550 containerd[1507]: time="2025-03-25T01:53:18.481524088Z" level=info msg="ImageCreate event name:\"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:53:18.483565 containerd[1507]: time="2025-03-25T01:53:18.483545265Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:53:18.484098 containerd[1507]: time="2025-03-25T01:53:18.484070649Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" with image id \"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\", size \"36285984\" in 3.051394295s" Mar 25 01:53:18.484241 containerd[1507]: time="2025-03-25T01:53:18.484128087Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" returns image reference \"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\"" Mar 25 01:53:18.488186 containerd[1507]: time="2025-03-25T01:53:18.488143739Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\"" Mar 25 01:53:18.508142 containerd[1507]: time="2025-03-25T01:53:18.508111847Z" level=info msg="CreateContainer within sandbox \"c2beac8be50cb7c9320cb3f4af22888c8499fddea9e454e52820bc5946092ec2\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 25 01:53:18.519391 containerd[1507]: time="2025-03-25T01:53:18.516284127Z" level=info msg="Container 789364782bc84056bee05e88ed1ded32a06a9e39ecda823fc70a29f4eab03dd8: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:53:18.530314 containerd[1507]: time="2025-03-25T01:53:18.530283653Z" level=info msg="CreateContainer within sandbox \"c2beac8be50cb7c9320cb3f4af22888c8499fddea9e454e52820bc5946092ec2\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"789364782bc84056bee05e88ed1ded32a06a9e39ecda823fc70a29f4eab03dd8\"" Mar 25 01:53:18.531833 containerd[1507]: time="2025-03-25T01:53:18.531791408Z" level=info msg="StartContainer for \"789364782bc84056bee05e88ed1ded32a06a9e39ecda823fc70a29f4eab03dd8\"" Mar 25 01:53:18.532638 containerd[1507]: time="2025-03-25T01:53:18.532572340Z" level=info msg="connecting to shim 789364782bc84056bee05e88ed1ded32a06a9e39ecda823fc70a29f4eab03dd8" address="unix:///run/containerd/s/1a6412383fbf915a7560b0f703ceeaf66470984f96e1d35e901b41ec51420c34" protocol=ttrpc version=3 Mar 25 01:53:18.561444 systemd[1]: Started cri-containerd-789364782bc84056bee05e88ed1ded32a06a9e39ecda823fc70a29f4eab03dd8.scope - libcontainer container 789364782bc84056bee05e88ed1ded32a06a9e39ecda823fc70a29f4eab03dd8. Mar 25 01:53:18.615160 containerd[1507]: time="2025-03-25T01:53:18.615099917Z" level=info msg="StartContainer for \"789364782bc84056bee05e88ed1ded32a06a9e39ecda823fc70a29f4eab03dd8\" returns successfully" Mar 25 01:53:19.179229 kubelet[2914]: I0325 01:53:19.179042 2914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-54675b8544-42csf" podStartSLOduration=27.825786385 podStartE2EDuration="34.178979618s" podCreationTimestamp="2025-03-25 01:52:45 +0000 UTC" firstStartedPulling="2025-03-25 01:53:12.133832462 +0000 UTC m=+49.317054302" lastFinishedPulling="2025-03-25 01:53:18.487025695 +0000 UTC m=+55.670247535" observedRunningTime="2025-03-25 01:53:19.177995775 +0000 UTC m=+56.361217694" watchObservedRunningTime="2025-03-25 01:53:19.178979618 +0000 UTC m=+56.362239829" Mar 25 01:53:19.227617 containerd[1507]: time="2025-03-25T01:53:19.227576442Z" level=info msg="TaskExit event in podsandbox handler container_id:\"789364782bc84056bee05e88ed1ded32a06a9e39ecda823fc70a29f4eab03dd8\" id:\"17828e083120ad6185849c0512780401309a8596c1f20db51009f40a07eb4143\" pid:4792 exited_at:{seconds:1742867599 nanos:227350149}" Mar 25 01:53:20.545002 containerd[1507]: time="2025-03-25T01:53:20.544948156Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:53:20.545899 containerd[1507]: time="2025-03-25T01:53:20.545743516Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.2: active requests=0, bytes read=7909887" Mar 25 01:53:20.546655 containerd[1507]: time="2025-03-25T01:53:20.546612454Z" level=info msg="ImageCreate event name:\"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:53:20.548105 containerd[1507]: time="2025-03-25T01:53:20.548068230Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:53:20.548764 containerd[1507]: time="2025-03-25T01:53:20.548507855Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.2\" with image id \"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\", size \"9402991\" in 2.060338407s" Mar 25 01:53:20.548764 containerd[1507]: time="2025-03-25T01:53:20.548532642Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\" returns image reference \"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\"" Mar 25 01:53:20.550178 containerd[1507]: time="2025-03-25T01:53:20.549543194Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 25 01:53:20.551042 containerd[1507]: time="2025-03-25T01:53:20.550991306Z" level=info msg="CreateContainer within sandbox \"9a54ca7dae51d747b25feb04af04fbb5c7cb020a7bba61703d7eb73bf4dbdbae\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 25 01:53:20.565546 containerd[1507]: time="2025-03-25T01:53:20.565517757Z" level=info msg="Container 3690a2aa1800361d9fdce0d53ce18ceaa2be2c3b836cdd9e61065fc3e9b27c25: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:53:20.569605 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount15713296.mount: Deactivated successfully. Mar 25 01:53:20.580288 containerd[1507]: time="2025-03-25T01:53:20.580250744Z" level=info msg="CreateContainer within sandbox \"9a54ca7dae51d747b25feb04af04fbb5c7cb020a7bba61703d7eb73bf4dbdbae\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"3690a2aa1800361d9fdce0d53ce18ceaa2be2c3b836cdd9e61065fc3e9b27c25\"" Mar 25 01:53:20.582343 containerd[1507]: time="2025-03-25T01:53:20.580747545Z" level=info msg="StartContainer for \"3690a2aa1800361d9fdce0d53ce18ceaa2be2c3b836cdd9e61065fc3e9b27c25\"" Mar 25 01:53:20.582343 containerd[1507]: time="2025-03-25T01:53:20.581808082Z" level=info msg="connecting to shim 3690a2aa1800361d9fdce0d53ce18ceaa2be2c3b836cdd9e61065fc3e9b27c25" address="unix:///run/containerd/s/d36ead6de0b0ac6f53f87fcf453bf3b3910af7514106dee1f7037c52a4925904" protocol=ttrpc version=3 Mar 25 01:53:20.600435 systemd[1]: Started cri-containerd-3690a2aa1800361d9fdce0d53ce18ceaa2be2c3b836cdd9e61065fc3e9b27c25.scope - libcontainer container 3690a2aa1800361d9fdce0d53ce18ceaa2be2c3b836cdd9e61065fc3e9b27c25. Mar 25 01:53:20.629712 containerd[1507]: time="2025-03-25T01:53:20.629651301Z" level=info msg="StartContainer for \"3690a2aa1800361d9fdce0d53ce18ceaa2be2c3b836cdd9e61065fc3e9b27c25\" returns successfully" Mar 25 01:53:21.051759 containerd[1507]: time="2025-03-25T01:53:21.051680610Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:53:21.052911 containerd[1507]: time="2025-03-25T01:53:21.052821367Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=77" Mar 25 01:53:21.055658 containerd[1507]: time="2025-03-25T01:53:21.055607906Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"44486324\" in 506.035077ms" Mar 25 01:53:21.055658 containerd[1507]: time="2025-03-25T01:53:21.055655826Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\"" Mar 25 01:53:21.057463 containerd[1507]: time="2025-03-25T01:53:21.057406454Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\"" Mar 25 01:53:21.061415 containerd[1507]: time="2025-03-25T01:53:21.060494208Z" level=info msg="CreateContainer within sandbox \"925d4bf627c65f3f7762c69dfb6fcb794abdb2e6a06893dc178d5082692d935f\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 25 01:53:21.071449 containerd[1507]: time="2025-03-25T01:53:21.071383415Z" level=info msg="Container 38593d5d640746397950e8ee5177611e19728834c6a3800807d407e80bf06846: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:53:21.088169 containerd[1507]: time="2025-03-25T01:53:21.088105997Z" level=info msg="CreateContainer within sandbox \"925d4bf627c65f3f7762c69dfb6fcb794abdb2e6a06893dc178d5082692d935f\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"38593d5d640746397950e8ee5177611e19728834c6a3800807d407e80bf06846\"" Mar 25 01:53:21.088872 containerd[1507]: time="2025-03-25T01:53:21.088837629Z" level=info msg="StartContainer for \"38593d5d640746397950e8ee5177611e19728834c6a3800807d407e80bf06846\"" Mar 25 01:53:21.091171 containerd[1507]: time="2025-03-25T01:53:21.090976374Z" level=info msg="connecting to shim 38593d5d640746397950e8ee5177611e19728834c6a3800807d407e80bf06846" address="unix:///run/containerd/s/d17d868b5d960a8d84bf26727ef5ca925e1484818d4cb5ef796011c6cfb41252" protocol=ttrpc version=3 Mar 25 01:53:21.121604 systemd[1]: Started cri-containerd-38593d5d640746397950e8ee5177611e19728834c6a3800807d407e80bf06846.scope - libcontainer container 38593d5d640746397950e8ee5177611e19728834c6a3800807d407e80bf06846. Mar 25 01:53:21.182291 containerd[1507]: time="2025-03-25T01:53:21.182237763Z" level=info msg="StartContainer for \"38593d5d640746397950e8ee5177611e19728834c6a3800807d407e80bf06846\" returns successfully" Mar 25 01:53:22.195823 kubelet[2914]: I0325 01:53:22.195186 2914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7d6975569f-xkvtt" podStartSLOduration=31.3132793 podStartE2EDuration="38.195137146s" podCreationTimestamp="2025-03-25 01:52:44 +0000 UTC" firstStartedPulling="2025-03-25 01:53:14.174954566 +0000 UTC m=+51.358176407" lastFinishedPulling="2025-03-25 01:53:21.056812382 +0000 UTC m=+58.240034253" observedRunningTime="2025-03-25 01:53:22.193991121 +0000 UTC m=+59.377212991" watchObservedRunningTime="2025-03-25 01:53:22.195137146 +0000 UTC m=+59.378359017" Mar 25 01:53:23.183762 kubelet[2914]: I0325 01:53:23.183314 2914 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:53:23.316084 containerd[1507]: time="2025-03-25T01:53:23.316026518Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:53:23.317155 containerd[1507]: time="2025-03-25T01:53:23.317015681Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2: active requests=0, bytes read=13986843" Mar 25 01:53:23.318070 containerd[1507]: time="2025-03-25T01:53:23.318018109Z" level=info msg="ImageCreate event name:\"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:53:23.319863 containerd[1507]: time="2025-03-25T01:53:23.319822939Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:53:23.320489 containerd[1507]: time="2025-03-25T01:53:23.320302527Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" with image id \"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\", size \"15479899\" in 2.262845818s" Mar 25 01:53:23.320489 containerd[1507]: time="2025-03-25T01:53:23.320380173Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" returns image reference \"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\"" Mar 25 01:53:23.322236 containerd[1507]: time="2025-03-25T01:53:23.322177288Z" level=info msg="CreateContainer within sandbox \"9a54ca7dae51d747b25feb04af04fbb5c7cb020a7bba61703d7eb73bf4dbdbae\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 25 01:53:23.332954 containerd[1507]: time="2025-03-25T01:53:23.332897796Z" level=info msg="Container 24f5cc78646acae7cf0a5ca69f4da6a2f266ce5876f43827b405714c4662266d: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:53:23.350314 containerd[1507]: time="2025-03-25T01:53:23.350271105Z" level=info msg="CreateContainer within sandbox \"9a54ca7dae51d747b25feb04af04fbb5c7cb020a7bba61703d7eb73bf4dbdbae\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"24f5cc78646acae7cf0a5ca69f4da6a2f266ce5876f43827b405714c4662266d\"" Mar 25 01:53:23.352540 containerd[1507]: time="2025-03-25T01:53:23.352379083Z" level=info msg="StartContainer for \"24f5cc78646acae7cf0a5ca69f4da6a2f266ce5876f43827b405714c4662266d\"" Mar 25 01:53:23.354625 containerd[1507]: time="2025-03-25T01:53:23.354559246Z" level=info msg="connecting to shim 24f5cc78646acae7cf0a5ca69f4da6a2f266ce5876f43827b405714c4662266d" address="unix:///run/containerd/s/d36ead6de0b0ac6f53f87fcf453bf3b3910af7514106dee1f7037c52a4925904" protocol=ttrpc version=3 Mar 25 01:53:23.388467 systemd[1]: Started cri-containerd-24f5cc78646acae7cf0a5ca69f4da6a2f266ce5876f43827b405714c4662266d.scope - libcontainer container 24f5cc78646acae7cf0a5ca69f4da6a2f266ce5876f43827b405714c4662266d. Mar 25 01:53:23.423351 containerd[1507]: time="2025-03-25T01:53:23.422109691Z" level=info msg="StartContainer for \"24f5cc78646acae7cf0a5ca69f4da6a2f266ce5876f43827b405714c4662266d\" returns successfully" Mar 25 01:53:24.168953 kubelet[2914]: I0325 01:53:24.168878 2914 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 25 01:53:24.170280 kubelet[2914]: I0325 01:53:24.170250 2914 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 25 01:53:28.411885 containerd[1507]: time="2025-03-25T01:53:28.411821988Z" level=info msg="TaskExit event in podsandbox handler container_id:\"789364782bc84056bee05e88ed1ded32a06a9e39ecda823fc70a29f4eab03dd8\" id:\"72c53e6c12028dabdb6d3eefffdf8c735b07ef40734162dd998ee85b64ac310b\" pid:4927 exited_at:{seconds:1742867608 nanos:411530172}" Mar 25 01:53:29.033484 kubelet[2914]: I0325 01:53:29.033340 2914 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:53:29.060369 kubelet[2914]: I0325 01:53:29.059380 2914 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-zsbnv" podStartSLOduration=34.96257353 podStartE2EDuration="45.059359679s" podCreationTimestamp="2025-03-25 01:52:44 +0000 UTC" firstStartedPulling="2025-03-25 01:53:13.224233581 +0000 UTC m=+50.407455421" lastFinishedPulling="2025-03-25 01:53:23.321019719 +0000 UTC m=+60.504241570" observedRunningTime="2025-03-25 01:53:24.211803135 +0000 UTC m=+61.395024975" watchObservedRunningTime="2025-03-25 01:53:29.059359679 +0000 UTC m=+66.242581529" Mar 25 01:53:32.524938 containerd[1507]: time="2025-03-25T01:53:32.524254322Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c1a93d09d0789e286491d05e330664b44ac3c0e1be7045deae34a0fdabf57c3c\" id:\"b119769bfef8782d394772802c5d03848e3a50a93dd85d9e71a8fac6a902d0e7\" pid:4952 exited_at:{seconds:1742867612 nanos:523227500}" Mar 25 01:53:32.995807 containerd[1507]: time="2025-03-25T01:53:32.995749265Z" level=info msg="TaskExit event in podsandbox handler container_id:\"789364782bc84056bee05e88ed1ded32a06a9e39ecda823fc70a29f4eab03dd8\" id:\"86668be770c2ab005e7b8f1ec920aaa64c10acdaab6fded16f77ff187b950a7d\" pid:4976 exited_at:{seconds:1742867612 nanos:995537679}" Mar 25 01:53:56.668735 kubelet[2914]: I0325 01:53:56.667657 2914 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:53:58.399398 containerd[1507]: time="2025-03-25T01:53:58.399244964Z" level=info msg="TaskExit event in podsandbox handler container_id:\"789364782bc84056bee05e88ed1ded32a06a9e39ecda823fc70a29f4eab03dd8\" id:\"6aa7eef92895215e18c3d28448d39cbf57573d4b233393ef79b4805895a67588\" pid:5012 exited_at:{seconds:1742867638 nanos:398786667}" Mar 25 01:54:02.524365 containerd[1507]: time="2025-03-25T01:54:02.524299381Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c1a93d09d0789e286491d05e330664b44ac3c0e1be7045deae34a0fdabf57c3c\" id:\"1a1f442692362bd0eb0a001ca1812e2377f835ce7d392e64c625b8d9988b472e\" pid:5035 exited_at:{seconds:1742867642 nanos:523810727}" Mar 25 01:54:19.864181 systemd[1]: Started sshd@8-95.217.13.107:22-92.255.85.188:55176.service - OpenSSH per-connection server daemon (92.255.85.188:55176). Mar 25 01:54:20.451572 sshd[5050]: Invalid user oracle from 92.255.85.188 port 55176 Mar 25 01:54:20.506302 sshd[5050]: Connection closed by invalid user oracle 92.255.85.188 port 55176 [preauth] Mar 25 01:54:20.507968 systemd[1]: sshd@8-95.217.13.107:22-92.255.85.188:55176.service: Deactivated successfully. Mar 25 01:54:28.398667 containerd[1507]: time="2025-03-25T01:54:28.398590077Z" level=info msg="TaskExit event in podsandbox handler container_id:\"789364782bc84056bee05e88ed1ded32a06a9e39ecda823fc70a29f4eab03dd8\" id:\"3eff550b6e89cadd81337fc9db40a7a44bb60a6239d98d2026b97c7cf49ba076\" pid:5074 exited_at:{seconds:1742867668 nanos:397916064}" Mar 25 01:54:32.504395 containerd[1507]: time="2025-03-25T01:54:32.504290573Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c1a93d09d0789e286491d05e330664b44ac3c0e1be7045deae34a0fdabf57c3c\" id:\"03d63aea9664be66738494ed1c793b63953b3535eb1459ab0308443d72a5eeaa\" pid:5095 exited_at:{seconds:1742867672 nanos:503968734}" Mar 25 01:54:32.983707 containerd[1507]: time="2025-03-25T01:54:32.983658614Z" level=info msg="TaskExit event in podsandbox handler container_id:\"789364782bc84056bee05e88ed1ded32a06a9e39ecda823fc70a29f4eab03dd8\" id:\"7a3045dad19d6bcb257afc6051b69b5e9d792d0fb8713d47884540a083f6210e\" pid:5118 exited_at:{seconds:1742867672 nanos:983029533}" Mar 25 01:54:58.399756 containerd[1507]: time="2025-03-25T01:54:58.399695887Z" level=info msg="TaskExit event in podsandbox handler container_id:\"789364782bc84056bee05e88ed1ded32a06a9e39ecda823fc70a29f4eab03dd8\" id:\"8de91aeb8b4a0a140b3e0dac24800a6fd9f70018017f95d1256cc65c76afce2a\" pid:5161 exited_at:{seconds:1742867698 nanos:399178755}" Mar 25 01:55:02.484012 containerd[1507]: time="2025-03-25T01:55:02.483978847Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c1a93d09d0789e286491d05e330664b44ac3c0e1be7045deae34a0fdabf57c3c\" id:\"57534ba922f0c6413e4b50411193b85ff937aa251d9e195c1d5fe2e54512451f\" pid:5182 exited_at:{seconds:1742867702 nanos:483642720}" Mar 25 01:55:28.397418 containerd[1507]: time="2025-03-25T01:55:28.397311625Z" level=info msg="TaskExit event in podsandbox handler container_id:\"789364782bc84056bee05e88ed1ded32a06a9e39ecda823fc70a29f4eab03dd8\" id:\"694171fbe3eaffee9b726e23a895776ae26354bdcc44bfc1f60f2ab20dd0b9c5\" pid:5214 exited_at:{seconds:1742867728 nanos:396481816}" Mar 25 01:55:32.517692 containerd[1507]: time="2025-03-25T01:55:32.517633881Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c1a93d09d0789e286491d05e330664b44ac3c0e1be7045deae34a0fdabf57c3c\" id:\"33664c4202399e7924289dc7cbcea7e0273cd4737bd6fa9ad2050acb47283bef\" pid:5235 exited_at:{seconds:1742867732 nanos:516808301}" Mar 25 01:55:32.984635 containerd[1507]: time="2025-03-25T01:55:32.984491077Z" level=info msg="TaskExit event in podsandbox handler container_id:\"789364782bc84056bee05e88ed1ded32a06a9e39ecda823fc70a29f4eab03dd8\" id:\"ef7c5fba5c21f6118c7b6add42c5f781e7d17c67547deaa5a10a7a475f01986d\" pid:5258 exited_at:{seconds:1742867732 nanos:984253494}" Mar 25 01:55:58.393602 containerd[1507]: time="2025-03-25T01:55:58.393521993Z" level=info msg="TaskExit event in podsandbox handler container_id:\"789364782bc84056bee05e88ed1ded32a06a9e39ecda823fc70a29f4eab03dd8\" id:\"6324ccb0e0561420c2ec870eeb9cda3d5af04ea376ce5e5268a77ba51504484c\" pid:5291 exited_at:{seconds:1742867758 nanos:393073096}" Mar 25 01:56:02.524833 containerd[1507]: time="2025-03-25T01:56:02.524560044Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c1a93d09d0789e286491d05e330664b44ac3c0e1be7045deae34a0fdabf57c3c\" id:\"e3a21dcfdd1ae85f05ae713898a4d5756c2b637e89c7be01938881d69a0a54c6\" pid:5312 exited_at:{seconds:1742867762 nanos:524080439}" Mar 25 01:56:06.620780 update_engine[1484]: I20250325 01:56:06.620623 1484 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Mar 25 01:56:06.620780 update_engine[1484]: I20250325 01:56:06.620695 1484 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Mar 25 01:56:06.622853 update_engine[1484]: I20250325 01:56:06.622823 1484 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Mar 25 01:56:06.623722 update_engine[1484]: I20250325 01:56:06.623696 1484 omaha_request_params.cc:62] Current group set to alpha Mar 25 01:56:06.623851 update_engine[1484]: I20250325 01:56:06.623808 1484 update_attempter.cc:499] Already updated boot flags. Skipping. Mar 25 01:56:06.623851 update_engine[1484]: I20250325 01:56:06.623844 1484 update_attempter.cc:643] Scheduling an action processor start. Mar 25 01:56:06.624304 update_engine[1484]: I20250325 01:56:06.623863 1484 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Mar 25 01:56:06.624304 update_engine[1484]: I20250325 01:56:06.623891 1484 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Mar 25 01:56:06.624304 update_engine[1484]: I20250325 01:56:06.623934 1484 omaha_request_action.cc:271] Posting an Omaha request to disabled Mar 25 01:56:06.624304 update_engine[1484]: I20250325 01:56:06.623940 1484 omaha_request_action.cc:272] Request: Mar 25 01:56:06.624304 update_engine[1484]: Mar 25 01:56:06.624304 update_engine[1484]: Mar 25 01:56:06.624304 update_engine[1484]: Mar 25 01:56:06.624304 update_engine[1484]: Mar 25 01:56:06.624304 update_engine[1484]: Mar 25 01:56:06.624304 update_engine[1484]: Mar 25 01:56:06.624304 update_engine[1484]: Mar 25 01:56:06.624304 update_engine[1484]: Mar 25 01:56:06.624304 update_engine[1484]: I20250325 01:56:06.623945 1484 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 25 01:56:06.637070 locksmithd[1520]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Mar 25 01:56:06.638932 update_engine[1484]: I20250325 01:56:06.638899 1484 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 25 01:56:06.639229 update_engine[1484]: I20250325 01:56:06.639191 1484 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 25 01:56:06.640201 update_engine[1484]: E20250325 01:56:06.640167 1484 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 25 01:56:06.640339 update_engine[1484]: I20250325 01:56:06.640265 1484 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Mar 25 01:56:16.492812 update_engine[1484]: I20250325 01:56:16.492691 1484 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 25 01:56:16.493597 update_engine[1484]: I20250325 01:56:16.493069 1484 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 25 01:56:16.493667 update_engine[1484]: I20250325 01:56:16.493634 1484 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 25 01:56:16.494086 update_engine[1484]: E20250325 01:56:16.493997 1484 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 25 01:56:16.494289 update_engine[1484]: I20250325 01:56:16.494113 1484 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Mar 25 01:56:26.492568 update_engine[1484]: I20250325 01:56:26.492442 1484 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 25 01:56:26.493567 update_engine[1484]: I20250325 01:56:26.492780 1484 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 25 01:56:26.493567 update_engine[1484]: I20250325 01:56:26.493506 1484 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 25 01:56:26.493850 update_engine[1484]: E20250325 01:56:26.493790 1484 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 25 01:56:26.493946 update_engine[1484]: I20250325 01:56:26.493904 1484 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Mar 25 01:56:28.405525 containerd[1507]: time="2025-03-25T01:56:28.405419238Z" level=info msg="TaskExit event in podsandbox handler container_id:\"789364782bc84056bee05e88ed1ded32a06a9e39ecda823fc70a29f4eab03dd8\" id:\"bdccfcae1664577e79c058a4854a0c2c9ec00c677271169efc94ccae8c5d9cb6\" pid:5359 exited_at:{seconds:1742867788 nanos:405013060}" Mar 25 01:56:32.507384 containerd[1507]: time="2025-03-25T01:56:32.507285147Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c1a93d09d0789e286491d05e330664b44ac3c0e1be7045deae34a0fdabf57c3c\" id:\"8867c9e737de21a6ac7e8c70367fa747880d0d29e52aef67b97bb394530c64ea\" pid:5381 exited_at:{seconds:1742867792 nanos:506836920}" Mar 25 01:56:33.007629 containerd[1507]: time="2025-03-25T01:56:33.007571400Z" level=info msg="TaskExit event in podsandbox handler container_id:\"789364782bc84056bee05e88ed1ded32a06a9e39ecda823fc70a29f4eab03dd8\" id:\"04beb33c82c92adf2a5f8c8e597e48c01b3db15a88a6e36dd81de57c18bd98ca\" pid:5406 exited_at:{seconds:1742867793 nanos:7365955}" Mar 25 01:56:36.491433 update_engine[1484]: I20250325 01:56:36.491350 1484 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 25 01:56:36.492383 update_engine[1484]: I20250325 01:56:36.491568 1484 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 25 01:56:36.492383 update_engine[1484]: I20250325 01:56:36.491796 1484 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 25 01:56:36.492383 update_engine[1484]: E20250325 01:56:36.492296 1484 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 25 01:56:36.492489 update_engine[1484]: I20250325 01:56:36.492389 1484 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Mar 25 01:56:36.492489 update_engine[1484]: I20250325 01:56:36.492400 1484 omaha_request_action.cc:617] Omaha request response: Mar 25 01:56:36.492489 update_engine[1484]: E20250325 01:56:36.492472 1484 omaha_request_action.cc:636] Omaha request network transfer failed. Mar 25 01:56:36.494817 update_engine[1484]: I20250325 01:56:36.494653 1484 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Mar 25 01:56:36.494817 update_engine[1484]: I20250325 01:56:36.494747 1484 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 25 01:56:36.494817 update_engine[1484]: I20250325 01:56:36.494758 1484 update_attempter.cc:306] Processing Done. Mar 25 01:56:36.494817 update_engine[1484]: E20250325 01:56:36.494781 1484 update_attempter.cc:619] Update failed. Mar 25 01:56:36.494817 update_engine[1484]: I20250325 01:56:36.494791 1484 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Mar 25 01:56:36.494817 update_engine[1484]: I20250325 01:56:36.494797 1484 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Mar 25 01:56:36.494817 update_engine[1484]: I20250325 01:56:36.494803 1484 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Mar 25 01:56:36.494955 update_engine[1484]: I20250325 01:56:36.494894 1484 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Mar 25 01:56:36.494955 update_engine[1484]: I20250325 01:56:36.494917 1484 omaha_request_action.cc:271] Posting an Omaha request to disabled Mar 25 01:56:36.494955 update_engine[1484]: I20250325 01:56:36.494925 1484 omaha_request_action.cc:272] Request: Mar 25 01:56:36.494955 update_engine[1484]: Mar 25 01:56:36.494955 update_engine[1484]: Mar 25 01:56:36.494955 update_engine[1484]: Mar 25 01:56:36.494955 update_engine[1484]: Mar 25 01:56:36.494955 update_engine[1484]: Mar 25 01:56:36.494955 update_engine[1484]: Mar 25 01:56:36.494955 update_engine[1484]: I20250325 01:56:36.494931 1484 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 25 01:56:36.495542 update_engine[1484]: I20250325 01:56:36.495139 1484 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 25 01:56:36.495542 update_engine[1484]: I20250325 01:56:36.495436 1484 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 25 01:56:36.495582 locksmithd[1520]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Mar 25 01:56:36.495886 update_engine[1484]: E20250325 01:56:36.495847 1484 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 25 01:56:36.495926 update_engine[1484]: I20250325 01:56:36.495908 1484 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Mar 25 01:56:36.495948 update_engine[1484]: I20250325 01:56:36.495923 1484 omaha_request_action.cc:617] Omaha request response: Mar 25 01:56:36.495948 update_engine[1484]: I20250325 01:56:36.495930 1484 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 25 01:56:36.495948 update_engine[1484]: I20250325 01:56:36.495935 1484 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 25 01:56:36.495948 update_engine[1484]: I20250325 01:56:36.495940 1484 update_attempter.cc:306] Processing Done. Mar 25 01:56:36.495948 update_engine[1484]: I20250325 01:56:36.495945 1484 update_attempter.cc:310] Error event sent. Mar 25 01:56:36.496065 update_engine[1484]: I20250325 01:56:36.495954 1484 update_check_scheduler.cc:74] Next update check in 43m37s Mar 25 01:56:36.496243 locksmithd[1520]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Mar 25 01:56:58.406226 containerd[1507]: time="2025-03-25T01:56:58.406031815Z" level=info msg="TaskExit event in podsandbox handler container_id:\"789364782bc84056bee05e88ed1ded32a06a9e39ecda823fc70a29f4eab03dd8\" id:\"bf3aa484b007800b9664a00359e3f11bdc0bcf54a57042e00d8ac87ff0646f46\" pid:5429 exited_at:{seconds:1742867818 nanos:405831050}" Mar 25 01:57:02.536119 containerd[1507]: time="2025-03-25T01:57:02.536056680Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c1a93d09d0789e286491d05e330664b44ac3c0e1be7045deae34a0fdabf57c3c\" id:\"b8e6d1c23d0b875a112f1e0e33b154a725715c7d9f7da1d9187816aa1de342ec\" pid:5450 exited_at:{seconds:1742867822 nanos:535681729}" Mar 25 01:57:02.931214 systemd[1]: Started sshd@9-95.217.13.107:22-139.178.68.195:44932.service - OpenSSH per-connection server daemon (139.178.68.195:44932). Mar 25 01:57:03.957568 sshd[5469]: Accepted publickey for core from 139.178.68.195 port 44932 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:57:03.960071 sshd-session[5469]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:57:03.966232 systemd-logind[1483]: New session 8 of user core. Mar 25 01:57:03.971449 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 25 01:57:05.056290 sshd[5471]: Connection closed by 139.178.68.195 port 44932 Mar 25 01:57:05.057195 sshd-session[5469]: pam_unix(sshd:session): session closed for user core Mar 25 01:57:05.064531 systemd[1]: sshd@9-95.217.13.107:22-139.178.68.195:44932.service: Deactivated successfully. Mar 25 01:57:05.067553 systemd[1]: session-8.scope: Deactivated successfully. Mar 25 01:57:05.070642 systemd-logind[1483]: Session 8 logged out. Waiting for processes to exit. Mar 25 01:57:05.072196 systemd-logind[1483]: Removed session 8. Mar 25 01:57:10.257529 systemd[1]: Started sshd@10-95.217.13.107:22-139.178.68.195:40900.service - OpenSSH per-connection server daemon (139.178.68.195:40900). Mar 25 01:57:11.326361 sshd[5485]: Accepted publickey for core from 139.178.68.195 port 40900 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:57:11.327809 sshd-session[5485]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:57:11.332499 systemd-logind[1483]: New session 9 of user core. Mar 25 01:57:11.342515 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 25 01:57:12.134284 sshd[5487]: Connection closed by 139.178.68.195 port 40900 Mar 25 01:57:12.135458 sshd-session[5485]: pam_unix(sshd:session): session closed for user core Mar 25 01:57:12.141662 systemd[1]: sshd@10-95.217.13.107:22-139.178.68.195:40900.service: Deactivated successfully. Mar 25 01:57:12.146088 systemd[1]: session-9.scope: Deactivated successfully. Mar 25 01:57:12.147813 systemd-logind[1483]: Session 9 logged out. Waiting for processes to exit. Mar 25 01:57:12.149795 systemd-logind[1483]: Removed session 9. Mar 25 01:57:17.320518 systemd[1]: Started sshd@11-95.217.13.107:22-139.178.68.195:49078.service - OpenSSH per-connection server daemon (139.178.68.195:49078). Mar 25 01:57:18.411877 sshd[5501]: Accepted publickey for core from 139.178.68.195 port 49078 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:57:18.413502 sshd-session[5501]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:57:18.419069 systemd-logind[1483]: New session 10 of user core. Mar 25 01:57:18.423523 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 25 01:57:18.598062 containerd[1507]: time="2025-03-25T01:57:18.595443910Z" level=warning msg="container event discarded" container=6bc52c5edd64f5f0201f8362908dc1ab705434616527cacdffb4e4822da73982 type=CONTAINER_CREATED_EVENT Mar 25 01:57:18.610274 containerd[1507]: time="2025-03-25T01:57:18.610187072Z" level=warning msg="container event discarded" container=6bc52c5edd64f5f0201f8362908dc1ab705434616527cacdffb4e4822da73982 type=CONTAINER_STARTED_EVENT Mar 25 01:57:18.610274 containerd[1507]: time="2025-03-25T01:57:18.610236084Z" level=warning msg="container event discarded" container=b1aef21d8edb847a99d9d4f5bf2143e7a9a3d083e26d04143c1f9b4093c1456c type=CONTAINER_CREATED_EVENT Mar 25 01:57:18.610274 containerd[1507]: time="2025-03-25T01:57:18.610253615Z" level=warning msg="container event discarded" container=b1aef21d8edb847a99d9d4f5bf2143e7a9a3d083e26d04143c1f9b4093c1456c type=CONTAINER_STARTED_EVENT Mar 25 01:57:18.610274 containerd[1507]: time="2025-03-25T01:57:18.610268544Z" level=warning msg="container event discarded" container=27efbd170daddf7a684fc1a58da2d326e761132d4cf2bdf7e556526ab97d8035 type=CONTAINER_CREATED_EVENT Mar 25 01:57:18.610274 containerd[1507]: time="2025-03-25T01:57:18.610280296Z" level=warning msg="container event discarded" container=27efbd170daddf7a684fc1a58da2d326e761132d4cf2bdf7e556526ab97d8035 type=CONTAINER_STARTED_EVENT Mar 25 01:57:18.637624 containerd[1507]: time="2025-03-25T01:57:18.637539156Z" level=warning msg="container event discarded" container=fb18c04002e30d6f75e6492c51d5ef348e67a2c46c3fee6fd454bcea84945441 type=CONTAINER_CREATED_EVENT Mar 25 01:57:18.637624 containerd[1507]: time="2025-03-25T01:57:18.637599900Z" level=warning msg="container event discarded" container=4bab651f38320a9625b803f73347c106b4ba04903b4df16a7be544ec913c2c24 type=CONTAINER_CREATED_EVENT Mar 25 01:57:18.637833 containerd[1507]: time="2025-03-25T01:57:18.637635597Z" level=warning msg="container event discarded" container=3bb9d638c02e4da8a99821df5c06807ff3faadcbf5014d7e1eeef267c5af0eb5 type=CONTAINER_CREATED_EVENT Mar 25 01:57:18.712150 containerd[1507]: time="2025-03-25T01:57:18.711969545Z" level=warning msg="container event discarded" container=fb18c04002e30d6f75e6492c51d5ef348e67a2c46c3fee6fd454bcea84945441 type=CONTAINER_STARTED_EVENT Mar 25 01:57:18.743615 containerd[1507]: time="2025-03-25T01:57:18.743556461Z" level=warning msg="container event discarded" container=3bb9d638c02e4da8a99821df5c06807ff3faadcbf5014d7e1eeef267c5af0eb5 type=CONTAINER_STARTED_EVENT Mar 25 01:57:18.743615 containerd[1507]: time="2025-03-25T01:57:18.743602366Z" level=warning msg="container event discarded" container=4bab651f38320a9625b803f73347c106b4ba04903b4df16a7be544ec913c2c24 type=CONTAINER_STARTED_EVENT Mar 25 01:57:19.229116 sshd[5503]: Connection closed by 139.178.68.195 port 49078 Mar 25 01:57:19.230267 sshd-session[5501]: pam_unix(sshd:session): session closed for user core Mar 25 01:57:19.235501 systemd[1]: sshd@11-95.217.13.107:22-139.178.68.195:49078.service: Deactivated successfully. Mar 25 01:57:19.239594 systemd[1]: session-10.scope: Deactivated successfully. Mar 25 01:57:19.241820 systemd-logind[1483]: Session 10 logged out. Waiting for processes to exit. Mar 25 01:57:19.244145 systemd-logind[1483]: Removed session 10. Mar 25 01:57:19.381016 systemd[1]: Started sshd@12-95.217.13.107:22-139.178.68.195:49092.service - OpenSSH per-connection server daemon (139.178.68.195:49092). Mar 25 01:57:20.362287 sshd[5517]: Accepted publickey for core from 139.178.68.195 port 49092 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:57:20.363721 sshd-session[5517]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:57:20.369219 systemd-logind[1483]: New session 11 of user core. Mar 25 01:57:20.373611 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 25 01:57:21.158369 sshd[5519]: Connection closed by 139.178.68.195 port 49092 Mar 25 01:57:21.159777 sshd-session[5517]: pam_unix(sshd:session): session closed for user core Mar 25 01:57:21.165697 systemd[1]: sshd@12-95.217.13.107:22-139.178.68.195:49092.service: Deactivated successfully. Mar 25 01:57:21.168480 systemd[1]: session-11.scope: Deactivated successfully. Mar 25 01:57:21.169832 systemd-logind[1483]: Session 11 logged out. Waiting for processes to exit. Mar 25 01:57:21.171312 systemd-logind[1483]: Removed session 11. Mar 25 01:57:21.328776 systemd[1]: Started sshd@13-95.217.13.107:22-139.178.68.195:49108.service - OpenSSH per-connection server daemon (139.178.68.195:49108). Mar 25 01:57:22.331049 sshd[5529]: Accepted publickey for core from 139.178.68.195 port 49108 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:57:22.333580 sshd-session[5529]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:57:22.341119 systemd-logind[1483]: New session 12 of user core. Mar 25 01:57:22.346595 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 25 01:57:23.082621 sshd[5531]: Connection closed by 139.178.68.195 port 49108 Mar 25 01:57:23.083257 sshd-session[5529]: pam_unix(sshd:session): session closed for user core Mar 25 01:57:23.086024 systemd[1]: sshd@13-95.217.13.107:22-139.178.68.195:49108.service: Deactivated successfully. Mar 25 01:57:23.087698 systemd[1]: session-12.scope: Deactivated successfully. Mar 25 01:57:23.089242 systemd-logind[1483]: Session 12 logged out. Waiting for processes to exit. Mar 25 01:57:23.090241 systemd-logind[1483]: Removed session 12. Mar 25 01:57:28.251561 systemd[1]: Started sshd@14-95.217.13.107:22-139.178.68.195:50728.service - OpenSSH per-connection server daemon (139.178.68.195:50728). Mar 25 01:57:28.380445 containerd[1507]: time="2025-03-25T01:57:28.380401927Z" level=info msg="TaskExit event in podsandbox handler container_id:\"789364782bc84056bee05e88ed1ded32a06a9e39ecda823fc70a29f4eab03dd8\" id:\"44bf8d24173e32bd7338419d6abad876ead085ef27b91cae459634e713d0e0e4\" pid:5563 exited_at:{seconds:1742867848 nanos:379877357}" Mar 25 01:57:29.227510 sshd[5549]: Accepted publickey for core from 139.178.68.195 port 50728 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:57:29.231169 sshd-session[5549]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:57:29.238580 systemd-logind[1483]: New session 13 of user core. Mar 25 01:57:29.244521 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 25 01:57:29.958682 sshd[5572]: Connection closed by 139.178.68.195 port 50728 Mar 25 01:57:29.959269 sshd-session[5549]: pam_unix(sshd:session): session closed for user core Mar 25 01:57:29.963263 systemd[1]: sshd@14-95.217.13.107:22-139.178.68.195:50728.service: Deactivated successfully. Mar 25 01:57:29.964942 systemd[1]: session-13.scope: Deactivated successfully. Mar 25 01:57:29.965714 systemd-logind[1483]: Session 13 logged out. Waiting for processes to exit. Mar 25 01:57:29.967912 systemd-logind[1483]: Removed session 13. Mar 25 01:57:32.523769 containerd[1507]: time="2025-03-25T01:57:32.523668265Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c1a93d09d0789e286491d05e330664b44ac3c0e1be7045deae34a0fdabf57c3c\" id:\"a73047ceefe849ed82d0df71152a3f35d0b2b5ebd98df7a22765740f07d2e1b9\" pid:5596 exited_at:{seconds:1742867852 nanos:523217953}" Mar 25 01:57:32.993441 containerd[1507]: time="2025-03-25T01:57:32.993298174Z" level=info msg="TaskExit event in podsandbox handler container_id:\"789364782bc84056bee05e88ed1ded32a06a9e39ecda823fc70a29f4eab03dd8\" id:\"16c8139dfb20677a05a00e0e564a0a85f3374441f4f5f666e6dfcf1ca8992bf5\" pid:5619 exited_at:{seconds:1742867852 nanos:992780527}" Mar 25 01:57:35.127499 systemd[1]: Started sshd@15-95.217.13.107:22-139.178.68.195:50732.service - OpenSSH per-connection server daemon (139.178.68.195:50732). Mar 25 01:57:36.117837 sshd[5629]: Accepted publickey for core from 139.178.68.195 port 50732 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:57:36.119168 sshd-session[5629]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:57:36.124943 systemd-logind[1483]: New session 14 of user core. Mar 25 01:57:36.127491 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 25 01:57:36.857108 sshd[5631]: Connection closed by 139.178.68.195 port 50732 Mar 25 01:57:36.858072 sshd-session[5629]: pam_unix(sshd:session): session closed for user core Mar 25 01:57:36.863736 systemd[1]: sshd@15-95.217.13.107:22-139.178.68.195:50732.service: Deactivated successfully. Mar 25 01:57:36.867255 systemd[1]: session-14.scope: Deactivated successfully. Mar 25 01:57:36.868665 systemd-logind[1483]: Session 14 logged out. Waiting for processes to exit. Mar 25 01:57:36.870691 systemd-logind[1483]: Removed session 14. Mar 25 01:57:38.234112 containerd[1507]: time="2025-03-25T01:57:38.233881286Z" level=warning msg="container event discarded" container=b7a78a41e87e327792e717e673a24a0049c73586bcfa7212d64e5f80653e43f0 type=CONTAINER_CREATED_EVENT Mar 25 01:57:38.234112 containerd[1507]: time="2025-03-25T01:57:38.234047196Z" level=warning msg="container event discarded" container=b7a78a41e87e327792e717e673a24a0049c73586bcfa7212d64e5f80653e43f0 type=CONTAINER_STARTED_EVENT Mar 25 01:57:38.257590 containerd[1507]: time="2025-03-25T01:57:38.257487998Z" level=warning msg="container event discarded" container=c4c2139afd29b90abd473c71017368025e2af2d13044f4e716577be932a4f717 type=CONTAINER_CREATED_EVENT Mar 25 01:57:38.291185 containerd[1507]: time="2025-03-25T01:57:38.291113809Z" level=warning msg="container event discarded" container=ee7655b9156195a149adf1b4da997c9af8dc2f3fe721bb0f04d64ca34c28cbe3 type=CONTAINER_CREATED_EVENT Mar 25 01:57:38.291383 containerd[1507]: time="2025-03-25T01:57:38.291186635Z" level=warning msg="container event discarded" container=ee7655b9156195a149adf1b4da997c9af8dc2f3fe721bb0f04d64ca34c28cbe3 type=CONTAINER_STARTED_EVENT Mar 25 01:57:38.320591 containerd[1507]: time="2025-03-25T01:57:38.320503325Z" level=warning msg="container event discarded" container=c4c2139afd29b90abd473c71017368025e2af2d13044f4e716577be932a4f717 type=CONTAINER_STARTED_EVENT Mar 25 01:57:41.810967 containerd[1507]: time="2025-03-25T01:57:41.810893523Z" level=warning msg="container event discarded" container=e7f97ec9cede0677a6edbfbaf700661da251254bb07c3b2ab1ffda9f3980fbed type=CONTAINER_CREATED_EVENT Mar 25 01:57:41.866715 containerd[1507]: time="2025-03-25T01:57:41.866645933Z" level=warning msg="container event discarded" container=e7f97ec9cede0677a6edbfbaf700661da251254bb07c3b2ab1ffda9f3980fbed type=CONTAINER_STARTED_EVENT Mar 25 01:57:42.031148 systemd[1]: Started sshd@16-95.217.13.107:22-139.178.68.195:59396.service - OpenSSH per-connection server daemon (139.178.68.195:59396). Mar 25 01:57:43.013182 sshd[5645]: Accepted publickey for core from 139.178.68.195 port 59396 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:57:43.015527 sshd-session[5645]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:57:43.020630 systemd-logind[1483]: New session 15 of user core. Mar 25 01:57:43.022525 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 25 01:57:43.765837 sshd[5647]: Connection closed by 139.178.68.195 port 59396 Mar 25 01:57:43.766692 sshd-session[5645]: pam_unix(sshd:session): session closed for user core Mar 25 01:57:43.770546 systemd-logind[1483]: Session 15 logged out. Waiting for processes to exit. Mar 25 01:57:43.771197 systemd[1]: sshd@16-95.217.13.107:22-139.178.68.195:59396.service: Deactivated successfully. Mar 25 01:57:43.773588 systemd[1]: session-15.scope: Deactivated successfully. Mar 25 01:57:43.774832 systemd-logind[1483]: Removed session 15. Mar 25 01:57:43.939037 systemd[1]: Started sshd@17-95.217.13.107:22-139.178.68.195:59406.service - OpenSSH per-connection server daemon (139.178.68.195:59406). Mar 25 01:57:44.924038 sshd[5660]: Accepted publickey for core from 139.178.68.195 port 59406 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:57:44.925884 sshd-session[5660]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:57:44.931239 systemd-logind[1483]: New session 16 of user core. Mar 25 01:57:44.936727 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 25 01:57:45.218213 containerd[1507]: time="2025-03-25T01:57:45.217980686Z" level=warning msg="container event discarded" container=d1849027fad62c5560b4722cfab5adb81844726191c81b2c2f432129e019e5e6 type=CONTAINER_CREATED_EVENT Mar 25 01:57:45.218213 containerd[1507]: time="2025-03-25T01:57:45.218053422Z" level=warning msg="container event discarded" container=d1849027fad62c5560b4722cfab5adb81844726191c81b2c2f432129e019e5e6 type=CONTAINER_STARTED_EVENT Mar 25 01:57:45.263345 containerd[1507]: time="2025-03-25T01:57:45.263252075Z" level=warning msg="container event discarded" container=54f6638a6f3a8352b869afcf72ccdaa9c8e7f0dccd7d7e68e252de80cf8692cd type=CONTAINER_CREATED_EVENT Mar 25 01:57:45.263345 containerd[1507]: time="2025-03-25T01:57:45.263297970Z" level=warning msg="container event discarded" container=54f6638a6f3a8352b869afcf72ccdaa9c8e7f0dccd7d7e68e252de80cf8692cd type=CONTAINER_STARTED_EVENT Mar 25 01:57:45.936699 sshd[5662]: Connection closed by 139.178.68.195 port 59406 Mar 25 01:57:45.939209 sshd-session[5660]: pam_unix(sshd:session): session closed for user core Mar 25 01:57:45.945903 systemd[1]: sshd@17-95.217.13.107:22-139.178.68.195:59406.service: Deactivated successfully. Mar 25 01:57:45.949097 systemd[1]: session-16.scope: Deactivated successfully. Mar 25 01:57:45.950807 systemd-logind[1483]: Session 16 logged out. Waiting for processes to exit. Mar 25 01:57:45.952704 systemd-logind[1483]: Removed session 16. Mar 25 01:57:46.106887 systemd[1]: Started sshd@18-95.217.13.107:22-139.178.68.195:45950.service - OpenSSH per-connection server daemon (139.178.68.195:45950). Mar 25 01:57:47.110081 sshd[5673]: Accepted publickey for core from 139.178.68.195 port 45950 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:57:47.112682 sshd-session[5673]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:57:47.119985 systemd-logind[1483]: New session 17 of user core. Mar 25 01:57:47.125560 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 25 01:57:48.196687 containerd[1507]: time="2025-03-25T01:57:48.196581480Z" level=warning msg="container event discarded" container=5963fd0dcc94b4ffc490f88285aaf51e8bf0e6ca9b46b0c265d2a263fec459d7 type=CONTAINER_CREATED_EVENT Mar 25 01:57:48.267649 containerd[1507]: time="2025-03-25T01:57:48.267554888Z" level=warning msg="container event discarded" container=5963fd0dcc94b4ffc490f88285aaf51e8bf0e6ca9b46b0c265d2a263fec459d7 type=CONTAINER_STARTED_EVENT Mar 25 01:57:49.547651 sshd[5675]: Connection closed by 139.178.68.195 port 45950 Mar 25 01:57:49.549459 sshd-session[5673]: pam_unix(sshd:session): session closed for user core Mar 25 01:57:49.555359 systemd[1]: sshd@18-95.217.13.107:22-139.178.68.195:45950.service: Deactivated successfully. Mar 25 01:57:49.557093 systemd[1]: session-17.scope: Deactivated successfully. Mar 25 01:57:49.557261 systemd[1]: session-17.scope: Consumed 480ms CPU time, 67.7M memory peak. Mar 25 01:57:49.558579 systemd-logind[1483]: Session 17 logged out. Waiting for processes to exit. Mar 25 01:57:49.560022 systemd-logind[1483]: Removed session 17. Mar 25 01:57:49.721756 systemd[1]: Started sshd@19-95.217.13.107:22-139.178.68.195:45962.service - OpenSSH per-connection server daemon (139.178.68.195:45962). Mar 25 01:57:50.208771 containerd[1507]: time="2025-03-25T01:57:50.208714607Z" level=warning msg="container event discarded" container=f1867468c01be0d213e39cf0a0e5d5c46f9bd602e1024e45055e1dad8cd459e4 type=CONTAINER_CREATED_EVENT Mar 25 01:57:50.263502 containerd[1507]: time="2025-03-25T01:57:50.263315568Z" level=warning msg="container event discarded" container=f1867468c01be0d213e39cf0a0e5d5c46f9bd602e1024e45055e1dad8cd459e4 type=CONTAINER_STARTED_EVENT Mar 25 01:57:50.378117 containerd[1507]: time="2025-03-25T01:57:50.378018367Z" level=warning msg="container event discarded" container=f1867468c01be0d213e39cf0a0e5d5c46f9bd602e1024e45055e1dad8cd459e4 type=CONTAINER_STOPPED_EVENT Mar 25 01:57:50.753950 sshd[5694]: Accepted publickey for core from 139.178.68.195 port 45962 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:57:50.756809 sshd-session[5694]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:57:50.765110 systemd-logind[1483]: New session 18 of user core. Mar 25 01:57:50.769621 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 25 01:57:51.661119 sshd[5696]: Connection closed by 139.178.68.195 port 45962 Mar 25 01:57:51.662046 sshd-session[5694]: pam_unix(sshd:session): session closed for user core Mar 25 01:57:51.668377 systemd[1]: sshd@19-95.217.13.107:22-139.178.68.195:45962.service: Deactivated successfully. Mar 25 01:57:51.671184 systemd[1]: session-18.scope: Deactivated successfully. Mar 25 01:57:51.672794 systemd-logind[1483]: Session 18 logged out. Waiting for processes to exit. Mar 25 01:57:51.674448 systemd-logind[1483]: Removed session 18. Mar 25 01:57:51.835859 systemd[1]: Started sshd@20-95.217.13.107:22-139.178.68.195:45966.service - OpenSSH per-connection server daemon (139.178.68.195:45966). Mar 25 01:57:52.838176 sshd[5718]: Accepted publickey for core from 139.178.68.195 port 45966 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:57:52.840719 sshd-session[5718]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:57:52.848080 systemd-logind[1483]: New session 19 of user core. Mar 25 01:57:52.856657 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 25 01:57:53.592401 sshd[5720]: Connection closed by 139.178.68.195 port 45966 Mar 25 01:57:53.593130 sshd-session[5718]: pam_unix(sshd:session): session closed for user core Mar 25 01:57:53.595665 systemd[1]: sshd@20-95.217.13.107:22-139.178.68.195:45966.service: Deactivated successfully. Mar 25 01:57:53.597846 systemd[1]: session-19.scope: Deactivated successfully. Mar 25 01:57:53.600159 systemd-logind[1483]: Session 19 logged out. Waiting for processes to exit. Mar 25 01:57:53.602424 systemd-logind[1483]: Removed session 19. Mar 25 01:57:57.456936 containerd[1507]: time="2025-03-25T01:57:57.456845910Z" level=warning msg="container event discarded" container=87956effa6eecffce2081d479a04afcce6701026d4c29707770ced79aa02be2b type=CONTAINER_CREATED_EVENT Mar 25 01:57:57.533225 containerd[1507]: time="2025-03-25T01:57:57.533157817Z" level=warning msg="container event discarded" container=87956effa6eecffce2081d479a04afcce6701026d4c29707770ced79aa02be2b type=CONTAINER_STARTED_EVENT Mar 25 01:57:57.982043 containerd[1507]: time="2025-03-25T01:57:57.981935479Z" level=warning msg="container event discarded" container=87956effa6eecffce2081d479a04afcce6701026d4c29707770ced79aa02be2b type=CONTAINER_STOPPED_EVENT Mar 25 01:57:58.425942 containerd[1507]: time="2025-03-25T01:57:58.425715665Z" level=info msg="TaskExit event in podsandbox handler container_id:\"789364782bc84056bee05e88ed1ded32a06a9e39ecda823fc70a29f4eab03dd8\" id:\"62ed99ab0475f1124ef636bdcf14d794b1fda20f7c7181b88014cb26ee6c1625\" pid:5746 exited_at:{seconds:1742867878 nanos:425463914}" Mar 25 01:57:58.763500 systemd[1]: Started sshd@21-95.217.13.107:22-139.178.68.195:40988.service - OpenSSH per-connection server daemon (139.178.68.195:40988). Mar 25 01:57:59.764160 sshd[5756]: Accepted publickey for core from 139.178.68.195 port 40988 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:57:59.766265 sshd-session[5756]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:57:59.774466 systemd-logind[1483]: New session 20 of user core. Mar 25 01:57:59.778665 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 25 01:58:00.505022 sshd[5758]: Connection closed by 139.178.68.195 port 40988 Mar 25 01:58:00.505530 sshd-session[5756]: pam_unix(sshd:session): session closed for user core Mar 25 01:58:00.509221 systemd-logind[1483]: Session 20 logged out. Waiting for processes to exit. Mar 25 01:58:00.509531 systemd[1]: sshd@21-95.217.13.107:22-139.178.68.195:40988.service: Deactivated successfully. Mar 25 01:58:00.511636 systemd[1]: session-20.scope: Deactivated successfully. Mar 25 01:58:00.512922 systemd-logind[1483]: Removed session 20. Mar 25 01:58:02.486514 containerd[1507]: time="2025-03-25T01:58:02.486441829Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c1a93d09d0789e286491d05e330664b44ac3c0e1be7045deae34a0fdabf57c3c\" id:\"2b2b1538bda4ffe393f7fdb1a65c1ca57da74cbc5fcca13f7f978753c93aaa5d\" pid:5786 exited_at:{seconds:1742867882 nanos:486083790}" Mar 25 01:58:05.671573 containerd[1507]: time="2025-03-25T01:58:05.671471874Z" level=warning msg="container event discarded" container=c1a93d09d0789e286491d05e330664b44ac3c0e1be7045deae34a0fdabf57c3c type=CONTAINER_CREATED_EVENT Mar 25 01:58:05.674510 systemd[1]: Started sshd@22-95.217.13.107:22-139.178.68.195:58338.service - OpenSSH per-connection server daemon (139.178.68.195:58338). Mar 25 01:58:05.838782 containerd[1507]: time="2025-03-25T01:58:05.838712493Z" level=warning msg="container event discarded" container=c1a93d09d0789e286491d05e330664b44ac3c0e1be7045deae34a0fdabf57c3c type=CONTAINER_STARTED_EVENT Mar 25 01:58:06.658425 sshd[5800]: Accepted publickey for core from 139.178.68.195 port 58338 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:58:06.660770 sshd-session[5800]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:58:06.667425 systemd-logind[1483]: New session 21 of user core. Mar 25 01:58:06.674604 systemd[1]: Started session-21.scope - Session 21 of User core. Mar 25 01:58:07.436736 sshd[5802]: Connection closed by 139.178.68.195 port 58338 Mar 25 01:58:07.437386 sshd-session[5800]: pam_unix(sshd:session): session closed for user core Mar 25 01:58:07.441166 systemd[1]: sshd@22-95.217.13.107:22-139.178.68.195:58338.service: Deactivated successfully. Mar 25 01:58:07.444309 systemd[1]: session-21.scope: Deactivated successfully. Mar 25 01:58:07.444979 systemd-logind[1483]: Session 21 logged out. Waiting for processes to exit. Mar 25 01:58:07.446149 systemd-logind[1483]: Removed session 21. Mar 25 01:58:11.427851 containerd[1507]: time="2025-03-25T01:58:11.427728585Z" level=warning msg="container event discarded" container=30c8adb2b7c8e8d0aa178d7f43dadcf8fdc428143c634a6c27208cc2845fbd55 type=CONTAINER_CREATED_EVENT Mar 25 01:58:11.427851 containerd[1507]: time="2025-03-25T01:58:11.427797343Z" level=warning msg="container event discarded" container=30c8adb2b7c8e8d0aa178d7f43dadcf8fdc428143c634a6c27208cc2845fbd55 type=CONTAINER_STARTED_EVENT Mar 25 01:58:11.427851 containerd[1507]: time="2025-03-25T01:58:11.427807533Z" level=warning msg="container event discarded" container=99a287620e4fc7c522f12729237e1a7633d8e36431ce91883634787fb1cb0c2d type=CONTAINER_CREATED_EVENT Mar 25 01:58:11.427851 containerd[1507]: time="2025-03-25T01:58:11.427814827Z" level=warning msg="container event discarded" container=99a287620e4fc7c522f12729237e1a7633d8e36431ce91883634787fb1cb0c2d type=CONTAINER_STARTED_EVENT Mar 25 01:58:11.451281 containerd[1507]: time="2025-03-25T01:58:11.451184170Z" level=warning msg="container event discarded" container=fe4e363ba0682799885099307f294c8bc3a0ca5c82f08f958989a23d06e799de type=CONTAINER_CREATED_EVENT Mar 25 01:58:11.504536 containerd[1507]: time="2025-03-25T01:58:11.504441186Z" level=warning msg="container event discarded" container=fe4e363ba0682799885099307f294c8bc3a0ca5c82f08f958989a23d06e799de type=CONTAINER_STARTED_EVENT Mar 25 01:58:12.140645 containerd[1507]: time="2025-03-25T01:58:12.140577153Z" level=warning msg="container event discarded" container=c2beac8be50cb7c9320cb3f4af22888c8499fddea9e454e52820bc5946092ec2 type=CONTAINER_CREATED_EVENT Mar 25 01:58:12.140645 containerd[1507]: time="2025-03-25T01:58:12.140634450Z" level=warning msg="container event discarded" container=c2beac8be50cb7c9320cb3f4af22888c8499fddea9e454e52820bc5946092ec2 type=CONTAINER_STARTED_EVENT Mar 25 01:58:13.212124 containerd[1507]: time="2025-03-25T01:58:13.212052098Z" level=warning msg="container event discarded" container=baa8fbda303e3570b0e5dc43d87fae78c43fd504eb6afae34d58654af392ac1a type=CONTAINER_CREATED_EVENT Mar 25 01:58:13.212124 containerd[1507]: time="2025-03-25T01:58:13.212115096Z" level=warning msg="container event discarded" container=baa8fbda303e3570b0e5dc43d87fae78c43fd504eb6afae34d58654af392ac1a type=CONTAINER_STARTED_EVENT Mar 25 01:58:13.232853 containerd[1507]: time="2025-03-25T01:58:13.232779325Z" level=warning msg="container event discarded" container=9a54ca7dae51d747b25feb04af04fbb5c7cb020a7bba61703d7eb73bf4dbdbae type=CONTAINER_CREATED_EVENT Mar 25 01:58:13.232853 containerd[1507]: time="2025-03-25T01:58:13.232828797Z" level=warning msg="container event discarded" container=9a54ca7dae51d747b25feb04af04fbb5c7cb020a7bba61703d7eb73bf4dbdbae type=CONTAINER_STARTED_EVENT Mar 25 01:58:13.232853 containerd[1507]: time="2025-03-25T01:58:13.232844005Z" level=warning msg="container event discarded" container=0008382f2d5bc54813ed9c03933abe7cde094932b24ecb75d485a8c1cd81aed5 type=CONTAINER_CREATED_EVENT Mar 25 01:58:13.288162 containerd[1507]: time="2025-03-25T01:58:13.288074049Z" level=warning msg="container event discarded" container=0008382f2d5bc54813ed9c03933abe7cde094932b24ecb75d485a8c1cd81aed5 type=CONTAINER_STARTED_EVENT Mar 25 01:58:14.181895 containerd[1507]: time="2025-03-25T01:58:14.181846506Z" level=warning msg="container event discarded" container=925d4bf627c65f3f7762c69dfb6fcb794abdb2e6a06893dc178d5082692d935f type=CONTAINER_CREATED_EVENT Mar 25 01:58:14.181895 containerd[1507]: time="2025-03-25T01:58:14.181880440Z" level=warning msg="container event discarded" container=925d4bf627c65f3f7762c69dfb6fcb794abdb2e6a06893dc178d5082692d935f type=CONTAINER_STARTED_EVENT Mar 25 01:58:15.463416 containerd[1507]: time="2025-03-25T01:58:15.463320021Z" level=warning msg="container event discarded" container=680f2db56ee4b6f59e696cd7e2d27699e41346c7b257f4fa0bad184513c33ee9 type=CONTAINER_CREATED_EVENT Mar 25 01:58:15.532620 containerd[1507]: time="2025-03-25T01:58:15.532545875Z" level=warning msg="container event discarded" container=680f2db56ee4b6f59e696cd7e2d27699e41346c7b257f4fa0bad184513c33ee9 type=CONTAINER_STARTED_EVENT Mar 25 01:58:18.540751 containerd[1507]: time="2025-03-25T01:58:18.540700019Z" level=warning msg="container event discarded" container=789364782bc84056bee05e88ed1ded32a06a9e39ecda823fc70a29f4eab03dd8 type=CONTAINER_CREATED_EVENT Mar 25 01:58:18.625189 containerd[1507]: time="2025-03-25T01:58:18.625084056Z" level=warning msg="container event discarded" container=789364782bc84056bee05e88ed1ded32a06a9e39ecda823fc70a29f4eab03dd8 type=CONTAINER_STARTED_EVENT Mar 25 01:58:20.590177 containerd[1507]: time="2025-03-25T01:58:20.590064954Z" level=warning msg="container event discarded" container=3690a2aa1800361d9fdce0d53ce18ceaa2be2c3b836cdd9e61065fc3e9b27c25 type=CONTAINER_CREATED_EVENT Mar 25 01:58:20.639534 containerd[1507]: time="2025-03-25T01:58:20.639427727Z" level=warning msg="container event discarded" container=3690a2aa1800361d9fdce0d53ce18ceaa2be2c3b836cdd9e61065fc3e9b27c25 type=CONTAINER_STARTED_EVENT Mar 25 01:58:21.097004 containerd[1507]: time="2025-03-25T01:58:21.096926762Z" level=warning msg="container event discarded" container=38593d5d640746397950e8ee5177611e19728834c6a3800807d407e80bf06846 type=CONTAINER_CREATED_EVENT Mar 25 01:58:21.192052 containerd[1507]: time="2025-03-25T01:58:21.191985967Z" level=warning msg="container event discarded" container=38593d5d640746397950e8ee5177611e19728834c6a3800807d407e80bf06846 type=CONTAINER_STARTED_EVENT Mar 25 01:58:23.359404 containerd[1507]: time="2025-03-25T01:58:23.359281401Z" level=warning msg="container event discarded" container=24f5cc78646acae7cf0a5ca69f4da6a2f266ce5876f43827b405714c4662266d type=CONTAINER_CREATED_EVENT Mar 25 01:58:23.431716 containerd[1507]: time="2025-03-25T01:58:23.431655260Z" level=warning msg="container event discarded" container=24f5cc78646acae7cf0a5ca69f4da6a2f266ce5876f43827b405714c4662266d type=CONTAINER_STARTED_EVENT Mar 25 01:58:28.402825 containerd[1507]: time="2025-03-25T01:58:28.402767043Z" level=info msg="TaskExit event in podsandbox handler container_id:\"789364782bc84056bee05e88ed1ded32a06a9e39ecda823fc70a29f4eab03dd8\" id:\"c0eb274ff66347a81cee419c11e04dc58e63bb26e665c0f0f3399dea0d661aa3\" pid:5836 exited_at:{seconds:1742867908 nanos:402513469}" Mar 25 01:58:32.482160 containerd[1507]: time="2025-03-25T01:58:32.482111456Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c1a93d09d0789e286491d05e330664b44ac3c0e1be7045deae34a0fdabf57c3c\" id:\"1694df78af0d5d072d2274e704c4a4c1b9edf62824b5b233b234e4e48733a604\" pid:5857 exited_at:{seconds:1742867912 nanos:481778052}" Mar 25 01:58:32.983214 containerd[1507]: time="2025-03-25T01:58:32.983147909Z" level=info msg="TaskExit event in podsandbox handler container_id:\"789364782bc84056bee05e88ed1ded32a06a9e39ecda823fc70a29f4eab03dd8\" id:\"bb9f8121cd65ba5bed409717cb8d22cebc3cfd90e97df18fca2f86a78b2a5c6e\" pid:5882 exited_at:{seconds:1742867912 nanos:982674936}" Mar 25 01:58:43.569837 systemd[1]: cri-containerd-e7f97ec9cede0677a6edbfbaf700661da251254bb07c3b2ab1ffda9f3980fbed.scope: Deactivated successfully. Mar 25 01:58:43.570091 systemd[1]: cri-containerd-e7f97ec9cede0677a6edbfbaf700661da251254bb07c3b2ab1ffda9f3980fbed.scope: Consumed 4.441s CPU time, 63.3M memory peak, 36.8M read from disk. Mar 25 01:58:43.582774 containerd[1507]: time="2025-03-25T01:58:43.581231780Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e7f97ec9cede0677a6edbfbaf700661da251254bb07c3b2ab1ffda9f3980fbed\" id:\"e7f97ec9cede0677a6edbfbaf700661da251254bb07c3b2ab1ffda9f3980fbed\" pid:3257 exit_status:1 exited_at:{seconds:1742867923 nanos:580765537}" Mar 25 01:58:43.582774 containerd[1507]: time="2025-03-25T01:58:43.582655424Z" level=info msg="received exit event container_id:\"e7f97ec9cede0677a6edbfbaf700661da251254bb07c3b2ab1ffda9f3980fbed\" id:\"e7f97ec9cede0677a6edbfbaf700661da251254bb07c3b2ab1ffda9f3980fbed\" pid:3257 exit_status:1 exited_at:{seconds:1742867923 nanos:580765537}" Mar 25 01:58:43.642115 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e7f97ec9cede0677a6edbfbaf700661da251254bb07c3b2ab1ffda9f3980fbed-rootfs.mount: Deactivated successfully. Mar 25 01:58:43.814791 systemd[1]: cri-containerd-3bb9d638c02e4da8a99821df5c06807ff3faadcbf5014d7e1eeef267c5af0eb5.scope: Deactivated successfully. Mar 25 01:58:43.815036 systemd[1]: cri-containerd-3bb9d638c02e4da8a99821df5c06807ff3faadcbf5014d7e1eeef267c5af0eb5.scope: Consumed 1.764s CPU time, 40.9M memory peak, 34.3M read from disk. Mar 25 01:58:43.820114 containerd[1507]: time="2025-03-25T01:58:43.820081299Z" level=info msg="received exit event container_id:\"3bb9d638c02e4da8a99821df5c06807ff3faadcbf5014d7e1eeef267c5af0eb5\" id:\"3bb9d638c02e4da8a99821df5c06807ff3faadcbf5014d7e1eeef267c5af0eb5\" pid:2761 exit_status:1 exited_at:{seconds:1742867923 nanos:818001657}" Mar 25 01:58:43.820983 containerd[1507]: time="2025-03-25T01:58:43.820888980Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3bb9d638c02e4da8a99821df5c06807ff3faadcbf5014d7e1eeef267c5af0eb5\" id:\"3bb9d638c02e4da8a99821df5c06807ff3faadcbf5014d7e1eeef267c5af0eb5\" pid:2761 exit_status:1 exited_at:{seconds:1742867923 nanos:818001657}" Mar 25 01:58:43.823364 kubelet[2914]: E0325 01:58:43.823293 2914 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:35362->10.0.0.2:2379: read: connection timed out" Mar 25 01:58:43.844710 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3bb9d638c02e4da8a99821df5c06807ff3faadcbf5014d7e1eeef267c5af0eb5-rootfs.mount: Deactivated successfully. Mar 25 01:58:44.182659 kubelet[2914]: I0325 01:58:44.182571 2914 scope.go:117] "RemoveContainer" containerID="e7f97ec9cede0677a6edbfbaf700661da251254bb07c3b2ab1ffda9f3980fbed" Mar 25 01:58:44.183535 kubelet[2914]: I0325 01:58:44.183057 2914 scope.go:117] "RemoveContainer" containerID="3bb9d638c02e4da8a99821df5c06807ff3faadcbf5014d7e1eeef267c5af0eb5" Mar 25 01:58:44.209146 containerd[1507]: time="2025-03-25T01:58:44.209102104Z" level=info msg="CreateContainer within sandbox \"6bc52c5edd64f5f0201f8362908dc1ab705434616527cacdffb4e4822da73982\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Mar 25 01:58:44.217727 containerd[1507]: time="2025-03-25T01:58:44.217666680Z" level=info msg="CreateContainer within sandbox \"ee7655b9156195a149adf1b4da997c9af8dc2f3fe721bb0f04d64ca34c28cbe3\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Mar 25 01:58:44.247355 containerd[1507]: time="2025-03-25T01:58:44.246090457Z" level=info msg="Container d3e276a0d6b257f6f50e64795722906ac5f9547771422cef2ce56d70bc3bde5e: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:58:44.247944 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1129603878.mount: Deactivated successfully. Mar 25 01:58:44.251147 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1337137252.mount: Deactivated successfully. Mar 25 01:58:44.272449 containerd[1507]: time="2025-03-25T01:58:44.272416589Z" level=info msg="Container 65b533e32f67857be02610a4a87625186d1b9b37eb1f3fcfeeea07be7d8b3e0f: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:58:44.277723 containerd[1507]: time="2025-03-25T01:58:44.277686950Z" level=info msg="CreateContainer within sandbox \"6bc52c5edd64f5f0201f8362908dc1ab705434616527cacdffb4e4822da73982\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"d3e276a0d6b257f6f50e64795722906ac5f9547771422cef2ce56d70bc3bde5e\"" Mar 25 01:58:44.282959 containerd[1507]: time="2025-03-25T01:58:44.282724125Z" level=info msg="StartContainer for \"d3e276a0d6b257f6f50e64795722906ac5f9547771422cef2ce56d70bc3bde5e\"" Mar 25 01:58:44.283574 containerd[1507]: time="2025-03-25T01:58:44.283552155Z" level=info msg="connecting to shim d3e276a0d6b257f6f50e64795722906ac5f9547771422cef2ce56d70bc3bde5e" address="unix:///run/containerd/s/c96e7abfe5d88154cb0de11c1014c27792a0a47d985d0be97adff7be8584abb3" protocol=ttrpc version=3 Mar 25 01:58:44.287832 systemd[1]: cri-containerd-fb18c04002e30d6f75e6492c51d5ef348e67a2c46c3fee6fd454bcea84945441.scope: Deactivated successfully. Mar 25 01:58:44.288053 systemd[1]: cri-containerd-fb18c04002e30d6f75e6492c51d5ef348e67a2c46c3fee6fd454bcea84945441.scope: Consumed 5.731s CPU time, 91M memory peak, 64M read from disk. Mar 25 01:58:44.297266 containerd[1507]: time="2025-03-25T01:58:44.297105777Z" level=info msg="received exit event container_id:\"fb18c04002e30d6f75e6492c51d5ef348e67a2c46c3fee6fd454bcea84945441\" id:\"fb18c04002e30d6f75e6492c51d5ef348e67a2c46c3fee6fd454bcea84945441\" pid:2754 exit_status:1 exited_at:{seconds:1742867924 nanos:293953227}" Mar 25 01:58:44.297390 containerd[1507]: time="2025-03-25T01:58:44.297373718Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fb18c04002e30d6f75e6492c51d5ef348e67a2c46c3fee6fd454bcea84945441\" id:\"fb18c04002e30d6f75e6492c51d5ef348e67a2c46c3fee6fd454bcea84945441\" pid:2754 exit_status:1 exited_at:{seconds:1742867924 nanos:293953227}" Mar 25 01:58:44.299172 containerd[1507]: time="2025-03-25T01:58:44.299155252Z" level=info msg="CreateContainer within sandbox \"ee7655b9156195a149adf1b4da997c9af8dc2f3fe721bb0f04d64ca34c28cbe3\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"65b533e32f67857be02610a4a87625186d1b9b37eb1f3fcfeeea07be7d8b3e0f\"" Mar 25 01:58:44.301479 containerd[1507]: time="2025-03-25T01:58:44.301320312Z" level=info msg="StartContainer for \"65b533e32f67857be02610a4a87625186d1b9b37eb1f3fcfeeea07be7d8b3e0f\"" Mar 25 01:58:44.311935 containerd[1507]: time="2025-03-25T01:58:44.311910409Z" level=info msg="connecting to shim 65b533e32f67857be02610a4a87625186d1b9b37eb1f3fcfeeea07be7d8b3e0f" address="unix:///run/containerd/s/c37122c0610307e9fc954239dda132903f94f2ac475ca6be7486ac3c8709b73d" protocol=ttrpc version=3 Mar 25 01:58:44.319577 systemd[1]: Started cri-containerd-d3e276a0d6b257f6f50e64795722906ac5f9547771422cef2ce56d70bc3bde5e.scope - libcontainer container d3e276a0d6b257f6f50e64795722906ac5f9547771422cef2ce56d70bc3bde5e. Mar 25 01:58:44.334465 systemd[1]: Started cri-containerd-65b533e32f67857be02610a4a87625186d1b9b37eb1f3fcfeeea07be7d8b3e0f.scope - libcontainer container 65b533e32f67857be02610a4a87625186d1b9b37eb1f3fcfeeea07be7d8b3e0f. Mar 25 01:58:44.398364 containerd[1507]: time="2025-03-25T01:58:44.398002304Z" level=error msg="collecting metrics for fb18c04002e30d6f75e6492c51d5ef348e67a2c46c3fee6fd454bcea84945441" error="ttrpc: closed" Mar 25 01:58:44.398364 containerd[1507]: time="2025-03-25T01:58:44.398297035Z" level=info msg="StartContainer for \"65b533e32f67857be02610a4a87625186d1b9b37eb1f3fcfeeea07be7d8b3e0f\" returns successfully" Mar 25 01:58:44.414075 containerd[1507]: time="2025-03-25T01:58:44.413767413Z" level=info msg="StartContainer for \"d3e276a0d6b257f6f50e64795722906ac5f9547771422cef2ce56d70bc3bde5e\" returns successfully" Mar 25 01:58:44.643911 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-fb18c04002e30d6f75e6492c51d5ef348e67a2c46c3fee6fd454bcea84945441-rootfs.mount: Deactivated successfully. Mar 25 01:58:45.168424 kubelet[2914]: I0325 01:58:45.168388 2914 scope.go:117] "RemoveContainer" containerID="fb18c04002e30d6f75e6492c51d5ef348e67a2c46c3fee6fd454bcea84945441" Mar 25 01:58:45.174362 containerd[1507]: time="2025-03-25T01:58:45.174269487Z" level=info msg="CreateContainer within sandbox \"b1aef21d8edb847a99d9d4f5bf2143e7a9a3d083e26d04143c1f9b4093c1456c\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Mar 25 01:58:45.184371 containerd[1507]: time="2025-03-25T01:58:45.182629771Z" level=info msg="Container 9e49867b1cd2bd47f2c8f056156cb261647593ae0d073a7a76ca46e0cd5d4092: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:58:45.196275 containerd[1507]: time="2025-03-25T01:58:45.196239558Z" level=info msg="CreateContainer within sandbox \"b1aef21d8edb847a99d9d4f5bf2143e7a9a3d083e26d04143c1f9b4093c1456c\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"9e49867b1cd2bd47f2c8f056156cb261647593ae0d073a7a76ca46e0cd5d4092\"" Mar 25 01:58:45.196690 containerd[1507]: time="2025-03-25T01:58:45.196667337Z" level=info msg="StartContainer for \"9e49867b1cd2bd47f2c8f056156cb261647593ae0d073a7a76ca46e0cd5d4092\"" Mar 25 01:58:45.197668 containerd[1507]: time="2025-03-25T01:58:45.197625009Z" level=info msg="connecting to shim 9e49867b1cd2bd47f2c8f056156cb261647593ae0d073a7a76ca46e0cd5d4092" address="unix:///run/containerd/s/b84647a3b3ee4c7d3fac4c94c653f136bcd596a7a8cf7579644f69574baaf7c1" protocol=ttrpc version=3 Mar 25 01:58:45.216449 systemd[1]: Started cri-containerd-9e49867b1cd2bd47f2c8f056156cb261647593ae0d073a7a76ca46e0cd5d4092.scope - libcontainer container 9e49867b1cd2bd47f2c8f056156cb261647593ae0d073a7a76ca46e0cd5d4092. Mar 25 01:58:45.260491 containerd[1507]: time="2025-03-25T01:58:45.260452160Z" level=info msg="StartContainer for \"9e49867b1cd2bd47f2c8f056156cb261647593ae0d073a7a76ca46e0cd5d4092\" returns successfully" Mar 25 01:58:46.464733 kubelet[2914]: E0325 01:58:46.453821 2914 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = keepalive ping failed to receive ACK within timeout" event="&Event{ObjectMeta:{kube-apiserver-ci-4284-0-0-a-abb47662e0.182fe922c86ad591 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4284-0-0-a-abb47662e0,UID:7416dec70bd659a5a030a2a5de8e0c81,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Liveness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4284-0-0-a-abb47662e0,},FirstTimestamp:2025-03-25 01:58:36.416275857 +0000 UTC m=+373.599497707,LastTimestamp:2025-03-25 01:58:36.416275857 +0000 UTC m=+373.599497707,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4284-0-0-a-abb47662e0,}" Mar 25 01:58:47.280198 kubelet[2914]: I0325 01:58:47.280096 2914 status_manager.go:853] "Failed to get status for pod" podUID="ea3b72b2-f324-4485-bdaf-f2b0ef8e196b" pod="tigera-operator/tigera-operator-6479d6dc54-9jtdf" err="rpc error: code = Unavailable desc = keepalive ping failed to receive ACK within timeout"