Mar 25 01:40:07.998837 kernel: Linux version 6.6.83-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Mon Mar 24 23:38:35 -00 2025 Mar 25 01:40:07.998860 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=e7a00b7ee8d97e8d255663e9d3fa92277da8316702fb7f6d664fd7b137c307e9 Mar 25 01:40:07.998874 kernel: BIOS-provided physical RAM map: Mar 25 01:40:07.998882 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Mar 25 01:40:07.998888 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Mar 25 01:40:07.998895 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Mar 25 01:40:07.998902 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007cfdbfff] usable Mar 25 01:40:07.998908 kernel: BIOS-e820: [mem 0x000000007cfdc000-0x000000007cffffff] reserved Mar 25 01:40:07.998915 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Mar 25 01:40:07.998921 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Mar 25 01:40:07.998927 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Mar 25 01:40:07.998933 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Mar 25 01:40:07.998939 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Mar 25 01:40:07.998946 kernel: NX (Execute Disable) protection: active Mar 25 01:40:07.998953 kernel: APIC: Static calls initialized Mar 25 01:40:07.998961 kernel: SMBIOS 3.0.0 present. Mar 25 01:40:07.998968 kernel: DMI: Hetzner vServer/Standard PC (Q35 + ICH9, 2009), BIOS 20171111 11/11/2017 Mar 25 01:40:07.998974 kernel: Hypervisor detected: KVM Mar 25 01:40:07.998981 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Mar 25 01:40:07.998987 kernel: kvm-clock: using sched offset of 3394936353 cycles Mar 25 01:40:07.998994 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Mar 25 01:40:07.999003 kernel: tsc: Detected 2495.310 MHz processor Mar 25 01:40:07.999012 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 25 01:40:07.999021 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 25 01:40:07.999031 kernel: last_pfn = 0x7cfdc max_arch_pfn = 0x400000000 Mar 25 01:40:07.999038 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Mar 25 01:40:07.999045 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 25 01:40:07.999051 kernel: Using GB pages for direct mapping Mar 25 01:40:07.999058 kernel: ACPI: Early table checksum verification disabled Mar 25 01:40:07.999066 kernel: ACPI: RSDP 0x00000000000F5270 000014 (v00 BOCHS ) Mar 25 01:40:07.999075 kernel: ACPI: RSDT 0x000000007CFE265D 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:40:07.999084 kernel: ACPI: FACP 0x000000007CFE244D 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:40:07.999091 kernel: ACPI: DSDT 0x000000007CFE0040 00240D (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:40:07.999100 kernel: ACPI: FACS 0x000000007CFE0000 000040 Mar 25 01:40:07.999108 kernel: ACPI: APIC 0x000000007CFE2541 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:40:07.999117 kernel: ACPI: HPET 0x000000007CFE25C1 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:40:07.999125 kernel: ACPI: MCFG 0x000000007CFE25F9 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:40:07.999133 kernel: ACPI: WAET 0x000000007CFE2635 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 25 01:40:07.999142 kernel: ACPI: Reserving FACP table memory at [mem 0x7cfe244d-0x7cfe2540] Mar 25 01:40:07.999152 kernel: ACPI: Reserving DSDT table memory at [mem 0x7cfe0040-0x7cfe244c] Mar 25 01:40:07.999165 kernel: ACPI: Reserving FACS table memory at [mem 0x7cfe0000-0x7cfe003f] Mar 25 01:40:07.999174 kernel: ACPI: Reserving APIC table memory at [mem 0x7cfe2541-0x7cfe25c0] Mar 25 01:40:07.999181 kernel: ACPI: Reserving HPET table memory at [mem 0x7cfe25c1-0x7cfe25f8] Mar 25 01:40:07.999188 kernel: ACPI: Reserving MCFG table memory at [mem 0x7cfe25f9-0x7cfe2634] Mar 25 01:40:07.999195 kernel: ACPI: Reserving WAET table memory at [mem 0x7cfe2635-0x7cfe265c] Mar 25 01:40:07.999203 kernel: No NUMA configuration found Mar 25 01:40:07.999212 kernel: Faking a node at [mem 0x0000000000000000-0x000000007cfdbfff] Mar 25 01:40:07.999223 kernel: NODE_DATA(0) allocated [mem 0x7cfd6000-0x7cfdbfff] Mar 25 01:40:07.999233 kernel: Zone ranges: Mar 25 01:40:07.999243 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 25 01:40:07.999253 kernel: DMA32 [mem 0x0000000001000000-0x000000007cfdbfff] Mar 25 01:40:07.999262 kernel: Normal empty Mar 25 01:40:07.999271 kernel: Movable zone start for each node Mar 25 01:40:07.999281 kernel: Early memory node ranges Mar 25 01:40:07.999290 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Mar 25 01:40:07.999300 kernel: node 0: [mem 0x0000000000100000-0x000000007cfdbfff] Mar 25 01:40:07.999309 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007cfdbfff] Mar 25 01:40:07.999961 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 25 01:40:07.999972 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Mar 25 01:40:07.999979 kernel: On node 0, zone DMA32: 12324 pages in unavailable ranges Mar 25 01:40:07.999986 kernel: ACPI: PM-Timer IO Port: 0x608 Mar 25 01:40:07.999993 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Mar 25 01:40:08.000000 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Mar 25 01:40:08.000007 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Mar 25 01:40:08.000015 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Mar 25 01:40:08.000022 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Mar 25 01:40:08.000032 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Mar 25 01:40:08.000039 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Mar 25 01:40:08.000046 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 25 01:40:08.000053 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Mar 25 01:40:08.000060 kernel: smpboot: Allowing 2 CPUs, 0 hotplug CPUs Mar 25 01:40:08.000067 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Mar 25 01:40:08.000074 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Mar 25 01:40:08.000081 kernel: Booting paravirtualized kernel on KVM Mar 25 01:40:08.000088 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 25 01:40:08.000097 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Mar 25 01:40:08.000104 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u1048576 Mar 25 01:40:08.000111 kernel: pcpu-alloc: s197032 r8192 d32344 u1048576 alloc=1*2097152 Mar 25 01:40:08.000118 kernel: pcpu-alloc: [0] 0 1 Mar 25 01:40:08.000125 kernel: kvm-guest: PV spinlocks disabled, no host support Mar 25 01:40:08.000133 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=e7a00b7ee8d97e8d255663e9d3fa92277da8316702fb7f6d664fd7b137c307e9 Mar 25 01:40:08.000141 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 25 01:40:08.000148 kernel: random: crng init done Mar 25 01:40:08.000156 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 25 01:40:08.000163 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Mar 25 01:40:08.000170 kernel: Fallback order for Node 0: 0 Mar 25 01:40:08.000177 kernel: Built 1 zonelists, mobility grouping on. Total pages: 503708 Mar 25 01:40:08.000184 kernel: Policy zone: DMA32 Mar 25 01:40:08.000191 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 25 01:40:08.000199 kernel: Memory: 1917956K/2047464K available (14336K kernel code, 2304K rwdata, 25060K rodata, 43592K init, 1472K bss, 129248K reserved, 0K cma-reserved) Mar 25 01:40:08.000206 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 25 01:40:08.000213 kernel: ftrace: allocating 37985 entries in 149 pages Mar 25 01:40:08.000221 kernel: ftrace: allocated 149 pages with 4 groups Mar 25 01:40:08.000228 kernel: Dynamic Preempt: voluntary Mar 25 01:40:08.000234 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 25 01:40:08.000242 kernel: rcu: RCU event tracing is enabled. Mar 25 01:40:08.000249 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 25 01:40:08.000256 kernel: Trampoline variant of Tasks RCU enabled. Mar 25 01:40:08.000292 kernel: Rude variant of Tasks RCU enabled. Mar 25 01:40:08.000299 kernel: Tracing variant of Tasks RCU enabled. Mar 25 01:40:08.000306 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 25 01:40:08.000315 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 25 01:40:08.000322 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Mar 25 01:40:08.000329 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Mar 25 01:40:08.000336 kernel: Console: colour VGA+ 80x25 Mar 25 01:40:08.000355 kernel: printk: console [tty0] enabled Mar 25 01:40:08.000365 kernel: printk: console [ttyS0] enabled Mar 25 01:40:08.000374 kernel: ACPI: Core revision 20230628 Mar 25 01:40:08.000384 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Mar 25 01:40:08.000393 kernel: APIC: Switch to symmetric I/O mode setup Mar 25 01:40:08.000403 kernel: x2apic enabled Mar 25 01:40:08.000416 kernel: APIC: Switched APIC routing to: physical x2apic Mar 25 01:40:08.000425 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Mar 25 01:40:08.000435 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized Mar 25 01:40:08.000445 kernel: Calibrating delay loop (skipped) preset value.. 4990.62 BogoMIPS (lpj=2495310) Mar 25 01:40:08.000454 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Mar 25 01:40:08.000475 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Mar 25 01:40:08.000550 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Mar 25 01:40:08.000628 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 25 01:40:08.000645 kernel: Spectre V2 : Mitigation: Retpolines Mar 25 01:40:08.000663 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Mar 25 01:40:08.000679 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Mar 25 01:40:08.000698 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Mar 25 01:40:08.000714 kernel: RETBleed: Mitigation: untrained return thunk Mar 25 01:40:08.000730 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Mar 25 01:40:08.000747 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Mar 25 01:40:08.000764 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Mar 25 01:40:08.000783 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Mar 25 01:40:08.000799 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Mar 25 01:40:08.000815 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Mar 25 01:40:08.000832 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Mar 25 01:40:08.000849 kernel: Freeing SMP alternatives memory: 32K Mar 25 01:40:08.000865 kernel: pid_max: default: 32768 minimum: 301 Mar 25 01:40:08.000882 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Mar 25 01:40:08.000898 kernel: landlock: Up and running. Mar 25 01:40:08.000914 kernel: SELinux: Initializing. Mar 25 01:40:08.000933 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Mar 25 01:40:08.000950 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Mar 25 01:40:08.000966 kernel: smpboot: CPU0: AMD EPYC-Rome-v4 Processor (no XSAVES) (family: 0x17, model: 0x31, stepping: 0x0) Mar 25 01:40:08.000983 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 25 01:40:08.000999 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 25 01:40:08.001016 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Mar 25 01:40:08.001032 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Mar 25 01:40:08.001048 kernel: ... version: 0 Mar 25 01:40:08.001067 kernel: ... bit width: 48 Mar 25 01:40:08.001083 kernel: ... generic registers: 6 Mar 25 01:40:08.001099 kernel: ... value mask: 0000ffffffffffff Mar 25 01:40:08.001115 kernel: ... max period: 00007fffffffffff Mar 25 01:40:08.001131 kernel: ... fixed-purpose events: 0 Mar 25 01:40:08.001147 kernel: ... event mask: 000000000000003f Mar 25 01:40:08.001163 kernel: signal: max sigframe size: 1776 Mar 25 01:40:08.001215 kernel: rcu: Hierarchical SRCU implementation. Mar 25 01:40:08.001233 kernel: rcu: Max phase no-delay instances is 400. Mar 25 01:40:08.001249 kernel: smp: Bringing up secondary CPUs ... Mar 25 01:40:08.001268 kernel: smpboot: x86: Booting SMP configuration: Mar 25 01:40:08.001285 kernel: .... node #0, CPUs: #1 Mar 25 01:40:08.001301 kernel: smp: Brought up 1 node, 2 CPUs Mar 25 01:40:08.001317 kernel: smpboot: Max logical packages: 1 Mar 25 01:40:08.001333 kernel: smpboot: Total of 2 processors activated (9981.24 BogoMIPS) Mar 25 01:40:08.002420 kernel: devtmpfs: initialized Mar 25 01:40:08.002438 kernel: x86/mm: Memory block size: 128MB Mar 25 01:40:08.002455 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 25 01:40:08.002505 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 25 01:40:08.002527 kernel: pinctrl core: initialized pinctrl subsystem Mar 25 01:40:08.002544 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 25 01:40:08.002560 kernel: audit: initializing netlink subsys (disabled) Mar 25 01:40:08.002576 kernel: audit: type=2000 audit(1742866807.087:1): state=initialized audit_enabled=0 res=1 Mar 25 01:40:08.002593 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 25 01:40:08.002609 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 25 01:40:08.002625 kernel: cpuidle: using governor menu Mar 25 01:40:08.002641 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 25 01:40:08.002657 kernel: dca service started, version 1.12.1 Mar 25 01:40:08.002677 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xb0000000-0xbfffffff] (base 0xb0000000) Mar 25 01:40:08.002693 kernel: PCI: Using configuration type 1 for base access Mar 25 01:40:08.002710 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 25 01:40:08.002726 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Mar 25 01:40:08.002742 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Mar 25 01:40:08.002759 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Mar 25 01:40:08.002775 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Mar 25 01:40:08.002791 kernel: ACPI: Added _OSI(Module Device) Mar 25 01:40:08.002808 kernel: ACPI: Added _OSI(Processor Device) Mar 25 01:40:08.002859 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 25 01:40:08.002876 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 25 01:40:08.002892 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 25 01:40:08.002909 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Mar 25 01:40:08.002925 kernel: ACPI: Interpreter enabled Mar 25 01:40:08.002947 kernel: ACPI: PM: (supports S0 S5) Mar 25 01:40:08.002969 kernel: ACPI: Using IOAPIC for interrupt routing Mar 25 01:40:08.002992 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 25 01:40:08.003014 kernel: PCI: Using E820 reservations for host bridge windows Mar 25 01:40:08.003042 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Mar 25 01:40:08.003058 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 25 01:40:08.003315 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 25 01:40:08.003537 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Mar 25 01:40:08.003682 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Mar 25 01:40:08.003692 kernel: PCI host bridge to bus 0000:00 Mar 25 01:40:08.003770 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Mar 25 01:40:08.003838 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Mar 25 01:40:08.003901 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Mar 25 01:40:08.003964 kernel: pci_bus 0000:00: root bus resource [mem 0x7d000000-0xafffffff window] Mar 25 01:40:08.004028 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Mar 25 01:40:08.004093 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x8ffffffff window] Mar 25 01:40:08.004157 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 25 01:40:08.004245 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 Mar 25 01:40:08.005184 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 Mar 25 01:40:08.005278 kernel: pci 0000:00:01.0: reg 0x10: [mem 0xfb800000-0xfbffffff pref] Mar 25 01:40:08.005374 kernel: pci 0000:00:01.0: reg 0x18: [mem 0xfd200000-0xfd203fff 64bit pref] Mar 25 01:40:08.005449 kernel: pci 0000:00:01.0: reg 0x20: [mem 0xfea10000-0xfea10fff] Mar 25 01:40:08.005532 kernel: pci 0000:00:01.0: reg 0x30: [mem 0xfea00000-0xfea0ffff pref] Mar 25 01:40:08.005603 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Mar 25 01:40:08.005690 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 Mar 25 01:40:08.005765 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfea11000-0xfea11fff] Mar 25 01:40:08.005845 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 Mar 25 01:40:08.005918 kernel: pci 0000:00:02.1: reg 0x10: [mem 0xfea12000-0xfea12fff] Mar 25 01:40:08.005999 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 Mar 25 01:40:08.006074 kernel: pci 0000:00:02.2: reg 0x10: [mem 0xfea13000-0xfea13fff] Mar 25 01:40:08.006158 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 Mar 25 01:40:08.006234 kernel: pci 0000:00:02.3: reg 0x10: [mem 0xfea14000-0xfea14fff] Mar 25 01:40:08.006315 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 Mar 25 01:40:08.006407 kernel: pci 0000:00:02.4: reg 0x10: [mem 0xfea15000-0xfea15fff] Mar 25 01:40:08.006497 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 Mar 25 01:40:08.006572 kernel: pci 0000:00:02.5: reg 0x10: [mem 0xfea16000-0xfea16fff] Mar 25 01:40:08.006655 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 Mar 25 01:40:08.006730 kernel: pci 0000:00:02.6: reg 0x10: [mem 0xfea17000-0xfea17fff] Mar 25 01:40:08.006809 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 Mar 25 01:40:08.006882 kernel: pci 0000:00:02.7: reg 0x10: [mem 0xfea18000-0xfea18fff] Mar 25 01:40:08.006966 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 Mar 25 01:40:08.007040 kernel: pci 0000:00:03.0: reg 0x10: [mem 0xfea19000-0xfea19fff] Mar 25 01:40:08.007117 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 Mar 25 01:40:08.007193 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Mar 25 01:40:08.007297 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 Mar 25 01:40:08.007411 kernel: pci 0000:00:1f.2: reg 0x20: [io 0xc040-0xc05f] Mar 25 01:40:08.007505 kernel: pci 0000:00:1f.2: reg 0x24: [mem 0xfea1a000-0xfea1afff] Mar 25 01:40:08.007585 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 Mar 25 01:40:08.007658 kernel: pci 0000:00:1f.3: reg 0x20: [io 0x0700-0x073f] Mar 25 01:40:08.007747 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 Mar 25 01:40:08.007825 kernel: pci 0000:01:00.0: reg 0x14: [mem 0xfe880000-0xfe880fff] Mar 25 01:40:08.007901 kernel: pci 0000:01:00.0: reg 0x20: [mem 0xfd000000-0xfd003fff 64bit pref] Mar 25 01:40:08.007999 kernel: pci 0000:01:00.0: reg 0x30: [mem 0xfe800000-0xfe87ffff pref] Mar 25 01:40:08.008092 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Mar 25 01:40:08.008167 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Mar 25 01:40:08.008240 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Mar 25 01:40:08.008327 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 Mar 25 01:40:08.008441 kernel: pci 0000:02:00.0: reg 0x10: [mem 0xfe600000-0xfe603fff 64bit] Mar 25 01:40:08.008601 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Mar 25 01:40:08.008678 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Mar 25 01:40:08.008748 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Mar 25 01:40:08.008829 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 Mar 25 01:40:08.008906 kernel: pci 0000:03:00.0: reg 0x14: [mem 0xfe400000-0xfe400fff] Mar 25 01:40:08.008980 kernel: pci 0000:03:00.0: reg 0x20: [mem 0xfcc00000-0xfcc03fff 64bit pref] Mar 25 01:40:08.009050 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Mar 25 01:40:08.009121 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Mar 25 01:40:08.009191 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Mar 25 01:40:08.009272 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 Mar 25 01:40:08.009376 kernel: pci 0000:04:00.0: reg 0x20: [mem 0xfca00000-0xfca03fff 64bit pref] Mar 25 01:40:08.009454 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Mar 25 01:40:08.009537 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Mar 25 01:40:08.009618 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Mar 25 01:40:08.009730 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 Mar 25 01:40:08.009827 kernel: pci 0000:05:00.0: reg 0x14: [mem 0xfe000000-0xfe000fff] Mar 25 01:40:08.009905 kernel: pci 0000:05:00.0: reg 0x20: [mem 0xfc800000-0xfc803fff 64bit pref] Mar 25 01:40:08.009978 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Mar 25 01:40:08.010052 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Mar 25 01:40:08.010129 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Mar 25 01:40:08.010214 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 Mar 25 01:40:08.010313 kernel: pci 0000:06:00.0: reg 0x14: [mem 0xfde00000-0xfde00fff] Mar 25 01:40:08.010436 kernel: pci 0000:06:00.0: reg 0x20: [mem 0xfc600000-0xfc603fff 64bit pref] Mar 25 01:40:08.010517 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Mar 25 01:40:08.010589 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Mar 25 01:40:08.010663 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Mar 25 01:40:08.010676 kernel: acpiphp: Slot [0] registered Mar 25 01:40:08.010754 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 Mar 25 01:40:08.010828 kernel: pci 0000:07:00.0: reg 0x14: [mem 0xfdc80000-0xfdc80fff] Mar 25 01:40:08.010925 kernel: pci 0000:07:00.0: reg 0x20: [mem 0xfc400000-0xfc403fff 64bit pref] Mar 25 01:40:08.010998 kernel: pci 0000:07:00.0: reg 0x30: [mem 0xfdc00000-0xfdc7ffff pref] Mar 25 01:40:08.011069 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Mar 25 01:40:08.011139 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Mar 25 01:40:08.011230 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Mar 25 01:40:08.011243 kernel: acpiphp: Slot [0-2] registered Mar 25 01:40:08.011333 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Mar 25 01:40:08.011589 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Mar 25 01:40:08.011691 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Mar 25 01:40:08.011702 kernel: acpiphp: Slot [0-3] registered Mar 25 01:40:08.011815 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Mar 25 01:40:08.011957 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Mar 25 01:40:08.012033 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Mar 25 01:40:08.012047 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Mar 25 01:40:08.012054 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Mar 25 01:40:08.012061 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Mar 25 01:40:08.012069 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Mar 25 01:40:08.012076 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Mar 25 01:40:08.012083 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Mar 25 01:40:08.012091 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Mar 25 01:40:08.012098 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Mar 25 01:40:08.012105 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Mar 25 01:40:08.012114 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Mar 25 01:40:08.012121 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Mar 25 01:40:08.012128 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Mar 25 01:40:08.012135 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Mar 25 01:40:08.012143 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Mar 25 01:40:08.012150 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Mar 25 01:40:08.012157 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Mar 25 01:40:08.012164 kernel: iommu: Default domain type: Translated Mar 25 01:40:08.012172 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 25 01:40:08.012181 kernel: PCI: Using ACPI for IRQ routing Mar 25 01:40:08.012188 kernel: PCI: pci_cache_line_size set to 64 bytes Mar 25 01:40:08.012195 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Mar 25 01:40:08.012223 kernel: e820: reserve RAM buffer [mem 0x7cfdc000-0x7fffffff] Mar 25 01:40:08.012298 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Mar 25 01:40:08.012422 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Mar 25 01:40:08.012506 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Mar 25 01:40:08.012516 kernel: vgaarb: loaded Mar 25 01:40:08.012523 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Mar 25 01:40:08.012561 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Mar 25 01:40:08.012569 kernel: clocksource: Switched to clocksource kvm-clock Mar 25 01:40:08.012576 kernel: VFS: Disk quotas dquot_6.6.0 Mar 25 01:40:08.012584 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 25 01:40:08.012591 kernel: pnp: PnP ACPI init Mar 25 01:40:08.012671 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Mar 25 01:40:08.012682 kernel: pnp: PnP ACPI: found 5 devices Mar 25 01:40:08.012690 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 25 01:40:08.012699 kernel: NET: Registered PF_INET protocol family Mar 25 01:40:08.012707 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 25 01:40:08.012714 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Mar 25 01:40:08.012722 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 25 01:40:08.012729 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Mar 25 01:40:08.012736 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Mar 25 01:40:08.012744 kernel: TCP: Hash tables configured (established 16384 bind 16384) Mar 25 01:40:08.012751 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Mar 25 01:40:08.012759 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Mar 25 01:40:08.012768 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 25 01:40:08.012775 kernel: NET: Registered PF_XDP protocol family Mar 25 01:40:08.012847 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Mar 25 01:40:08.012919 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Mar 25 01:40:08.013011 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Mar 25 01:40:08.013111 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x1000-0x1fff] Mar 25 01:40:08.014496 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x2000-0x2fff] Mar 25 01:40:08.014602 kernel: pci 0000:00:03.0: BAR 13: assigned [io 0x3000-0x3fff] Mar 25 01:40:08.014694 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Mar 25 01:40:08.014769 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Mar 25 01:40:08.014843 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Mar 25 01:40:08.014916 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Mar 25 01:40:08.014988 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Mar 25 01:40:08.015060 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Mar 25 01:40:08.015131 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Mar 25 01:40:08.015208 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Mar 25 01:40:08.015282 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Mar 25 01:40:08.015371 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Mar 25 01:40:08.015446 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Mar 25 01:40:08.015531 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Mar 25 01:40:08.015605 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Mar 25 01:40:08.015706 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Mar 25 01:40:08.015792 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Mar 25 01:40:08.015864 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Mar 25 01:40:08.015939 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Mar 25 01:40:08.016012 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Mar 25 01:40:08.016085 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Mar 25 01:40:08.016158 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff] Mar 25 01:40:08.016231 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Mar 25 01:40:08.016305 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Mar 25 01:40:08.017425 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Mar 25 01:40:08.017511 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff] Mar 25 01:40:08.017585 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Mar 25 01:40:08.017655 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Mar 25 01:40:08.017724 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Mar 25 01:40:08.017805 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff] Mar 25 01:40:08.017877 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Mar 25 01:40:08.017951 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Mar 25 01:40:08.018015 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Mar 25 01:40:08.018079 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Mar 25 01:40:08.018146 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Mar 25 01:40:08.018209 kernel: pci_bus 0000:00: resource 7 [mem 0x7d000000-0xafffffff window] Mar 25 01:40:08.018307 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Mar 25 01:40:08.019430 kernel: pci_bus 0000:00: resource 9 [mem 0x100000000-0x8ffffffff window] Mar 25 01:40:08.019520 kernel: pci_bus 0000:01: resource 1 [mem 0xfe800000-0xfe9fffff] Mar 25 01:40:08.019587 kernel: pci_bus 0000:01: resource 2 [mem 0xfd000000-0xfd1fffff 64bit pref] Mar 25 01:40:08.019682 kernel: pci_bus 0000:02: resource 1 [mem 0xfe600000-0xfe7fffff] Mar 25 01:40:08.019750 kernel: pci_bus 0000:02: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Mar 25 01:40:08.019822 kernel: pci_bus 0000:03: resource 1 [mem 0xfe400000-0xfe5fffff] Mar 25 01:40:08.019892 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Mar 25 01:40:08.019968 kernel: pci_bus 0000:04: resource 1 [mem 0xfe200000-0xfe3fffff] Mar 25 01:40:08.020033 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Mar 25 01:40:08.020104 kernel: pci_bus 0000:05: resource 1 [mem 0xfe000000-0xfe1fffff] Mar 25 01:40:08.020170 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Mar 25 01:40:08.020244 kernel: pci_bus 0000:06: resource 1 [mem 0xfde00000-0xfdffffff] Mar 25 01:40:08.020313 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Mar 25 01:40:08.021422 kernel: pci_bus 0000:07: resource 0 [io 0x1000-0x1fff] Mar 25 01:40:08.021528 kernel: pci_bus 0000:07: resource 1 [mem 0xfdc00000-0xfddfffff] Mar 25 01:40:08.021598 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Mar 25 01:40:08.021695 kernel: pci_bus 0000:08: resource 0 [io 0x2000-0x2fff] Mar 25 01:40:08.021763 kernel: pci_bus 0000:08: resource 1 [mem 0xfda00000-0xfdbfffff] Mar 25 01:40:08.021833 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Mar 25 01:40:08.021903 kernel: pci_bus 0000:09: resource 0 [io 0x3000-0x3fff] Mar 25 01:40:08.021969 kernel: pci_bus 0000:09: resource 1 [mem 0xfd800000-0xfd9fffff] Mar 25 01:40:08.022034 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Mar 25 01:40:08.022045 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Mar 25 01:40:08.022053 kernel: PCI: CLS 0 bytes, default 64 Mar 25 01:40:08.022061 kernel: Initialise system trusted keyrings Mar 25 01:40:08.022069 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Mar 25 01:40:08.022079 kernel: Key type asymmetric registered Mar 25 01:40:08.022086 kernel: Asymmetric key parser 'x509' registered Mar 25 01:40:08.022094 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Mar 25 01:40:08.022102 kernel: io scheduler mq-deadline registered Mar 25 01:40:08.022110 kernel: io scheduler kyber registered Mar 25 01:40:08.022117 kernel: io scheduler bfq registered Mar 25 01:40:08.022192 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Mar 25 01:40:08.022265 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Mar 25 01:40:08.022335 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Mar 25 01:40:08.022425 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Mar 25 01:40:08.022505 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Mar 25 01:40:08.022575 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Mar 25 01:40:08.022647 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Mar 25 01:40:08.022720 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Mar 25 01:40:08.022791 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Mar 25 01:40:08.022862 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Mar 25 01:40:08.022932 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Mar 25 01:40:08.023005 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Mar 25 01:40:08.023076 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Mar 25 01:40:08.023146 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Mar 25 01:40:08.023250 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Mar 25 01:40:08.023323 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Mar 25 01:40:08.023333 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Mar 25 01:40:08.025485 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Mar 25 01:40:08.025560 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Mar 25 01:40:08.025571 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 25 01:40:08.025583 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Mar 25 01:40:08.025590 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 25 01:40:08.025599 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 25 01:40:08.025606 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Mar 25 01:40:08.025614 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Mar 25 01:40:08.025622 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Mar 25 01:40:08.025629 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Mar 25 01:40:08.025711 kernel: rtc_cmos 00:03: RTC can wake from S4 Mar 25 01:40:08.025781 kernel: rtc_cmos 00:03: registered as rtc0 Mar 25 01:40:08.025846 kernel: rtc_cmos 00:03: setting system clock to 2025-03-25T01:40:07 UTC (1742866807) Mar 25 01:40:08.025911 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Mar 25 01:40:08.025921 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Mar 25 01:40:08.025929 kernel: NET: Registered PF_INET6 protocol family Mar 25 01:40:08.025937 kernel: Segment Routing with IPv6 Mar 25 01:40:08.025944 kernel: In-situ OAM (IOAM) with IPv6 Mar 25 01:40:08.025952 kernel: NET: Registered PF_PACKET protocol family Mar 25 01:40:08.025962 kernel: Key type dns_resolver registered Mar 25 01:40:08.025970 kernel: IPI shorthand broadcast: enabled Mar 25 01:40:08.025978 kernel: sched_clock: Marking stable (1308010043, 145128370)->(1463511141, -10372728) Mar 25 01:40:08.025985 kernel: registered taskstats version 1 Mar 25 01:40:08.025993 kernel: Loading compiled-in X.509 certificates Mar 25 01:40:08.026001 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.83-flatcar: eff01054e94a599f8e404b9a9482f4e2220f5386' Mar 25 01:40:08.026008 kernel: Key type .fscrypt registered Mar 25 01:40:08.026016 kernel: Key type fscrypt-provisioning registered Mar 25 01:40:08.026024 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 25 01:40:08.026033 kernel: ima: Allocated hash algorithm: sha1 Mar 25 01:40:08.026040 kernel: ima: No architecture policies found Mar 25 01:40:08.026048 kernel: clk: Disabling unused clocks Mar 25 01:40:08.026056 kernel: Freeing unused kernel image (initmem) memory: 43592K Mar 25 01:40:08.026063 kernel: Write protecting the kernel read-only data: 40960k Mar 25 01:40:08.026071 kernel: Freeing unused kernel image (rodata/data gap) memory: 1564K Mar 25 01:40:08.026079 kernel: Run /init as init process Mar 25 01:40:08.026086 kernel: with arguments: Mar 25 01:40:08.026094 kernel: /init Mar 25 01:40:08.026103 kernel: with environment: Mar 25 01:40:08.026110 kernel: HOME=/ Mar 25 01:40:08.026118 kernel: TERM=linux Mar 25 01:40:08.026125 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 25 01:40:08.026134 systemd[1]: Successfully made /usr/ read-only. Mar 25 01:40:08.026145 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 25 01:40:08.026154 systemd[1]: Detected virtualization kvm. Mar 25 01:40:08.026162 systemd[1]: Detected architecture x86-64. Mar 25 01:40:08.026171 systemd[1]: Running in initrd. Mar 25 01:40:08.026179 systemd[1]: No hostname configured, using default hostname. Mar 25 01:40:08.026188 systemd[1]: Hostname set to . Mar 25 01:40:08.026196 systemd[1]: Initializing machine ID from VM UUID. Mar 25 01:40:08.026204 systemd[1]: Queued start job for default target initrd.target. Mar 25 01:40:08.026212 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 01:40:08.026221 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 01:40:08.026229 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Mar 25 01:40:08.026239 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 25 01:40:08.026247 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Mar 25 01:40:08.026256 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Mar 25 01:40:08.026267 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Mar 25 01:40:08.026275 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Mar 25 01:40:08.026284 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 01:40:08.026292 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 25 01:40:08.026301 systemd[1]: Reached target paths.target - Path Units. Mar 25 01:40:08.026310 systemd[1]: Reached target slices.target - Slice Units. Mar 25 01:40:08.026318 systemd[1]: Reached target swap.target - Swaps. Mar 25 01:40:08.026326 systemd[1]: Reached target timers.target - Timer Units. Mar 25 01:40:08.026334 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Mar 25 01:40:08.027415 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 25 01:40:08.027429 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Mar 25 01:40:08.027437 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Mar 25 01:40:08.027449 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 25 01:40:08.027457 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 25 01:40:08.027473 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 01:40:08.027482 systemd[1]: Reached target sockets.target - Socket Units. Mar 25 01:40:08.027491 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Mar 25 01:40:08.027500 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 25 01:40:08.027508 systemd[1]: Finished network-cleanup.service - Network Cleanup. Mar 25 01:40:08.027516 systemd[1]: Starting systemd-fsck-usr.service... Mar 25 01:40:08.027524 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 25 01:40:08.027534 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 25 01:40:08.027556 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:40:08.027565 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Mar 25 01:40:08.027573 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 01:40:08.027582 systemd[1]: Finished systemd-fsck-usr.service. Mar 25 01:40:08.027592 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 25 01:40:08.027622 systemd-journald[186]: Collecting audit messages is disabled. Mar 25 01:40:08.027642 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 25 01:40:08.027652 systemd-journald[186]: Journal started Mar 25 01:40:08.027678 systemd-journald[186]: Runtime Journal (/run/log/journal/4e3d6412854648d8af6f59d755643ef8) is 4.7M, max 38.3M, 33.5M free. Mar 25 01:40:08.011916 systemd-modules-load[189]: Inserted module 'overlay' Mar 25 01:40:08.061144 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 25 01:40:08.061168 kernel: Bridge firewalling registered Mar 25 01:40:08.061179 systemd[1]: Started systemd-journald.service - Journal Service. Mar 25 01:40:08.042677 systemd-modules-load[189]: Inserted module 'br_netfilter' Mar 25 01:40:08.061767 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 25 01:40:08.062682 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:40:08.064946 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 01:40:08.067492 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 25 01:40:08.077293 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 25 01:40:08.082509 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 25 01:40:08.092857 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 25 01:40:08.099998 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 01:40:08.102139 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:40:08.104457 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Mar 25 01:40:08.105702 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 01:40:08.108450 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 25 01:40:08.123549 dracut-cmdline[221]: dracut-dracut-053 Mar 25 01:40:08.127199 dracut-cmdline[221]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=e7a00b7ee8d97e8d255663e9d3fa92277da8316702fb7f6d664fd7b137c307e9 Mar 25 01:40:08.141934 systemd-resolved[223]: Positive Trust Anchors: Mar 25 01:40:08.142637 systemd-resolved[223]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 25 01:40:08.142670 systemd-resolved[223]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 25 01:40:08.150948 systemd-resolved[223]: Defaulting to hostname 'linux'. Mar 25 01:40:08.151811 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 25 01:40:08.152485 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 25 01:40:08.202417 kernel: SCSI subsystem initialized Mar 25 01:40:08.212370 kernel: Loading iSCSI transport class v2.0-870. Mar 25 01:40:08.222371 kernel: iscsi: registered transport (tcp) Mar 25 01:40:08.242752 kernel: iscsi: registered transport (qla4xxx) Mar 25 01:40:08.242802 kernel: QLogic iSCSI HBA Driver Mar 25 01:40:08.282101 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Mar 25 01:40:08.283969 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Mar 25 01:40:08.320388 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 25 01:40:08.320478 kernel: device-mapper: uevent: version 1.0.3 Mar 25 01:40:08.323369 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Mar 25 01:40:08.374412 kernel: raid6: avx2x4 gen() 12275 MB/s Mar 25 01:40:08.392380 kernel: raid6: avx2x2 gen() 17036 MB/s Mar 25 01:40:08.409644 kernel: raid6: avx2x1 gen() 25445 MB/s Mar 25 01:40:08.409727 kernel: raid6: using algorithm avx2x1 gen() 25445 MB/s Mar 25 01:40:08.428408 kernel: raid6: .... xor() 15974 MB/s, rmw enabled Mar 25 01:40:08.428535 kernel: raid6: using avx2x2 recovery algorithm Mar 25 01:40:08.447389 kernel: xor: automatically using best checksumming function avx Mar 25 01:40:08.581383 kernel: Btrfs loaded, zoned=no, fsverity=no Mar 25 01:40:08.592546 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Mar 25 01:40:08.594318 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 01:40:08.617002 systemd-udevd[408]: Using default interface naming scheme 'v255'. Mar 25 01:40:08.622149 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 01:40:08.629014 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Mar 25 01:40:08.656190 dracut-pre-trigger[416]: rd.md=0: removing MD RAID activation Mar 25 01:40:08.688823 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Mar 25 01:40:08.690756 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 25 01:40:08.750769 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 01:40:08.758447 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Mar 25 01:40:08.791893 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Mar 25 01:40:08.794886 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Mar 25 01:40:08.797137 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 01:40:08.797885 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 25 01:40:08.803112 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Mar 25 01:40:08.828130 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Mar 25 01:40:08.852438 kernel: libata version 3.00 loaded. Mar 25 01:40:08.862374 kernel: cryptd: max_cpu_qlen set to 1000 Mar 25 01:40:08.865374 kernel: scsi host0: Virtio SCSI HBA Mar 25 01:40:08.880386 kernel: scsi 0:0:0:0: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Mar 25 01:40:08.883390 kernel: AVX2 version of gcm_enc/dec engaged. Mar 25 01:40:08.885413 kernel: ahci 0000:00:1f.2: version 3.0 Mar 25 01:40:08.987388 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Mar 25 01:40:08.987410 kernel: AES CTR mode by8 optimization enabled Mar 25 01:40:08.987420 kernel: ACPI: bus type USB registered Mar 25 01:40:08.987430 kernel: usbcore: registered new interface driver usbfs Mar 25 01:40:08.987439 kernel: usbcore: registered new interface driver hub Mar 25 01:40:08.987450 kernel: usbcore: registered new device driver usb Mar 25 01:40:08.987459 kernel: ahci 0000:00:1f.2: AHCI 0001.0000 32 slots 6 ports 1.5 Gbps 0x3f impl SATA mode Mar 25 01:40:08.987593 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Mar 25 01:40:08.987680 kernel: scsi host1: ahci Mar 25 01:40:08.987800 kernel: scsi host2: ahci Mar 25 01:40:08.987895 kernel: scsi host3: ahci Mar 25 01:40:08.987984 kernel: scsi host4: ahci Mar 25 01:40:08.988081 kernel: scsi host5: ahci Mar 25 01:40:08.988174 kernel: scsi host6: ahci Mar 25 01:40:08.988264 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a100 irq 48 Mar 25 01:40:08.988278 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a180 irq 48 Mar 25 01:40:08.988288 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a200 irq 48 Mar 25 01:40:08.988298 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a280 irq 48 Mar 25 01:40:08.988307 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a300 irq 48 Mar 25 01:40:08.988316 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a380 irq 48 Mar 25 01:40:08.934619 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 25 01:40:08.934750 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:40:08.937302 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 01:40:08.939325 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 25 01:40:08.939632 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:40:08.941851 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:40:08.954492 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:40:09.034988 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:40:09.038451 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Mar 25 01:40:09.061108 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:40:09.301362 kernel: ata2: SATA link down (SStatus 0 SControl 300) Mar 25 01:40:09.301474 kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Mar 25 01:40:09.301486 kernel: ata6: SATA link down (SStatus 0 SControl 300) Mar 25 01:40:09.301496 kernel: ata3: SATA link down (SStatus 0 SControl 300) Mar 25 01:40:09.303699 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Mar 25 01:40:09.303718 kernel: ata1.00: applying bridge limits Mar 25 01:40:09.304437 kernel: ata4: SATA link down (SStatus 0 SControl 300) Mar 25 01:40:09.307410 kernel: ata5: SATA link down (SStatus 0 SControl 300) Mar 25 01:40:09.308383 kernel: ata1.00: configured for UDMA/100 Mar 25 01:40:09.313366 kernel: scsi 1:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Mar 25 01:40:09.341292 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Mar 25 01:40:09.357847 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Mar 25 01:40:09.357962 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Mar 25 01:40:09.358054 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Mar 25 01:40:09.358146 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Mar 25 01:40:09.358239 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Mar 25 01:40:09.358328 kernel: hub 1-0:1.0: USB hub found Mar 25 01:40:09.358518 kernel: hub 1-0:1.0: 4 ports detected Mar 25 01:40:09.358646 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Mar 25 01:40:09.358750 kernel: hub 2-0:1.0: USB hub found Mar 25 01:40:09.358921 kernel: hub 2-0:1.0: 4 ports detected Mar 25 01:40:09.367726 kernel: sd 0:0:0:0: Power-on or device reset occurred Mar 25 01:40:09.391554 kernel: sd 0:0:0:0: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Mar 25 01:40:09.391676 kernel: sd 0:0:0:0: [sda] Write Protect is off Mar 25 01:40:09.391766 kernel: sd 0:0:0:0: [sda] Mode Sense: 63 00 00 08 Mar 25 01:40:09.391854 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Mar 25 01:40:09.391941 kernel: sr 1:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Mar 25 01:40:09.402967 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 25 01:40:09.402979 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 25 01:40:09.402988 kernel: GPT:17805311 != 80003071 Mar 25 01:40:09.403003 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 25 01:40:09.403012 kernel: GPT:17805311 != 80003071 Mar 25 01:40:09.403021 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 25 01:40:09.403030 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 25 01:40:09.403038 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Mar 25 01:40:09.403140 kernel: sr 1:0:0:0: Attached scsi CD-ROM sr0 Mar 25 01:40:09.459387 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by (udev-worker) (465) Mar 25 01:40:09.467393 kernel: BTRFS: device fsid 6d9424cd-1432-492b-b006-b311869817e2 devid 1 transid 39 /dev/sda3 scanned by (udev-worker) (455) Mar 25 01:40:09.482604 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Mar 25 01:40:09.493035 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Mar 25 01:40:09.503643 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Mar 25 01:40:09.512653 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Mar 25 01:40:09.513218 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Mar 25 01:40:09.518435 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Mar 25 01:40:09.534111 disk-uuid[581]: Primary Header is updated. Mar 25 01:40:09.534111 disk-uuid[581]: Secondary Entries is updated. Mar 25 01:40:09.534111 disk-uuid[581]: Secondary Header is updated. Mar 25 01:40:09.544375 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 25 01:40:09.598258 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Mar 25 01:40:09.736374 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 25 01:40:09.741472 kernel: usbcore: registered new interface driver usbhid Mar 25 01:40:09.741516 kernel: usbhid: USB HID core driver Mar 25 01:40:09.746674 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input2 Mar 25 01:40:09.746696 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Mar 25 01:40:10.564665 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 25 01:40:10.567332 disk-uuid[583]: The operation has completed successfully. Mar 25 01:40:10.665356 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 25 01:40:10.665531 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Mar 25 01:40:10.712486 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Mar 25 01:40:10.725147 sh[598]: Success Mar 25 01:40:10.737415 kernel: device-mapper: verity: sha256 using implementation "sha256-ni" Mar 25 01:40:10.809317 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Mar 25 01:40:10.815504 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Mar 25 01:40:10.831661 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Mar 25 01:40:10.855370 kernel: BTRFS info (device dm-0): first mount of filesystem 6d9424cd-1432-492b-b006-b311869817e2 Mar 25 01:40:10.855430 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Mar 25 01:40:10.859377 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Mar 25 01:40:10.862969 kernel: BTRFS info (device dm-0): disabling log replay at mount time Mar 25 01:40:10.865945 kernel: BTRFS info (device dm-0): using free space tree Mar 25 01:40:10.882395 kernel: BTRFS info (device dm-0): enabling ssd optimizations Mar 25 01:40:10.886442 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Mar 25 01:40:10.887598 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Mar 25 01:40:10.890453 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Mar 25 01:40:10.891897 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Mar 25 01:40:10.921706 kernel: BTRFS info (device sda6): first mount of filesystem a72930ba-1354-475c-94df-b83a66efea67 Mar 25 01:40:10.921754 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 25 01:40:10.921764 kernel: BTRFS info (device sda6): using free space tree Mar 25 01:40:10.927621 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 25 01:40:10.927647 kernel: BTRFS info (device sda6): auto enabling async discard Mar 25 01:40:10.933369 kernel: BTRFS info (device sda6): last unmount of filesystem a72930ba-1354-475c-94df-b83a66efea67 Mar 25 01:40:10.937315 systemd[1]: Finished ignition-setup.service - Ignition (setup). Mar 25 01:40:10.940110 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Mar 25 01:40:11.041077 ignition[689]: Ignition 2.20.0 Mar 25 01:40:11.042109 ignition[689]: Stage: fetch-offline Mar 25 01:40:11.042663 ignition[689]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:40:11.043173 ignition[689]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 25 01:40:11.043309 ignition[689]: parsed url from cmdline: "" Mar 25 01:40:11.043314 ignition[689]: no config URL provided Mar 25 01:40:11.043320 ignition[689]: reading system config file "/usr/lib/ignition/user.ign" Mar 25 01:40:11.043328 ignition[689]: no config at "/usr/lib/ignition/user.ign" Mar 25 01:40:11.045755 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Mar 25 01:40:11.043335 ignition[689]: failed to fetch config: resource requires networking Mar 25 01:40:11.043612 ignition[689]: Ignition finished successfully Mar 25 01:40:11.057984 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 25 01:40:11.061812 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 25 01:40:11.101196 systemd-networkd[781]: lo: Link UP Mar 25 01:40:11.101210 systemd-networkd[781]: lo: Gained carrier Mar 25 01:40:11.103259 systemd-networkd[781]: Enumeration completed Mar 25 01:40:11.104308 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 25 01:40:11.104945 systemd[1]: Reached target network.target - Network. Mar 25 01:40:11.105152 systemd-networkd[781]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:40:11.105158 systemd-networkd[781]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 25 01:40:11.107164 systemd-networkd[781]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:40:11.107170 systemd-networkd[781]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 25 01:40:11.108095 systemd-networkd[781]: eth0: Link UP Mar 25 01:40:11.108100 systemd-networkd[781]: eth0: Gained carrier Mar 25 01:40:11.108109 systemd-networkd[781]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:40:11.111654 systemd-networkd[781]: eth1: Link UP Mar 25 01:40:11.111659 systemd-networkd[781]: eth1: Gained carrier Mar 25 01:40:11.111669 systemd-networkd[781]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:40:11.112561 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Mar 25 01:40:11.138552 systemd-networkd[781]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 Mar 25 01:40:11.139966 ignition[784]: Ignition 2.20.0 Mar 25 01:40:11.139974 ignition[784]: Stage: fetch Mar 25 01:40:11.140178 ignition[784]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:40:11.140187 ignition[784]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 25 01:40:11.140286 ignition[784]: parsed url from cmdline: "" Mar 25 01:40:11.140289 ignition[784]: no config URL provided Mar 25 01:40:11.140296 ignition[784]: reading system config file "/usr/lib/ignition/user.ign" Mar 25 01:40:11.140302 ignition[784]: no config at "/usr/lib/ignition/user.ign" Mar 25 01:40:11.140325 ignition[784]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Mar 25 01:40:11.140923 ignition[784]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Mar 25 01:40:11.182488 systemd-networkd[781]: eth0: DHCPv4 address 37.27.205.216/32, gateway 172.31.1.1 acquired from 172.31.1.1 Mar 25 01:40:11.341299 ignition[784]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Mar 25 01:40:11.351867 ignition[784]: GET result: OK Mar 25 01:40:11.351993 ignition[784]: parsing config with SHA512: 3b5924a29fa52c967633f12a4b90044087e74419d254771b3e7d978812ab80b49fb33d7cb2d363a7c1721cdeec7a536190bbd9090bc40af302b2a1d3b1ac90bb Mar 25 01:40:11.360326 unknown[784]: fetched base config from "system" Mar 25 01:40:11.360394 unknown[784]: fetched base config from "system" Mar 25 01:40:11.361078 ignition[784]: fetch: fetch complete Mar 25 01:40:11.360407 unknown[784]: fetched user config from "hetzner" Mar 25 01:40:11.361087 ignition[784]: fetch: fetch passed Mar 25 01:40:11.364909 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Mar 25 01:40:11.361153 ignition[784]: Ignition finished successfully Mar 25 01:40:11.368552 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Mar 25 01:40:11.409538 ignition[791]: Ignition 2.20.0 Mar 25 01:40:11.409562 ignition[791]: Stage: kargs Mar 25 01:40:11.409875 ignition[791]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:40:11.409895 ignition[791]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 25 01:40:11.414791 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Mar 25 01:40:11.412123 ignition[791]: kargs: kargs passed Mar 25 01:40:11.412201 ignition[791]: Ignition finished successfully Mar 25 01:40:11.420525 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Mar 25 01:40:11.460426 ignition[798]: Ignition 2.20.0 Mar 25 01:40:11.460442 ignition[798]: Stage: disks Mar 25 01:40:11.460775 ignition[798]: no configs at "/usr/lib/ignition/base.d" Mar 25 01:40:11.460794 ignition[798]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 25 01:40:11.464513 systemd[1]: Finished ignition-disks.service - Ignition (disks). Mar 25 01:40:11.462496 ignition[798]: disks: disks passed Mar 25 01:40:11.466601 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Mar 25 01:40:11.462556 ignition[798]: Ignition finished successfully Mar 25 01:40:11.467966 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Mar 25 01:40:11.469871 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 25 01:40:11.471576 systemd[1]: Reached target sysinit.target - System Initialization. Mar 25 01:40:11.473618 systemd[1]: Reached target basic.target - Basic System. Mar 25 01:40:11.478542 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Mar 25 01:40:11.513317 systemd-fsck[807]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Mar 25 01:40:11.518540 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Mar 25 01:40:11.522316 systemd[1]: Mounting sysroot.mount - /sysroot... Mar 25 01:40:11.657378 kernel: EXT4-fs (sda9): mounted filesystem 4e6dca82-2e50-453c-be25-61f944b72008 r/w with ordered data mode. Quota mode: none. Mar 25 01:40:11.658114 systemd[1]: Mounted sysroot.mount - /sysroot. Mar 25 01:40:11.659286 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Mar 25 01:40:11.662644 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 25 01:40:11.676170 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Mar 25 01:40:11.678467 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Mar 25 01:40:11.681726 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 25 01:40:11.682488 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Mar 25 01:40:11.691242 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Mar 25 01:40:11.693450 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Mar 25 01:40:11.701389 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (815) Mar 25 01:40:11.715668 kernel: BTRFS info (device sda6): first mount of filesystem a72930ba-1354-475c-94df-b83a66efea67 Mar 25 01:40:11.715757 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 25 01:40:11.715779 kernel: BTRFS info (device sda6): using free space tree Mar 25 01:40:11.733056 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 25 01:40:11.733176 kernel: BTRFS info (device sda6): auto enabling async discard Mar 25 01:40:11.746602 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 25 01:40:11.778771 coreos-metadata[817]: Mar 25 01:40:11.778 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Mar 25 01:40:11.780838 coreos-metadata[817]: Mar 25 01:40:11.780 INFO Fetch successful Mar 25 01:40:11.782708 coreos-metadata[817]: Mar 25 01:40:11.781 INFO wrote hostname ci-4284-0-0-2-22d395eace to /sysroot/etc/hostname Mar 25 01:40:11.784304 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 25 01:40:11.789749 initrd-setup-root[843]: cut: /sysroot/etc/passwd: No such file or directory Mar 25 01:40:11.796319 initrd-setup-root[850]: cut: /sysroot/etc/group: No such file or directory Mar 25 01:40:11.801503 initrd-setup-root[857]: cut: /sysroot/etc/shadow: No such file or directory Mar 25 01:40:11.806587 initrd-setup-root[864]: cut: /sysroot/etc/gshadow: No such file or directory Mar 25 01:40:11.927859 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Mar 25 01:40:11.932021 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Mar 25 01:40:11.936505 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Mar 25 01:40:11.951063 systemd[1]: sysroot-oem.mount: Deactivated successfully. Mar 25 01:40:11.955736 kernel: BTRFS info (device sda6): last unmount of filesystem a72930ba-1354-475c-94df-b83a66efea67 Mar 25 01:40:11.980418 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Mar 25 01:40:11.992382 ignition[932]: INFO : Ignition 2.20.0 Mar 25 01:40:11.992382 ignition[932]: INFO : Stage: mount Mar 25 01:40:11.995493 ignition[932]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 01:40:11.995493 ignition[932]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 25 01:40:11.995493 ignition[932]: INFO : mount: mount passed Mar 25 01:40:11.995493 ignition[932]: INFO : Ignition finished successfully Mar 25 01:40:11.996829 systemd[1]: Finished ignition-mount.service - Ignition (mount). Mar 25 01:40:12.000948 systemd[1]: Starting ignition-files.service - Ignition (files)... Mar 25 01:40:12.025982 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Mar 25 01:40:12.056392 kernel: BTRFS: device label OEM devid 1 transid 17 /dev/sda6 scanned by mount (943) Mar 25 01:40:12.060032 kernel: BTRFS info (device sda6): first mount of filesystem a72930ba-1354-475c-94df-b83a66efea67 Mar 25 01:40:12.060082 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 25 01:40:12.060104 kernel: BTRFS info (device sda6): using free space tree Mar 25 01:40:12.073710 kernel: BTRFS info (device sda6): enabling ssd optimizations Mar 25 01:40:12.073767 kernel: BTRFS info (device sda6): auto enabling async discard Mar 25 01:40:12.079670 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Mar 25 01:40:12.113703 ignition[959]: INFO : Ignition 2.20.0 Mar 25 01:40:12.113703 ignition[959]: INFO : Stage: files Mar 25 01:40:12.114937 ignition[959]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 01:40:12.114937 ignition[959]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 25 01:40:12.116615 ignition[959]: DEBUG : files: compiled without relabeling support, skipping Mar 25 01:40:12.117551 ignition[959]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 25 01:40:12.117551 ignition[959]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 25 01:40:12.121843 ignition[959]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 25 01:40:12.122517 ignition[959]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 25 01:40:12.123474 ignition[959]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 25 01:40:12.122643 unknown[959]: wrote ssh authorized keys file for user: core Mar 25 01:40:12.125265 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Mar 25 01:40:12.126172 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Mar 25 01:40:12.317847 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Mar 25 01:40:12.558586 systemd-networkd[781]: eth0: Gained IPv6LL Mar 25 01:40:12.678772 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Mar 25 01:40:12.680004 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Mar 25 01:40:12.680004 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Mar 25 01:40:12.680004 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 25 01:40:12.680004 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 25 01:40:12.680004 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 25 01:40:12.680004 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 25 01:40:12.680004 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 25 01:40:12.680004 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 25 01:40:12.686291 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 25 01:40:12.686291 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 25 01:40:12.686291 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Mar 25 01:40:12.686291 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Mar 25 01:40:12.686291 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Mar 25 01:40:12.686291 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.31.0-x86-64.raw: attempt #1 Mar 25 01:40:12.687405 systemd-networkd[781]: eth1: Gained IPv6LL Mar 25 01:40:13.467991 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Mar 25 01:40:15.031219 ignition[959]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Mar 25 01:40:15.031219 ignition[959]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Mar 25 01:40:15.036029 ignition[959]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 25 01:40:15.036029 ignition[959]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 25 01:40:15.036029 ignition[959]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Mar 25 01:40:15.036029 ignition[959]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Mar 25 01:40:15.048134 ignition[959]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Mar 25 01:40:15.048134 ignition[959]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Mar 25 01:40:15.048134 ignition[959]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Mar 25 01:40:15.048134 ignition[959]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Mar 25 01:40:15.048134 ignition[959]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Mar 25 01:40:15.048134 ignition[959]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 25 01:40:15.048134 ignition[959]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 25 01:40:15.048134 ignition[959]: INFO : files: files passed Mar 25 01:40:15.048134 ignition[959]: INFO : Ignition finished successfully Mar 25 01:40:15.039966 systemd[1]: Finished ignition-files.service - Ignition (files). Mar 25 01:40:15.048530 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Mar 25 01:40:15.063537 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Mar 25 01:40:15.070768 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 25 01:40:15.080579 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Mar 25 01:40:15.094692 initrd-setup-root-after-ignition[990]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 25 01:40:15.098128 initrd-setup-root-after-ignition[990]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Mar 25 01:40:15.099739 initrd-setup-root-after-ignition[994]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 25 01:40:15.099955 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 25 01:40:15.103183 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Mar 25 01:40:15.107555 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Mar 25 01:40:15.168101 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 25 01:40:15.168274 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Mar 25 01:40:15.170966 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Mar 25 01:40:15.172831 systemd[1]: Reached target initrd.target - Initrd Default Target. Mar 25 01:40:15.175077 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Mar 25 01:40:15.176556 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Mar 25 01:40:15.207632 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 25 01:40:15.212562 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Mar 25 01:40:15.239189 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Mar 25 01:40:15.240534 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 01:40:15.242934 systemd[1]: Stopped target timers.target - Timer Units. Mar 25 01:40:15.245139 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 25 01:40:15.245335 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Mar 25 01:40:15.249103 systemd[1]: Stopped target initrd.target - Initrd Default Target. Mar 25 01:40:15.250531 systemd[1]: Stopped target basic.target - Basic System. Mar 25 01:40:15.252491 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Mar 25 01:40:15.254422 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Mar 25 01:40:15.256742 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Mar 25 01:40:15.259098 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Mar 25 01:40:15.261319 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Mar 25 01:40:15.264514 systemd[1]: Stopped target sysinit.target - System Initialization. Mar 25 01:40:15.266794 systemd[1]: Stopped target local-fs.target - Local File Systems. Mar 25 01:40:15.268961 systemd[1]: Stopped target swap.target - Swaps. Mar 25 01:40:15.270779 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 25 01:40:15.270966 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Mar 25 01:40:15.273641 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Mar 25 01:40:15.275141 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 01:40:15.277311 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Mar 25 01:40:15.279489 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 01:40:15.281044 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 25 01:40:15.281234 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Mar 25 01:40:15.284129 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 25 01:40:15.284375 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Mar 25 01:40:15.285606 systemd[1]: ignition-files.service: Deactivated successfully. Mar 25 01:40:15.285781 systemd[1]: Stopped ignition-files.service - Ignition (files). Mar 25 01:40:15.287682 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Mar 25 01:40:15.287936 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Mar 25 01:40:15.293685 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Mar 25 01:40:15.305646 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Mar 25 01:40:15.309490 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 25 01:40:15.309762 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 01:40:15.315255 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 25 01:40:15.316700 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Mar 25 01:40:15.327369 ignition[1014]: INFO : Ignition 2.20.0 Mar 25 01:40:15.327369 ignition[1014]: INFO : Stage: umount Mar 25 01:40:15.327369 ignition[1014]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 25 01:40:15.327369 ignition[1014]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Mar 25 01:40:15.334672 ignition[1014]: INFO : umount: umount passed Mar 25 01:40:15.334672 ignition[1014]: INFO : Ignition finished successfully Mar 25 01:40:15.332167 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 25 01:40:15.332291 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Mar 25 01:40:15.340892 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 25 01:40:15.349213 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Mar 25 01:40:15.352507 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 25 01:40:15.353105 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 25 01:40:15.353159 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Mar 25 01:40:15.359869 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 25 01:40:15.359941 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Mar 25 01:40:15.361822 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 25 01:40:15.361876 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Mar 25 01:40:15.363263 systemd[1]: Stopped target network.target - Network. Mar 25 01:40:15.364752 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 25 01:40:15.364813 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Mar 25 01:40:15.366372 systemd[1]: Stopped target paths.target - Path Units. Mar 25 01:40:15.368820 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 25 01:40:15.372492 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 01:40:15.380869 systemd[1]: Stopped target slices.target - Slice Units. Mar 25 01:40:15.382832 systemd[1]: Stopped target sockets.target - Socket Units. Mar 25 01:40:15.384572 systemd[1]: iscsid.socket: Deactivated successfully. Mar 25 01:40:15.384634 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Mar 25 01:40:15.386586 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 25 01:40:15.386644 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Mar 25 01:40:15.389055 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 25 01:40:15.389133 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Mar 25 01:40:15.395615 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Mar 25 01:40:15.395676 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Mar 25 01:40:15.396652 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Mar 25 01:40:15.399573 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Mar 25 01:40:15.403047 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 25 01:40:15.403186 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Mar 25 01:40:15.405944 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 25 01:40:15.406132 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Mar 25 01:40:15.412740 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Mar 25 01:40:15.413134 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 25 01:40:15.413303 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Mar 25 01:40:15.416227 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Mar 25 01:40:15.417275 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 25 01:40:15.417337 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Mar 25 01:40:15.418938 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 25 01:40:15.419010 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Mar 25 01:40:15.422493 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Mar 25 01:40:15.423544 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 25 01:40:15.423609 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Mar 25 01:40:15.426833 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 25 01:40:15.426892 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Mar 25 01:40:15.428538 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 25 01:40:15.428609 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Mar 25 01:40:15.430775 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Mar 25 01:40:15.430835 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 01:40:15.433066 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 01:40:15.435947 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 25 01:40:15.436027 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Mar 25 01:40:15.452594 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 25 01:40:15.452804 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 01:40:15.456090 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 25 01:40:15.456217 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Mar 25 01:40:15.458563 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 25 01:40:15.458633 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Mar 25 01:40:15.460639 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 25 01:40:15.460680 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 01:40:15.462325 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 25 01:40:15.462421 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Mar 25 01:40:15.464898 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 25 01:40:15.464954 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Mar 25 01:40:15.466493 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 25 01:40:15.466549 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Mar 25 01:40:15.471491 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Mar 25 01:40:15.473185 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 25 01:40:15.473250 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 01:40:15.477280 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Mar 25 01:40:15.477366 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 25 01:40:15.478189 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 25 01:40:15.478244 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 01:40:15.479158 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 25 01:40:15.479215 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:40:15.485795 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Mar 25 01:40:15.485878 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Mar 25 01:40:15.487185 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 25 01:40:15.487288 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Mar 25 01:40:15.488804 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Mar 25 01:40:15.493487 systemd[1]: Starting initrd-switch-root.service - Switch Root... Mar 25 01:40:15.515432 systemd[1]: Switching root. Mar 25 01:40:15.574614 systemd-journald[186]: Journal stopped Mar 25 01:40:16.858152 systemd-journald[186]: Received SIGTERM from PID 1 (systemd). Mar 25 01:40:16.858221 kernel: SELinux: policy capability network_peer_controls=1 Mar 25 01:40:16.858239 kernel: SELinux: policy capability open_perms=1 Mar 25 01:40:16.858258 kernel: SELinux: policy capability extended_socket_class=1 Mar 25 01:40:16.858274 kernel: SELinux: policy capability always_check_network=0 Mar 25 01:40:16.858287 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 25 01:40:16.858300 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 25 01:40:16.858314 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 25 01:40:16.858331 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 25 01:40:16.859180 kernel: audit: type=1403 audit(1742866815.725:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 25 01:40:16.859199 systemd[1]: Successfully loaded SELinux policy in 58.671ms. Mar 25 01:40:16.859222 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 18.351ms. Mar 25 01:40:16.859233 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Mar 25 01:40:16.859243 systemd[1]: Detected virtualization kvm. Mar 25 01:40:16.859254 systemd[1]: Detected architecture x86-64. Mar 25 01:40:16.859263 systemd[1]: Detected first boot. Mar 25 01:40:16.859276 systemd[1]: Hostname set to . Mar 25 01:40:16.859285 systemd[1]: Initializing machine ID from VM UUID. Mar 25 01:40:16.859296 zram_generator::config[1059]: No configuration found. Mar 25 01:40:16.859307 kernel: Guest personality initialized and is inactive Mar 25 01:40:16.859316 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Mar 25 01:40:16.859325 kernel: Initialized host personality Mar 25 01:40:16.859336 kernel: NET: Registered PF_VSOCK protocol family Mar 25 01:40:16.860793 systemd[1]: Populated /etc with preset unit settings. Mar 25 01:40:16.860810 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Mar 25 01:40:16.860824 systemd[1]: initrd-switch-root.service: Deactivated successfully. Mar 25 01:40:16.860834 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Mar 25 01:40:16.860847 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Mar 25 01:40:16.860857 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Mar 25 01:40:16.860867 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Mar 25 01:40:16.860877 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Mar 25 01:40:16.860887 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Mar 25 01:40:16.860897 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Mar 25 01:40:16.860908 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Mar 25 01:40:16.860918 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Mar 25 01:40:16.860928 systemd[1]: Created slice user.slice - User and Session Slice. Mar 25 01:40:16.860938 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Mar 25 01:40:16.860948 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Mar 25 01:40:16.860960 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Mar 25 01:40:16.860970 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Mar 25 01:40:16.860982 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Mar 25 01:40:16.860993 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Mar 25 01:40:16.861003 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Mar 25 01:40:16.861014 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Mar 25 01:40:16.861024 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Mar 25 01:40:16.861039 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Mar 25 01:40:16.861057 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Mar 25 01:40:16.861071 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Mar 25 01:40:16.861085 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Mar 25 01:40:16.861098 systemd[1]: Reached target remote-fs.target - Remote File Systems. Mar 25 01:40:16.861112 systemd[1]: Reached target slices.target - Slice Units. Mar 25 01:40:16.861125 systemd[1]: Reached target swap.target - Swaps. Mar 25 01:40:16.861139 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Mar 25 01:40:16.861153 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Mar 25 01:40:16.861174 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Mar 25 01:40:16.861186 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Mar 25 01:40:16.861197 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Mar 25 01:40:16.861208 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Mar 25 01:40:16.861220 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Mar 25 01:40:16.861229 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Mar 25 01:40:16.861240 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Mar 25 01:40:16.861250 systemd[1]: Mounting media.mount - External Media Directory... Mar 25 01:40:16.861261 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 01:40:16.861273 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Mar 25 01:40:16.861283 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Mar 25 01:40:16.861293 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Mar 25 01:40:16.861304 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 25 01:40:16.861314 systemd[1]: Reached target machines.target - Containers. Mar 25 01:40:16.861324 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Mar 25 01:40:16.861335 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 01:40:16.861368 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Mar 25 01:40:16.861382 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Mar 25 01:40:16.861392 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 25 01:40:16.861402 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 25 01:40:16.861413 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 25 01:40:16.861423 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Mar 25 01:40:16.861433 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 25 01:40:16.861443 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 25 01:40:16.861469 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Mar 25 01:40:16.861479 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Mar 25 01:40:16.861491 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Mar 25 01:40:16.861502 systemd[1]: Stopped systemd-fsck-usr.service. Mar 25 01:40:16.861513 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 01:40:16.861524 systemd[1]: Starting systemd-journald.service - Journal Service... Mar 25 01:40:16.861534 kernel: fuse: init (API version 7.39) Mar 25 01:40:16.861543 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Mar 25 01:40:16.861554 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Mar 25 01:40:16.861565 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Mar 25 01:40:16.861577 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Mar 25 01:40:16.861588 kernel: loop: module loaded Mar 25 01:40:16.861598 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Mar 25 01:40:16.861608 systemd[1]: verity-setup.service: Deactivated successfully. Mar 25 01:40:16.861620 systemd[1]: Stopped verity-setup.service. Mar 25 01:40:16.861635 kernel: ACPI: bus type drm_connector registered Mar 25 01:40:16.861648 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 01:40:16.861662 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Mar 25 01:40:16.861678 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Mar 25 01:40:16.861692 systemd[1]: Mounted media.mount - External Media Directory. Mar 25 01:40:16.861706 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Mar 25 01:40:16.861717 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Mar 25 01:40:16.861730 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Mar 25 01:40:16.861744 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Mar 25 01:40:16.861756 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Mar 25 01:40:16.861766 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 25 01:40:16.861777 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Mar 25 01:40:16.861787 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 25 01:40:16.861797 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 25 01:40:16.861808 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 25 01:40:16.861819 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 25 01:40:16.861831 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 25 01:40:16.861845 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 25 01:40:16.861856 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 25 01:40:16.861890 systemd-journald[1150]: Collecting audit messages is disabled. Mar 25 01:40:16.861922 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Mar 25 01:40:16.861942 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 25 01:40:16.861959 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 25 01:40:16.861974 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Mar 25 01:40:16.861990 systemd-journald[1150]: Journal started Mar 25 01:40:16.862022 systemd-journald[1150]: Runtime Journal (/run/log/journal/4e3d6412854648d8af6f59d755643ef8) is 4.7M, max 38.3M, 33.5M free. Mar 25 01:40:16.446228 systemd[1]: Queued start job for default target multi-user.target. Mar 25 01:40:16.456412 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Mar 25 01:40:16.865406 systemd[1]: Started systemd-journald.service - Journal Service. Mar 25 01:40:16.457139 systemd[1]: systemd-journald.service: Deactivated successfully. Mar 25 01:40:16.864832 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Mar 25 01:40:16.866980 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Mar 25 01:40:16.867894 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Mar 25 01:40:16.877128 systemd[1]: Reached target network-pre.target - Preparation for Network. Mar 25 01:40:16.879472 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Mar 25 01:40:16.883437 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Mar 25 01:40:16.884962 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 25 01:40:16.885403 systemd[1]: Reached target local-fs.target - Local File Systems. Mar 25 01:40:16.887469 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Mar 25 01:40:16.891333 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Mar 25 01:40:16.895784 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Mar 25 01:40:16.897497 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 01:40:16.905497 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Mar 25 01:40:16.907666 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Mar 25 01:40:16.908253 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 25 01:40:16.910491 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Mar 25 01:40:16.911213 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 25 01:40:16.914493 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Mar 25 01:40:16.918385 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Mar 25 01:40:16.923783 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Mar 25 01:40:16.927841 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Mar 25 01:40:16.933933 systemd-journald[1150]: Time spent on flushing to /var/log/journal/4e3d6412854648d8af6f59d755643ef8 is 31.052ms for 1144 entries. Mar 25 01:40:16.933933 systemd-journald[1150]: System Journal (/var/log/journal/4e3d6412854648d8af6f59d755643ef8) is 8M, max 584.8M, 576.8M free. Mar 25 01:40:16.984942 systemd-journald[1150]: Received client request to flush runtime journal. Mar 25 01:40:16.984996 kernel: loop0: detected capacity change from 0 to 151640 Mar 25 01:40:16.932603 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Mar 25 01:40:16.933165 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Mar 25 01:40:16.938752 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Mar 25 01:40:16.940630 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Mar 25 01:40:16.945175 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Mar 25 01:40:16.950935 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Mar 25 01:40:16.956581 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Mar 25 01:40:16.988131 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Mar 25 01:40:16.991232 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Mar 25 01:40:17.004873 udevadm[1194]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Mar 25 01:40:17.018374 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 25 01:40:17.022180 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Mar 25 01:40:17.025109 systemd-tmpfiles[1186]: ACLs are not supported, ignoring. Mar 25 01:40:17.025124 systemd-tmpfiles[1186]: ACLs are not supported, ignoring. Mar 25 01:40:17.031211 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Mar 25 01:40:17.033839 systemd[1]: Starting systemd-sysusers.service - Create System Users... Mar 25 01:40:17.040382 kernel: loop1: detected capacity change from 0 to 109808 Mar 25 01:40:17.080223 systemd[1]: Finished systemd-sysusers.service - Create System Users. Mar 25 01:40:17.081493 kernel: loop2: detected capacity change from 0 to 205544 Mar 25 01:40:17.082874 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Mar 25 01:40:17.113026 systemd-tmpfiles[1208]: ACLs are not supported, ignoring. Mar 25 01:40:17.113044 systemd-tmpfiles[1208]: ACLs are not supported, ignoring. Mar 25 01:40:17.116898 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Mar 25 01:40:17.147375 kernel: loop3: detected capacity change from 0 to 8 Mar 25 01:40:17.169392 kernel: loop4: detected capacity change from 0 to 151640 Mar 25 01:40:17.193390 kernel: loop5: detected capacity change from 0 to 109808 Mar 25 01:40:17.224384 kernel: loop6: detected capacity change from 0 to 205544 Mar 25 01:40:17.259378 kernel: loop7: detected capacity change from 0 to 8 Mar 25 01:40:17.260167 (sd-merge)[1213]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Mar 25 01:40:17.260620 (sd-merge)[1213]: Merged extensions into '/usr'. Mar 25 01:40:17.267639 systemd[1]: Reload requested from client PID 1185 ('systemd-sysext') (unit systemd-sysext.service)... Mar 25 01:40:17.267787 systemd[1]: Reloading... Mar 25 01:40:17.337401 zram_generator::config[1239]: No configuration found. Mar 25 01:40:17.433375 ldconfig[1180]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 25 01:40:17.485679 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:40:17.563092 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 25 01:40:17.563528 systemd[1]: Reloading finished in 295 ms. Mar 25 01:40:17.578745 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Mar 25 01:40:17.579723 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Mar 25 01:40:17.589479 systemd[1]: Starting ensure-sysext.service... Mar 25 01:40:17.594486 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Mar 25 01:40:17.611205 systemd[1]: Reload requested from client PID 1284 ('systemctl') (unit ensure-sysext.service)... Mar 25 01:40:17.611424 systemd[1]: Reloading... Mar 25 01:40:17.625838 systemd-tmpfiles[1285]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 25 01:40:17.626050 systemd-tmpfiles[1285]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Mar 25 01:40:17.629604 systemd-tmpfiles[1285]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 25 01:40:17.629824 systemd-tmpfiles[1285]: ACLs are not supported, ignoring. Mar 25 01:40:17.629867 systemd-tmpfiles[1285]: ACLs are not supported, ignoring. Mar 25 01:40:17.636932 systemd-tmpfiles[1285]: Detected autofs mount point /boot during canonicalization of boot. Mar 25 01:40:17.636945 systemd-tmpfiles[1285]: Skipping /boot Mar 25 01:40:17.651607 systemd-tmpfiles[1285]: Detected autofs mount point /boot during canonicalization of boot. Mar 25 01:40:17.652641 systemd-tmpfiles[1285]: Skipping /boot Mar 25 01:40:17.668369 zram_generator::config[1310]: No configuration found. Mar 25 01:40:17.776713 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:40:17.846698 systemd[1]: Reloading finished in 234 ms. Mar 25 01:40:17.857901 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Mar 25 01:40:17.858787 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Mar 25 01:40:17.875442 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 25 01:40:17.878665 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Mar 25 01:40:17.885505 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Mar 25 01:40:17.891512 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Mar 25 01:40:17.895567 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Mar 25 01:40:17.901887 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Mar 25 01:40:17.920932 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Mar 25 01:40:17.927781 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 01:40:17.928547 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 01:40:17.937624 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 25 01:40:17.940615 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 25 01:40:17.949964 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 25 01:40:17.950693 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 01:40:17.950798 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 01:40:17.950897 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 01:40:17.958301 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 01:40:17.959575 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 01:40:17.959792 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 01:40:17.959927 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 01:40:17.960056 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 01:40:17.966067 systemd-udevd[1363]: Using default interface naming scheme 'v255'. Mar 25 01:40:17.971627 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 25 01:40:17.972167 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 25 01:40:17.973940 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 25 01:40:17.974700 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 25 01:40:17.976200 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 25 01:40:17.976825 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 25 01:40:17.981984 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 01:40:17.982258 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 01:40:17.985547 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Mar 25 01:40:17.986663 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 01:40:17.986760 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 01:40:17.986879 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 25 01:40:17.986984 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 25 01:40:17.987082 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 01:40:17.990395 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Mar 25 01:40:17.994435 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Mar 25 01:40:17.997393 systemd[1]: Finished ensure-sysext.service. Mar 25 01:40:18.006983 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Mar 25 01:40:18.010577 systemd[1]: Starting systemd-update-done.service - Update is Completed... Mar 25 01:40:18.011500 systemd[1]: Started systemd-userdbd.service - User Database Manager. Mar 25 01:40:18.012308 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 25 01:40:18.012500 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Mar 25 01:40:18.038515 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Mar 25 01:40:18.042648 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 25 01:40:18.048662 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Mar 25 01:40:18.051358 augenrules[1404]: No rules Mar 25 01:40:18.049887 systemd[1]: Finished systemd-update-done.service - Update is Completed. Mar 25 01:40:18.054113 systemd[1]: Starting systemd-networkd.service - Network Configuration... Mar 25 01:40:18.056616 systemd[1]: audit-rules.service: Deactivated successfully. Mar 25 01:40:18.056801 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 25 01:40:18.114382 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Mar 25 01:40:18.165148 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Mar 25 01:40:18.166062 systemd[1]: Reached target time-set.target - System Time Set. Mar 25 01:40:18.176727 systemd-resolved[1362]: Positive Trust Anchors: Mar 25 01:40:18.176745 systemd-resolved[1362]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 25 01:40:18.176776 systemd-resolved[1362]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Mar 25 01:40:18.187608 systemd-resolved[1362]: Using system hostname 'ci-4284-0-0-2-22d395eace'. Mar 25 01:40:18.189543 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Mar 25 01:40:18.190100 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Mar 25 01:40:18.211880 systemd-networkd[1415]: lo: Link UP Mar 25 01:40:18.211891 systemd-networkd[1415]: lo: Gained carrier Mar 25 01:40:18.213296 systemd-networkd[1415]: Enumeration completed Mar 25 01:40:18.213398 systemd[1]: Started systemd-networkd.service - Network Configuration. Mar 25 01:40:18.214799 systemd[1]: Reached target network.target - Network. Mar 25 01:40:18.217275 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Mar 25 01:40:18.220220 systemd-networkd[1415]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:40:18.220226 systemd-networkd[1415]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 25 01:40:18.220238 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Mar 25 01:40:18.222445 systemd-networkd[1415]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:40:18.222884 systemd-networkd[1415]: eth0: Link UP Mar 25 01:40:18.222890 systemd-networkd[1415]: eth0: Gained carrier Mar 25 01:40:18.222902 systemd-networkd[1415]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:40:18.235203 systemd-networkd[1415]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:40:18.235326 systemd-networkd[1415]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 25 01:40:18.235923 systemd-networkd[1415]: eth1: Link UP Mar 25 01:40:18.236041 systemd-networkd[1415]: eth1: Gained carrier Mar 25 01:40:18.236093 systemd-networkd[1415]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Mar 25 01:40:18.245571 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Mar 25 01:40:18.248361 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Mar 25 01:40:18.258382 kernel: ACPI: button: Power Button [PWRF] Mar 25 01:40:18.260366 kernel: mousedev: PS/2 mouse device common for all mice Mar 25 01:40:18.262470 systemd-networkd[1415]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 Mar 25 01:40:18.265913 systemd-timesyncd[1391]: Network configuration changed, trying to establish connection. Mar 25 01:40:18.275436 systemd-networkd[1415]: eth0: DHCPv4 address 37.27.205.216/32, gateway 172.31.1.1 acquired from 172.31.1.1 Mar 25 01:40:18.275844 systemd-timesyncd[1391]: Network configuration changed, trying to establish connection. Mar 25 01:40:18.276805 systemd-timesyncd[1391]: Network configuration changed, trying to establish connection. Mar 25 01:40:18.279288 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Mar 25 01:40:18.279333 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 01:40:18.279445 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Mar 25 01:40:18.282302 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Mar 25 01:40:18.286953 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Mar 25 01:40:18.288431 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (1428) Mar 25 01:40:18.292121 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Mar 25 01:40:18.292704 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Mar 25 01:40:18.292730 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Mar 25 01:40:18.292753 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 25 01:40:18.292765 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 25 01:40:18.308742 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 25 01:40:18.309155 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Mar 25 01:40:18.310700 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 25 01:40:18.311106 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Mar 25 01:40:18.312303 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 25 01:40:18.312939 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Mar 25 01:40:18.316524 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 25 01:40:18.317496 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Mar 25 01:40:18.342437 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Mar 25 01:40:18.345543 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Mar 25 01:40:18.371806 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Mar 25 01:40:18.372154 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Mar 25 01:40:18.382687 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input4 Mar 25 01:40:18.386537 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:40:18.389087 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Mar 25 01:40:18.391646 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI) Mar 25 01:40:18.391793 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Mar 25 01:40:18.400365 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Mar 25 01:40:18.400611 kernel: EDAC MC: Ver: 3.0.0 Mar 25 01:40:18.402685 kernel: Console: switching to colour dummy device 80x25 Mar 25 01:40:18.408050 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Mar 25 01:40:18.408105 kernel: [drm] features: -context_init Mar 25 01:40:18.412362 kernel: [drm] number of scanouts: 1 Mar 25 01:40:18.412421 kernel: [drm] number of cap sets: 0 Mar 25 01:40:18.415377 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:01.0 on minor 0 Mar 25 01:40:18.418393 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device Mar 25 01:40:18.419367 kernel: Console: switching to colour frame buffer device 160x50 Mar 25 01:40:18.431379 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Mar 25 01:40:18.432559 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 25 01:40:18.432767 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:40:18.436694 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Mar 25 01:40:18.512230 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Mar 25 01:40:18.571631 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Mar 25 01:40:18.573472 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Mar 25 01:40:18.593436 lvm[1475]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 25 01:40:18.619320 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Mar 25 01:40:18.620636 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Mar 25 01:40:18.620742 systemd[1]: Reached target sysinit.target - System Initialization. Mar 25 01:40:18.620890 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Mar 25 01:40:18.620980 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Mar 25 01:40:18.621253 systemd[1]: Started logrotate.timer - Daily rotation of log files. Mar 25 01:40:18.621408 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Mar 25 01:40:18.621479 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Mar 25 01:40:18.621577 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 25 01:40:18.621601 systemd[1]: Reached target paths.target - Path Units. Mar 25 01:40:18.621696 systemd[1]: Reached target timers.target - Timer Units. Mar 25 01:40:18.623943 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Mar 25 01:40:18.625646 systemd[1]: Starting docker.socket - Docker Socket for the API... Mar 25 01:40:18.629718 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Mar 25 01:40:18.631177 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Mar 25 01:40:18.632433 systemd[1]: Reached target ssh-access.target - SSH Access Available. Mar 25 01:40:18.641028 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Mar 25 01:40:18.642403 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Mar 25 01:40:18.643770 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Mar 25 01:40:18.646297 systemd[1]: Listening on docker.socket - Docker Socket for the API. Mar 25 01:40:18.648960 systemd[1]: Reached target sockets.target - Socket Units. Mar 25 01:40:18.649440 systemd[1]: Reached target basic.target - Basic System. Mar 25 01:40:18.649960 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Mar 25 01:40:18.649982 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Mar 25 01:40:18.660013 lvm[1479]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 25 01:40:18.656461 systemd[1]: Starting containerd.service - containerd container runtime... Mar 25 01:40:18.660656 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Mar 25 01:40:18.665514 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Mar 25 01:40:18.670359 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Mar 25 01:40:18.678040 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Mar 25 01:40:18.679279 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Mar 25 01:40:18.683511 jq[1483]: false Mar 25 01:40:18.686330 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Mar 25 01:40:18.691577 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Mar 25 01:40:18.698688 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Mar 25 01:40:18.706610 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Mar 25 01:40:18.714679 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Mar 25 01:40:18.722545 coreos-metadata[1481]: Mar 25 01:40:18.721 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Mar 25 01:40:18.740027 coreos-metadata[1481]: Mar 25 01:40:18.726 INFO Fetch successful Mar 25 01:40:18.740027 coreos-metadata[1481]: Mar 25 01:40:18.726 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Mar 25 01:40:18.740027 coreos-metadata[1481]: Mar 25 01:40:18.728 INFO Fetch successful Mar 25 01:40:18.724127 systemd[1]: Starting systemd-logind.service - User Login Management... Mar 25 01:40:18.727005 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 25 01:40:18.731832 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Mar 25 01:40:18.735542 systemd[1]: Starting update-engine.service - Update Engine... Mar 25 01:40:18.743049 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Mar 25 01:40:18.750416 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Mar 25 01:40:18.764872 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 25 01:40:18.765063 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Mar 25 01:40:18.765290 systemd[1]: motdgen.service: Deactivated successfully. Mar 25 01:40:18.766653 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Mar 25 01:40:18.768420 jq[1501]: true Mar 25 01:40:18.774546 extend-filesystems[1484]: Found loop4 Mar 25 01:40:18.774546 extend-filesystems[1484]: Found loop5 Mar 25 01:40:18.774546 extend-filesystems[1484]: Found loop6 Mar 25 01:40:18.774546 extend-filesystems[1484]: Found loop7 Mar 25 01:40:18.774546 extend-filesystems[1484]: Found sda Mar 25 01:40:18.774546 extend-filesystems[1484]: Found sda1 Mar 25 01:40:18.774546 extend-filesystems[1484]: Found sda2 Mar 25 01:40:18.774546 extend-filesystems[1484]: Found sda3 Mar 25 01:40:18.774546 extend-filesystems[1484]: Found usr Mar 25 01:40:18.848994 extend-filesystems[1484]: Found sda4 Mar 25 01:40:18.848994 extend-filesystems[1484]: Found sda6 Mar 25 01:40:18.848994 extend-filesystems[1484]: Found sda7 Mar 25 01:40:18.848994 extend-filesystems[1484]: Found sda9 Mar 25 01:40:18.848994 extend-filesystems[1484]: Checking size of /dev/sda9 Mar 25 01:40:18.776713 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 25 01:40:18.808768 dbus-daemon[1482]: [system] SELinux support is enabled Mar 25 01:40:18.873832 update_engine[1500]: I20250325 01:40:18.793942 1500 main.cc:92] Flatcar Update Engine starting Mar 25 01:40:18.873832 update_engine[1500]: I20250325 01:40:18.812606 1500 update_check_scheduler.cc:74] Next update check in 3m53s Mar 25 01:40:18.874061 extend-filesystems[1484]: Resized partition /dev/sda9 Mar 25 01:40:18.904360 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Mar 25 01:40:18.776947 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Mar 25 01:40:18.904612 tar[1506]: linux-amd64/helm Mar 25 01:40:18.904878 extend-filesystems[1527]: resize2fs 1.47.2 (1-Jan-2025) Mar 25 01:40:18.812525 systemd[1]: Started dbus.service - D-Bus System Message Bus. Mar 25 01:40:18.823755 (ntainerd)[1510]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Mar 25 01:40:18.906542 jq[1509]: true Mar 25 01:40:18.828592 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 25 01:40:18.828648 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Mar 25 01:40:18.845976 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 25 01:40:18.846028 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Mar 25 01:40:18.868443 systemd[1]: Started update-engine.service - Update Engine. Mar 25 01:40:18.886479 systemd-logind[1499]: New seat seat0. Mar 25 01:40:18.912813 systemd[1]: Started locksmithd.service - Cluster reboot manager. Mar 25 01:40:18.928806 systemd-logind[1499]: Watching system buttons on /dev/input/event2 (Power Button) Mar 25 01:40:18.928828 systemd-logind[1499]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 25 01:40:18.929234 systemd[1]: Started systemd-logind.service - User Login Management. Mar 25 01:40:19.016369 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (1414) Mar 25 01:40:19.061458 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Mar 25 01:40:19.064810 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Mar 25 01:40:19.096401 bash[1550]: Updated "/home/core/.ssh/authorized_keys" Mar 25 01:40:19.091716 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Mar 25 01:40:19.096924 systemd[1]: Starting sshkeys.service... Mar 25 01:40:19.126196 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Mar 25 01:40:19.134784 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Mar 25 01:40:19.149118 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Mar 25 01:40:19.154972 locksmithd[1529]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 25 01:40:19.192301 extend-filesystems[1527]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Mar 25 01:40:19.192301 extend-filesystems[1527]: old_desc_blocks = 1, new_desc_blocks = 5 Mar 25 01:40:19.192301 extend-filesystems[1527]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Mar 25 01:40:19.200017 extend-filesystems[1484]: Resized filesystem in /dev/sda9 Mar 25 01:40:19.200017 extend-filesystems[1484]: Found sr0 Mar 25 01:40:19.192631 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 25 01:40:19.192816 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Mar 25 01:40:19.217091 coreos-metadata[1562]: Mar 25 01:40:19.216 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Mar 25 01:40:19.220413 coreos-metadata[1562]: Mar 25 01:40:19.219 INFO Fetch successful Mar 25 01:40:19.225432 unknown[1562]: wrote ssh authorized keys file for user: core Mar 25 01:40:19.265923 update-ssh-keys[1570]: Updated "/home/core/.ssh/authorized_keys" Mar 25 01:40:19.266847 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Mar 25 01:40:19.272821 systemd[1]: Finished sshkeys.service. Mar 25 01:40:19.281210 containerd[1510]: time="2025-03-25T01:40:19Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Mar 25 01:40:19.283113 containerd[1510]: time="2025-03-25T01:40:19.282826458Z" level=info msg="starting containerd" revision=88aa2f531d6c2922003cc7929e51daf1c14caa0a version=v2.0.1 Mar 25 01:40:19.305780 containerd[1510]: time="2025-03-25T01:40:19.305598389Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="7.384µs" Mar 25 01:40:19.306224 containerd[1510]: time="2025-03-25T01:40:19.305778928Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Mar 25 01:40:19.306224 containerd[1510]: time="2025-03-25T01:40:19.306057630Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Mar 25 01:40:19.306303 containerd[1510]: time="2025-03-25T01:40:19.306224172Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Mar 25 01:40:19.306303 containerd[1510]: time="2025-03-25T01:40:19.306237818Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Mar 25 01:40:19.306303 containerd[1510]: time="2025-03-25T01:40:19.306259749Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 25 01:40:19.306397 containerd[1510]: time="2025-03-25T01:40:19.306308470Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Mar 25 01:40:19.306397 containerd[1510]: time="2025-03-25T01:40:19.306317748Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 25 01:40:19.307029 containerd[1510]: time="2025-03-25T01:40:19.306594678Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Mar 25 01:40:19.307029 containerd[1510]: time="2025-03-25T01:40:19.306610708Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 25 01:40:19.307029 containerd[1510]: time="2025-03-25T01:40:19.306619935Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Mar 25 01:40:19.307029 containerd[1510]: time="2025-03-25T01:40:19.306627308Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Mar 25 01:40:19.307029 containerd[1510]: time="2025-03-25T01:40:19.306688694Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Mar 25 01:40:19.307029 containerd[1510]: time="2025-03-25T01:40:19.306918976Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 25 01:40:19.307029 containerd[1510]: time="2025-03-25T01:40:19.306946968Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Mar 25 01:40:19.307029 containerd[1510]: time="2025-03-25T01:40:19.306956396Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Mar 25 01:40:19.309384 containerd[1510]: time="2025-03-25T01:40:19.309366306Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Mar 25 01:40:19.309659 containerd[1510]: time="2025-03-25T01:40:19.309580768Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Mar 25 01:40:19.309659 containerd[1510]: time="2025-03-25T01:40:19.309631574Z" level=info msg="metadata content store policy set" policy=shared Mar 25 01:40:19.318774 containerd[1510]: time="2025-03-25T01:40:19.316127457Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Mar 25 01:40:19.318774 containerd[1510]: time="2025-03-25T01:40:19.316177191Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Mar 25 01:40:19.318774 containerd[1510]: time="2025-03-25T01:40:19.316194132Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Mar 25 01:40:19.318774 containerd[1510]: time="2025-03-25T01:40:19.316208238Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Mar 25 01:40:19.318774 containerd[1510]: time="2025-03-25T01:40:19.316221163Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Mar 25 01:40:19.318774 containerd[1510]: time="2025-03-25T01:40:19.316233817Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Mar 25 01:40:19.318774 containerd[1510]: time="2025-03-25T01:40:19.316249716Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Mar 25 01:40:19.318774 containerd[1510]: time="2025-03-25T01:40:19.316266087Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Mar 25 01:40:19.318774 containerd[1510]: time="2025-03-25T01:40:19.316279082Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Mar 25 01:40:19.318774 containerd[1510]: time="2025-03-25T01:40:19.316291475Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Mar 25 01:40:19.318774 containerd[1510]: time="2025-03-25T01:40:19.316302295Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Mar 25 01:40:19.318774 containerd[1510]: time="2025-03-25T01:40:19.316316251Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Mar 25 01:40:19.318774 containerd[1510]: time="2025-03-25T01:40:19.316437789Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Mar 25 01:40:19.318774 containerd[1510]: time="2025-03-25T01:40:19.316469158Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Mar 25 01:40:19.319067 containerd[1510]: time="2025-03-25T01:40:19.316486060Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Mar 25 01:40:19.319067 containerd[1510]: time="2025-03-25T01:40:19.316497100Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Mar 25 01:40:19.319067 containerd[1510]: time="2025-03-25T01:40:19.316506408Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Mar 25 01:40:19.319067 containerd[1510]: time="2025-03-25T01:40:19.316515956Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Mar 25 01:40:19.319067 containerd[1510]: time="2025-03-25T01:40:19.316526415Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Mar 25 01:40:19.319067 containerd[1510]: time="2025-03-25T01:40:19.316543668Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Mar 25 01:40:19.319067 containerd[1510]: time="2025-03-25T01:40:19.316554107Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Mar 25 01:40:19.319067 containerd[1510]: time="2025-03-25T01:40:19.316564146Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Mar 25 01:40:19.319067 containerd[1510]: time="2025-03-25T01:40:19.316572922Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Mar 25 01:40:19.319067 containerd[1510]: time="2025-03-25T01:40:19.316624179Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Mar 25 01:40:19.319067 containerd[1510]: time="2025-03-25T01:40:19.316636462Z" level=info msg="Start snapshots syncer" Mar 25 01:40:19.319067 containerd[1510]: time="2025-03-25T01:40:19.316660317Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Mar 25 01:40:19.319319 containerd[1510]: time="2025-03-25T01:40:19.316873827Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Mar 25 01:40:19.319319 containerd[1510]: time="2025-03-25T01:40:19.316914042Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Mar 25 01:40:19.319486 containerd[1510]: time="2025-03-25T01:40:19.316976319Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Mar 25 01:40:19.319486 containerd[1510]: time="2025-03-25T01:40:19.317047873Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Mar 25 01:40:19.319486 containerd[1510]: time="2025-03-25T01:40:19.317068211Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Mar 25 01:40:19.319486 containerd[1510]: time="2025-03-25T01:40:19.317078762Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Mar 25 01:40:19.319486 containerd[1510]: time="2025-03-25T01:40:19.317089101Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Mar 25 01:40:19.319486 containerd[1510]: time="2025-03-25T01:40:19.317098799Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Mar 25 01:40:19.319486 containerd[1510]: time="2025-03-25T01:40:19.317107746Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Mar 25 01:40:19.319486 containerd[1510]: time="2025-03-25T01:40:19.317116933Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Mar 25 01:40:19.319486 containerd[1510]: time="2025-03-25T01:40:19.317136960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Mar 25 01:40:19.319486 containerd[1510]: time="2025-03-25T01:40:19.317147350Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Mar 25 01:40:19.319486 containerd[1510]: time="2025-03-25T01:40:19.317156517Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Mar 25 01:40:19.319486 containerd[1510]: time="2025-03-25T01:40:19.317186854Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 25 01:40:19.319486 containerd[1510]: time="2025-03-25T01:40:19.317198787Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Mar 25 01:40:19.319486 containerd[1510]: time="2025-03-25T01:40:19.317206541Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 25 01:40:19.319721 containerd[1510]: time="2025-03-25T01:40:19.317216460Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Mar 25 01:40:19.319721 containerd[1510]: time="2025-03-25T01:40:19.317224435Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Mar 25 01:40:19.319721 containerd[1510]: time="2025-03-25T01:40:19.317233662Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Mar 25 01:40:19.319721 containerd[1510]: time="2025-03-25T01:40:19.317242649Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Mar 25 01:40:19.319721 containerd[1510]: time="2025-03-25T01:40:19.317256435Z" level=info msg="runtime interface created" Mar 25 01:40:19.319721 containerd[1510]: time="2025-03-25T01:40:19.317260943Z" level=info msg="created NRI interface" Mar 25 01:40:19.319721 containerd[1510]: time="2025-03-25T01:40:19.317277875Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Mar 25 01:40:19.319721 containerd[1510]: time="2025-03-25T01:40:19.317287563Z" level=info msg="Connect containerd service" Mar 25 01:40:19.319721 containerd[1510]: time="2025-03-25T01:40:19.317311868Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Mar 25 01:40:19.321519 sshd_keygen[1519]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 25 01:40:19.326229 containerd[1510]: time="2025-03-25T01:40:19.326195450Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 25 01:40:19.344480 systemd-networkd[1415]: eth1: Gained IPv6LL Mar 25 01:40:19.347790 systemd-timesyncd[1391]: Network configuration changed, trying to establish connection. Mar 25 01:40:19.349561 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Mar 25 01:40:19.353885 systemd[1]: Reached target network-online.target - Network is Online. Mar 25 01:40:19.359767 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:40:19.365010 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Mar 25 01:40:19.374752 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Mar 25 01:40:19.390150 systemd[1]: Starting issuegen.service - Generate /run/issue... Mar 25 01:40:19.425491 systemd[1]: issuegen.service: Deactivated successfully. Mar 25 01:40:19.425736 systemd[1]: Finished issuegen.service - Generate /run/issue. Mar 25 01:40:19.431422 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Mar 25 01:40:19.451389 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Mar 25 01:40:19.470733 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Mar 25 01:40:19.478594 systemd[1]: Started getty@tty1.service - Getty on tty1. Mar 25 01:40:19.490088 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Mar 25 01:40:19.491768 systemd[1]: Reached target getty.target - Login Prompts. Mar 25 01:40:19.512698 containerd[1510]: time="2025-03-25T01:40:19.512652660Z" level=info msg="Start subscribing containerd event" Mar 25 01:40:19.512882 containerd[1510]: time="2025-03-25T01:40:19.512856803Z" level=info msg="Start recovering state" Mar 25 01:40:19.512999 containerd[1510]: time="2025-03-25T01:40:19.512989592Z" level=info msg="Start event monitor" Mar 25 01:40:19.513427 containerd[1510]: time="2025-03-25T01:40:19.513417374Z" level=info msg="Start cni network conf syncer for default" Mar 25 01:40:19.513837 containerd[1510]: time="2025-03-25T01:40:19.513486624Z" level=info msg="Start streaming server" Mar 25 01:40:19.513837 containerd[1510]: time="2025-03-25T01:40:19.513500259Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Mar 25 01:40:19.513837 containerd[1510]: time="2025-03-25T01:40:19.513507553Z" level=info msg="runtime interface starting up..." Mar 25 01:40:19.513837 containerd[1510]: time="2025-03-25T01:40:19.513512903Z" level=info msg="starting plugins..." Mar 25 01:40:19.513837 containerd[1510]: time="2025-03-25T01:40:19.513528132Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Mar 25 01:40:19.514695 containerd[1510]: time="2025-03-25T01:40:19.514676857Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 25 01:40:19.514741 containerd[1510]: time="2025-03-25T01:40:19.514719526Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 25 01:40:19.514873 systemd[1]: Started containerd.service - containerd container runtime. Mar 25 01:40:19.516682 containerd[1510]: time="2025-03-25T01:40:19.516655367Z" level=info msg="containerd successfully booted in 0.235928s" Mar 25 01:40:19.535063 systemd-networkd[1415]: eth0: Gained IPv6LL Mar 25 01:40:19.535709 systemd-timesyncd[1391]: Network configuration changed, trying to establish connection. Mar 25 01:40:19.623480 tar[1506]: linux-amd64/LICENSE Mar 25 01:40:19.623641 tar[1506]: linux-amd64/README.md Mar 25 01:40:19.640066 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Mar 25 01:40:20.503033 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:40:20.507738 systemd[1]: Reached target multi-user.target - Multi-User System. Mar 25 01:40:20.514952 systemd[1]: Startup finished in 1.464s (kernel) + 7.975s (initrd) + 4.846s (userspace) = 14.286s. Mar 25 01:40:20.515721 (kubelet)[1624]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:40:21.443418 kubelet[1624]: E0325 01:40:21.443262 1624 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:40:21.447623 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:40:21.447866 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:40:21.448588 systemd[1]: kubelet.service: Consumed 1.382s CPU time, 235M memory peak. Mar 25 01:40:31.635112 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 25 01:40:31.639379 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:40:31.802905 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:40:31.814696 (kubelet)[1643]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:40:31.865874 kubelet[1643]: E0325 01:40:31.865690 1643 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:40:31.868986 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:40:31.869145 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:40:31.869479 systemd[1]: kubelet.service: Consumed 189ms CPU time, 96.1M memory peak. Mar 25 01:40:41.883730 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 25 01:40:41.886025 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:40:42.037788 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:40:42.041596 (kubelet)[1658]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:40:42.094128 kubelet[1658]: E0325 01:40:42.094049 1658 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:40:42.097837 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:40:42.098039 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:40:42.098486 systemd[1]: kubelet.service: Consumed 175ms CPU time, 96M memory peak. Mar 25 01:40:50.719928 systemd-timesyncd[1391]: Contacted time server 176.9.42.91:123 (2.flatcar.pool.ntp.org). Mar 25 01:40:50.720013 systemd-timesyncd[1391]: Initial clock synchronization to Tue 2025-03-25 01:40:50.719699 UTC. Mar 25 01:40:50.720860 systemd-resolved[1362]: Clock change detected. Flushing caches. Mar 25 01:40:52.934122 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 25 01:40:52.936891 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:40:53.108704 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:40:53.118599 (kubelet)[1672]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:40:53.154316 kubelet[1672]: E0325 01:40:53.154147 1672 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:40:53.157431 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:40:53.157555 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:40:53.157830 systemd[1]: kubelet.service: Consumed 186ms CPU time, 95.7M memory peak. Mar 25 01:41:03.184143 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Mar 25 01:41:03.187100 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:41:03.350188 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:41:03.353223 (kubelet)[1688]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:41:03.415023 kubelet[1688]: E0325 01:41:03.414930 1688 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:41:03.418818 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:41:03.419058 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:41:03.419625 systemd[1]: kubelet.service: Consumed 205ms CPU time, 95.7M memory peak. Mar 25 01:41:05.073671 update_engine[1500]: I20250325 01:41:05.073472 1500 update_attempter.cc:509] Updating boot flags... Mar 25 01:41:05.131387 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (1704) Mar 25 01:41:05.204817 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (1704) Mar 25 01:41:05.250331 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (1704) Mar 25 01:41:13.434740 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Mar 25 01:41:13.437433 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:41:13.613642 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:41:13.619877 (kubelet)[1723]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:41:13.688912 kubelet[1723]: E0325 01:41:13.688728 1723 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:41:13.691418 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:41:13.691812 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:41:13.692559 systemd[1]: kubelet.service: Consumed 221ms CPU time, 95.7M memory peak. Mar 25 01:41:23.933940 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. Mar 25 01:41:23.936495 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:41:24.066873 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:41:24.075484 (kubelet)[1739]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:41:24.113323 kubelet[1739]: E0325 01:41:24.113232 1739 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:41:24.116396 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:41:24.116563 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:41:24.117154 systemd[1]: kubelet.service: Consumed 139ms CPU time, 95.4M memory peak. Mar 25 01:41:34.184380 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 7. Mar 25 01:41:34.187540 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:41:34.328474 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:41:34.343810 (kubelet)[1754]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:41:34.411512 kubelet[1754]: E0325 01:41:34.411386 1754 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:41:34.413767 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:41:34.413937 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:41:34.414303 systemd[1]: kubelet.service: Consumed 187ms CPU time, 97.4M memory peak. Mar 25 01:41:44.434159 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 8. Mar 25 01:41:44.436747 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:41:44.608482 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:41:44.611719 (kubelet)[1768]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:41:44.663705 kubelet[1768]: E0325 01:41:44.663519 1768 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:41:44.667295 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:41:44.667518 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:41:44.667992 systemd[1]: kubelet.service: Consumed 193ms CPU time, 95.3M memory peak. Mar 25 01:41:54.684337 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 9. Mar 25 01:41:54.687311 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:41:54.832112 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:41:54.843476 (kubelet)[1783]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:41:54.893051 kubelet[1783]: E0325 01:41:54.892906 1783 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:41:54.894883 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:41:54.895110 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:41:54.895604 systemd[1]: kubelet.service: Consumed 164ms CPU time, 97.4M memory peak. Mar 25 01:42:01.618541 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Mar 25 01:42:01.620742 systemd[1]: Started sshd@0-37.27.205.216:22-139.178.68.195:45410.service - OpenSSH per-connection server daemon (139.178.68.195:45410). Mar 25 01:42:02.645400 sshd[1792]: Accepted publickey for core from 139.178.68.195 port 45410 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:42:02.648902 sshd-session[1792]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:42:02.657288 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Mar 25 01:42:02.658680 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Mar 25 01:42:02.672571 systemd-logind[1499]: New session 1 of user core. Mar 25 01:42:02.681633 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Mar 25 01:42:02.686537 systemd[1]: Starting user@500.service - User Manager for UID 500... Mar 25 01:42:02.700307 (systemd)[1796]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 25 01:42:02.703004 systemd-logind[1499]: New session c1 of user core. Mar 25 01:42:02.851783 systemd[1796]: Queued start job for default target default.target. Mar 25 01:42:02.858453 systemd[1796]: Created slice app.slice - User Application Slice. Mar 25 01:42:02.858482 systemd[1796]: Reached target paths.target - Paths. Mar 25 01:42:02.858518 systemd[1796]: Reached target timers.target - Timers. Mar 25 01:42:02.859972 systemd[1796]: Starting dbus.socket - D-Bus User Message Bus Socket... Mar 25 01:42:02.876892 systemd[1796]: Listening on dbus.socket - D-Bus User Message Bus Socket. Mar 25 01:42:02.876995 systemd[1796]: Reached target sockets.target - Sockets. Mar 25 01:42:02.877026 systemd[1796]: Reached target basic.target - Basic System. Mar 25 01:42:02.877054 systemd[1796]: Reached target default.target - Main User Target. Mar 25 01:42:02.877076 systemd[1796]: Startup finished in 167ms. Mar 25 01:42:02.878091 systemd[1]: Started user@500.service - User Manager for UID 500. Mar 25 01:42:02.886764 systemd[1]: Started session-1.scope - Session 1 of User core. Mar 25 01:42:03.582132 systemd[1]: Started sshd@1-37.27.205.216:22-139.178.68.195:45414.service - OpenSSH per-connection server daemon (139.178.68.195:45414). Mar 25 01:42:04.573960 sshd[1807]: Accepted publickey for core from 139.178.68.195 port 45414 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:42:04.575704 sshd-session[1807]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:42:04.583210 systemd-logind[1499]: New session 2 of user core. Mar 25 01:42:04.592455 systemd[1]: Started session-2.scope - Session 2 of User core. Mar 25 01:42:04.933421 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 10. Mar 25 01:42:04.935334 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:42:05.109746 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:42:05.119518 (kubelet)[1819]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:42:05.151530 kubelet[1819]: E0325 01:42:05.151468 1819 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:42:05.155773 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:42:05.155954 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:42:05.156408 systemd[1]: kubelet.service: Consumed 182ms CPU time, 95.4M memory peak. Mar 25 01:42:05.250800 sshd[1809]: Connection closed by 139.178.68.195 port 45414 Mar 25 01:42:05.251558 sshd-session[1807]: pam_unix(sshd:session): session closed for user core Mar 25 01:42:05.255711 systemd[1]: sshd@1-37.27.205.216:22-139.178.68.195:45414.service: Deactivated successfully. Mar 25 01:42:05.258426 systemd[1]: session-2.scope: Deactivated successfully. Mar 25 01:42:05.260445 systemd-logind[1499]: Session 2 logged out. Waiting for processes to exit. Mar 25 01:42:05.262184 systemd-logind[1499]: Removed session 2. Mar 25 01:42:05.422592 systemd[1]: Started sshd@2-37.27.205.216:22-139.178.68.195:55208.service - OpenSSH per-connection server daemon (139.178.68.195:55208). Mar 25 01:42:06.419817 sshd[1831]: Accepted publickey for core from 139.178.68.195 port 55208 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:42:06.421902 sshd-session[1831]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:42:06.429919 systemd-logind[1499]: New session 3 of user core. Mar 25 01:42:06.439518 systemd[1]: Started session-3.scope - Session 3 of User core. Mar 25 01:42:07.090138 sshd[1833]: Connection closed by 139.178.68.195 port 55208 Mar 25 01:42:07.091132 sshd-session[1831]: pam_unix(sshd:session): session closed for user core Mar 25 01:42:07.096511 systemd[1]: sshd@2-37.27.205.216:22-139.178.68.195:55208.service: Deactivated successfully. Mar 25 01:42:07.099164 systemd[1]: session-3.scope: Deactivated successfully. Mar 25 01:42:07.100297 systemd-logind[1499]: Session 3 logged out. Waiting for processes to exit. Mar 25 01:42:07.101963 systemd-logind[1499]: Removed session 3. Mar 25 01:42:07.264184 systemd[1]: Started sshd@3-37.27.205.216:22-139.178.68.195:55212.service - OpenSSH per-connection server daemon (139.178.68.195:55212). Mar 25 01:42:08.270444 sshd[1839]: Accepted publickey for core from 139.178.68.195 port 55212 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:42:08.273454 sshd-session[1839]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:42:08.285091 systemd-logind[1499]: New session 4 of user core. Mar 25 01:42:08.292931 systemd[1]: Started session-4.scope - Session 4 of User core. Mar 25 01:42:08.954528 sshd[1841]: Connection closed by 139.178.68.195 port 55212 Mar 25 01:42:08.955722 sshd-session[1839]: pam_unix(sshd:session): session closed for user core Mar 25 01:42:08.961218 systemd[1]: sshd@3-37.27.205.216:22-139.178.68.195:55212.service: Deactivated successfully. Mar 25 01:42:08.964607 systemd[1]: session-4.scope: Deactivated successfully. Mar 25 01:42:08.967327 systemd-logind[1499]: Session 4 logged out. Waiting for processes to exit. Mar 25 01:42:08.969112 systemd-logind[1499]: Removed session 4. Mar 25 01:42:09.132827 systemd[1]: Started sshd@4-37.27.205.216:22-139.178.68.195:55214.service - OpenSSH per-connection server daemon (139.178.68.195:55214). Mar 25 01:42:10.140150 sshd[1847]: Accepted publickey for core from 139.178.68.195 port 55214 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:42:10.143222 sshd-session[1847]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:42:10.153991 systemd-logind[1499]: New session 5 of user core. Mar 25 01:42:10.163656 systemd[1]: Started session-5.scope - Session 5 of User core. Mar 25 01:42:10.677795 sudo[1850]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 25 01:42:10.678328 sudo[1850]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:42:10.696924 sudo[1850]: pam_unix(sudo:session): session closed for user root Mar 25 01:42:10.855904 sshd[1849]: Connection closed by 139.178.68.195 port 55214 Mar 25 01:42:10.857420 sshd-session[1847]: pam_unix(sshd:session): session closed for user core Mar 25 01:42:10.863743 systemd[1]: sshd@4-37.27.205.216:22-139.178.68.195:55214.service: Deactivated successfully. Mar 25 01:42:10.867704 systemd[1]: session-5.scope: Deactivated successfully. Mar 25 01:42:10.870693 systemd-logind[1499]: Session 5 logged out. Waiting for processes to exit. Mar 25 01:42:10.873200 systemd-logind[1499]: Removed session 5. Mar 25 01:42:11.033693 systemd[1]: Started sshd@5-37.27.205.216:22-139.178.68.195:55228.service - OpenSSH per-connection server daemon (139.178.68.195:55228). Mar 25 01:42:12.038580 sshd[1856]: Accepted publickey for core from 139.178.68.195 port 55228 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:42:12.040607 sshd-session[1856]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:42:12.049092 systemd-logind[1499]: New session 6 of user core. Mar 25 01:42:12.059509 systemd[1]: Started session-6.scope - Session 6 of User core. Mar 25 01:42:12.561141 sudo[1860]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 25 01:42:12.561637 sudo[1860]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:42:12.567940 sudo[1860]: pam_unix(sudo:session): session closed for user root Mar 25 01:42:12.580648 sudo[1859]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Mar 25 01:42:12.581171 sudo[1859]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:42:12.599870 systemd[1]: Starting audit-rules.service - Load Audit Rules... Mar 25 01:42:12.670709 augenrules[1882]: No rules Mar 25 01:42:12.673561 systemd[1]: audit-rules.service: Deactivated successfully. Mar 25 01:42:12.673949 systemd[1]: Finished audit-rules.service - Load Audit Rules. Mar 25 01:42:12.676356 sudo[1859]: pam_unix(sudo:session): session closed for user root Mar 25 01:42:12.835525 sshd[1858]: Connection closed by 139.178.68.195 port 55228 Mar 25 01:42:12.837212 sshd-session[1856]: pam_unix(sshd:session): session closed for user core Mar 25 01:42:12.845500 systemd-logind[1499]: Session 6 logged out. Waiting for processes to exit. Mar 25 01:42:12.846344 systemd[1]: sshd@5-37.27.205.216:22-139.178.68.195:55228.service: Deactivated successfully. Mar 25 01:42:12.850175 systemd[1]: session-6.scope: Deactivated successfully. Mar 25 01:42:12.852949 systemd-logind[1499]: Removed session 6. Mar 25 01:42:13.013673 systemd[1]: Started sshd@6-37.27.205.216:22-139.178.68.195:55236.service - OpenSSH per-connection server daemon (139.178.68.195:55236). Mar 25 01:42:14.015658 sshd[1891]: Accepted publickey for core from 139.178.68.195 port 55236 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:42:14.018227 sshd-session[1891]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:42:14.032433 systemd-logind[1499]: New session 7 of user core. Mar 25 01:42:14.040586 systemd[1]: Started session-7.scope - Session 7 of User core. Mar 25 01:42:14.539433 sudo[1894]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 25 01:42:14.539923 sudo[1894]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Mar 25 01:42:15.071578 systemd[1]: Starting docker.service - Docker Application Container Engine... Mar 25 01:42:15.089940 (dockerd)[1912]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Mar 25 01:42:15.184050 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 11. Mar 25 01:42:15.191559 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:42:15.380363 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:42:15.388499 (kubelet)[1924]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:42:15.431197 kubelet[1924]: E0325 01:42:15.430934 1924 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:42:15.433967 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:42:15.434116 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:42:15.434632 systemd[1]: kubelet.service: Consumed 186ms CPU time, 97.4M memory peak. Mar 25 01:42:15.479771 dockerd[1912]: time="2025-03-25T01:42:15.479663027Z" level=info msg="Starting up" Mar 25 01:42:15.481551 dockerd[1912]: time="2025-03-25T01:42:15.481514148Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Mar 25 01:42:15.525872 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport706094940-merged.mount: Deactivated successfully. Mar 25 01:42:15.571988 dockerd[1912]: time="2025-03-25T01:42:15.571900934Z" level=info msg="Loading containers: start." Mar 25 01:42:15.797343 kernel: Initializing XFRM netlink socket Mar 25 01:42:15.901453 systemd-networkd[1415]: docker0: Link UP Mar 25 01:42:15.952389 dockerd[1912]: time="2025-03-25T01:42:15.952296971Z" level=info msg="Loading containers: done." Mar 25 01:42:15.975705 dockerd[1912]: time="2025-03-25T01:42:15.975574237Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 25 01:42:15.975949 dockerd[1912]: time="2025-03-25T01:42:15.975833392Z" level=info msg="Docker daemon" commit=c710b88579fcb5e0d53f96dcae976d79323b9166 containerd-snapshotter=false storage-driver=overlay2 version=27.4.1 Mar 25 01:42:15.976044 dockerd[1912]: time="2025-03-25T01:42:15.975994243Z" level=info msg="Daemon has completed initialization" Mar 25 01:42:16.025139 dockerd[1912]: time="2025-03-25T01:42:16.024255595Z" level=info msg="API listen on /run/docker.sock" Mar 25 01:42:16.024481 systemd[1]: Started docker.service - Docker Application Container Engine. Mar 25 01:42:16.519638 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck4029094659-merged.mount: Deactivated successfully. Mar 25 01:42:17.412685 containerd[1510]: time="2025-03-25T01:42:17.412620101Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.7\"" Mar 25 01:42:18.044336 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2962021363.mount: Deactivated successfully. Mar 25 01:42:19.120876 containerd[1510]: time="2025-03-25T01:42:19.120811884Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:42:19.121852 containerd[1510]: time="2025-03-25T01:42:19.121803994Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.7: active requests=0, bytes read=27959362" Mar 25 01:42:19.123074 containerd[1510]: time="2025-03-25T01:42:19.123034080Z" level=info msg="ImageCreate event name:\"sha256:f084bc047a8cf7c8484d47c51e70e646dde3977d916f282feb99207b7b9241af\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:42:19.125769 containerd[1510]: time="2025-03-25T01:42:19.125719515Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:22c19cc70fe5806d0a2cb28a6b6b33fd34e6f9e50616bdf6d53649bcfafbc277\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:42:19.126468 containerd[1510]: time="2025-03-25T01:42:19.126442951Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.7\" with image id \"sha256:f084bc047a8cf7c8484d47c51e70e646dde3977d916f282feb99207b7b9241af\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.7\", repo digest \"registry.k8s.io/kube-apiserver@sha256:22c19cc70fe5806d0a2cb28a6b6b33fd34e6f9e50616bdf6d53649bcfafbc277\", size \"27956068\" in 1.713768237s" Mar 25 01:42:19.126678 containerd[1510]: time="2025-03-25T01:42:19.126545263Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.7\" returns image reference \"sha256:f084bc047a8cf7c8484d47c51e70e646dde3977d916f282feb99207b7b9241af\"" Mar 25 01:42:19.128308 containerd[1510]: time="2025-03-25T01:42:19.128288181Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.7\"" Mar 25 01:42:20.358384 containerd[1510]: time="2025-03-25T01:42:20.358320776Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:42:20.359495 containerd[1510]: time="2025-03-25T01:42:20.359450123Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.7: active requests=0, bytes read=24713798" Mar 25 01:42:20.360346 containerd[1510]: time="2025-03-25T01:42:20.360310295Z" level=info msg="ImageCreate event name:\"sha256:652dcad615a9a0c252c253860d5b5b7bfebd3efe159dc033a8555bc15a6d1985\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:42:20.362903 containerd[1510]: time="2025-03-25T01:42:20.362869233Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:6abe7a0accecf29db6ebab18a10f844678ffed693d79e2e51a18a6f2b4530cbb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:42:20.364008 containerd[1510]: time="2025-03-25T01:42:20.363586818Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.7\" with image id \"sha256:652dcad615a9a0c252c253860d5b5b7bfebd3efe159dc033a8555bc15a6d1985\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.7\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:6abe7a0accecf29db6ebab18a10f844678ffed693d79e2e51a18a6f2b4530cbb\", size \"26201384\" in 1.235275635s" Mar 25 01:42:20.364008 containerd[1510]: time="2025-03-25T01:42:20.363613408Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.7\" returns image reference \"sha256:652dcad615a9a0c252c253860d5b5b7bfebd3efe159dc033a8555bc15a6d1985\"" Mar 25 01:42:20.364254 containerd[1510]: time="2025-03-25T01:42:20.364232869Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.7\"" Mar 25 01:42:22.133656 containerd[1510]: time="2025-03-25T01:42:22.133595172Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:42:22.135053 containerd[1510]: time="2025-03-25T01:42:22.135000336Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.7: active requests=0, bytes read=18780390" Mar 25 01:42:22.136483 containerd[1510]: time="2025-03-25T01:42:22.136444794Z" level=info msg="ImageCreate event name:\"sha256:7f1f6a63d8aa14cf61d0045e912ad312b4ade24637cecccc933b163582eae68c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:42:22.139530 containerd[1510]: time="2025-03-25T01:42:22.139491988Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:fb80249bcb77ee72b1c9fa5b70bc28a83ed107c9ca71957841ad91db379963bf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:42:22.140364 containerd[1510]: time="2025-03-25T01:42:22.140239409Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.7\" with image id \"sha256:7f1f6a63d8aa14cf61d0045e912ad312b4ade24637cecccc933b163582eae68c\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.7\", repo digest \"registry.k8s.io/kube-scheduler@sha256:fb80249bcb77ee72b1c9fa5b70bc28a83ed107c9ca71957841ad91db379963bf\", size \"20267994\" in 1.775983677s" Mar 25 01:42:22.140364 containerd[1510]: time="2025-03-25T01:42:22.140280615Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.7\" returns image reference \"sha256:7f1f6a63d8aa14cf61d0045e912ad312b4ade24637cecccc933b163582eae68c\"" Mar 25 01:42:22.140962 containerd[1510]: time="2025-03-25T01:42:22.140796674Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.7\"" Mar 25 01:42:23.175317 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount436644226.mount: Deactivated successfully. Mar 25 01:42:23.511252 containerd[1510]: time="2025-03-25T01:42:23.511194912Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:42:23.512594 containerd[1510]: time="2025-03-25T01:42:23.512548650Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.7: active requests=0, bytes read=30354658" Mar 25 01:42:23.514089 containerd[1510]: time="2025-03-25T01:42:23.514040868Z" level=info msg="ImageCreate event name:\"sha256:dcfc039c372ea285997a302d60e58a75b80905b4c4dba969993b9b22e8ac66d1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:42:23.516045 containerd[1510]: time="2025-03-25T01:42:23.516008197Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:e5839270c96c3ad1bea1dce4935126d3281297527f3655408d2970aa4b5cf178\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:42:23.516409 containerd[1510]: time="2025-03-25T01:42:23.516381738Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.7\" with image id \"sha256:dcfc039c372ea285997a302d60e58a75b80905b4c4dba969993b9b22e8ac66d1\", repo tag \"registry.k8s.io/kube-proxy:v1.31.7\", repo digest \"registry.k8s.io/kube-proxy@sha256:e5839270c96c3ad1bea1dce4935126d3281297527f3655408d2970aa4b5cf178\", size \"30353649\" in 1.375563513s" Mar 25 01:42:23.516448 containerd[1510]: time="2025-03-25T01:42:23.516412124Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.7\" returns image reference \"sha256:dcfc039c372ea285997a302d60e58a75b80905b4c4dba969993b9b22e8ac66d1\"" Mar 25 01:42:23.516888 containerd[1510]: time="2025-03-25T01:42:23.516867418Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Mar 25 01:42:24.059370 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1833028893.mount: Deactivated successfully. Mar 25 01:42:24.943758 containerd[1510]: time="2025-03-25T01:42:24.943546164Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:42:24.944772 containerd[1510]: time="2025-03-25T01:42:24.944504391Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185843" Mar 25 01:42:24.945767 containerd[1510]: time="2025-03-25T01:42:24.945707867Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:42:24.948445 containerd[1510]: time="2025-03-25T01:42:24.948366251Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:42:24.949289 containerd[1510]: time="2025-03-25T01:42:24.949156533Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 1.432264098s" Mar 25 01:42:24.949289 containerd[1510]: time="2025-03-25T01:42:24.949183795Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" Mar 25 01:42:24.949868 containerd[1510]: time="2025-03-25T01:42:24.949831009Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Mar 25 01:42:25.435861 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2891539616.mount: Deactivated successfully. Mar 25 01:42:25.445788 containerd[1510]: time="2025-03-25T01:42:25.445689629Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:42:25.447216 containerd[1510]: time="2025-03-25T01:42:25.447113268Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321160" Mar 25 01:42:25.449051 containerd[1510]: time="2025-03-25T01:42:25.448963728Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:42:25.454321 containerd[1510]: time="2025-03-25T01:42:25.452554802Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Mar 25 01:42:25.454510 containerd[1510]: time="2025-03-25T01:42:25.454469682Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 504.604109ms" Mar 25 01:42:25.454658 containerd[1510]: time="2025-03-25T01:42:25.454622399Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Mar 25 01:42:25.455547 containerd[1510]: time="2025-03-25T01:42:25.455388305Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Mar 25 01:42:25.683864 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 12. Mar 25 01:42:25.686531 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:42:25.844625 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:42:25.851479 (kubelet)[2251]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Mar 25 01:42:25.892311 kubelet[2251]: E0325 01:42:25.891312 2251 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 25 01:42:25.895434 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 25 01:42:25.895570 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 25 01:42:25.896166 systemd[1]: kubelet.service: Consumed 168ms CPU time, 95.3M memory peak. Mar 25 01:42:26.055108 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1116020411.mount: Deactivated successfully. Mar 25 01:42:27.586454 containerd[1510]: time="2025-03-25T01:42:27.586393140Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:42:27.587779 containerd[1510]: time="2025-03-25T01:42:27.587714468Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56780037" Mar 25 01:42:27.589186 containerd[1510]: time="2025-03-25T01:42:27.589146743Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:42:27.592126 containerd[1510]: time="2025-03-25T01:42:27.592059403Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:42:27.593253 containerd[1510]: time="2025-03-25T01:42:27.592869115Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 2.137187852s" Mar 25 01:42:27.593253 containerd[1510]: time="2025-03-25T01:42:27.592901757Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" Mar 25 01:42:31.111969 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:42:31.112428 systemd[1]: kubelet.service: Consumed 168ms CPU time, 95.3M memory peak. Mar 25 01:42:31.117437 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:42:31.151432 systemd[1]: Reload requested from client PID 2339 ('systemctl') (unit session-7.scope)... Mar 25 01:42:31.151446 systemd[1]: Reloading... Mar 25 01:42:31.275298 zram_generator::config[2384]: No configuration found. Mar 25 01:42:31.382486 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:42:31.485619 systemd[1]: Reloading finished in 333 ms. Mar 25 01:42:31.535746 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:42:31.546628 (kubelet)[2430]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 25 01:42:31.547753 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:42:31.550015 systemd[1]: Started sshd@7-37.27.205.216:22-182.75.65.22:37984.service - OpenSSH per-connection server daemon (182.75.65.22:37984). Mar 25 01:42:31.552225 systemd[1]: kubelet.service: Deactivated successfully. Mar 25 01:42:31.552615 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:42:31.552675 systemd[1]: kubelet.service: Consumed 120ms CPU time, 83.7M memory peak. Mar 25 01:42:31.557385 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:42:31.718089 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:42:31.727923 (kubelet)[2444]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 25 01:42:31.795201 kubelet[2444]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:42:31.795201 kubelet[2444]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 25 01:42:31.795201 kubelet[2444]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:42:31.795201 kubelet[2444]: I0325 01:42:31.794538 2444 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 25 01:42:32.202933 kubelet[2444]: I0325 01:42:32.202847 2444 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" Mar 25 01:42:32.202933 kubelet[2444]: I0325 01:42:32.202919 2444 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 25 01:42:32.203428 kubelet[2444]: I0325 01:42:32.203393 2444 server.go:929] "Client rotation is on, will bootstrap in background" Mar 25 01:42:32.239937 kubelet[2444]: E0325 01:42:32.239870 2444 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://37.27.205.216:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 37.27.205.216:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:42:32.241383 kubelet[2444]: I0325 01:42:32.241245 2444 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 25 01:42:32.257283 kubelet[2444]: I0325 01:42:32.257226 2444 server.go:1426] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 25 01:42:32.266882 kubelet[2444]: I0325 01:42:32.266713 2444 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 25 01:42:32.271486 kubelet[2444]: I0325 01:42:32.271468 2444 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 25 01:42:32.272155 kubelet[2444]: I0325 01:42:32.271876 2444 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 25 01:42:32.272155 kubelet[2444]: I0325 01:42:32.271908 2444 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4284-0-0-2-22d395eace","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 25 01:42:32.273845 kubelet[2444]: I0325 01:42:32.273795 2444 topology_manager.go:138] "Creating topology manager with none policy" Mar 25 01:42:32.273845 kubelet[2444]: I0325 01:42:32.273825 2444 container_manager_linux.go:300] "Creating device plugin manager" Mar 25 01:42:32.274070 kubelet[2444]: I0325 01:42:32.273939 2444 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:42:32.276368 kubelet[2444]: I0325 01:42:32.276335 2444 kubelet.go:408] "Attempting to sync node with API server" Mar 25 01:42:32.276748 kubelet[2444]: I0325 01:42:32.276499 2444 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 25 01:42:32.276748 kubelet[2444]: I0325 01:42:32.276547 2444 kubelet.go:314] "Adding apiserver pod source" Mar 25 01:42:32.276748 kubelet[2444]: I0325 01:42:32.276567 2444 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 25 01:42:32.284794 kubelet[2444]: I0325 01:42:32.284664 2444 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 25 01:42:32.286775 kubelet[2444]: I0325 01:42:32.286757 2444 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 25 01:42:32.288907 kubelet[2444]: W0325 01:42:32.287786 2444 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 25 01:42:32.288907 kubelet[2444]: I0325 01:42:32.288473 2444 server.go:1269] "Started kubelet" Mar 25 01:42:32.288907 kubelet[2444]: W0325 01:42:32.288612 2444 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://37.27.205.216:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-2-22d395eace&limit=500&resourceVersion=0": dial tcp 37.27.205.216:6443: connect: connection refused Mar 25 01:42:32.288907 kubelet[2444]: E0325 01:42:32.288674 2444 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://37.27.205.216:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-2-22d395eace&limit=500&resourceVersion=0\": dial tcp 37.27.205.216:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:42:32.293625 kubelet[2444]: I0325 01:42:32.293420 2444 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 25 01:42:32.294153 kubelet[2444]: W0325 01:42:32.294076 2444 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://37.27.205.216:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 37.27.205.216:6443: connect: connection refused Mar 25 01:42:32.294153 kubelet[2444]: E0325 01:42:32.294133 2444 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://37.27.205.216:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 37.27.205.216:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:42:32.295686 kubelet[2444]: I0325 01:42:32.294291 2444 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 25 01:42:32.295686 kubelet[2444]: I0325 01:42:32.294369 2444 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 25 01:42:32.295686 kubelet[2444]: I0325 01:42:32.295196 2444 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 25 01:42:32.295686 kubelet[2444]: I0325 01:42:32.295519 2444 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 25 01:42:32.305801 kubelet[2444]: I0325 01:42:32.305640 2444 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 25 01:42:32.306029 kubelet[2444]: E0325 01:42:32.305962 2444 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284-0-0-2-22d395eace\" not found" Mar 25 01:42:32.306161 kubelet[2444]: I0325 01:42:32.306064 2444 server.go:460] "Adding debug handlers to kubelet server" Mar 25 01:42:32.306966 kubelet[2444]: I0325 01:42:32.306955 2444 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 25 01:42:32.307097 kubelet[2444]: I0325 01:42:32.307055 2444 reconciler.go:26] "Reconciler: start to sync state" Mar 25 01:42:32.307467 kubelet[2444]: W0325 01:42:32.307407 2444 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://37.27.205.216:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 37.27.205.216:6443: connect: connection refused Mar 25 01:42:32.307872 kubelet[2444]: E0325 01:42:32.307530 2444 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://37.27.205.216:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 37.27.205.216:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:42:32.307932 kubelet[2444]: E0325 01:42:32.307844 2444 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://37.27.205.216:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-2-22d395eace?timeout=10s\": dial tcp 37.27.205.216:6443: connect: connection refused" interval="200ms" Mar 25 01:42:32.308922 kubelet[2444]: I0325 01:42:32.308910 2444 factory.go:221] Registration of the systemd container factory successfully Mar 25 01:42:32.311898 kubelet[2444]: I0325 01:42:32.311355 2444 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 25 01:42:32.314647 kubelet[2444]: E0325 01:42:32.309000 2444 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://37.27.205.216:6443/api/v1/namespaces/default/events\": dial tcp 37.27.205.216:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4284-0-0-2-22d395eace.182fe8424deb6b56 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4284-0-0-2-22d395eace,UID:ci-4284-0-0-2-22d395eace,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4284-0-0-2-22d395eace,},FirstTimestamp:2025-03-25 01:42:32.288430934 +0000 UTC m=+0.554840742,LastTimestamp:2025-03-25 01:42:32.288430934 +0000 UTC m=+0.554840742,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4284-0-0-2-22d395eace,}" Mar 25 01:42:32.315578 kubelet[2444]: E0325 01:42:32.315566 2444 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 25 01:42:32.316205 kubelet[2444]: I0325 01:42:32.316169 2444 factory.go:221] Registration of the containerd container factory successfully Mar 25 01:42:32.336003 kubelet[2444]: I0325 01:42:32.335970 2444 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 25 01:42:32.338098 kubelet[2444]: I0325 01:42:32.338085 2444 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 25 01:42:32.338173 kubelet[2444]: I0325 01:42:32.338166 2444 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 25 01:42:32.338230 kubelet[2444]: I0325 01:42:32.338224 2444 kubelet.go:2321] "Starting kubelet main sync loop" Mar 25 01:42:32.338320 kubelet[2444]: E0325 01:42:32.338306 2444 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 25 01:42:32.344510 kubelet[2444]: W0325 01:42:32.344426 2444 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://37.27.205.216:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 37.27.205.216:6443: connect: connection refused Mar 25 01:42:32.344510 kubelet[2444]: E0325 01:42:32.344509 2444 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://37.27.205.216:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 37.27.205.216:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:42:32.346304 kubelet[2444]: I0325 01:42:32.345844 2444 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 25 01:42:32.346304 kubelet[2444]: I0325 01:42:32.345871 2444 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 25 01:42:32.346304 kubelet[2444]: I0325 01:42:32.345890 2444 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:42:32.348249 kubelet[2444]: I0325 01:42:32.348160 2444 policy_none.go:49] "None policy: Start" Mar 25 01:42:32.348760 kubelet[2444]: I0325 01:42:32.348732 2444 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 25 01:42:32.348760 kubelet[2444]: I0325 01:42:32.348756 2444 state_mem.go:35] "Initializing new in-memory state store" Mar 25 01:42:32.355660 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Mar 25 01:42:32.368951 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Mar 25 01:42:32.372408 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Mar 25 01:42:32.380185 kubelet[2444]: I0325 01:42:32.380014 2444 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 25 01:42:32.380354 kubelet[2444]: I0325 01:42:32.380286 2444 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 25 01:42:32.380354 kubelet[2444]: I0325 01:42:32.380302 2444 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 25 01:42:32.381340 kubelet[2444]: I0325 01:42:32.380791 2444 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 25 01:42:32.383353 kubelet[2444]: E0325 01:42:32.383334 2444 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4284-0-0-2-22d395eace\" not found" Mar 25 01:42:32.462491 systemd[1]: Created slice kubepods-burstable-pod197df92524af76987bea640189ae4987.slice - libcontainer container kubepods-burstable-pod197df92524af76987bea640189ae4987.slice. Mar 25 01:42:32.481679 systemd[1]: Created slice kubepods-burstable-pod4b33e2589c302fbb24afd0c8086a9508.slice - libcontainer container kubepods-burstable-pod4b33e2589c302fbb24afd0c8086a9508.slice. Mar 25 01:42:32.486543 kubelet[2444]: I0325 01:42:32.486504 2444 kubelet_node_status.go:72] "Attempting to register node" node="ci-4284-0-0-2-22d395eace" Mar 25 01:42:32.488255 kubelet[2444]: E0325 01:42:32.488220 2444 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://37.27.205.216:6443/api/v1/nodes\": dial tcp 37.27.205.216:6443: connect: connection refused" node="ci-4284-0-0-2-22d395eace" Mar 25 01:42:32.495397 systemd[1]: Created slice kubepods-burstable-pod0255575bf95f06b48b4043160949dd08.slice - libcontainer container kubepods-burstable-pod0255575bf95f06b48b4043160949dd08.slice. Mar 25 01:42:32.509201 kubelet[2444]: E0325 01:42:32.509110 2444 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://37.27.205.216:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-2-22d395eace?timeout=10s\": dial tcp 37.27.205.216:6443: connect: connection refused" interval="400ms" Mar 25 01:42:32.551201 sshd[2433]: Invalid user test from 182.75.65.22 port 37984 Mar 25 01:42:32.608995 kubelet[2444]: I0325 01:42:32.608784 2444 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/197df92524af76987bea640189ae4987-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4284-0-0-2-22d395eace\" (UID: \"197df92524af76987bea640189ae4987\") " pod="kube-system/kube-apiserver-ci-4284-0-0-2-22d395eace" Mar 25 01:42:32.608995 kubelet[2444]: I0325 01:42:32.608870 2444 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/4b33e2589c302fbb24afd0c8086a9508-flexvolume-dir\") pod \"kube-controller-manager-ci-4284-0-0-2-22d395eace\" (UID: \"4b33e2589c302fbb24afd0c8086a9508\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-2-22d395eace" Mar 25 01:42:32.608995 kubelet[2444]: I0325 01:42:32.608906 2444 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4b33e2589c302fbb24afd0c8086a9508-k8s-certs\") pod \"kube-controller-manager-ci-4284-0-0-2-22d395eace\" (UID: \"4b33e2589c302fbb24afd0c8086a9508\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-2-22d395eace" Mar 25 01:42:32.608995 kubelet[2444]: I0325 01:42:32.608936 2444 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4b33e2589c302fbb24afd0c8086a9508-kubeconfig\") pod \"kube-controller-manager-ci-4284-0-0-2-22d395eace\" (UID: \"4b33e2589c302fbb24afd0c8086a9508\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-2-22d395eace" Mar 25 01:42:32.608995 kubelet[2444]: I0325 01:42:32.608968 2444 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4b33e2589c302fbb24afd0c8086a9508-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4284-0-0-2-22d395eace\" (UID: \"4b33e2589c302fbb24afd0c8086a9508\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-2-22d395eace" Mar 25 01:42:32.609560 kubelet[2444]: I0325 01:42:32.608999 2444 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/197df92524af76987bea640189ae4987-k8s-certs\") pod \"kube-apiserver-ci-4284-0-0-2-22d395eace\" (UID: \"197df92524af76987bea640189ae4987\") " pod="kube-system/kube-apiserver-ci-4284-0-0-2-22d395eace" Mar 25 01:42:32.609560 kubelet[2444]: I0325 01:42:32.609029 2444 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4b33e2589c302fbb24afd0c8086a9508-ca-certs\") pod \"kube-controller-manager-ci-4284-0-0-2-22d395eace\" (UID: \"4b33e2589c302fbb24afd0c8086a9508\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-2-22d395eace" Mar 25 01:42:32.609560 kubelet[2444]: I0325 01:42:32.609063 2444 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0255575bf95f06b48b4043160949dd08-kubeconfig\") pod \"kube-scheduler-ci-4284-0-0-2-22d395eace\" (UID: \"0255575bf95f06b48b4043160949dd08\") " pod="kube-system/kube-scheduler-ci-4284-0-0-2-22d395eace" Mar 25 01:42:32.609560 kubelet[2444]: I0325 01:42:32.609099 2444 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/197df92524af76987bea640189ae4987-ca-certs\") pod \"kube-apiserver-ci-4284-0-0-2-22d395eace\" (UID: \"197df92524af76987bea640189ae4987\") " pod="kube-system/kube-apiserver-ci-4284-0-0-2-22d395eace" Mar 25 01:42:32.691760 kubelet[2444]: I0325 01:42:32.691701 2444 kubelet_node_status.go:72] "Attempting to register node" node="ci-4284-0-0-2-22d395eace" Mar 25 01:42:32.692314 kubelet[2444]: E0325 01:42:32.692242 2444 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://37.27.205.216:6443/api/v1/nodes\": dial tcp 37.27.205.216:6443: connect: connection refused" node="ci-4284-0-0-2-22d395eace" Mar 25 01:42:32.740259 sshd[2433]: Received disconnect from 182.75.65.22 port 37984:11: Bye Bye [preauth] Mar 25 01:42:32.740259 sshd[2433]: Disconnected from invalid user test 182.75.65.22 port 37984 [preauth] Mar 25 01:42:32.742314 systemd[1]: sshd@7-37.27.205.216:22-182.75.65.22:37984.service: Deactivated successfully. Mar 25 01:42:32.779398 containerd[1510]: time="2025-03-25T01:42:32.779233372Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4284-0-0-2-22d395eace,Uid:197df92524af76987bea640189ae4987,Namespace:kube-system,Attempt:0,}" Mar 25 01:42:32.793983 containerd[1510]: time="2025-03-25T01:42:32.793798823Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4284-0-0-2-22d395eace,Uid:4b33e2589c302fbb24afd0c8086a9508,Namespace:kube-system,Attempt:0,}" Mar 25 01:42:32.800216 containerd[1510]: time="2025-03-25T01:42:32.800131423Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4284-0-0-2-22d395eace,Uid:0255575bf95f06b48b4043160949dd08,Namespace:kube-system,Attempt:0,}" Mar 25 01:42:32.910699 kubelet[2444]: E0325 01:42:32.910649 2444 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://37.27.205.216:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-2-22d395eace?timeout=10s\": dial tcp 37.27.205.216:6443: connect: connection refused" interval="800ms" Mar 25 01:42:32.956125 containerd[1510]: time="2025-03-25T01:42:32.955470558Z" level=info msg="connecting to shim b75900d672c962881b1a7d8286d5b9d12bb66ee696b940a410090750f8d3081d" address="unix:///run/containerd/s/060e8041bd9de54d938f976d1a3daf834743e4e77a4ae70ab11d060ae7351049" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:42:32.956125 containerd[1510]: time="2025-03-25T01:42:32.955630840Z" level=info msg="connecting to shim 11aa8d9457861f7f8ba9791f55a3f65d4b34ff7670c942c7cbe0602bfb15d63e" address="unix:///run/containerd/s/b5c881daa7c290a37da4bc8c9ba91a462b44c04c6d2e1c328bc1f31529897a52" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:42:32.959446 containerd[1510]: time="2025-03-25T01:42:32.959194672Z" level=info msg="connecting to shim 9ed62a39cd25d8b996df411704454f064fecc0a1a8bcad2b63c5130cc222e590" address="unix:///run/containerd/s/97b96248d1e7f896cc9eba9b3517fc395f67a0e1dab544540d79c7d6b457a50e" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:42:33.065478 systemd[1]: Started cri-containerd-11aa8d9457861f7f8ba9791f55a3f65d4b34ff7670c942c7cbe0602bfb15d63e.scope - libcontainer container 11aa8d9457861f7f8ba9791f55a3f65d4b34ff7670c942c7cbe0602bfb15d63e. Mar 25 01:42:33.067045 systemd[1]: Started cri-containerd-9ed62a39cd25d8b996df411704454f064fecc0a1a8bcad2b63c5130cc222e590.scope - libcontainer container 9ed62a39cd25d8b996df411704454f064fecc0a1a8bcad2b63c5130cc222e590. Mar 25 01:42:33.068730 systemd[1]: Started cri-containerd-b75900d672c962881b1a7d8286d5b9d12bb66ee696b940a410090750f8d3081d.scope - libcontainer container b75900d672c962881b1a7d8286d5b9d12bb66ee696b940a410090750f8d3081d. Mar 25 01:42:33.098290 kubelet[2444]: I0325 01:42:33.095601 2444 kubelet_node_status.go:72] "Attempting to register node" node="ci-4284-0-0-2-22d395eace" Mar 25 01:42:33.098290 kubelet[2444]: E0325 01:42:33.095879 2444 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://37.27.205.216:6443/api/v1/nodes\": dial tcp 37.27.205.216:6443: connect: connection refused" node="ci-4284-0-0-2-22d395eace" Mar 25 01:42:33.143583 containerd[1510]: time="2025-03-25T01:42:33.143440244Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4284-0-0-2-22d395eace,Uid:197df92524af76987bea640189ae4987,Namespace:kube-system,Attempt:0,} returns sandbox id \"b75900d672c962881b1a7d8286d5b9d12bb66ee696b940a410090750f8d3081d\"" Mar 25 01:42:33.147150 containerd[1510]: time="2025-03-25T01:42:33.147075783Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4284-0-0-2-22d395eace,Uid:4b33e2589c302fbb24afd0c8086a9508,Namespace:kube-system,Attempt:0,} returns sandbox id \"9ed62a39cd25d8b996df411704454f064fecc0a1a8bcad2b63c5130cc222e590\"" Mar 25 01:42:33.149954 containerd[1510]: time="2025-03-25T01:42:33.149882013Z" level=info msg="CreateContainer within sandbox \"b75900d672c962881b1a7d8286d5b9d12bb66ee696b940a410090750f8d3081d\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 25 01:42:33.150561 containerd[1510]: time="2025-03-25T01:42:33.150130143Z" level=info msg="CreateContainer within sandbox \"9ed62a39cd25d8b996df411704454f064fecc0a1a8bcad2b63c5130cc222e590\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 25 01:42:33.153582 containerd[1510]: time="2025-03-25T01:42:33.153548848Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4284-0-0-2-22d395eace,Uid:0255575bf95f06b48b4043160949dd08,Namespace:kube-system,Attempt:0,} returns sandbox id \"11aa8d9457861f7f8ba9791f55a3f65d4b34ff7670c942c7cbe0602bfb15d63e\"" Mar 25 01:42:33.155493 containerd[1510]: time="2025-03-25T01:42:33.155473105Z" level=info msg="CreateContainer within sandbox \"11aa8d9457861f7f8ba9791f55a3f65d4b34ff7670c942c7cbe0602bfb15d63e\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 25 01:42:33.169216 containerd[1510]: time="2025-03-25T01:42:33.169171163Z" level=info msg="Container 5ffe9a5b2412695da7395c1cb820afbdf8e0c2fb63f7506797c96724e722284d: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:42:33.171404 containerd[1510]: time="2025-03-25T01:42:33.171365261Z" level=info msg="Container 4223d5b2978251a2813abe1182ea273ba3e217179a9115ea9a86ad4f6a67f871: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:42:33.181070 containerd[1510]: time="2025-03-25T01:42:33.181034456Z" level=info msg="Container 605c443f23fe4a316c1c2d2d537317a1eb6192d489d5597b46df202973e005dd: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:42:33.189954 containerd[1510]: time="2025-03-25T01:42:33.189909097Z" level=info msg="CreateContainer within sandbox \"b75900d672c962881b1a7d8286d5b9d12bb66ee696b940a410090750f8d3081d\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"5ffe9a5b2412695da7395c1cb820afbdf8e0c2fb63f7506797c96724e722284d\"" Mar 25 01:42:33.192099 containerd[1510]: time="2025-03-25T01:42:33.192053275Z" level=info msg="StartContainer for \"5ffe9a5b2412695da7395c1cb820afbdf8e0c2fb63f7506797c96724e722284d\"" Mar 25 01:42:33.194205 containerd[1510]: time="2025-03-25T01:42:33.194174940Z" level=info msg="connecting to shim 5ffe9a5b2412695da7395c1cb820afbdf8e0c2fb63f7506797c96724e722284d" address="unix:///run/containerd/s/060e8041bd9de54d938f976d1a3daf834743e4e77a4ae70ab11d060ae7351049" protocol=ttrpc version=3 Mar 25 01:42:33.196981 containerd[1510]: time="2025-03-25T01:42:33.196941119Z" level=info msg="CreateContainer within sandbox \"11aa8d9457861f7f8ba9791f55a3f65d4b34ff7670c942c7cbe0602bfb15d63e\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"605c443f23fe4a316c1c2d2d537317a1eb6192d489d5597b46df202973e005dd\"" Mar 25 01:42:33.199107 containerd[1510]: time="2025-03-25T01:42:33.197844962Z" level=info msg="CreateContainer within sandbox \"9ed62a39cd25d8b996df411704454f064fecc0a1a8bcad2b63c5130cc222e590\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"4223d5b2978251a2813abe1182ea273ba3e217179a9115ea9a86ad4f6a67f871\"" Mar 25 01:42:33.199107 containerd[1510]: time="2025-03-25T01:42:33.198262260Z" level=info msg="StartContainer for \"4223d5b2978251a2813abe1182ea273ba3e217179a9115ea9a86ad4f6a67f871\"" Mar 25 01:42:33.199107 containerd[1510]: time="2025-03-25T01:42:33.198311079Z" level=info msg="StartContainer for \"605c443f23fe4a316c1c2d2d537317a1eb6192d489d5597b46df202973e005dd\"" Mar 25 01:42:33.199107 containerd[1510]: time="2025-03-25T01:42:33.199091849Z" level=info msg="connecting to shim 605c443f23fe4a316c1c2d2d537317a1eb6192d489d5597b46df202973e005dd" address="unix:///run/containerd/s/b5c881daa7c290a37da4bc8c9ba91a462b44c04c6d2e1c328bc1f31529897a52" protocol=ttrpc version=3 Mar 25 01:42:33.200213 containerd[1510]: time="2025-03-25T01:42:33.200185345Z" level=info msg="connecting to shim 4223d5b2978251a2813abe1182ea273ba3e217179a9115ea9a86ad4f6a67f871" address="unix:///run/containerd/s/97b96248d1e7f896cc9eba9b3517fc395f67a0e1dab544540d79c7d6b457a50e" protocol=ttrpc version=3 Mar 25 01:42:33.214402 systemd[1]: Started cri-containerd-5ffe9a5b2412695da7395c1cb820afbdf8e0c2fb63f7506797c96724e722284d.scope - libcontainer container 5ffe9a5b2412695da7395c1cb820afbdf8e0c2fb63f7506797c96724e722284d. Mar 25 01:42:33.218595 systemd[1]: Started cri-containerd-605c443f23fe4a316c1c2d2d537317a1eb6192d489d5597b46df202973e005dd.scope - libcontainer container 605c443f23fe4a316c1c2d2d537317a1eb6192d489d5597b46df202973e005dd. Mar 25 01:42:33.231434 systemd[1]: Started cri-containerd-4223d5b2978251a2813abe1182ea273ba3e217179a9115ea9a86ad4f6a67f871.scope - libcontainer container 4223d5b2978251a2813abe1182ea273ba3e217179a9115ea9a86ad4f6a67f871. Mar 25 01:42:33.233261 kubelet[2444]: W0325 01:42:33.233185 2444 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://37.27.205.216:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-2-22d395eace&limit=500&resourceVersion=0": dial tcp 37.27.205.216:6443: connect: connection refused Mar 25 01:42:33.233539 kubelet[2444]: E0325 01:42:33.233505 2444 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://37.27.205.216:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-2-22d395eace&limit=500&resourceVersion=0\": dial tcp 37.27.205.216:6443: connect: connection refused" logger="UnhandledError" Mar 25 01:42:33.296859 containerd[1510]: time="2025-03-25T01:42:33.296814982Z" level=info msg="StartContainer for \"605c443f23fe4a316c1c2d2d537317a1eb6192d489d5597b46df202973e005dd\" returns successfully" Mar 25 01:42:33.297694 containerd[1510]: time="2025-03-25T01:42:33.297547925Z" level=info msg="StartContainer for \"5ffe9a5b2412695da7395c1cb820afbdf8e0c2fb63f7506797c96724e722284d\" returns successfully" Mar 25 01:42:33.318048 containerd[1510]: time="2025-03-25T01:42:33.317921077Z" level=info msg="StartContainer for \"4223d5b2978251a2813abe1182ea273ba3e217179a9115ea9a86ad4f6a67f871\" returns successfully" Mar 25 01:42:33.898488 kubelet[2444]: I0325 01:42:33.898437 2444 kubelet_node_status.go:72] "Attempting to register node" node="ci-4284-0-0-2-22d395eace" Mar 25 01:42:34.908139 kubelet[2444]: E0325 01:42:34.908086 2444 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4284-0-0-2-22d395eace\" not found" node="ci-4284-0-0-2-22d395eace" Mar 25 01:42:35.014221 kubelet[2444]: I0325 01:42:35.013966 2444 kubelet_node_status.go:75] "Successfully registered node" node="ci-4284-0-0-2-22d395eace" Mar 25 01:42:35.014221 kubelet[2444]: E0325 01:42:35.014023 2444 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"ci-4284-0-0-2-22d395eace\": node \"ci-4284-0-0-2-22d395eace\" not found" Mar 25 01:42:35.034244 kubelet[2444]: E0325 01:42:35.034185 2444 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284-0-0-2-22d395eace\" not found" Mar 25 01:42:35.134757 kubelet[2444]: E0325 01:42:35.134656 2444 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284-0-0-2-22d395eace\" not found" Mar 25 01:42:35.235536 kubelet[2444]: E0325 01:42:35.235464 2444 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284-0-0-2-22d395eace\" not found" Mar 25 01:42:35.336119 kubelet[2444]: E0325 01:42:35.336050 2444 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284-0-0-2-22d395eace\" not found" Mar 25 01:42:36.295017 kubelet[2444]: I0325 01:42:36.294925 2444 apiserver.go:52] "Watching apiserver" Mar 25 01:42:36.307602 kubelet[2444]: I0325 01:42:36.307528 2444 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 25 01:42:37.204455 systemd[1]: Reload requested from client PID 2712 ('systemctl') (unit session-7.scope)... Mar 25 01:42:37.204824 systemd[1]: Reloading... Mar 25 01:42:37.315301 zram_generator::config[2775]: No configuration found. Mar 25 01:42:37.405468 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 25 01:42:37.527662 systemd[1]: Reloading finished in 322 ms. Mar 25 01:42:37.552400 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:42:37.567848 systemd[1]: kubelet.service: Deactivated successfully. Mar 25 01:42:37.568105 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:42:37.568188 systemd[1]: kubelet.service: Consumed 1.011s CPU time, 114M memory peak. Mar 25 01:42:37.570950 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Mar 25 01:42:37.734102 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Mar 25 01:42:37.746207 (kubelet)[2808]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Mar 25 01:42:37.823603 kubelet[2808]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:42:37.823603 kubelet[2808]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 25 01:42:37.823603 kubelet[2808]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 25 01:42:37.823603 kubelet[2808]: I0325 01:42:37.823366 2808 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 25 01:42:37.833510 kubelet[2808]: I0325 01:42:37.833463 2808 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" Mar 25 01:42:37.833510 kubelet[2808]: I0325 01:42:37.833493 2808 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 25 01:42:37.833818 kubelet[2808]: I0325 01:42:37.833794 2808 server.go:929] "Client rotation is on, will bootstrap in background" Mar 25 01:42:37.835516 kubelet[2808]: I0325 01:42:37.835490 2808 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 25 01:42:37.840927 kubelet[2808]: I0325 01:42:37.840691 2808 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 25 01:42:37.846275 kubelet[2808]: I0325 01:42:37.846186 2808 server.go:1426] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 25 01:42:37.851660 kubelet[2808]: I0325 01:42:37.850993 2808 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 25 01:42:37.851660 kubelet[2808]: I0325 01:42:37.851126 2808 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 25 01:42:37.851660 kubelet[2808]: I0325 01:42:37.851213 2808 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 25 01:42:37.851660 kubelet[2808]: I0325 01:42:37.851235 2808 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4284-0-0-2-22d395eace","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 25 01:42:37.851918 kubelet[2808]: I0325 01:42:37.851507 2808 topology_manager.go:138] "Creating topology manager with none policy" Mar 25 01:42:37.851918 kubelet[2808]: I0325 01:42:37.851517 2808 container_manager_linux.go:300] "Creating device plugin manager" Mar 25 01:42:37.851918 kubelet[2808]: I0325 01:42:37.851552 2808 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:42:37.855691 kubelet[2808]: I0325 01:42:37.855454 2808 kubelet.go:408] "Attempting to sync node with API server" Mar 25 01:42:37.855691 kubelet[2808]: I0325 01:42:37.855479 2808 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 25 01:42:37.855691 kubelet[2808]: I0325 01:42:37.855528 2808 kubelet.go:314] "Adding apiserver pod source" Mar 25 01:42:37.855691 kubelet[2808]: I0325 01:42:37.855543 2808 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 25 01:42:37.864293 kubelet[2808]: I0325 01:42:37.861838 2808 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" Mar 25 01:42:37.864872 kubelet[2808]: I0325 01:42:37.864842 2808 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 25 01:42:37.878597 kubelet[2808]: I0325 01:42:37.877780 2808 server.go:1269] "Started kubelet" Mar 25 01:42:37.879111 kubelet[2808]: I0325 01:42:37.879063 2808 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 25 01:42:37.880956 kubelet[2808]: I0325 01:42:37.880944 2808 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 25 01:42:37.881204 kubelet[2808]: I0325 01:42:37.881182 2808 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 25 01:42:37.883126 kubelet[2808]: I0325 01:42:37.882213 2808 server.go:460] "Adding debug handlers to kubelet server" Mar 25 01:42:37.886257 kubelet[2808]: I0325 01:42:37.885693 2808 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 25 01:42:37.886257 kubelet[2808]: I0325 01:42:37.885991 2808 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Mar 25 01:42:37.891086 kubelet[2808]: I0325 01:42:37.890627 2808 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 25 01:42:37.891086 kubelet[2808]: I0325 01:42:37.890762 2808 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 25 01:42:37.891086 kubelet[2808]: I0325 01:42:37.890969 2808 reconciler.go:26] "Reconciler: start to sync state" Mar 25 01:42:37.893218 kubelet[2808]: I0325 01:42:37.893189 2808 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 25 01:42:37.894793 kubelet[2808]: E0325 01:42:37.894763 2808 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 25 01:42:37.897429 kubelet[2808]: I0325 01:42:37.897412 2808 factory.go:221] Registration of the containerd container factory successfully Mar 25 01:42:37.897523 kubelet[2808]: I0325 01:42:37.897518 2808 factory.go:221] Registration of the systemd container factory successfully Mar 25 01:42:37.907167 kubelet[2808]: I0325 01:42:37.906989 2808 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 25 01:42:37.910333 kubelet[2808]: I0325 01:42:37.910104 2808 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 25 01:42:37.910393 kubelet[2808]: I0325 01:42:37.910379 2808 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 25 01:42:37.910416 kubelet[2808]: I0325 01:42:37.910397 2808 kubelet.go:2321] "Starting kubelet main sync loop" Mar 25 01:42:37.910481 kubelet[2808]: E0325 01:42:37.910456 2808 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 25 01:42:37.940965 kubelet[2808]: I0325 01:42:37.940941 2808 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 25 01:42:37.941415 kubelet[2808]: I0325 01:42:37.941134 2808 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 25 01:42:37.941415 kubelet[2808]: I0325 01:42:37.941153 2808 state_mem.go:36] "Initialized new in-memory state store" Mar 25 01:42:37.941415 kubelet[2808]: I0325 01:42:37.941339 2808 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 25 01:42:37.941415 kubelet[2808]: I0325 01:42:37.941348 2808 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 25 01:42:37.941415 kubelet[2808]: I0325 01:42:37.941367 2808 policy_none.go:49] "None policy: Start" Mar 25 01:42:37.942004 kubelet[2808]: I0325 01:42:37.941994 2808 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 25 01:42:37.942085 kubelet[2808]: I0325 01:42:37.942073 2808 state_mem.go:35] "Initializing new in-memory state store" Mar 25 01:42:37.942358 kubelet[2808]: I0325 01:42:37.942347 2808 state_mem.go:75] "Updated machine memory state" Mar 25 01:42:37.946125 kubelet[2808]: I0325 01:42:37.946093 2808 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 25 01:42:37.946263 kubelet[2808]: I0325 01:42:37.946244 2808 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 25 01:42:37.946325 kubelet[2808]: I0325 01:42:37.946261 2808 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 25 01:42:37.946547 kubelet[2808]: I0325 01:42:37.946526 2808 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 25 01:42:38.055720 kubelet[2808]: I0325 01:42:38.055321 2808 kubelet_node_status.go:72] "Attempting to register node" node="ci-4284-0-0-2-22d395eace" Mar 25 01:42:38.067480 kubelet[2808]: I0325 01:42:38.067421 2808 kubelet_node_status.go:111] "Node was previously registered" node="ci-4284-0-0-2-22d395eace" Mar 25 01:42:38.067605 kubelet[2808]: I0325 01:42:38.067534 2808 kubelet_node_status.go:75] "Successfully registered node" node="ci-4284-0-0-2-22d395eace" Mar 25 01:42:38.092999 kubelet[2808]: I0325 01:42:38.092726 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4b33e2589c302fbb24afd0c8086a9508-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4284-0-0-2-22d395eace\" (UID: \"4b33e2589c302fbb24afd0c8086a9508\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-2-22d395eace" Mar 25 01:42:38.092999 kubelet[2808]: I0325 01:42:38.092760 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4b33e2589c302fbb24afd0c8086a9508-k8s-certs\") pod \"kube-controller-manager-ci-4284-0-0-2-22d395eace\" (UID: \"4b33e2589c302fbb24afd0c8086a9508\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-2-22d395eace" Mar 25 01:42:38.092999 kubelet[2808]: I0325 01:42:38.092777 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4b33e2589c302fbb24afd0c8086a9508-kubeconfig\") pod \"kube-controller-manager-ci-4284-0-0-2-22d395eace\" (UID: \"4b33e2589c302fbb24afd0c8086a9508\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-2-22d395eace" Mar 25 01:42:38.092999 kubelet[2808]: I0325 01:42:38.092791 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0255575bf95f06b48b4043160949dd08-kubeconfig\") pod \"kube-scheduler-ci-4284-0-0-2-22d395eace\" (UID: \"0255575bf95f06b48b4043160949dd08\") " pod="kube-system/kube-scheduler-ci-4284-0-0-2-22d395eace" Mar 25 01:42:38.092999 kubelet[2808]: I0325 01:42:38.092804 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/197df92524af76987bea640189ae4987-ca-certs\") pod \"kube-apiserver-ci-4284-0-0-2-22d395eace\" (UID: \"197df92524af76987bea640189ae4987\") " pod="kube-system/kube-apiserver-ci-4284-0-0-2-22d395eace" Mar 25 01:42:38.093218 kubelet[2808]: I0325 01:42:38.092817 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/197df92524af76987bea640189ae4987-k8s-certs\") pod \"kube-apiserver-ci-4284-0-0-2-22d395eace\" (UID: \"197df92524af76987bea640189ae4987\") " pod="kube-system/kube-apiserver-ci-4284-0-0-2-22d395eace" Mar 25 01:42:38.093218 kubelet[2808]: I0325 01:42:38.092831 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/197df92524af76987bea640189ae4987-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4284-0-0-2-22d395eace\" (UID: \"197df92524af76987bea640189ae4987\") " pod="kube-system/kube-apiserver-ci-4284-0-0-2-22d395eace" Mar 25 01:42:38.093218 kubelet[2808]: I0325 01:42:38.092846 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4b33e2589c302fbb24afd0c8086a9508-ca-certs\") pod \"kube-controller-manager-ci-4284-0-0-2-22d395eace\" (UID: \"4b33e2589c302fbb24afd0c8086a9508\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-2-22d395eace" Mar 25 01:42:38.093218 kubelet[2808]: I0325 01:42:38.092861 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/4b33e2589c302fbb24afd0c8086a9508-flexvolume-dir\") pod \"kube-controller-manager-ci-4284-0-0-2-22d395eace\" (UID: \"4b33e2589c302fbb24afd0c8086a9508\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-2-22d395eace" Mar 25 01:42:38.856700 kubelet[2808]: I0325 01:42:38.856566 2808 apiserver.go:52] "Watching apiserver" Mar 25 01:42:38.892302 kubelet[2808]: I0325 01:42:38.891001 2808 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 25 01:42:38.948109 kubelet[2808]: E0325 01:42:38.948011 2808 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4284-0-0-2-22d395eace\" already exists" pod="kube-system/kube-apiserver-ci-4284-0-0-2-22d395eace" Mar 25 01:42:38.991670 kubelet[2808]: I0325 01:42:38.991585 2808 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4284-0-0-2-22d395eace" podStartSLOduration=0.991535286 podStartE2EDuration="991.535286ms" podCreationTimestamp="2025-03-25 01:42:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:42:38.977418963 +0000 UTC m=+1.224460332" watchObservedRunningTime="2025-03-25 01:42:38.991535286 +0000 UTC m=+1.238576665" Mar 25 01:42:39.014992 kubelet[2808]: I0325 01:42:39.013977 2808 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4284-0-0-2-22d395eace" podStartSLOduration=1.013946315 podStartE2EDuration="1.013946315s" podCreationTimestamp="2025-03-25 01:42:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:42:38.992707766 +0000 UTC m=+1.239749135" watchObservedRunningTime="2025-03-25 01:42:39.013946315 +0000 UTC m=+1.260987684" Mar 25 01:42:39.043118 kubelet[2808]: I0325 01:42:39.042362 2808 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4284-0-0-2-22d395eace" podStartSLOduration=1.042335355 podStartE2EDuration="1.042335355s" podCreationTimestamp="2025-03-25 01:42:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:42:39.016261493 +0000 UTC m=+1.263302862" watchObservedRunningTime="2025-03-25 01:42:39.042335355 +0000 UTC m=+1.289376703" Mar 25 01:42:42.790053 kubelet[2808]: I0325 01:42:42.789957 2808 kuberuntime_manager.go:1633] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 25 01:42:42.791128 containerd[1510]: time="2025-03-25T01:42:42.790989782Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 25 01:42:42.791598 kubelet[2808]: I0325 01:42:42.791330 2808 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 25 01:42:43.562163 systemd[1]: Created slice kubepods-besteffort-pod421c58f0_6a8b_4f6b_9dd5_1d0145e49ad0.slice - libcontainer container kubepods-besteffort-pod421c58f0_6a8b_4f6b_9dd5_1d0145e49ad0.slice. Mar 25 01:42:43.626736 kubelet[2808]: I0325 01:42:43.626682 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/421c58f0-6a8b-4f6b-9dd5-1d0145e49ad0-kube-proxy\") pod \"kube-proxy-t5gfn\" (UID: \"421c58f0-6a8b-4f6b-9dd5-1d0145e49ad0\") " pod="kube-system/kube-proxy-t5gfn" Mar 25 01:42:43.626736 kubelet[2808]: I0325 01:42:43.626726 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/421c58f0-6a8b-4f6b-9dd5-1d0145e49ad0-xtables-lock\") pod \"kube-proxy-t5gfn\" (UID: \"421c58f0-6a8b-4f6b-9dd5-1d0145e49ad0\") " pod="kube-system/kube-proxy-t5gfn" Mar 25 01:42:43.626736 kubelet[2808]: I0325 01:42:43.626743 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/421c58f0-6a8b-4f6b-9dd5-1d0145e49ad0-lib-modules\") pod \"kube-proxy-t5gfn\" (UID: \"421c58f0-6a8b-4f6b-9dd5-1d0145e49ad0\") " pod="kube-system/kube-proxy-t5gfn" Mar 25 01:42:43.626966 kubelet[2808]: I0325 01:42:43.626758 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm97m\" (UniqueName: \"kubernetes.io/projected/421c58f0-6a8b-4f6b-9dd5-1d0145e49ad0-kube-api-access-gm97m\") pod \"kube-proxy-t5gfn\" (UID: \"421c58f0-6a8b-4f6b-9dd5-1d0145e49ad0\") " pod="kube-system/kube-proxy-t5gfn" Mar 25 01:42:43.647083 sudo[1894]: pam_unix(sudo:session): session closed for user root Mar 25 01:42:43.769915 systemd[1]: Created slice kubepods-besteffort-pod6d4318ca_0c1a_400c_a962_4070b7bcb2d1.slice - libcontainer container kubepods-besteffort-pod6d4318ca_0c1a_400c_a962_4070b7bcb2d1.slice. Mar 25 01:42:43.804621 sshd[1893]: Connection closed by 139.178.68.195 port 55236 Mar 25 01:42:43.805920 sshd-session[1891]: pam_unix(sshd:session): session closed for user core Mar 25 01:42:43.810214 systemd[1]: sshd@6-37.27.205.216:22-139.178.68.195:55236.service: Deactivated successfully. Mar 25 01:42:43.813710 systemd[1]: session-7.scope: Deactivated successfully. Mar 25 01:42:43.814689 systemd[1]: session-7.scope: Consumed 6.096s CPU time, 155.3M memory peak. Mar 25 01:42:43.816483 systemd-logind[1499]: Session 7 logged out. Waiting for processes to exit. Mar 25 01:42:43.818118 systemd-logind[1499]: Removed session 7. Mar 25 01:42:43.828818 kubelet[2808]: I0325 01:42:43.828726 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/6d4318ca-0c1a-400c-a962-4070b7bcb2d1-var-lib-calico\") pod \"tigera-operator-64ff5465b7-4dc9c\" (UID: \"6d4318ca-0c1a-400c-a962-4070b7bcb2d1\") " pod="tigera-operator/tigera-operator-64ff5465b7-4dc9c" Mar 25 01:42:43.828818 kubelet[2808]: I0325 01:42:43.828835 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fv8l\" (UniqueName: \"kubernetes.io/projected/6d4318ca-0c1a-400c-a962-4070b7bcb2d1-kube-api-access-8fv8l\") pod \"tigera-operator-64ff5465b7-4dc9c\" (UID: \"6d4318ca-0c1a-400c-a962-4070b7bcb2d1\") " pod="tigera-operator/tigera-operator-64ff5465b7-4dc9c" Mar 25 01:42:43.873020 containerd[1510]: time="2025-03-25T01:42:43.872864854Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-t5gfn,Uid:421c58f0-6a8b-4f6b-9dd5-1d0145e49ad0,Namespace:kube-system,Attempt:0,}" Mar 25 01:42:43.904752 containerd[1510]: time="2025-03-25T01:42:43.904248610Z" level=info msg="connecting to shim 160080cfd9231dc5066cf2b254059a21d06e7240e1ade036209bf5d7e390b3cb" address="unix:///run/containerd/s/9e0de2bb3ebf51e7c59f8db290143b18debd4c9cb2e3d99804fef8fb52313a8e" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:42:43.947556 systemd[1]: Started cri-containerd-160080cfd9231dc5066cf2b254059a21d06e7240e1ade036209bf5d7e390b3cb.scope - libcontainer container 160080cfd9231dc5066cf2b254059a21d06e7240e1ade036209bf5d7e390b3cb. Mar 25 01:42:44.003461 containerd[1510]: time="2025-03-25T01:42:44.003409685Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-t5gfn,Uid:421c58f0-6a8b-4f6b-9dd5-1d0145e49ad0,Namespace:kube-system,Attempt:0,} returns sandbox id \"160080cfd9231dc5066cf2b254059a21d06e7240e1ade036209bf5d7e390b3cb\"" Mar 25 01:42:44.006708 containerd[1510]: time="2025-03-25T01:42:44.006658238Z" level=info msg="CreateContainer within sandbox \"160080cfd9231dc5066cf2b254059a21d06e7240e1ade036209bf5d7e390b3cb\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 25 01:42:44.019315 containerd[1510]: time="2025-03-25T01:42:44.018607235Z" level=info msg="Container c7e50e048cdb37ba3393de5467a0d9622d5e24bc9a99e0c5f9bb7cdc307e040b: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:42:44.029808 containerd[1510]: time="2025-03-25T01:42:44.029686678Z" level=info msg="CreateContainer within sandbox \"160080cfd9231dc5066cf2b254059a21d06e7240e1ade036209bf5d7e390b3cb\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"c7e50e048cdb37ba3393de5467a0d9622d5e24bc9a99e0c5f9bb7cdc307e040b\"" Mar 25 01:42:44.030633 containerd[1510]: time="2025-03-25T01:42:44.030475023Z" level=info msg="StartContainer for \"c7e50e048cdb37ba3393de5467a0d9622d5e24bc9a99e0c5f9bb7cdc307e040b\"" Mar 25 01:42:44.032209 containerd[1510]: time="2025-03-25T01:42:44.032134304Z" level=info msg="connecting to shim c7e50e048cdb37ba3393de5467a0d9622d5e24bc9a99e0c5f9bb7cdc307e040b" address="unix:///run/containerd/s/9e0de2bb3ebf51e7c59f8db290143b18debd4c9cb2e3d99804fef8fb52313a8e" protocol=ttrpc version=3 Mar 25 01:42:44.051388 systemd[1]: Started cri-containerd-c7e50e048cdb37ba3393de5467a0d9622d5e24bc9a99e0c5f9bb7cdc307e040b.scope - libcontainer container c7e50e048cdb37ba3393de5467a0d9622d5e24bc9a99e0c5f9bb7cdc307e040b. Mar 25 01:42:44.073372 containerd[1510]: time="2025-03-25T01:42:44.073236474Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-64ff5465b7-4dc9c,Uid:6d4318ca-0c1a-400c-a962-4070b7bcb2d1,Namespace:tigera-operator,Attempt:0,}" Mar 25 01:42:44.096158 containerd[1510]: time="2025-03-25T01:42:44.095976978Z" level=info msg="StartContainer for \"c7e50e048cdb37ba3393de5467a0d9622d5e24bc9a99e0c5f9bb7cdc307e040b\" returns successfully" Mar 25 01:42:44.103843 containerd[1510]: time="2025-03-25T01:42:44.103754059Z" level=info msg="connecting to shim 562b23990bdbcbceae8c5df4d9df819eeadfae554872eb1ab18cdd95bec4f5bb" address="unix:///run/containerd/s/2fc1518a171544630245ecb0f5d571d71f69d21f9bb42d6e5843b484985370d5" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:42:44.126439 systemd[1]: Started cri-containerd-562b23990bdbcbceae8c5df4d9df819eeadfae554872eb1ab18cdd95bec4f5bb.scope - libcontainer container 562b23990bdbcbceae8c5df4d9df819eeadfae554872eb1ab18cdd95bec4f5bb. Mar 25 01:42:44.182991 containerd[1510]: time="2025-03-25T01:42:44.182848322Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-64ff5465b7-4dc9c,Uid:6d4318ca-0c1a-400c-a962-4070b7bcb2d1,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"562b23990bdbcbceae8c5df4d9df819eeadfae554872eb1ab18cdd95bec4f5bb\"" Mar 25 01:42:44.183988 systemd[1]: Started sshd@8-37.27.205.216:22-187.110.238.50:36620.service - OpenSSH per-connection server daemon (187.110.238.50:36620). Mar 25 01:42:44.185879 containerd[1510]: time="2025-03-25T01:42:44.185563939Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\"" Mar 25 01:42:44.765670 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3574098745.mount: Deactivated successfully. Mar 25 01:42:44.964034 kubelet[2808]: I0325 01:42:44.963511 2808 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-t5gfn" podStartSLOduration=1.9634838060000002 podStartE2EDuration="1.963483806s" podCreationTimestamp="2025-03-25 01:42:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:42:44.963402717 +0000 UTC m=+7.210444096" watchObservedRunningTime="2025-03-25 01:42:44.963483806 +0000 UTC m=+7.210525196" Mar 25 01:42:45.202054 sshd[3010]: Invalid user marcelo from 187.110.238.50 port 36620 Mar 25 01:42:45.389835 sshd[3010]: Received disconnect from 187.110.238.50 port 36620:11: Bye Bye [preauth] Mar 25 01:42:45.389835 sshd[3010]: Disconnected from invalid user marcelo 187.110.238.50 port 36620 [preauth] Mar 25 01:42:45.392621 systemd[1]: sshd@8-37.27.205.216:22-187.110.238.50:36620.service: Deactivated successfully. Mar 25 01:42:46.569169 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount183976600.mount: Deactivated successfully. Mar 25 01:42:46.946437 containerd[1510]: time="2025-03-25T01:42:46.946388521Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:42:46.947602 containerd[1510]: time="2025-03-25T01:42:46.947563508Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.5: active requests=0, bytes read=21945008" Mar 25 01:42:46.949078 containerd[1510]: time="2025-03-25T01:42:46.949008379Z" level=info msg="ImageCreate event name:\"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:42:46.951655 containerd[1510]: time="2025-03-25T01:42:46.951527601Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:42:46.952277 containerd[1510]: time="2025-03-25T01:42:46.952027447Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.5\" with image id \"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\", repo tag \"quay.io/tigera/operator:v1.36.5\", repo digest \"quay.io/tigera/operator@sha256:3341fa9475c0325b86228c8726389f9bae9fd6c430c66fe5cd5dc39d7bb6ad4b\", size \"21941003\" in 2.766441248s" Mar 25 01:42:46.952277 containerd[1510]: time="2025-03-25T01:42:46.952054978Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.5\" returns image reference \"sha256:dc4a8a56c133edb1bc4c3d6bc94bcd96f2bde82413370cb1783ac2d7f3a46d53\"" Mar 25 01:42:46.954190 containerd[1510]: time="2025-03-25T01:42:46.954124224Z" level=info msg="CreateContainer within sandbox \"562b23990bdbcbceae8c5df4d9df819eeadfae554872eb1ab18cdd95bec4f5bb\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 25 01:42:46.963381 containerd[1510]: time="2025-03-25T01:42:46.961966062Z" level=info msg="Container 5e93f55ab361b57e046d59cced7c1cf4f15235f712f35eddeb09576206986847: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:42:46.969902 containerd[1510]: time="2025-03-25T01:42:46.969862791Z" level=info msg="CreateContainer within sandbox \"562b23990bdbcbceae8c5df4d9df819eeadfae554872eb1ab18cdd95bec4f5bb\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"5e93f55ab361b57e046d59cced7c1cf4f15235f712f35eddeb09576206986847\"" Mar 25 01:42:46.970782 containerd[1510]: time="2025-03-25T01:42:46.970749918Z" level=info msg="StartContainer for \"5e93f55ab361b57e046d59cced7c1cf4f15235f712f35eddeb09576206986847\"" Mar 25 01:42:46.972179 containerd[1510]: time="2025-03-25T01:42:46.972147252Z" level=info msg="connecting to shim 5e93f55ab361b57e046d59cced7c1cf4f15235f712f35eddeb09576206986847" address="unix:///run/containerd/s/2fc1518a171544630245ecb0f5d571d71f69d21f9bb42d6e5843b484985370d5" protocol=ttrpc version=3 Mar 25 01:42:46.992396 systemd[1]: Started cri-containerd-5e93f55ab361b57e046d59cced7c1cf4f15235f712f35eddeb09576206986847.scope - libcontainer container 5e93f55ab361b57e046d59cced7c1cf4f15235f712f35eddeb09576206986847. Mar 25 01:42:47.024984 containerd[1510]: time="2025-03-25T01:42:47.024729292Z" level=info msg="StartContainer for \"5e93f55ab361b57e046d59cced7c1cf4f15235f712f35eddeb09576206986847\" returns successfully" Mar 25 01:42:50.001451 kubelet[2808]: I0325 01:42:50.001352 2808 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-64ff5465b7-4dc9c" podStartSLOduration=4.233225447 podStartE2EDuration="7.001326027s" podCreationTimestamp="2025-03-25 01:42:43 +0000 UTC" firstStartedPulling="2025-03-25 01:42:44.184783869 +0000 UTC m=+6.431825219" lastFinishedPulling="2025-03-25 01:42:46.95288445 +0000 UTC m=+9.199925799" observedRunningTime="2025-03-25 01:42:47.976936491 +0000 UTC m=+10.223977870" watchObservedRunningTime="2025-03-25 01:42:50.001326027 +0000 UTC m=+12.248367386" Mar 25 01:42:50.014010 systemd[1]: Created slice kubepods-besteffort-podbeea1c13_1126_4856_8678_6d1288c13344.slice - libcontainer container kubepods-besteffort-podbeea1c13_1126_4856_8678_6d1288c13344.slice. Mar 25 01:42:50.068799 kubelet[2808]: I0325 01:42:50.068654 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/beea1c13-1126-4856-8678-6d1288c13344-tigera-ca-bundle\") pod \"calico-typha-797cd4f474-8dxgc\" (UID: \"beea1c13-1126-4856-8678-6d1288c13344\") " pod="calico-system/calico-typha-797cd4f474-8dxgc" Mar 25 01:42:50.068799 kubelet[2808]: I0325 01:42:50.068703 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/beea1c13-1126-4856-8678-6d1288c13344-typha-certs\") pod \"calico-typha-797cd4f474-8dxgc\" (UID: \"beea1c13-1126-4856-8678-6d1288c13344\") " pod="calico-system/calico-typha-797cd4f474-8dxgc" Mar 25 01:42:50.068799 kubelet[2808]: I0325 01:42:50.068720 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp75m\" (UniqueName: \"kubernetes.io/projected/beea1c13-1126-4856-8678-6d1288c13344-kube-api-access-vp75m\") pod \"calico-typha-797cd4f474-8dxgc\" (UID: \"beea1c13-1126-4856-8678-6d1288c13344\") " pod="calico-system/calico-typha-797cd4f474-8dxgc" Mar 25 01:42:50.213887 systemd[1]: Created slice kubepods-besteffort-podaf38d92c_ae35_4a97_a61b_d456c8bb8793.slice - libcontainer container kubepods-besteffort-podaf38d92c_ae35_4a97_a61b_d456c8bb8793.slice. Mar 25 01:42:50.270895 kubelet[2808]: I0325 01:42:50.270771 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/af38d92c-ae35-4a97-a61b-d456c8bb8793-cni-bin-dir\") pod \"calico-node-zhqrm\" (UID: \"af38d92c-ae35-4a97-a61b-d456c8bb8793\") " pod="calico-system/calico-node-zhqrm" Mar 25 01:42:50.271069 kubelet[2808]: I0325 01:42:50.271057 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/af38d92c-ae35-4a97-a61b-d456c8bb8793-node-certs\") pod \"calico-node-zhqrm\" (UID: \"af38d92c-ae35-4a97-a61b-d456c8bb8793\") " pod="calico-system/calico-node-zhqrm" Mar 25 01:42:50.271151 kubelet[2808]: I0325 01:42:50.271137 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/af38d92c-ae35-4a97-a61b-d456c8bb8793-flexvol-driver-host\") pod \"calico-node-zhqrm\" (UID: \"af38d92c-ae35-4a97-a61b-d456c8bb8793\") " pod="calico-system/calico-node-zhqrm" Mar 25 01:42:50.271406 kubelet[2808]: I0325 01:42:50.271217 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/af38d92c-ae35-4a97-a61b-d456c8bb8793-xtables-lock\") pod \"calico-node-zhqrm\" (UID: \"af38d92c-ae35-4a97-a61b-d456c8bb8793\") " pod="calico-system/calico-node-zhqrm" Mar 25 01:42:50.271406 kubelet[2808]: I0325 01:42:50.271236 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/af38d92c-ae35-4a97-a61b-d456c8bb8793-policysync\") pod \"calico-node-zhqrm\" (UID: \"af38d92c-ae35-4a97-a61b-d456c8bb8793\") " pod="calico-system/calico-node-zhqrm" Mar 25 01:42:50.271406 kubelet[2808]: I0325 01:42:50.271254 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af38d92c-ae35-4a97-a61b-d456c8bb8793-tigera-ca-bundle\") pod \"calico-node-zhqrm\" (UID: \"af38d92c-ae35-4a97-a61b-d456c8bb8793\") " pod="calico-system/calico-node-zhqrm" Mar 25 01:42:50.271839 kubelet[2808]: I0325 01:42:50.271612 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/af38d92c-ae35-4a97-a61b-d456c8bb8793-var-run-calico\") pod \"calico-node-zhqrm\" (UID: \"af38d92c-ae35-4a97-a61b-d456c8bb8793\") " pod="calico-system/calico-node-zhqrm" Mar 25 01:42:50.271839 kubelet[2808]: I0325 01:42:50.271648 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/af38d92c-ae35-4a97-a61b-d456c8bb8793-cni-log-dir\") pod \"calico-node-zhqrm\" (UID: \"af38d92c-ae35-4a97-a61b-d456c8bb8793\") " pod="calico-system/calico-node-zhqrm" Mar 25 01:42:50.271839 kubelet[2808]: I0325 01:42:50.271665 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7pvj\" (UniqueName: \"kubernetes.io/projected/af38d92c-ae35-4a97-a61b-d456c8bb8793-kube-api-access-c7pvj\") pod \"calico-node-zhqrm\" (UID: \"af38d92c-ae35-4a97-a61b-d456c8bb8793\") " pod="calico-system/calico-node-zhqrm" Mar 25 01:42:50.271839 kubelet[2808]: I0325 01:42:50.271681 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/af38d92c-ae35-4a97-a61b-d456c8bb8793-lib-modules\") pod \"calico-node-zhqrm\" (UID: \"af38d92c-ae35-4a97-a61b-d456c8bb8793\") " pod="calico-system/calico-node-zhqrm" Mar 25 01:42:50.271839 kubelet[2808]: I0325 01:42:50.271703 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/af38d92c-ae35-4a97-a61b-d456c8bb8793-var-lib-calico\") pod \"calico-node-zhqrm\" (UID: \"af38d92c-ae35-4a97-a61b-d456c8bb8793\") " pod="calico-system/calico-node-zhqrm" Mar 25 01:42:50.272001 kubelet[2808]: I0325 01:42:50.271724 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/af38d92c-ae35-4a97-a61b-d456c8bb8793-cni-net-dir\") pod \"calico-node-zhqrm\" (UID: \"af38d92c-ae35-4a97-a61b-d456c8bb8793\") " pod="calico-system/calico-node-zhqrm" Mar 25 01:42:50.319076 containerd[1510]: time="2025-03-25T01:42:50.318750703Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-797cd4f474-8dxgc,Uid:beea1c13-1126-4856-8678-6d1288c13344,Namespace:calico-system,Attempt:0,}" Mar 25 01:42:50.358372 containerd[1510]: time="2025-03-25T01:42:50.357915837Z" level=info msg="connecting to shim ff91ea43e43a3424f25cd2362cc68a772a781392062b7ee226bf4ec33ca2866f" address="unix:///run/containerd/s/f6dfb1040e2f93a4269bce032eeb025eae2ad717dadcfe7cfc5f8e941d93accf" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:42:50.383000 kubelet[2808]: E0325 01:42:50.382973 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.383388 kubelet[2808]: W0325 01:42:50.383332 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.383388 kubelet[2808]: E0325 01:42:50.383359 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.399937 kubelet[2808]: E0325 01:42:50.399158 2808 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-j9x72" podUID="16e59d76-015e-4d72-b105-76b3e3dd930d" Mar 25 01:42:50.415416 systemd[1]: Started cri-containerd-ff91ea43e43a3424f25cd2362cc68a772a781392062b7ee226bf4ec33ca2866f.scope - libcontainer container ff91ea43e43a3424f25cd2362cc68a772a781392062b7ee226bf4ec33ca2866f. Mar 25 01:42:50.420562 kubelet[2808]: E0325 01:42:50.420522 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.420562 kubelet[2808]: W0325 01:42:50.420544 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.421001 kubelet[2808]: E0325 01:42:50.420582 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.473831 kubelet[2808]: E0325 01:42:50.473783 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.473831 kubelet[2808]: W0325 01:42:50.473806 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.473831 kubelet[2808]: E0325 01:42:50.473826 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.474998 kubelet[2808]: E0325 01:42:50.474969 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.474998 kubelet[2808]: W0325 01:42:50.474984 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.474998 kubelet[2808]: E0325 01:42:50.474994 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.475232 kubelet[2808]: E0325 01:42:50.475215 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.475232 kubelet[2808]: W0325 01:42:50.475229 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.475311 kubelet[2808]: E0325 01:42:50.475236 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.475431 kubelet[2808]: E0325 01:42:50.475401 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.475431 kubelet[2808]: W0325 01:42:50.475423 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.475431 kubelet[2808]: E0325 01:42:50.475430 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.476347 kubelet[2808]: E0325 01:42:50.476327 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.476389 kubelet[2808]: W0325 01:42:50.476342 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.476389 kubelet[2808]: E0325 01:42:50.476372 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.477095 kubelet[2808]: E0325 01:42:50.476929 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.477095 kubelet[2808]: W0325 01:42:50.476954 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.477095 kubelet[2808]: E0325 01:42:50.476979 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.477239 kubelet[2808]: E0325 01:42:50.477230 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.477308 kubelet[2808]: W0325 01:42:50.477299 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.477422 kubelet[2808]: E0325 01:42:50.477349 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.477540 kubelet[2808]: E0325 01:42:50.477532 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.477692 kubelet[2808]: W0325 01:42:50.477599 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.477692 kubelet[2808]: E0325 01:42:50.477612 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.477869 kubelet[2808]: E0325 01:42:50.477798 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.477869 kubelet[2808]: W0325 01:42:50.477806 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.477869 kubelet[2808]: E0325 01:42:50.477816 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.478396 kubelet[2808]: E0325 01:42:50.478386 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.478541 kubelet[2808]: W0325 01:42:50.478468 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.478541 kubelet[2808]: E0325 01:42:50.478481 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.478775 kubelet[2808]: E0325 01:42:50.478693 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.478775 kubelet[2808]: W0325 01:42:50.478702 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.478775 kubelet[2808]: E0325 01:42:50.478711 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.478890 kubelet[2808]: E0325 01:42:50.478883 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.478938 kubelet[2808]: W0325 01:42:50.478930 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.478980 kubelet[2808]: E0325 01:42:50.478972 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.479206 kubelet[2808]: E0325 01:42:50.479137 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.479206 kubelet[2808]: W0325 01:42:50.479145 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.479206 kubelet[2808]: E0325 01:42:50.479153 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.479432 kubelet[2808]: E0325 01:42:50.479423 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.480174 kubelet[2808]: W0325 01:42:50.479480 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.480174 kubelet[2808]: E0325 01:42:50.479492 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.480369 kubelet[2808]: E0325 01:42:50.480353 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.480490 kubelet[2808]: W0325 01:42:50.480415 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.480490 kubelet[2808]: E0325 01:42:50.480426 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.480691 kubelet[2808]: E0325 01:42:50.480603 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.480691 kubelet[2808]: W0325 01:42:50.480611 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.480691 kubelet[2808]: E0325 01:42:50.480619 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.480811 kubelet[2808]: E0325 01:42:50.480803 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.480852 kubelet[2808]: W0325 01:42:50.480846 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.480909 kubelet[2808]: E0325 01:42:50.480887 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.481309 kubelet[2808]: E0325 01:42:50.481299 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.481445 kubelet[2808]: W0325 01:42:50.481370 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.481445 kubelet[2808]: E0325 01:42:50.481381 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.481704 kubelet[2808]: E0325 01:42:50.481628 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.481704 kubelet[2808]: W0325 01:42:50.481637 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.481704 kubelet[2808]: E0325 01:42:50.481645 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.481836 kubelet[2808]: E0325 01:42:50.481829 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.481949 kubelet[2808]: W0325 01:42:50.481873 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.481949 kubelet[2808]: E0325 01:42:50.481884 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.508647 containerd[1510]: time="2025-03-25T01:42:50.508602297Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-797cd4f474-8dxgc,Uid:beea1c13-1126-4856-8678-6d1288c13344,Namespace:calico-system,Attempt:0,} returns sandbox id \"ff91ea43e43a3424f25cd2362cc68a772a781392062b7ee226bf4ec33ca2866f\"" Mar 25 01:42:50.512436 containerd[1510]: time="2025-03-25T01:42:50.512403977Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\"" Mar 25 01:42:50.522619 containerd[1510]: time="2025-03-25T01:42:50.521729826Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-zhqrm,Uid:af38d92c-ae35-4a97-a61b-d456c8bb8793,Namespace:calico-system,Attempt:0,}" Mar 25 01:42:50.546713 containerd[1510]: time="2025-03-25T01:42:50.546640719Z" level=info msg="connecting to shim ac5ce008bbd5beec4e6952cb387d4d544b24191494433280834f716b1577a4ef" address="unix:///run/containerd/s/aceee51305f88e069eedbf14f9091e5a9652a84bb9c26020fe60f71df2b9280d" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:42:50.571885 systemd[1]: Started cri-containerd-ac5ce008bbd5beec4e6952cb387d4d544b24191494433280834f716b1577a4ef.scope - libcontainer container ac5ce008bbd5beec4e6952cb387d4d544b24191494433280834f716b1577a4ef. Mar 25 01:42:50.575601 kubelet[2808]: E0325 01:42:50.575436 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.575601 kubelet[2808]: W0325 01:42:50.575471 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.575601 kubelet[2808]: E0325 01:42:50.575493 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.575601 kubelet[2808]: I0325 01:42:50.575521 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/16e59d76-015e-4d72-b105-76b3e3dd930d-socket-dir\") pod \"csi-node-driver-j9x72\" (UID: \"16e59d76-015e-4d72-b105-76b3e3dd930d\") " pod="calico-system/csi-node-driver-j9x72" Mar 25 01:42:50.575984 kubelet[2808]: E0325 01:42:50.575878 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.575984 kubelet[2808]: W0325 01:42:50.575990 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.576253 kubelet[2808]: E0325 01:42:50.576231 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.576528 kubelet[2808]: E0325 01:42:50.576503 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.576528 kubelet[2808]: W0325 01:42:50.576521 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.576607 kubelet[2808]: E0325 01:42:50.576533 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.576607 kubelet[2808]: I0325 01:42:50.576569 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/16e59d76-015e-4d72-b105-76b3e3dd930d-kubelet-dir\") pod \"csi-node-driver-j9x72\" (UID: \"16e59d76-015e-4d72-b105-76b3e3dd930d\") " pod="calico-system/csi-node-driver-j9x72" Mar 25 01:42:50.577290 kubelet[2808]: E0325 01:42:50.577257 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.577647 kubelet[2808]: W0325 01:42:50.577625 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.577647 kubelet[2808]: E0325 01:42:50.577645 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.578343 kubelet[2808]: E0325 01:42:50.578318 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.578410 kubelet[2808]: W0325 01:42:50.578397 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.578436 kubelet[2808]: E0325 01:42:50.578413 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.578869 kubelet[2808]: E0325 01:42:50.578848 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.579008 kubelet[2808]: W0325 01:42:50.578978 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.579136 kubelet[2808]: E0325 01:42:50.579108 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.579374 kubelet[2808]: E0325 01:42:50.579349 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.579374 kubelet[2808]: W0325 01:42:50.579363 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.579374 kubelet[2808]: E0325 01:42:50.579372 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.579466 kubelet[2808]: I0325 01:42:50.579399 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/16e59d76-015e-4d72-b105-76b3e3dd930d-registration-dir\") pod \"csi-node-driver-j9x72\" (UID: \"16e59d76-015e-4d72-b105-76b3e3dd930d\") " pod="calico-system/csi-node-driver-j9x72" Mar 25 01:42:50.580016 kubelet[2808]: E0325 01:42:50.579991 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.580016 kubelet[2808]: W0325 01:42:50.580010 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.580078 kubelet[2808]: E0325 01:42:50.580024 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.580409 kubelet[2808]: I0325 01:42:50.580387 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5spmb\" (UniqueName: \"kubernetes.io/projected/16e59d76-015e-4d72-b105-76b3e3dd930d-kube-api-access-5spmb\") pod \"csi-node-driver-j9x72\" (UID: \"16e59d76-015e-4d72-b105-76b3e3dd930d\") " pod="calico-system/csi-node-driver-j9x72" Mar 25 01:42:50.581074 kubelet[2808]: E0325 01:42:50.581050 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.581074 kubelet[2808]: W0325 01:42:50.581066 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.581251 kubelet[2808]: E0325 01:42:50.581100 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.581347 kubelet[2808]: E0325 01:42:50.581325 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.581347 kubelet[2808]: W0325 01:42:50.581341 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.581433 kubelet[2808]: E0325 01:42:50.581352 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.581433 kubelet[2808]: I0325 01:42:50.581382 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/16e59d76-015e-4d72-b105-76b3e3dd930d-varrun\") pod \"csi-node-driver-j9x72\" (UID: \"16e59d76-015e-4d72-b105-76b3e3dd930d\") " pod="calico-system/csi-node-driver-j9x72" Mar 25 01:42:50.581933 kubelet[2808]: E0325 01:42:50.581910 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.581933 kubelet[2808]: W0325 01:42:50.581927 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.582133 kubelet[2808]: E0325 01:42:50.582110 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.582888 kubelet[2808]: E0325 01:42:50.582864 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.582888 kubelet[2808]: W0325 01:42:50.582882 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.582888 kubelet[2808]: E0325 01:42:50.582890 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.583779 kubelet[2808]: E0325 01:42:50.583755 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.583779 kubelet[2808]: W0325 01:42:50.583774 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.583779 kubelet[2808]: E0325 01:42:50.583787 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.584419 kubelet[2808]: E0325 01:42:50.584403 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.584419 kubelet[2808]: W0325 01:42:50.584416 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.584495 kubelet[2808]: E0325 01:42:50.584425 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.585069 kubelet[2808]: E0325 01:42:50.585048 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.585069 kubelet[2808]: W0325 01:42:50.585061 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.585069 kubelet[2808]: E0325 01:42:50.585070 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.620155 containerd[1510]: time="2025-03-25T01:42:50.620062611Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-zhqrm,Uid:af38d92c-ae35-4a97-a61b-d456c8bb8793,Namespace:calico-system,Attempt:0,} returns sandbox id \"ac5ce008bbd5beec4e6952cb387d4d544b24191494433280834f716b1577a4ef\"" Mar 25 01:42:50.684905 kubelet[2808]: E0325 01:42:50.684646 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.684905 kubelet[2808]: W0325 01:42:50.684689 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.684905 kubelet[2808]: E0325 01:42:50.684710 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.685317 kubelet[2808]: E0325 01:42:50.685291 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.685317 kubelet[2808]: W0325 01:42:50.685304 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.685317 kubelet[2808]: E0325 01:42:50.685316 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.685736 kubelet[2808]: E0325 01:42:50.685707 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.685795 kubelet[2808]: W0325 01:42:50.685760 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.685795 kubelet[2808]: E0325 01:42:50.685777 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.686079 kubelet[2808]: E0325 01:42:50.686039 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.686079 kubelet[2808]: W0325 01:42:50.686064 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.686194 kubelet[2808]: E0325 01:42:50.686135 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.686686 kubelet[2808]: E0325 01:42:50.686328 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.686686 kubelet[2808]: W0325 01:42:50.686355 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.686686 kubelet[2808]: E0325 01:42:50.686463 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.686686 kubelet[2808]: E0325 01:42:50.686622 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.686686 kubelet[2808]: W0325 01:42:50.686629 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.686686 kubelet[2808]: E0325 01:42:50.686639 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.686965 kubelet[2808]: E0325 01:42:50.686910 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.686965 kubelet[2808]: W0325 01:42:50.686918 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.686965 kubelet[2808]: E0325 01:42:50.686927 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.687112 kubelet[2808]: E0325 01:42:50.687082 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.687112 kubelet[2808]: W0325 01:42:50.687095 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.687185 kubelet[2808]: E0325 01:42:50.687146 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.687325 kubelet[2808]: E0325 01:42:50.687255 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.687325 kubelet[2808]: W0325 01:42:50.687314 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.687418 kubelet[2808]: E0325 01:42:50.687396 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.687529 kubelet[2808]: E0325 01:42:50.687504 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.687529 kubelet[2808]: W0325 01:42:50.687520 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.687638 kubelet[2808]: E0325 01:42:50.687563 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.687742 kubelet[2808]: E0325 01:42:50.687718 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.687742 kubelet[2808]: W0325 01:42:50.687732 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.687838 kubelet[2808]: E0325 01:42:50.687798 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.687894 kubelet[2808]: E0325 01:42:50.687871 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.687894 kubelet[2808]: W0325 01:42:50.687884 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.687973 kubelet[2808]: E0325 01:42:50.687938 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.688115 kubelet[2808]: E0325 01:42:50.688081 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.688115 kubelet[2808]: W0325 01:42:50.688095 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.688115 kubelet[2808]: E0325 01:42:50.688108 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.688670 kubelet[2808]: E0325 01:42:50.688649 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.688670 kubelet[2808]: W0325 01:42:50.688662 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.688670 kubelet[2808]: E0325 01:42:50.688671 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.689321 kubelet[2808]: E0325 01:42:50.689290 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.689321 kubelet[2808]: W0325 01:42:50.689303 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.689593 kubelet[2808]: E0325 01:42:50.689564 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.689897 kubelet[2808]: E0325 01:42:50.689870 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.689897 kubelet[2808]: W0325 01:42:50.689891 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.690205 kubelet[2808]: E0325 01:42:50.690164 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.690672 kubelet[2808]: E0325 01:42:50.690644 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.690672 kubelet[2808]: W0325 01:42:50.690659 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.690921 kubelet[2808]: E0325 01:42:50.690893 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.691183 kubelet[2808]: E0325 01:42:50.691154 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.691183 kubelet[2808]: W0325 01:42:50.691166 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.691515 kubelet[2808]: E0325 01:42:50.691446 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.691854 kubelet[2808]: E0325 01:42:50.691746 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.691854 kubelet[2808]: W0325 01:42:50.691757 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.691945 kubelet[2808]: E0325 01:42:50.691864 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.692492 kubelet[2808]: E0325 01:42:50.692460 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.692492 kubelet[2808]: W0325 01:42:50.692482 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.692638 kubelet[2808]: E0325 01:42:50.692614 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.692837 kubelet[2808]: E0325 01:42:50.692813 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.692837 kubelet[2808]: W0325 01:42:50.692827 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.694792 kubelet[2808]: E0325 01:42:50.694740 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.695048 kubelet[2808]: E0325 01:42:50.695020 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.695048 kubelet[2808]: W0325 01:42:50.695034 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.697313 kubelet[2808]: E0325 01:42:50.695619 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.697313 kubelet[2808]: W0325 01:42:50.695634 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.697313 kubelet[2808]: E0325 01:42:50.695782 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.697313 kubelet[2808]: W0325 01:42:50.695789 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.697313 kubelet[2808]: E0325 01:42:50.695796 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.697313 kubelet[2808]: E0325 01:42:50.695810 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.697313 kubelet[2808]: E0325 01:42:50.695818 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.697313 kubelet[2808]: E0325 01:42:50.696631 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.697313 kubelet[2808]: W0325 01:42:50.696638 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.697313 kubelet[2808]: E0325 01:42:50.696646 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:50.699501 kubelet[2808]: E0325 01:42:50.699479 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:50.699612 kubelet[2808]: W0325 01:42:50.699596 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:50.699688 kubelet[2808]: E0325 01:42:50.699676 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:51.390439 kubelet[2808]: E0325 01:42:51.390371 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:51.390439 kubelet[2808]: W0325 01:42:51.390402 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:51.390439 kubelet[2808]: E0325 01:42:51.390432 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:51.391553 kubelet[2808]: E0325 01:42:51.390975 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:51.391553 kubelet[2808]: W0325 01:42:51.390992 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:51.391553 kubelet[2808]: E0325 01:42:51.391009 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:51.391902 kubelet[2808]: E0325 01:42:51.391860 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:51.391902 kubelet[2808]: W0325 01:42:51.391898 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:51.392021 kubelet[2808]: E0325 01:42:51.391914 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:51.392222 kubelet[2808]: E0325 01:42:51.392191 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:51.392222 kubelet[2808]: W0325 01:42:51.392211 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:51.392353 kubelet[2808]: E0325 01:42:51.392225 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:51.392682 kubelet[2808]: E0325 01:42:51.392653 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:51.392682 kubelet[2808]: W0325 01:42:51.392672 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:51.392899 kubelet[2808]: E0325 01:42:51.392688 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:51.392998 kubelet[2808]: E0325 01:42:51.392964 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:51.392998 kubelet[2808]: W0325 01:42:51.392987 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:51.393100 kubelet[2808]: E0325 01:42:51.393006 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:51.393365 kubelet[2808]: E0325 01:42:51.393338 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:51.393365 kubelet[2808]: W0325 01:42:51.393358 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:51.393604 kubelet[2808]: E0325 01:42:51.393372 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:51.393695 kubelet[2808]: E0325 01:42:51.393665 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:51.393695 kubelet[2808]: W0325 01:42:51.393689 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:51.393816 kubelet[2808]: E0325 01:42:51.393707 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:51.394022 kubelet[2808]: E0325 01:42:51.393999 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:51.394022 kubelet[2808]: W0325 01:42:51.394018 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:51.394102 kubelet[2808]: E0325 01:42:51.394033 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:51.394373 kubelet[2808]: E0325 01:42:51.394346 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:51.394373 kubelet[2808]: W0325 01:42:51.394370 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:51.394471 kubelet[2808]: E0325 01:42:51.394385 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:51.394725 kubelet[2808]: E0325 01:42:51.394697 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:51.394787 kubelet[2808]: W0325 01:42:51.394731 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:51.394787 kubelet[2808]: E0325 01:42:51.394745 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:51.395052 kubelet[2808]: E0325 01:42:51.395022 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:51.395052 kubelet[2808]: W0325 01:42:51.395044 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:51.395409 kubelet[2808]: E0325 01:42:51.395058 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:51.395483 kubelet[2808]: E0325 01:42:51.395458 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:51.395552 kubelet[2808]: W0325 01:42:51.395481 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:51.395552 kubelet[2808]: E0325 01:42:51.395496 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:51.395845 kubelet[2808]: E0325 01:42:51.395818 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:51.395926 kubelet[2808]: W0325 01:42:51.395854 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:51.395926 kubelet[2808]: E0325 01:42:51.395869 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:51.396321 kubelet[2808]: E0325 01:42:51.396288 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:51.396434 kubelet[2808]: W0325 01:42:51.396340 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:51.396434 kubelet[2808]: E0325 01:42:51.396358 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:51.911799 kubelet[2808]: E0325 01:42:51.910956 2808 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-j9x72" podUID="16e59d76-015e-4d72-b105-76b3e3dd930d" Mar 25 01:42:53.911509 kubelet[2808]: E0325 01:42:53.911428 2808 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-j9x72" podUID="16e59d76-015e-4d72-b105-76b3e3dd930d" Mar 25 01:42:54.027115 containerd[1510]: time="2025-03-25T01:42:54.027048205Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:42:54.028387 containerd[1510]: time="2025-03-25T01:42:54.028315469Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.2: active requests=0, bytes read=30414075" Mar 25 01:42:54.029926 containerd[1510]: time="2025-03-25T01:42:54.029856528Z" level=info msg="ImageCreate event name:\"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:42:54.032536 containerd[1510]: time="2025-03-25T01:42:54.032491703Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:42:54.033338 containerd[1510]: time="2025-03-25T01:42:54.033098391Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.2\" with image id \"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:9839fd34b4c1bad50beed72aec59c64893487a46eea57dc2d7d66c3041d7bcce\", size \"31907171\" in 3.52066513s" Mar 25 01:42:54.033338 containerd[1510]: time="2025-03-25T01:42:54.033131241Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.2\" returns image reference \"sha256:1d6f9d005866d74e6f0a8b0b8b743d0eaf4efcb7c7032fd2215da9c6ca131cb5\"" Mar 25 01:42:54.034698 containerd[1510]: time="2025-03-25T01:42:54.034679984Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\"" Mar 25 01:42:54.048618 containerd[1510]: time="2025-03-25T01:42:54.048183540Z" level=info msg="CreateContainer within sandbox \"ff91ea43e43a3424f25cd2362cc68a772a781392062b7ee226bf4ec33ca2866f\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 25 01:42:54.058556 containerd[1510]: time="2025-03-25T01:42:54.058518540Z" level=info msg="Container a67aebf32d056b1e84f2ecd705e0c282ee3ac4afffd7967c7364ddc23165b85c: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:42:54.069085 containerd[1510]: time="2025-03-25T01:42:54.069036697Z" level=info msg="CreateContainer within sandbox \"ff91ea43e43a3424f25cd2362cc68a772a781392062b7ee226bf4ec33ca2866f\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"a67aebf32d056b1e84f2ecd705e0c282ee3ac4afffd7967c7364ddc23165b85c\"" Mar 25 01:42:54.069710 containerd[1510]: time="2025-03-25T01:42:54.069680654Z" level=info msg="StartContainer for \"a67aebf32d056b1e84f2ecd705e0c282ee3ac4afffd7967c7364ddc23165b85c\"" Mar 25 01:42:54.070699 containerd[1510]: time="2025-03-25T01:42:54.070668554Z" level=info msg="connecting to shim a67aebf32d056b1e84f2ecd705e0c282ee3ac4afffd7967c7364ddc23165b85c" address="unix:///run/containerd/s/f6dfb1040e2f93a4269bce032eeb025eae2ad717dadcfe7cfc5f8e941d93accf" protocol=ttrpc version=3 Mar 25 01:42:54.092437 systemd[1]: Started cri-containerd-a67aebf32d056b1e84f2ecd705e0c282ee3ac4afffd7967c7364ddc23165b85c.scope - libcontainer container a67aebf32d056b1e84f2ecd705e0c282ee3ac4afffd7967c7364ddc23165b85c. Mar 25 01:42:54.142483 containerd[1510]: time="2025-03-25T01:42:54.142427240Z" level=info msg="StartContainer for \"a67aebf32d056b1e84f2ecd705e0c282ee3ac4afffd7967c7364ddc23165b85c\" returns successfully" Mar 25 01:42:55.027216 kubelet[2808]: E0325 01:42:55.027070 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:55.027216 kubelet[2808]: W0325 01:42:55.027098 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:55.027216 kubelet[2808]: E0325 01:42:55.027121 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:55.028094 kubelet[2808]: E0325 01:42:55.027954 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:55.028094 kubelet[2808]: W0325 01:42:55.027975 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:55.028094 kubelet[2808]: E0325 01:42:55.027992 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:55.029960 kubelet[2808]: E0325 01:42:55.028411 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:55.029960 kubelet[2808]: W0325 01:42:55.028427 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:55.029960 kubelet[2808]: E0325 01:42:55.028442 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:55.029960 kubelet[2808]: E0325 01:42:55.028647 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:55.029960 kubelet[2808]: W0325 01:42:55.028662 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:55.029960 kubelet[2808]: E0325 01:42:55.028689 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:55.032531 kubelet[2808]: E0325 01:42:55.032346 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:55.032531 kubelet[2808]: W0325 01:42:55.032364 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:55.032531 kubelet[2808]: E0325 01:42:55.032402 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:55.032968 kubelet[2808]: E0325 01:42:55.032590 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:55.032968 kubelet[2808]: W0325 01:42:55.032600 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:55.032968 kubelet[2808]: E0325 01:42:55.032612 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:55.032968 kubelet[2808]: E0325 01:42:55.032753 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:55.032968 kubelet[2808]: W0325 01:42:55.032763 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:55.032968 kubelet[2808]: E0325 01:42:55.032774 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:55.034081 kubelet[2808]: E0325 01:42:55.033127 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:55.034081 kubelet[2808]: W0325 01:42:55.033141 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:55.034081 kubelet[2808]: E0325 01:42:55.033156 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:55.034081 kubelet[2808]: E0325 01:42:55.033404 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:55.034081 kubelet[2808]: W0325 01:42:55.033413 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:55.034081 kubelet[2808]: E0325 01:42:55.033422 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:55.034081 kubelet[2808]: E0325 01:42:55.033534 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:55.034081 kubelet[2808]: W0325 01:42:55.033539 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:55.034081 kubelet[2808]: E0325 01:42:55.033546 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:55.034081 kubelet[2808]: E0325 01:42:55.033644 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:55.034455 kubelet[2808]: W0325 01:42:55.033652 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:55.034455 kubelet[2808]: E0325 01:42:55.033658 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:55.034455 kubelet[2808]: E0325 01:42:55.033751 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:55.034455 kubelet[2808]: W0325 01:42:55.033756 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:55.034455 kubelet[2808]: E0325 01:42:55.033762 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:55.034455 kubelet[2808]: E0325 01:42:55.033864 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:55.034455 kubelet[2808]: W0325 01:42:55.033869 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:55.034455 kubelet[2808]: E0325 01:42:55.033877 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:55.034455 kubelet[2808]: E0325 01:42:55.034371 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:55.034455 kubelet[2808]: W0325 01:42:55.034392 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:55.034825 kubelet[2808]: E0325 01:42:55.034402 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:55.034825 kubelet[2808]: E0325 01:42:55.034508 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:55.034825 kubelet[2808]: W0325 01:42:55.034515 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:55.034825 kubelet[2808]: E0325 01:42:55.034521 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:55.038765 kubelet[2808]: E0325 01:42:55.035188 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:55.038765 kubelet[2808]: W0325 01:42:55.035195 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:55.038765 kubelet[2808]: E0325 01:42:55.035203 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:55.038765 kubelet[2808]: E0325 01:42:55.035615 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:55.038765 kubelet[2808]: W0325 01:42:55.035622 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:55.038765 kubelet[2808]: E0325 01:42:55.035630 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:55.038765 kubelet[2808]: E0325 01:42:55.035746 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:55.038765 kubelet[2808]: W0325 01:42:55.035751 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:55.038765 kubelet[2808]: E0325 01:42:55.035758 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:55.038765 kubelet[2808]: E0325 01:42:55.035876 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:55.039114 kubelet[2808]: W0325 01:42:55.035881 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:55.039114 kubelet[2808]: E0325 01:42:55.035887 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:55.039114 kubelet[2808]: E0325 01:42:55.035986 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:55.039114 kubelet[2808]: W0325 01:42:55.035992 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:55.039114 kubelet[2808]: E0325 01:42:55.035998 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:55.039114 kubelet[2808]: E0325 01:42:55.036087 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:55.039114 kubelet[2808]: W0325 01:42:55.036093 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:55.039114 kubelet[2808]: E0325 01:42:55.036100 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:55.039114 kubelet[2808]: E0325 01:42:55.036217 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:55.039114 kubelet[2808]: W0325 01:42:55.036223 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:55.039638 kubelet[2808]: E0325 01:42:55.036230 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:55.039638 kubelet[2808]: E0325 01:42:55.036527 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:55.039638 kubelet[2808]: W0325 01:42:55.036533 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:55.039638 kubelet[2808]: E0325 01:42:55.036540 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:55.039638 kubelet[2808]: E0325 01:42:55.036660 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:55.039638 kubelet[2808]: W0325 01:42:55.036666 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:55.039638 kubelet[2808]: E0325 01:42:55.036673 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:55.039638 kubelet[2808]: E0325 01:42:55.036770 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:55.039638 kubelet[2808]: W0325 01:42:55.036776 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:55.039638 kubelet[2808]: E0325 01:42:55.036782 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:55.039954 kubelet[2808]: E0325 01:42:55.036878 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:55.039954 kubelet[2808]: W0325 01:42:55.036884 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:55.039954 kubelet[2808]: E0325 01:42:55.036891 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:55.039954 kubelet[2808]: E0325 01:42:55.036994 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:55.039954 kubelet[2808]: W0325 01:42:55.037000 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:55.039954 kubelet[2808]: E0325 01:42:55.037006 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:55.039954 kubelet[2808]: E0325 01:42:55.037330 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:55.039954 kubelet[2808]: W0325 01:42:55.037338 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:55.039954 kubelet[2808]: E0325 01:42:55.037345 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:55.039954 kubelet[2808]: E0325 01:42:55.038038 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:55.040401 kubelet[2808]: W0325 01:42:55.038047 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:55.040401 kubelet[2808]: E0325 01:42:55.038055 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:55.040401 kubelet[2808]: E0325 01:42:55.038164 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:55.040401 kubelet[2808]: W0325 01:42:55.038170 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:55.040401 kubelet[2808]: E0325 01:42:55.038177 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:55.040401 kubelet[2808]: E0325 01:42:55.038313 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:55.040401 kubelet[2808]: W0325 01:42:55.038320 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:55.040401 kubelet[2808]: E0325 01:42:55.038327 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:55.040401 kubelet[2808]: E0325 01:42:55.038642 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:55.040401 kubelet[2808]: W0325 01:42:55.038649 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:55.042720 kubelet[2808]: E0325 01:42:55.038655 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:55.042720 kubelet[2808]: E0325 01:42:55.039158 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:55.042720 kubelet[2808]: W0325 01:42:55.039168 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:55.042720 kubelet[2808]: E0325 01:42:55.039177 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:55.912546 kubelet[2808]: E0325 01:42:55.911680 2808 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-j9x72" podUID="16e59d76-015e-4d72-b105-76b3e3dd930d" Mar 25 01:42:55.984212 kubelet[2808]: I0325 01:42:55.984173 2808 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:42:56.008169 containerd[1510]: time="2025-03-25T01:42:56.008112506Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:42:56.009469 containerd[1510]: time="2025-03-25T01:42:56.009396854Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2: active requests=0, bytes read=5364011" Mar 25 01:42:56.010691 containerd[1510]: time="2025-03-25T01:42:56.010613426Z" level=info msg="ImageCreate event name:\"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:42:56.012524 containerd[1510]: time="2025-03-25T01:42:56.012466162Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:42:56.012977 containerd[1510]: time="2025-03-25T01:42:56.012837899Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" with image id \"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:51d9341a4a37e278a906f40ecc73f5076e768612c21621f1b1d4f2b2f0735a1d\", size \"6857075\" in 1.978066366s" Mar 25 01:42:56.012977 containerd[1510]: time="2025-03-25T01:42:56.012863175Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.2\" returns image reference \"sha256:441bf8ace5b7fa3742b7fafaf6cd60fea340dd307169a18c75a1d78cba3a8365\"" Mar 25 01:42:56.015168 containerd[1510]: time="2025-03-25T01:42:56.015128853Z" level=info msg="CreateContainer within sandbox \"ac5ce008bbd5beec4e6952cb387d4d544b24191494433280834f716b1577a4ef\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 25 01:42:56.037861 containerd[1510]: time="2025-03-25T01:42:56.036338490Z" level=info msg="Container 03eda04f4b338e358b1c55990c2d0bc5962eea1602761b0b128a0670332d852c: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:42:56.043515 kubelet[2808]: E0325 01:42:56.043325 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:56.043515 kubelet[2808]: W0325 01:42:56.043385 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:56.043515 kubelet[2808]: E0325 01:42:56.043416 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:56.043783 kubelet[2808]: E0325 01:42:56.043707 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:56.043783 kubelet[2808]: W0325 01:42:56.043721 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:56.043783 kubelet[2808]: E0325 01:42:56.043737 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:56.043955 kubelet[2808]: E0325 01:42:56.043940 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:56.043987 kubelet[2808]: W0325 01:42:56.043957 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:56.043987 kubelet[2808]: E0325 01:42:56.043971 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:56.044216 kubelet[2808]: E0325 01:42:56.044189 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:56.044216 kubelet[2808]: W0325 01:42:56.044209 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:56.044300 kubelet[2808]: E0325 01:42:56.044222 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:56.044522 kubelet[2808]: E0325 01:42:56.044485 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:56.044522 kubelet[2808]: W0325 01:42:56.044498 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:56.044522 kubelet[2808]: E0325 01:42:56.044507 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:56.044857 kubelet[2808]: E0325 01:42:56.044708 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:56.044857 kubelet[2808]: W0325 01:42:56.044715 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:56.044857 kubelet[2808]: E0325 01:42:56.044722 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:56.044857 kubelet[2808]: E0325 01:42:56.044850 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:56.044857 kubelet[2808]: W0325 01:42:56.044857 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:56.045208 kubelet[2808]: E0325 01:42:56.044863 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:56.045208 kubelet[2808]: E0325 01:42:56.044989 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:56.045208 kubelet[2808]: W0325 01:42:56.044995 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:56.045208 kubelet[2808]: E0325 01:42:56.045002 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:56.045208 kubelet[2808]: E0325 01:42:56.045142 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:56.045208 kubelet[2808]: W0325 01:42:56.045148 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:56.045208 kubelet[2808]: E0325 01:42:56.045154 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:56.045555 kubelet[2808]: E0325 01:42:56.045277 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:56.045555 kubelet[2808]: W0325 01:42:56.045285 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:56.045555 kubelet[2808]: E0325 01:42:56.045308 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:56.045555 kubelet[2808]: E0325 01:42:56.045460 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:56.045555 kubelet[2808]: W0325 01:42:56.045466 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:56.045555 kubelet[2808]: E0325 01:42:56.045473 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:56.045810 kubelet[2808]: E0325 01:42:56.045591 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:56.045810 kubelet[2808]: W0325 01:42:56.045597 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:56.045810 kubelet[2808]: E0325 01:42:56.045603 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:56.045810 kubelet[2808]: E0325 01:42:56.045737 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:56.045810 kubelet[2808]: W0325 01:42:56.045743 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:56.045810 kubelet[2808]: E0325 01:42:56.045750 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:56.046050 kubelet[2808]: E0325 01:42:56.045878 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:56.046050 kubelet[2808]: W0325 01:42:56.045883 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:56.046050 kubelet[2808]: E0325 01:42:56.045890 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:56.046050 kubelet[2808]: E0325 01:42:56.046017 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:56.046050 kubelet[2808]: W0325 01:42:56.046023 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:56.046050 kubelet[2808]: E0325 01:42:56.046030 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:56.052838 containerd[1510]: time="2025-03-25T01:42:56.052805092Z" level=info msg="CreateContainer within sandbox \"ac5ce008bbd5beec4e6952cb387d4d544b24191494433280834f716b1577a4ef\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"03eda04f4b338e358b1c55990c2d0bc5962eea1602761b0b128a0670332d852c\"" Mar 25 01:42:56.058648 containerd[1510]: time="2025-03-25T01:42:56.058190811Z" level=info msg="StartContainer for \"03eda04f4b338e358b1c55990c2d0bc5962eea1602761b0b128a0670332d852c\"" Mar 25 01:42:56.060640 containerd[1510]: time="2025-03-25T01:42:56.060577772Z" level=info msg="connecting to shim 03eda04f4b338e358b1c55990c2d0bc5962eea1602761b0b128a0670332d852c" address="unix:///run/containerd/s/aceee51305f88e069eedbf14f9091e5a9652a84bb9c26020fe60f71df2b9280d" protocol=ttrpc version=3 Mar 25 01:42:56.084533 systemd[1]: Started cri-containerd-03eda04f4b338e358b1c55990c2d0bc5962eea1602761b0b128a0670332d852c.scope - libcontainer container 03eda04f4b338e358b1c55990c2d0bc5962eea1602761b0b128a0670332d852c. Mar 25 01:42:56.137962 containerd[1510]: time="2025-03-25T01:42:56.137897517Z" level=info msg="StartContainer for \"03eda04f4b338e358b1c55990c2d0bc5962eea1602761b0b128a0670332d852c\" returns successfully" Mar 25 01:42:56.144215 kubelet[2808]: E0325 01:42:56.144080 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:56.144215 kubelet[2808]: W0325 01:42:56.144100 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:56.144215 kubelet[2808]: E0325 01:42:56.144118 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:56.144752 kubelet[2808]: E0325 01:42:56.144539 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:56.144752 kubelet[2808]: W0325 01:42:56.144547 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:56.144752 kubelet[2808]: E0325 01:42:56.144565 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:56.144752 kubelet[2808]: E0325 01:42:56.144737 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:56.144752 kubelet[2808]: W0325 01:42:56.144743 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:56.145018 kubelet[2808]: E0325 01:42:56.144891 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:56.145255 kubelet[2808]: E0325 01:42:56.145195 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:56.145255 kubelet[2808]: W0325 01:42:56.145204 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:56.145255 kubelet[2808]: E0325 01:42:56.145220 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:56.145641 kubelet[2808]: E0325 01:42:56.145486 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:56.145641 kubelet[2808]: W0325 01:42:56.145494 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:56.145641 kubelet[2808]: E0325 01:42:56.145516 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:56.145893 kubelet[2808]: E0325 01:42:56.145808 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:56.145893 kubelet[2808]: W0325 01:42:56.145816 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:56.146042 kubelet[2808]: E0325 01:42:56.145959 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:56.146042 kubelet[2808]: E0325 01:42:56.146028 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:56.146042 kubelet[2808]: W0325 01:42:56.146033 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:56.146319 kubelet[2808]: E0325 01:42:56.146248 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:56.146473 kubelet[2808]: E0325 01:42:56.146403 2808 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 25 01:42:56.146473 kubelet[2808]: W0325 01:42:56.146411 2808 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 25 01:42:56.146567 kubelet[2808]: E0325 01:42:56.146545 2808 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 25 01:42:56.148456 systemd[1]: cri-containerd-03eda04f4b338e358b1c55990c2d0bc5962eea1602761b0b128a0670332d852c.scope: Deactivated successfully. Mar 25 01:42:56.186089 containerd[1510]: time="2025-03-25T01:42:56.186030106Z" level=info msg="received exit event container_id:\"03eda04f4b338e358b1c55990c2d0bc5962eea1602761b0b128a0670332d852c\" id:\"03eda04f4b338e358b1c55990c2d0bc5962eea1602761b0b128a0670332d852c\" pid:3484 exited_at:{seconds:1742866976 nanos:152089475}" Mar 25 01:42:56.231756 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-03eda04f4b338e358b1c55990c2d0bc5962eea1602761b0b128a0670332d852c-rootfs.mount: Deactivated successfully. Mar 25 01:42:56.243580 containerd[1510]: time="2025-03-25T01:42:56.243530387Z" level=info msg="TaskExit event in podsandbox handler container_id:\"03eda04f4b338e358b1c55990c2d0bc5962eea1602761b0b128a0670332d852c\" id:\"03eda04f4b338e358b1c55990c2d0bc5962eea1602761b0b128a0670332d852c\" pid:3484 exited_at:{seconds:1742866976 nanos:152089475}" Mar 25 01:42:57.020519 containerd[1510]: time="2025-03-25T01:42:57.018666982Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\"" Mar 25 01:42:57.065411 kubelet[2808]: I0325 01:42:57.063986 2808 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-797cd4f474-8dxgc" podStartSLOduration=4.541789049 podStartE2EDuration="8.063957345s" podCreationTimestamp="2025-03-25 01:42:49 +0000 UTC" firstStartedPulling="2025-03-25 01:42:50.511980138 +0000 UTC m=+12.759021487" lastFinishedPulling="2025-03-25 01:42:54.034148435 +0000 UTC m=+16.281189783" observedRunningTime="2025-03-25 01:42:55.057093216 +0000 UTC m=+17.304134566" watchObservedRunningTime="2025-03-25 01:42:57.063957345 +0000 UTC m=+19.310998764" Mar 25 01:42:57.913677 kubelet[2808]: E0325 01:42:57.913595 2808 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-j9x72" podUID="16e59d76-015e-4d72-b105-76b3e3dd930d" Mar 25 01:42:59.911520 kubelet[2808]: E0325 01:42:59.911397 2808 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-j9x72" podUID="16e59d76-015e-4d72-b105-76b3e3dd930d" Mar 25 01:43:01.916597 kubelet[2808]: E0325 01:43:01.916548 2808 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-j9x72" podUID="16e59d76-015e-4d72-b105-76b3e3dd930d" Mar 25 01:43:02.701259 containerd[1510]: time="2025-03-25T01:43:02.700743833Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:43:02.702310 containerd[1510]: time="2025-03-25T01:43:02.702219249Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.2: active requests=0, bytes read=97781477" Mar 25 01:43:02.704025 containerd[1510]: time="2025-03-25T01:43:02.703920523Z" level=info msg="ImageCreate event name:\"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:43:02.706945 containerd[1510]: time="2025-03-25T01:43:02.706860737Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:43:02.708517 containerd[1510]: time="2025-03-25T01:43:02.707714445Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.2\" with image id \"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:890e1db6ae363695cfc23ffae4d612cc85cdd99d759bd539af6683969d0c3c25\", size \"99274581\" in 5.688949932s" Mar 25 01:43:02.708517 containerd[1510]: time="2025-03-25T01:43:02.707779756Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.2\" returns image reference \"sha256:cda13293c895a8a3b06c1e190b70fb6fe61036db2e59764036fc6e65ec374693\"" Mar 25 01:43:02.711630 containerd[1510]: time="2025-03-25T01:43:02.711578145Z" level=info msg="CreateContainer within sandbox \"ac5ce008bbd5beec4e6952cb387d4d544b24191494433280834f716b1577a4ef\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 25 01:43:02.794980 containerd[1510]: time="2025-03-25T01:43:02.791726703Z" level=info msg="Container 582b8601727ae6f43513464f6abf6daab9103acce51c4e316c13b0347f058fe0: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:43:02.796940 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3230546191.mount: Deactivated successfully. Mar 25 01:43:02.812630 containerd[1510]: time="2025-03-25T01:43:02.812560417Z" level=info msg="CreateContainer within sandbox \"ac5ce008bbd5beec4e6952cb387d4d544b24191494433280834f716b1577a4ef\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"582b8601727ae6f43513464f6abf6daab9103acce51c4e316c13b0347f058fe0\"" Mar 25 01:43:02.814619 containerd[1510]: time="2025-03-25T01:43:02.814537330Z" level=info msg="StartContainer for \"582b8601727ae6f43513464f6abf6daab9103acce51c4e316c13b0347f058fe0\"" Mar 25 01:43:02.821575 containerd[1510]: time="2025-03-25T01:43:02.821501159Z" level=info msg="connecting to shim 582b8601727ae6f43513464f6abf6daab9103acce51c4e316c13b0347f058fe0" address="unix:///run/containerd/s/aceee51305f88e069eedbf14f9091e5a9652a84bb9c26020fe60f71df2b9280d" protocol=ttrpc version=3 Mar 25 01:43:02.896410 systemd[1]: Started cri-containerd-582b8601727ae6f43513464f6abf6daab9103acce51c4e316c13b0347f058fe0.scope - libcontainer container 582b8601727ae6f43513464f6abf6daab9103acce51c4e316c13b0347f058fe0. Mar 25 01:43:02.963417 containerd[1510]: time="2025-03-25T01:43:02.962563581Z" level=info msg="StartContainer for \"582b8601727ae6f43513464f6abf6daab9103acce51c4e316c13b0347f058fe0\" returns successfully" Mar 25 01:43:03.460067 systemd[1]: cri-containerd-582b8601727ae6f43513464f6abf6daab9103acce51c4e316c13b0347f058fe0.scope: Deactivated successfully. Mar 25 01:43:03.460647 systemd[1]: cri-containerd-582b8601727ae6f43513464f6abf6daab9103acce51c4e316c13b0347f058fe0.scope: Consumed 502ms CPU time, 150.9M memory peak, 4.7M read from disk, 154M written to disk. Mar 25 01:43:03.466790 containerd[1510]: time="2025-03-25T01:43:03.466487093Z" level=info msg="TaskExit event in podsandbox handler container_id:\"582b8601727ae6f43513464f6abf6daab9103acce51c4e316c13b0347f058fe0\" id:\"582b8601727ae6f43513464f6abf6daab9103acce51c4e316c13b0347f058fe0\" pid:3557 exited_at:{seconds:1742866983 nanos:464420974}" Mar 25 01:43:03.466790 containerd[1510]: time="2025-03-25T01:43:03.466626541Z" level=info msg="received exit event container_id:\"582b8601727ae6f43513464f6abf6daab9103acce51c4e316c13b0347f058fe0\" id:\"582b8601727ae6f43513464f6abf6daab9103acce51c4e316c13b0347f058fe0\" pid:3557 exited_at:{seconds:1742866983 nanos:464420974}" Mar 25 01:43:03.506534 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-582b8601727ae6f43513464f6abf6daab9103acce51c4e316c13b0347f058fe0-rootfs.mount: Deactivated successfully. Mar 25 01:43:03.593972 kubelet[2808]: I0325 01:43:03.593905 2808 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Mar 25 01:43:03.644411 systemd[1]: Created slice kubepods-burstable-pod58f0f3d1_dfb1_49eb_96d2_b92341d12cba.slice - libcontainer container kubepods-burstable-pod58f0f3d1_dfb1_49eb_96d2_b92341d12cba.slice. Mar 25 01:43:03.655444 kubelet[2808]: W0325 01:43:03.654733 2808 reflector.go:561] object-"calico-apiserver"/"calico-apiserver-certs": failed to list *v1.Secret: secrets "calico-apiserver-certs" is forbidden: User "system:node:ci-4284-0-0-2-22d395eace" cannot list resource "secrets" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ci-4284-0-0-2-22d395eace' and this object Mar 25 01:43:03.655444 kubelet[2808]: E0325 01:43:03.654788 2808 reflector.go:158] "Unhandled Error" err="object-\"calico-apiserver\"/\"calico-apiserver-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"calico-apiserver-certs\" is forbidden: User \"system:node:ci-4284-0-0-2-22d395eace\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'ci-4284-0-0-2-22d395eace' and this object" logger="UnhandledError" Mar 25 01:43:03.655444 kubelet[2808]: W0325 01:43:03.654821 2808 reflector.go:561] object-"calico-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4284-0-0-2-22d395eace" cannot list resource "configmaps" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ci-4284-0-0-2-22d395eace' and this object Mar 25 01:43:03.655444 kubelet[2808]: E0325 01:43:03.654838 2808 reflector.go:158] "Unhandled Error" err="object-\"calico-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ci-4284-0-0-2-22d395eace\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'ci-4284-0-0-2-22d395eace' and this object" logger="UnhandledError" Mar 25 01:43:03.655448 systemd[1]: Created slice kubepods-besteffort-pod0bc14d37_3e76_4377_aa16_ba03020be65d.slice - libcontainer container kubepods-besteffort-pod0bc14d37_3e76_4377_aa16_ba03020be65d.slice. Mar 25 01:43:03.665033 systemd[1]: Created slice kubepods-burstable-pod1dbd8038_f679_40f3_bcdd_33eeca1fe31f.slice - libcontainer container kubepods-burstable-pod1dbd8038_f679_40f3_bcdd_33eeca1fe31f.slice. Mar 25 01:43:03.685237 systemd[1]: Created slice kubepods-besteffort-podf3ec862d_50d5_495c_8b60_3b6369bda216.slice - libcontainer container kubepods-besteffort-podf3ec862d_50d5_495c_8b60_3b6369bda216.slice. Mar 25 01:43:03.697404 systemd[1]: Created slice kubepods-besteffort-pod44a52fe5_6229_44f0_a7aa_4765242ee6fa.slice - libcontainer container kubepods-besteffort-pod44a52fe5_6229_44f0_a7aa_4765242ee6fa.slice. Mar 25 01:43:03.811374 kubelet[2808]: I0325 01:43:03.810902 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f3ec862d-50d5-495c-8b60-3b6369bda216-calico-apiserver-certs\") pod \"calico-apiserver-f6c47cd8c-ztc7d\" (UID: \"f3ec862d-50d5-495c-8b60-3b6369bda216\") " pod="calico-apiserver/calico-apiserver-f6c47cd8c-ztc7d" Mar 25 01:43:03.811374 kubelet[2808]: I0325 01:43:03.810966 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/44a52fe5-6229-44f0-a7aa-4765242ee6fa-calico-apiserver-certs\") pod \"calico-apiserver-f6c47cd8c-596j7\" (UID: \"44a52fe5-6229-44f0-a7aa-4765242ee6fa\") " pod="calico-apiserver/calico-apiserver-f6c47cd8c-596j7" Mar 25 01:43:03.811374 kubelet[2808]: I0325 01:43:03.810996 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br9p2\" (UniqueName: \"kubernetes.io/projected/58f0f3d1-dfb1-49eb-96d2-b92341d12cba-kube-api-access-br9p2\") pod \"coredns-6f6b679f8f-gdgtb\" (UID: \"58f0f3d1-dfb1-49eb-96d2-b92341d12cba\") " pod="kube-system/coredns-6f6b679f8f-gdgtb" Mar 25 01:43:03.811374 kubelet[2808]: I0325 01:43:03.811033 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1dbd8038-f679-40f3-bcdd-33eeca1fe31f-config-volume\") pod \"coredns-6f6b679f8f-2zp2f\" (UID: \"1dbd8038-f679-40f3-bcdd-33eeca1fe31f\") " pod="kube-system/coredns-6f6b679f8f-2zp2f" Mar 25 01:43:03.811374 kubelet[2808]: I0325 01:43:03.811056 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94fhc\" (UniqueName: \"kubernetes.io/projected/44a52fe5-6229-44f0-a7aa-4765242ee6fa-kube-api-access-94fhc\") pod \"calico-apiserver-f6c47cd8c-596j7\" (UID: \"44a52fe5-6229-44f0-a7aa-4765242ee6fa\") " pod="calico-apiserver/calico-apiserver-f6c47cd8c-596j7" Mar 25 01:43:03.811929 kubelet[2808]: I0325 01:43:03.811083 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssvfl\" (UniqueName: \"kubernetes.io/projected/f3ec862d-50d5-495c-8b60-3b6369bda216-kube-api-access-ssvfl\") pod \"calico-apiserver-f6c47cd8c-ztc7d\" (UID: \"f3ec862d-50d5-495c-8b60-3b6369bda216\") " pod="calico-apiserver/calico-apiserver-f6c47cd8c-ztc7d" Mar 25 01:43:03.811929 kubelet[2808]: I0325 01:43:03.811124 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bc14d37-3e76-4377-aa16-ba03020be65d-tigera-ca-bundle\") pod \"calico-kube-controllers-6c89bd67d9-shfkm\" (UID: \"0bc14d37-3e76-4377-aa16-ba03020be65d\") " pod="calico-system/calico-kube-controllers-6c89bd67d9-shfkm" Mar 25 01:43:03.811929 kubelet[2808]: I0325 01:43:03.811146 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7mqx\" (UniqueName: \"kubernetes.io/projected/0bc14d37-3e76-4377-aa16-ba03020be65d-kube-api-access-p7mqx\") pod \"calico-kube-controllers-6c89bd67d9-shfkm\" (UID: \"0bc14d37-3e76-4377-aa16-ba03020be65d\") " pod="calico-system/calico-kube-controllers-6c89bd67d9-shfkm" Mar 25 01:43:03.811929 kubelet[2808]: I0325 01:43:03.811177 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58f0f3d1-dfb1-49eb-96d2-b92341d12cba-config-volume\") pod \"coredns-6f6b679f8f-gdgtb\" (UID: \"58f0f3d1-dfb1-49eb-96d2-b92341d12cba\") " pod="kube-system/coredns-6f6b679f8f-gdgtb" Mar 25 01:43:03.811929 kubelet[2808]: I0325 01:43:03.811198 2808 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpffm\" (UniqueName: \"kubernetes.io/projected/1dbd8038-f679-40f3-bcdd-33eeca1fe31f-kube-api-access-bpffm\") pod \"coredns-6f6b679f8f-2zp2f\" (UID: \"1dbd8038-f679-40f3-bcdd-33eeca1fe31f\") " pod="kube-system/coredns-6f6b679f8f-2zp2f" Mar 25 01:43:03.866506 systemd[1]: Started sshd@9-37.27.205.216:22-147.182.248.67:40988.service - OpenSSH per-connection server daemon (147.182.248.67:40988). Mar 25 01:43:03.944015 systemd[1]: Created slice kubepods-besteffort-pod16e59d76_015e_4d72_b105_76b3e3dd930d.slice - libcontainer container kubepods-besteffort-pod16e59d76_015e_4d72_b105_76b3e3dd930d.slice. Mar 25 01:43:03.969835 containerd[1510]: time="2025-03-25T01:43:03.969493315Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6c89bd67d9-shfkm,Uid:0bc14d37-3e76-4377-aa16-ba03020be65d,Namespace:calico-system,Attempt:0,}" Mar 25 01:43:03.976282 containerd[1510]: time="2025-03-25T01:43:03.975308245Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-2zp2f,Uid:1dbd8038-f679-40f3-bcdd-33eeca1fe31f,Namespace:kube-system,Attempt:0,}" Mar 25 01:43:03.978798 containerd[1510]: time="2025-03-25T01:43:03.978762351Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-j9x72,Uid:16e59d76-015e-4d72-b105-76b3e3dd930d,Namespace:calico-system,Attempt:0,}" Mar 25 01:43:04.041832 containerd[1510]: time="2025-03-25T01:43:04.041800413Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\"" Mar 25 01:43:04.164750 containerd[1510]: time="2025-03-25T01:43:04.164616476Z" level=error msg="Failed to destroy network for sandbox \"1ea613638b16c375eba1e1e635cda7781ab81dc294bea8d30de76494d0bbb618\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:43:04.167131 containerd[1510]: time="2025-03-25T01:43:04.167080022Z" level=error msg="Failed to destroy network for sandbox \"b66ff7f2e82f8db7e89f2111f93417cefb3fd78bbbafd7a651fe890a3374e245\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:43:04.184932 containerd[1510]: time="2025-03-25T01:43:04.170290420Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-j9x72,Uid:16e59d76-015e-4d72-b105-76b3e3dd930d,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b66ff7f2e82f8db7e89f2111f93417cefb3fd78bbbafd7a651fe890a3374e245\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:43:04.185213 containerd[1510]: time="2025-03-25T01:43:04.174317667Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-2zp2f,Uid:1dbd8038-f679-40f3-bcdd-33eeca1fe31f,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1ea613638b16c375eba1e1e635cda7781ab81dc294bea8d30de76494d0bbb618\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:43:04.193134 containerd[1510]: time="2025-03-25T01:43:04.182668841Z" level=error msg="Failed to destroy network for sandbox \"a6c1e8f830dcae31d66beb9b08f9f969dea33cd09e80887ef3d8ff9642beef16\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:43:04.194654 kubelet[2808]: E0325 01:43:04.194206 2808 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1ea613638b16c375eba1e1e635cda7781ab81dc294bea8d30de76494d0bbb618\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:43:04.194654 kubelet[2808]: E0325 01:43:04.194296 2808 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1ea613638b16c375eba1e1e635cda7781ab81dc294bea8d30de76494d0bbb618\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-2zp2f" Mar 25 01:43:04.194654 kubelet[2808]: E0325 01:43:04.194320 2808 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1ea613638b16c375eba1e1e635cda7781ab81dc294bea8d30de76494d0bbb618\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-2zp2f" Mar 25 01:43:04.194795 kubelet[2808]: E0325 01:43:04.194357 2808 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-2zp2f_kube-system(1dbd8038-f679-40f3-bcdd-33eeca1fe31f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-2zp2f_kube-system(1dbd8038-f679-40f3-bcdd-33eeca1fe31f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1ea613638b16c375eba1e1e635cda7781ab81dc294bea8d30de76494d0bbb618\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-2zp2f" podUID="1dbd8038-f679-40f3-bcdd-33eeca1fe31f" Mar 25 01:43:04.194795 kubelet[2808]: E0325 01:43:04.194421 2808 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b66ff7f2e82f8db7e89f2111f93417cefb3fd78bbbafd7a651fe890a3374e245\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:43:04.194795 kubelet[2808]: E0325 01:43:04.194511 2808 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b66ff7f2e82f8db7e89f2111f93417cefb3fd78bbbafd7a651fe890a3374e245\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-j9x72" Mar 25 01:43:04.194905 containerd[1510]: time="2025-03-25T01:43:04.194677398Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6c89bd67d9-shfkm,Uid:0bc14d37-3e76-4377-aa16-ba03020be65d,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a6c1e8f830dcae31d66beb9b08f9f969dea33cd09e80887ef3d8ff9642beef16\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:43:04.194950 kubelet[2808]: E0325 01:43:04.194529 2808 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b66ff7f2e82f8db7e89f2111f93417cefb3fd78bbbafd7a651fe890a3374e245\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-j9x72" Mar 25 01:43:04.194950 kubelet[2808]: E0325 01:43:04.194571 2808 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-j9x72_calico-system(16e59d76-015e-4d72-b105-76b3e3dd930d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-j9x72_calico-system(16e59d76-015e-4d72-b105-76b3e3dd930d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b66ff7f2e82f8db7e89f2111f93417cefb3fd78bbbafd7a651fe890a3374e245\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-j9x72" podUID="16e59d76-015e-4d72-b105-76b3e3dd930d" Mar 25 01:43:04.194950 kubelet[2808]: E0325 01:43:04.194843 2808 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a6c1e8f830dcae31d66beb9b08f9f969dea33cd09e80887ef3d8ff9642beef16\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:43:04.195032 kubelet[2808]: E0325 01:43:04.194865 2808 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a6c1e8f830dcae31d66beb9b08f9f969dea33cd09e80887ef3d8ff9642beef16\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6c89bd67d9-shfkm" Mar 25 01:43:04.195032 kubelet[2808]: E0325 01:43:04.194879 2808 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a6c1e8f830dcae31d66beb9b08f9f969dea33cd09e80887ef3d8ff9642beef16\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6c89bd67d9-shfkm" Mar 25 01:43:04.195032 kubelet[2808]: E0325 01:43:04.194900 2808 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6c89bd67d9-shfkm_calico-system(0bc14d37-3e76-4377-aa16-ba03020be65d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6c89bd67d9-shfkm_calico-system(0bc14d37-3e76-4377-aa16-ba03020be65d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a6c1e8f830dcae31d66beb9b08f9f969dea33cd09e80887ef3d8ff9642beef16\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6c89bd67d9-shfkm" podUID="0bc14d37-3e76-4377-aa16-ba03020be65d" Mar 25 01:43:04.250510 containerd[1510]: time="2025-03-25T01:43:04.250458283Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-gdgtb,Uid:58f0f3d1-dfb1-49eb-96d2-b92341d12cba,Namespace:kube-system,Attempt:0,}" Mar 25 01:43:04.317527 containerd[1510]: time="2025-03-25T01:43:04.317466260Z" level=error msg="Failed to destroy network for sandbox \"412071f869f671256c841f7a0d5f25b0c90ea10f8e0c376ea0ab7530753c0e72\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:43:04.319375 containerd[1510]: time="2025-03-25T01:43:04.319332342Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-gdgtb,Uid:58f0f3d1-dfb1-49eb-96d2-b92341d12cba,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"412071f869f671256c841f7a0d5f25b0c90ea10f8e0c376ea0ab7530753c0e72\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:43:04.319553 kubelet[2808]: E0325 01:43:04.319528 2808 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"412071f869f671256c841f7a0d5f25b0c90ea10f8e0c376ea0ab7530753c0e72\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:43:04.319620 kubelet[2808]: E0325 01:43:04.319578 2808 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"412071f869f671256c841f7a0d5f25b0c90ea10f8e0c376ea0ab7530753c0e72\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-gdgtb" Mar 25 01:43:04.319620 kubelet[2808]: E0325 01:43:04.319595 2808 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"412071f869f671256c841f7a0d5f25b0c90ea10f8e0c376ea0ab7530753c0e72\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-gdgtb" Mar 25 01:43:04.319694 kubelet[2808]: E0325 01:43:04.319636 2808 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-gdgtb_kube-system(58f0f3d1-dfb1-49eb-96d2-b92341d12cba)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-gdgtb_kube-system(58f0f3d1-dfb1-49eb-96d2-b92341d12cba)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"412071f869f671256c841f7a0d5f25b0c90ea10f8e0c376ea0ab7530753c0e72\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-gdgtb" podUID="58f0f3d1-dfb1-49eb-96d2-b92341d12cba" Mar 25 01:43:04.354314 sshd[3586]: Connection closed by 147.182.248.67 port 40988 [preauth] Mar 25 01:43:04.356159 systemd[1]: sshd@9-37.27.205.216:22-147.182.248.67:40988.service: Deactivated successfully. Mar 25 01:43:04.912815 kubelet[2808]: E0325 01:43:04.912768 2808 secret.go:188] Couldn't get secret calico-apiserver/calico-apiserver-certs: failed to sync secret cache: timed out waiting for the condition Mar 25 01:43:04.913928 kubelet[2808]: E0325 01:43:04.913908 2808 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44a52fe5-6229-44f0-a7aa-4765242ee6fa-calico-apiserver-certs podName:44a52fe5-6229-44f0-a7aa-4765242ee6fa nodeName:}" failed. No retries permitted until 2025-03-25 01:43:05.412852167 +0000 UTC m=+27.659893516 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "calico-apiserver-certs" (UniqueName: "kubernetes.io/secret/44a52fe5-6229-44f0-a7aa-4765242ee6fa-calico-apiserver-certs") pod "calico-apiserver-f6c47cd8c-596j7" (UID: "44a52fe5-6229-44f0-a7aa-4765242ee6fa") : failed to sync secret cache: timed out waiting for the condition Mar 25 01:43:04.920211 kubelet[2808]: E0325 01:43:04.920188 2808 secret.go:188] Couldn't get secret calico-apiserver/calico-apiserver-certs: failed to sync secret cache: timed out waiting for the condition Mar 25 01:43:04.921092 kubelet[2808]: E0325 01:43:04.920242 2808 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3ec862d-50d5-495c-8b60-3b6369bda216-calico-apiserver-certs podName:f3ec862d-50d5-495c-8b60-3b6369bda216 nodeName:}" failed. No retries permitted until 2025-03-25 01:43:05.420226143 +0000 UTC m=+27.667267492 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "calico-apiserver-certs" (UniqueName: "kubernetes.io/secret/f3ec862d-50d5-495c-8b60-3b6369bda216-calico-apiserver-certs") pod "calico-apiserver-f6c47cd8c-ztc7d" (UID: "f3ec862d-50d5-495c-8b60-3b6369bda216") : failed to sync secret cache: timed out waiting for the condition Mar 25 01:43:04.949646 systemd[1]: run-netns-cni\x2d65935b3d\x2d2628\x2da992\x2dfe18\x2dd9bc05a50d4d.mount: Deactivated successfully. Mar 25 01:43:04.949762 systemd[1]: run-netns-cni\x2d3a608b28\x2d2655\x2d3251\x2d40f8\x2dff6aee339f61.mount: Deactivated successfully. Mar 25 01:43:04.949828 systemd[1]: run-netns-cni\x2d41a49b34\x2d3adc\x2d9e6e\x2dd948\x2d3dca0e83680c.mount: Deactivated successfully. Mar 25 01:43:04.956581 kubelet[2808]: E0325 01:43:04.956536 2808 projected.go:288] Couldn't get configMap calico-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 25 01:43:04.956581 kubelet[2808]: E0325 01:43:04.956578 2808 projected.go:194] Error preparing data for projected volume kube-api-access-ssvfl for pod calico-apiserver/calico-apiserver-f6c47cd8c-ztc7d: failed to sync configmap cache: timed out waiting for the condition Mar 25 01:43:04.956822 kubelet[2808]: E0325 01:43:04.956654 2808 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f3ec862d-50d5-495c-8b60-3b6369bda216-kube-api-access-ssvfl podName:f3ec862d-50d5-495c-8b60-3b6369bda216 nodeName:}" failed. No retries permitted until 2025-03-25 01:43:05.456632349 +0000 UTC m=+27.703673698 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ssvfl" (UniqueName: "kubernetes.io/projected/f3ec862d-50d5-495c-8b60-3b6369bda216-kube-api-access-ssvfl") pod "calico-apiserver-f6c47cd8c-ztc7d" (UID: "f3ec862d-50d5-495c-8b60-3b6369bda216") : failed to sync configmap cache: timed out waiting for the condition Mar 25 01:43:04.964033 kubelet[2808]: E0325 01:43:04.963946 2808 projected.go:288] Couldn't get configMap calico-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 25 01:43:04.964033 kubelet[2808]: E0325 01:43:04.963989 2808 projected.go:194] Error preparing data for projected volume kube-api-access-94fhc for pod calico-apiserver/calico-apiserver-f6c47cd8c-596j7: failed to sync configmap cache: timed out waiting for the condition Mar 25 01:43:04.964291 kubelet[2808]: E0325 01:43:04.964054 2808 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/44a52fe5-6229-44f0-a7aa-4765242ee6fa-kube-api-access-94fhc podName:44a52fe5-6229-44f0-a7aa-4765242ee6fa nodeName:}" failed. No retries permitted until 2025-03-25 01:43:05.464031462 +0000 UTC m=+27.711072811 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-94fhc" (UniqueName: "kubernetes.io/projected/44a52fe5-6229-44f0-a7aa-4765242ee6fa-kube-api-access-94fhc") pod "calico-apiserver-f6c47cd8c-596j7" (UID: "44a52fe5-6229-44f0-a7aa-4765242ee6fa") : failed to sync configmap cache: timed out waiting for the condition Mar 25 01:43:05.797622 containerd[1510]: time="2025-03-25T01:43:05.797417641Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f6c47cd8c-ztc7d,Uid:f3ec862d-50d5-495c-8b60-3b6369bda216,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:43:05.801389 containerd[1510]: time="2025-03-25T01:43:05.800684324Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f6c47cd8c-596j7,Uid:44a52fe5-6229-44f0-a7aa-4765242ee6fa,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:43:05.907136 containerd[1510]: time="2025-03-25T01:43:05.906686536Z" level=error msg="Failed to destroy network for sandbox \"f4509751357faa694e57256bcfc911ca29c9e24becbcbda49ffde211af6357e5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:43:05.908948 containerd[1510]: time="2025-03-25T01:43:05.908825113Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f6c47cd8c-ztc7d,Uid:f3ec862d-50d5-495c-8b60-3b6369bda216,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f4509751357faa694e57256bcfc911ca29c9e24becbcbda49ffde211af6357e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:43:05.909299 kubelet[2808]: E0325 01:43:05.909110 2808 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f4509751357faa694e57256bcfc911ca29c9e24becbcbda49ffde211af6357e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:43:05.909299 kubelet[2808]: E0325 01:43:05.909172 2808 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f4509751357faa694e57256bcfc911ca29c9e24becbcbda49ffde211af6357e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f6c47cd8c-ztc7d" Mar 25 01:43:05.909299 kubelet[2808]: E0325 01:43:05.909194 2808 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f4509751357faa694e57256bcfc911ca29c9e24becbcbda49ffde211af6357e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f6c47cd8c-ztc7d" Mar 25 01:43:05.909729 kubelet[2808]: E0325 01:43:05.909242 2808 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-f6c47cd8c-ztc7d_calico-apiserver(f3ec862d-50d5-495c-8b60-3b6369bda216)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-f6c47cd8c-ztc7d_calico-apiserver(f3ec862d-50d5-495c-8b60-3b6369bda216)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f4509751357faa694e57256bcfc911ca29c9e24becbcbda49ffde211af6357e5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-f6c47cd8c-ztc7d" podUID="f3ec862d-50d5-495c-8b60-3b6369bda216" Mar 25 01:43:05.915365 containerd[1510]: time="2025-03-25T01:43:05.915235341Z" level=error msg="Failed to destroy network for sandbox \"69dd2e51dc1106a879c29accb6d409e42ca4947e110041fb5a4e3a71f64209f3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:43:05.916506 containerd[1510]: time="2025-03-25T01:43:05.916476447Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f6c47cd8c-596j7,Uid:44a52fe5-6229-44f0-a7aa-4765242ee6fa,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"69dd2e51dc1106a879c29accb6d409e42ca4947e110041fb5a4e3a71f64209f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:43:05.916608 kubelet[2808]: E0325 01:43:05.916582 2808 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"69dd2e51dc1106a879c29accb6d409e42ca4947e110041fb5a4e3a71f64209f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 25 01:43:05.916838 kubelet[2808]: E0325 01:43:05.916622 2808 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"69dd2e51dc1106a879c29accb6d409e42ca4947e110041fb5a4e3a71f64209f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f6c47cd8c-596j7" Mar 25 01:43:05.916838 kubelet[2808]: E0325 01:43:05.916637 2808 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"69dd2e51dc1106a879c29accb6d409e42ca4947e110041fb5a4e3a71f64209f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f6c47cd8c-596j7" Mar 25 01:43:05.916838 kubelet[2808]: E0325 01:43:05.916669 2808 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-f6c47cd8c-596j7_calico-apiserver(44a52fe5-6229-44f0-a7aa-4765242ee6fa)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-f6c47cd8c-596j7_calico-apiserver(44a52fe5-6229-44f0-a7aa-4765242ee6fa)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"69dd2e51dc1106a879c29accb6d409e42ca4947e110041fb5a4e3a71f64209f3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-f6c47cd8c-596j7" podUID="44a52fe5-6229-44f0-a7aa-4765242ee6fa" Mar 25 01:43:11.450006 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2326112501.mount: Deactivated successfully. Mar 25 01:43:11.751537 containerd[1510]: time="2025-03-25T01:43:11.724920406Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.2: active requests=0, bytes read=142241445" Mar 25 01:43:11.753977 containerd[1510]: time="2025-03-25T01:43:11.703805650Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:43:11.876303 containerd[1510]: time="2025-03-25T01:43:11.876227348Z" level=info msg="ImageCreate event name:\"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:43:11.879769 containerd[1510]: time="2025-03-25T01:43:11.879718898Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:43:11.882077 containerd[1510]: time="2025-03-25T01:43:11.882040660Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.2\" with image id \"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:d9a21be37fe591ee5ab5a2e3dc26408ea165a44a55705102ffaa002de9908b32\", size \"142241307\" in 7.838499252s" Mar 25 01:43:11.882189 containerd[1510]: time="2025-03-25T01:43:11.882171733Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.2\" returns image reference \"sha256:048bf7af1f8c697d151dbecc478a18e89d89ed8627da98e17a56c11b3d45d351\"" Mar 25 01:43:11.977445 containerd[1510]: time="2025-03-25T01:43:11.977375903Z" level=info msg="CreateContainer within sandbox \"ac5ce008bbd5beec4e6952cb387d4d544b24191494433280834f716b1577a4ef\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 25 01:43:12.051563 containerd[1510]: time="2025-03-25T01:43:12.051132612Z" level=info msg="Container 60a361e49ee2512a46f6f281a8dd5f303f88d280ffac9fd2a107dce473f76a48: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:43:12.092307 containerd[1510]: time="2025-03-25T01:43:12.092205162Z" level=info msg="CreateContainer within sandbox \"ac5ce008bbd5beec4e6952cb387d4d544b24191494433280834f716b1577a4ef\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"60a361e49ee2512a46f6f281a8dd5f303f88d280ffac9fd2a107dce473f76a48\"" Mar 25 01:43:12.098423 containerd[1510]: time="2025-03-25T01:43:12.098359078Z" level=info msg="StartContainer for \"60a361e49ee2512a46f6f281a8dd5f303f88d280ffac9fd2a107dce473f76a48\"" Mar 25 01:43:12.103380 containerd[1510]: time="2025-03-25T01:43:12.103315605Z" level=info msg="connecting to shim 60a361e49ee2512a46f6f281a8dd5f303f88d280ffac9fd2a107dce473f76a48" address="unix:///run/containerd/s/aceee51305f88e069eedbf14f9091e5a9652a84bb9c26020fe60f71df2b9280d" protocol=ttrpc version=3 Mar 25 01:43:12.291465 systemd[1]: Started cri-containerd-60a361e49ee2512a46f6f281a8dd5f303f88d280ffac9fd2a107dce473f76a48.scope - libcontainer container 60a361e49ee2512a46f6f281a8dd5f303f88d280ffac9fd2a107dce473f76a48. Mar 25 01:43:12.422110 containerd[1510]: time="2025-03-25T01:43:12.421872973Z" level=info msg="StartContainer for \"60a361e49ee2512a46f6f281a8dd5f303f88d280ffac9fd2a107dce473f76a48\" returns successfully" Mar 25 01:43:12.487248 kubelet[2808]: I0325 01:43:12.487202 2808 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:43:12.507331 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Mar 25 01:43:12.509313 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Mar 25 01:43:14.100834 kubelet[2808]: I0325 01:43:14.100785 2808 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:43:14.358512 kernel: bpftool[3967]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Mar 25 01:43:14.649963 systemd-networkd[1415]: vxlan.calico: Link UP Mar 25 01:43:14.649997 systemd-networkd[1415]: vxlan.calico: Gained carrier Mar 25 01:43:15.913649 containerd[1510]: time="2025-03-25T01:43:15.912679460Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-2zp2f,Uid:1dbd8038-f679-40f3-bcdd-33eeca1fe31f,Namespace:kube-system,Attempt:0,}" Mar 25 01:43:16.014484 systemd-networkd[1415]: vxlan.calico: Gained IPv6LL Mar 25 01:43:16.306991 systemd-networkd[1415]: calid2c0e247624: Link UP Mar 25 01:43:16.309634 systemd-networkd[1415]: calid2c0e247624: Gained carrier Mar 25 01:43:16.337314 containerd[1510]: 2025-03-25 01:43:16.028 [INFO][4042] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--2--22d395eace-k8s-coredns--6f6b679f8f--2zp2f-eth0 coredns-6f6b679f8f- kube-system 1dbd8038-f679-40f3-bcdd-33eeca1fe31f 667 0 2025-03-25 01:42:43 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4284-0-0-2-22d395eace coredns-6f6b679f8f-2zp2f eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calid2c0e247624 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="d84225317892a4339812ef473801e9a5240054b330c77734ee9b8cf5d32c8d77" Namespace="kube-system" Pod="coredns-6f6b679f8f-2zp2f" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-coredns--6f6b679f8f--2zp2f-" Mar 25 01:43:16.337314 containerd[1510]: 2025-03-25 01:43:16.029 [INFO][4042] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="d84225317892a4339812ef473801e9a5240054b330c77734ee9b8cf5d32c8d77" Namespace="kube-system" Pod="coredns-6f6b679f8f-2zp2f" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-coredns--6f6b679f8f--2zp2f-eth0" Mar 25 01:43:16.337314 containerd[1510]: 2025-03-25 01:43:16.228 [INFO][4056] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d84225317892a4339812ef473801e9a5240054b330c77734ee9b8cf5d32c8d77" HandleID="k8s-pod-network.d84225317892a4339812ef473801e9a5240054b330c77734ee9b8cf5d32c8d77" Workload="ci--4284--0--0--2--22d395eace-k8s-coredns--6f6b679f8f--2zp2f-eth0" Mar 25 01:43:16.337563 containerd[1510]: 2025-03-25 01:43:16.255 [INFO][4056] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d84225317892a4339812ef473801e9a5240054b330c77734ee9b8cf5d32c8d77" HandleID="k8s-pod-network.d84225317892a4339812ef473801e9a5240054b330c77734ee9b8cf5d32c8d77" Workload="ci--4284--0--0--2--22d395eace-k8s-coredns--6f6b679f8f--2zp2f-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000319130), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4284-0-0-2-22d395eace", "pod":"coredns-6f6b679f8f-2zp2f", "timestamp":"2025-03-25 01:43:16.228413649 +0000 UTC"}, Hostname:"ci-4284-0-0-2-22d395eace", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:43:16.337563 containerd[1510]: 2025-03-25 01:43:16.255 [INFO][4056] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:43:16.337563 containerd[1510]: 2025-03-25 01:43:16.255 [INFO][4056] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:43:16.337563 containerd[1510]: 2025-03-25 01:43:16.255 [INFO][4056] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-2-22d395eace' Mar 25 01:43:16.337563 containerd[1510]: 2025-03-25 01:43:16.259 [INFO][4056] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.d84225317892a4339812ef473801e9a5240054b330c77734ee9b8cf5d32c8d77" host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:16.337563 containerd[1510]: 2025-03-25 01:43:16.268 [INFO][4056] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:16.337563 containerd[1510]: 2025-03-25 01:43:16.275 [INFO][4056] ipam/ipam.go 489: Trying affinity for 192.168.24.192/26 host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:16.337563 containerd[1510]: 2025-03-25 01:43:16.278 [INFO][4056] ipam/ipam.go 155: Attempting to load block cidr=192.168.24.192/26 host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:16.337563 containerd[1510]: 2025-03-25 01:43:16.281 [INFO][4056] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.24.192/26 host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:16.337900 containerd[1510]: 2025-03-25 01:43:16.281 [INFO][4056] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.24.192/26 handle="k8s-pod-network.d84225317892a4339812ef473801e9a5240054b330c77734ee9b8cf5d32c8d77" host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:16.337900 containerd[1510]: 2025-03-25 01:43:16.284 [INFO][4056] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.d84225317892a4339812ef473801e9a5240054b330c77734ee9b8cf5d32c8d77 Mar 25 01:43:16.337900 containerd[1510]: 2025-03-25 01:43:16.290 [INFO][4056] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.24.192/26 handle="k8s-pod-network.d84225317892a4339812ef473801e9a5240054b330c77734ee9b8cf5d32c8d77" host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:16.337900 containerd[1510]: 2025-03-25 01:43:16.297 [INFO][4056] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.24.193/26] block=192.168.24.192/26 handle="k8s-pod-network.d84225317892a4339812ef473801e9a5240054b330c77734ee9b8cf5d32c8d77" host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:16.337900 containerd[1510]: 2025-03-25 01:43:16.297 [INFO][4056] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.24.193/26] handle="k8s-pod-network.d84225317892a4339812ef473801e9a5240054b330c77734ee9b8cf5d32c8d77" host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:16.337900 containerd[1510]: 2025-03-25 01:43:16.297 [INFO][4056] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:43:16.337900 containerd[1510]: 2025-03-25 01:43:16.297 [INFO][4056] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.24.193/26] IPv6=[] ContainerID="d84225317892a4339812ef473801e9a5240054b330c77734ee9b8cf5d32c8d77" HandleID="k8s-pod-network.d84225317892a4339812ef473801e9a5240054b330c77734ee9b8cf5d32c8d77" Workload="ci--4284--0--0--2--22d395eace-k8s-coredns--6f6b679f8f--2zp2f-eth0" Mar 25 01:43:16.339593 containerd[1510]: 2025-03-25 01:43:16.301 [INFO][4042] cni-plugin/k8s.go 386: Populated endpoint ContainerID="d84225317892a4339812ef473801e9a5240054b330c77734ee9b8cf5d32c8d77" Namespace="kube-system" Pod="coredns-6f6b679f8f-2zp2f" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-coredns--6f6b679f8f--2zp2f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--2--22d395eace-k8s-coredns--6f6b679f8f--2zp2f-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"1dbd8038-f679-40f3-bcdd-33eeca1fe31f", ResourceVersion:"667", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 42, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-2-22d395eace", ContainerID:"", Pod:"coredns-6f6b679f8f-2zp2f", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.24.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid2c0e247624", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:43:16.339593 containerd[1510]: 2025-03-25 01:43:16.301 [INFO][4042] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.24.193/32] ContainerID="d84225317892a4339812ef473801e9a5240054b330c77734ee9b8cf5d32c8d77" Namespace="kube-system" Pod="coredns-6f6b679f8f-2zp2f" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-coredns--6f6b679f8f--2zp2f-eth0" Mar 25 01:43:16.339593 containerd[1510]: 2025-03-25 01:43:16.301 [INFO][4042] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid2c0e247624 ContainerID="d84225317892a4339812ef473801e9a5240054b330c77734ee9b8cf5d32c8d77" Namespace="kube-system" Pod="coredns-6f6b679f8f-2zp2f" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-coredns--6f6b679f8f--2zp2f-eth0" Mar 25 01:43:16.339593 containerd[1510]: 2025-03-25 01:43:16.307 [INFO][4042] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d84225317892a4339812ef473801e9a5240054b330c77734ee9b8cf5d32c8d77" Namespace="kube-system" Pod="coredns-6f6b679f8f-2zp2f" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-coredns--6f6b679f8f--2zp2f-eth0" Mar 25 01:43:16.339593 containerd[1510]: 2025-03-25 01:43:16.308 [INFO][4042] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="d84225317892a4339812ef473801e9a5240054b330c77734ee9b8cf5d32c8d77" Namespace="kube-system" Pod="coredns-6f6b679f8f-2zp2f" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-coredns--6f6b679f8f--2zp2f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--2--22d395eace-k8s-coredns--6f6b679f8f--2zp2f-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"1dbd8038-f679-40f3-bcdd-33eeca1fe31f", ResourceVersion:"667", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 42, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-2-22d395eace", ContainerID:"d84225317892a4339812ef473801e9a5240054b330c77734ee9b8cf5d32c8d77", Pod:"coredns-6f6b679f8f-2zp2f", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.24.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid2c0e247624", MAC:"76:8e:89:34:bb:e5", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:43:16.339593 containerd[1510]: 2025-03-25 01:43:16.329 [INFO][4042] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="d84225317892a4339812ef473801e9a5240054b330c77734ee9b8cf5d32c8d77" Namespace="kube-system" Pod="coredns-6f6b679f8f-2zp2f" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-coredns--6f6b679f8f--2zp2f-eth0" Mar 25 01:43:16.355222 kubelet[2808]: I0325 01:43:16.351066 2808 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-zhqrm" podStartSLOduration=5.041629854 podStartE2EDuration="26.326365398s" podCreationTimestamp="2025-03-25 01:42:50 +0000 UTC" firstStartedPulling="2025-03-25 01:42:50.622371597 +0000 UTC m=+12.869412946" lastFinishedPulling="2025-03-25 01:43:11.907107141 +0000 UTC m=+34.154148490" observedRunningTime="2025-03-25 01:43:13.100098422 +0000 UTC m=+35.347139812" watchObservedRunningTime="2025-03-25 01:43:16.326365398 +0000 UTC m=+38.573406768" Mar 25 01:43:16.471601 containerd[1510]: time="2025-03-25T01:43:16.470933853Z" level=info msg="connecting to shim d84225317892a4339812ef473801e9a5240054b330c77734ee9b8cf5d32c8d77" address="unix:///run/containerd/s/cc976c3c60806d38043d1386754756b5e27a348dd82e0e84b67376ff75687cc4" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:43:16.510007 systemd[1]: Started cri-containerd-d84225317892a4339812ef473801e9a5240054b330c77734ee9b8cf5d32c8d77.scope - libcontainer container d84225317892a4339812ef473801e9a5240054b330c77734ee9b8cf5d32c8d77. Mar 25 01:43:16.587735 containerd[1510]: time="2025-03-25T01:43:16.587554337Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-2zp2f,Uid:1dbd8038-f679-40f3-bcdd-33eeca1fe31f,Namespace:kube-system,Attempt:0,} returns sandbox id \"d84225317892a4339812ef473801e9a5240054b330c77734ee9b8cf5d32c8d77\"" Mar 25 01:43:16.601296 containerd[1510]: time="2025-03-25T01:43:16.599868775Z" level=info msg="CreateContainer within sandbox \"d84225317892a4339812ef473801e9a5240054b330c77734ee9b8cf5d32c8d77\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 25 01:43:16.621589 containerd[1510]: time="2025-03-25T01:43:16.621540842Z" level=info msg="Container 65f1b58e60ebadeb7928ba627c521abc8a7c60af60cebca7c7f3efa98e9cb70f: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:43:16.630529 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1257664507.mount: Deactivated successfully. Mar 25 01:43:16.633557 containerd[1510]: time="2025-03-25T01:43:16.633500751Z" level=info msg="CreateContainer within sandbox \"d84225317892a4339812ef473801e9a5240054b330c77734ee9b8cf5d32c8d77\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"65f1b58e60ebadeb7928ba627c521abc8a7c60af60cebca7c7f3efa98e9cb70f\"" Mar 25 01:43:16.634213 containerd[1510]: time="2025-03-25T01:43:16.634092920Z" level=info msg="StartContainer for \"65f1b58e60ebadeb7928ba627c521abc8a7c60af60cebca7c7f3efa98e9cb70f\"" Mar 25 01:43:16.637099 containerd[1510]: time="2025-03-25T01:43:16.636647739Z" level=info msg="connecting to shim 65f1b58e60ebadeb7928ba627c521abc8a7c60af60cebca7c7f3efa98e9cb70f" address="unix:///run/containerd/s/cc976c3c60806d38043d1386754756b5e27a348dd82e0e84b67376ff75687cc4" protocol=ttrpc version=3 Mar 25 01:43:16.667533 systemd[1]: Started cri-containerd-65f1b58e60ebadeb7928ba627c521abc8a7c60af60cebca7c7f3efa98e9cb70f.scope - libcontainer container 65f1b58e60ebadeb7928ba627c521abc8a7c60af60cebca7c7f3efa98e9cb70f. Mar 25 01:43:16.722633 containerd[1510]: time="2025-03-25T01:43:16.722583094Z" level=info msg="StartContainer for \"65f1b58e60ebadeb7928ba627c521abc8a7c60af60cebca7c7f3efa98e9cb70f\" returns successfully" Mar 25 01:43:16.913121 containerd[1510]: time="2025-03-25T01:43:16.912627481Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-j9x72,Uid:16e59d76-015e-4d72-b105-76b3e3dd930d,Namespace:calico-system,Attempt:0,}" Mar 25 01:43:17.089081 systemd-networkd[1415]: cali132a044fa0d: Link UP Mar 25 01:43:17.089202 systemd-networkd[1415]: cali132a044fa0d: Gained carrier Mar 25 01:43:17.125872 containerd[1510]: 2025-03-25 01:43:16.984 [INFO][4151] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--2--22d395eace-k8s-csi--node--driver--j9x72-eth0 csi-node-driver- calico-system 16e59d76-015e-4d72-b105-76b3e3dd930d 581 0 2025-03-25 01:42:50 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:568c96974f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4284-0-0-2-22d395eace csi-node-driver-j9x72 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali132a044fa0d [] []}} ContainerID="0f7e2369524f26a6d6083790d312b2d2c77100d15cb88a04f40f1c894306e542" Namespace="calico-system" Pod="csi-node-driver-j9x72" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-csi--node--driver--j9x72-" Mar 25 01:43:17.125872 containerd[1510]: 2025-03-25 01:43:16.984 [INFO][4151] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="0f7e2369524f26a6d6083790d312b2d2c77100d15cb88a04f40f1c894306e542" Namespace="calico-system" Pod="csi-node-driver-j9x72" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-csi--node--driver--j9x72-eth0" Mar 25 01:43:17.125872 containerd[1510]: 2025-03-25 01:43:17.027 [INFO][4165] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0f7e2369524f26a6d6083790d312b2d2c77100d15cb88a04f40f1c894306e542" HandleID="k8s-pod-network.0f7e2369524f26a6d6083790d312b2d2c77100d15cb88a04f40f1c894306e542" Workload="ci--4284--0--0--2--22d395eace-k8s-csi--node--driver--j9x72-eth0" Mar 25 01:43:17.125872 containerd[1510]: 2025-03-25 01:43:17.043 [INFO][4165] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0f7e2369524f26a6d6083790d312b2d2c77100d15cb88a04f40f1c894306e542" HandleID="k8s-pod-network.0f7e2369524f26a6d6083790d312b2d2c77100d15cb88a04f40f1c894306e542" Workload="ci--4284--0--0--2--22d395eace-k8s-csi--node--driver--j9x72-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003ba5b0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4284-0-0-2-22d395eace", "pod":"csi-node-driver-j9x72", "timestamp":"2025-03-25 01:43:17.02725078 +0000 UTC"}, Hostname:"ci-4284-0-0-2-22d395eace", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:43:17.125872 containerd[1510]: 2025-03-25 01:43:17.043 [INFO][4165] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:43:17.125872 containerd[1510]: 2025-03-25 01:43:17.043 [INFO][4165] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:43:17.125872 containerd[1510]: 2025-03-25 01:43:17.043 [INFO][4165] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-2-22d395eace' Mar 25 01:43:17.125872 containerd[1510]: 2025-03-25 01:43:17.047 [INFO][4165] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.0f7e2369524f26a6d6083790d312b2d2c77100d15cb88a04f40f1c894306e542" host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:17.125872 containerd[1510]: 2025-03-25 01:43:17.052 [INFO][4165] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:17.125872 containerd[1510]: 2025-03-25 01:43:17.057 [INFO][4165] ipam/ipam.go 489: Trying affinity for 192.168.24.192/26 host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:17.125872 containerd[1510]: 2025-03-25 01:43:17.060 [INFO][4165] ipam/ipam.go 155: Attempting to load block cidr=192.168.24.192/26 host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:17.125872 containerd[1510]: 2025-03-25 01:43:17.063 [INFO][4165] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.24.192/26 host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:17.125872 containerd[1510]: 2025-03-25 01:43:17.063 [INFO][4165] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.24.192/26 handle="k8s-pod-network.0f7e2369524f26a6d6083790d312b2d2c77100d15cb88a04f40f1c894306e542" host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:17.125872 containerd[1510]: 2025-03-25 01:43:17.066 [INFO][4165] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.0f7e2369524f26a6d6083790d312b2d2c77100d15cb88a04f40f1c894306e542 Mar 25 01:43:17.125872 containerd[1510]: 2025-03-25 01:43:17.072 [INFO][4165] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.24.192/26 handle="k8s-pod-network.0f7e2369524f26a6d6083790d312b2d2c77100d15cb88a04f40f1c894306e542" host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:17.125872 containerd[1510]: 2025-03-25 01:43:17.079 [INFO][4165] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.24.194/26] block=192.168.24.192/26 handle="k8s-pod-network.0f7e2369524f26a6d6083790d312b2d2c77100d15cb88a04f40f1c894306e542" host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:17.125872 containerd[1510]: 2025-03-25 01:43:17.079 [INFO][4165] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.24.194/26] handle="k8s-pod-network.0f7e2369524f26a6d6083790d312b2d2c77100d15cb88a04f40f1c894306e542" host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:17.125872 containerd[1510]: 2025-03-25 01:43:17.079 [INFO][4165] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:43:17.125872 containerd[1510]: 2025-03-25 01:43:17.079 [INFO][4165] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.24.194/26] IPv6=[] ContainerID="0f7e2369524f26a6d6083790d312b2d2c77100d15cb88a04f40f1c894306e542" HandleID="k8s-pod-network.0f7e2369524f26a6d6083790d312b2d2c77100d15cb88a04f40f1c894306e542" Workload="ci--4284--0--0--2--22d395eace-k8s-csi--node--driver--j9x72-eth0" Mar 25 01:43:17.134813 containerd[1510]: 2025-03-25 01:43:17.085 [INFO][4151] cni-plugin/k8s.go 386: Populated endpoint ContainerID="0f7e2369524f26a6d6083790d312b2d2c77100d15cb88a04f40f1c894306e542" Namespace="calico-system" Pod="csi-node-driver-j9x72" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-csi--node--driver--j9x72-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--2--22d395eace-k8s-csi--node--driver--j9x72-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"16e59d76-015e-4d72-b105-76b3e3dd930d", ResourceVersion:"581", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 42, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"568c96974f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-2-22d395eace", ContainerID:"", Pod:"csi-node-driver-j9x72", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.24.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali132a044fa0d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:43:17.134813 containerd[1510]: 2025-03-25 01:43:17.085 [INFO][4151] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.24.194/32] ContainerID="0f7e2369524f26a6d6083790d312b2d2c77100d15cb88a04f40f1c894306e542" Namespace="calico-system" Pod="csi-node-driver-j9x72" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-csi--node--driver--j9x72-eth0" Mar 25 01:43:17.134813 containerd[1510]: 2025-03-25 01:43:17.085 [INFO][4151] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali132a044fa0d ContainerID="0f7e2369524f26a6d6083790d312b2d2c77100d15cb88a04f40f1c894306e542" Namespace="calico-system" Pod="csi-node-driver-j9x72" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-csi--node--driver--j9x72-eth0" Mar 25 01:43:17.134813 containerd[1510]: 2025-03-25 01:43:17.087 [INFO][4151] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0f7e2369524f26a6d6083790d312b2d2c77100d15cb88a04f40f1c894306e542" Namespace="calico-system" Pod="csi-node-driver-j9x72" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-csi--node--driver--j9x72-eth0" Mar 25 01:43:17.134813 containerd[1510]: 2025-03-25 01:43:17.088 [INFO][4151] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="0f7e2369524f26a6d6083790d312b2d2c77100d15cb88a04f40f1c894306e542" Namespace="calico-system" Pod="csi-node-driver-j9x72" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-csi--node--driver--j9x72-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--2--22d395eace-k8s-csi--node--driver--j9x72-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"16e59d76-015e-4d72-b105-76b3e3dd930d", ResourceVersion:"581", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 42, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"568c96974f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-2-22d395eace", ContainerID:"0f7e2369524f26a6d6083790d312b2d2c77100d15cb88a04f40f1c894306e542", Pod:"csi-node-driver-j9x72", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.24.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali132a044fa0d", MAC:"86:31:72:b8:1e:c0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:43:17.134813 containerd[1510]: 2025-03-25 01:43:17.121 [INFO][4151] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="0f7e2369524f26a6d6083790d312b2d2c77100d15cb88a04f40f1c894306e542" Namespace="calico-system" Pod="csi-node-driver-j9x72" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-csi--node--driver--j9x72-eth0" Mar 25 01:43:17.164437 kubelet[2808]: I0325 01:43:17.163023 2808 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-2zp2f" podStartSLOduration=34.163000238 podStartE2EDuration="34.163000238s" podCreationTimestamp="2025-03-25 01:42:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:43:17.151074378 +0000 UTC m=+39.398115727" watchObservedRunningTime="2025-03-25 01:43:17.163000238 +0000 UTC m=+39.410041597" Mar 25 01:43:17.188900 containerd[1510]: time="2025-03-25T01:43:17.188480101Z" level=info msg="connecting to shim 0f7e2369524f26a6d6083790d312b2d2c77100d15cb88a04f40f1c894306e542" address="unix:///run/containerd/s/740550c7aceeec1c17f4dfa7c586d4356f547d4913c4e50a1c41e146c2d77a65" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:43:17.260440 systemd[1]: Started cri-containerd-0f7e2369524f26a6d6083790d312b2d2c77100d15cb88a04f40f1c894306e542.scope - libcontainer container 0f7e2369524f26a6d6083790d312b2d2c77100d15cb88a04f40f1c894306e542. Mar 25 01:43:17.292770 containerd[1510]: time="2025-03-25T01:43:17.292731269Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-j9x72,Uid:16e59d76-015e-4d72-b105-76b3e3dd930d,Namespace:calico-system,Attempt:0,} returns sandbox id \"0f7e2369524f26a6d6083790d312b2d2c77100d15cb88a04f40f1c894306e542\"" Mar 25 01:43:17.294601 containerd[1510]: time="2025-03-25T01:43:17.294577755Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\"" Mar 25 01:43:17.806784 systemd-networkd[1415]: calid2c0e247624: Gained IPv6LL Mar 25 01:43:17.915861 containerd[1510]: time="2025-03-25T01:43:17.915797655Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-gdgtb,Uid:58f0f3d1-dfb1-49eb-96d2-b92341d12cba,Namespace:kube-system,Attempt:0,}" Mar 25 01:43:18.072862 systemd-networkd[1415]: cali288a0000ae9: Link UP Mar 25 01:43:18.074874 systemd-networkd[1415]: cali288a0000ae9: Gained carrier Mar 25 01:43:18.092333 containerd[1510]: 2025-03-25 01:43:17.991 [INFO][4230] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--2--22d395eace-k8s-coredns--6f6b679f8f--gdgtb-eth0 coredns-6f6b679f8f- kube-system 58f0f3d1-dfb1-49eb-96d2-b92341d12cba 663 0 2025-03-25 01:42:43 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4284-0-0-2-22d395eace coredns-6f6b679f8f-gdgtb eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali288a0000ae9 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="e1e0d1c5fb3a2f164bda46257cbace035df11a1c0b3faf17a636d021779fc99a" Namespace="kube-system" Pod="coredns-6f6b679f8f-gdgtb" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-coredns--6f6b679f8f--gdgtb-" Mar 25 01:43:18.092333 containerd[1510]: 2025-03-25 01:43:17.992 [INFO][4230] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="e1e0d1c5fb3a2f164bda46257cbace035df11a1c0b3faf17a636d021779fc99a" Namespace="kube-system" Pod="coredns-6f6b679f8f-gdgtb" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-coredns--6f6b679f8f--gdgtb-eth0" Mar 25 01:43:18.092333 containerd[1510]: 2025-03-25 01:43:18.028 [INFO][4241] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e1e0d1c5fb3a2f164bda46257cbace035df11a1c0b3faf17a636d021779fc99a" HandleID="k8s-pod-network.e1e0d1c5fb3a2f164bda46257cbace035df11a1c0b3faf17a636d021779fc99a" Workload="ci--4284--0--0--2--22d395eace-k8s-coredns--6f6b679f8f--gdgtb-eth0" Mar 25 01:43:18.092333 containerd[1510]: 2025-03-25 01:43:18.037 [INFO][4241] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e1e0d1c5fb3a2f164bda46257cbace035df11a1c0b3faf17a636d021779fc99a" HandleID="k8s-pod-network.e1e0d1c5fb3a2f164bda46257cbace035df11a1c0b3faf17a636d021779fc99a" Workload="ci--4284--0--0--2--22d395eace-k8s-coredns--6f6b679f8f--gdgtb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000336c50), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4284-0-0-2-22d395eace", "pod":"coredns-6f6b679f8f-gdgtb", "timestamp":"2025-03-25 01:43:18.028543326 +0000 UTC"}, Hostname:"ci-4284-0-0-2-22d395eace", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:43:18.092333 containerd[1510]: 2025-03-25 01:43:18.037 [INFO][4241] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:43:18.092333 containerd[1510]: 2025-03-25 01:43:18.037 [INFO][4241] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:43:18.092333 containerd[1510]: 2025-03-25 01:43:18.037 [INFO][4241] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-2-22d395eace' Mar 25 01:43:18.092333 containerd[1510]: 2025-03-25 01:43:18.039 [INFO][4241] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.e1e0d1c5fb3a2f164bda46257cbace035df11a1c0b3faf17a636d021779fc99a" host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:18.092333 containerd[1510]: 2025-03-25 01:43:18.043 [INFO][4241] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:18.092333 containerd[1510]: 2025-03-25 01:43:18.047 [INFO][4241] ipam/ipam.go 489: Trying affinity for 192.168.24.192/26 host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:18.092333 containerd[1510]: 2025-03-25 01:43:18.049 [INFO][4241] ipam/ipam.go 155: Attempting to load block cidr=192.168.24.192/26 host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:18.092333 containerd[1510]: 2025-03-25 01:43:18.051 [INFO][4241] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.24.192/26 host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:18.092333 containerd[1510]: 2025-03-25 01:43:18.051 [INFO][4241] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.24.192/26 handle="k8s-pod-network.e1e0d1c5fb3a2f164bda46257cbace035df11a1c0b3faf17a636d021779fc99a" host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:18.092333 containerd[1510]: 2025-03-25 01:43:18.053 [INFO][4241] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.e1e0d1c5fb3a2f164bda46257cbace035df11a1c0b3faf17a636d021779fc99a Mar 25 01:43:18.092333 containerd[1510]: 2025-03-25 01:43:18.058 [INFO][4241] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.24.192/26 handle="k8s-pod-network.e1e0d1c5fb3a2f164bda46257cbace035df11a1c0b3faf17a636d021779fc99a" host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:18.092333 containerd[1510]: 2025-03-25 01:43:18.065 [INFO][4241] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.24.195/26] block=192.168.24.192/26 handle="k8s-pod-network.e1e0d1c5fb3a2f164bda46257cbace035df11a1c0b3faf17a636d021779fc99a" host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:18.092333 containerd[1510]: 2025-03-25 01:43:18.067 [INFO][4241] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.24.195/26] handle="k8s-pod-network.e1e0d1c5fb3a2f164bda46257cbace035df11a1c0b3faf17a636d021779fc99a" host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:18.092333 containerd[1510]: 2025-03-25 01:43:18.067 [INFO][4241] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:43:18.092333 containerd[1510]: 2025-03-25 01:43:18.067 [INFO][4241] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.24.195/26] IPv6=[] ContainerID="e1e0d1c5fb3a2f164bda46257cbace035df11a1c0b3faf17a636d021779fc99a" HandleID="k8s-pod-network.e1e0d1c5fb3a2f164bda46257cbace035df11a1c0b3faf17a636d021779fc99a" Workload="ci--4284--0--0--2--22d395eace-k8s-coredns--6f6b679f8f--gdgtb-eth0" Mar 25 01:43:18.093052 containerd[1510]: 2025-03-25 01:43:18.069 [INFO][4230] cni-plugin/k8s.go 386: Populated endpoint ContainerID="e1e0d1c5fb3a2f164bda46257cbace035df11a1c0b3faf17a636d021779fc99a" Namespace="kube-system" Pod="coredns-6f6b679f8f-gdgtb" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-coredns--6f6b679f8f--gdgtb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--2--22d395eace-k8s-coredns--6f6b679f8f--gdgtb-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"58f0f3d1-dfb1-49eb-96d2-b92341d12cba", ResourceVersion:"663", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 42, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-2-22d395eace", ContainerID:"", Pod:"coredns-6f6b679f8f-gdgtb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.24.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali288a0000ae9", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:43:18.093052 containerd[1510]: 2025-03-25 01:43:18.070 [INFO][4230] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.24.195/32] ContainerID="e1e0d1c5fb3a2f164bda46257cbace035df11a1c0b3faf17a636d021779fc99a" Namespace="kube-system" Pod="coredns-6f6b679f8f-gdgtb" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-coredns--6f6b679f8f--gdgtb-eth0" Mar 25 01:43:18.093052 containerd[1510]: 2025-03-25 01:43:18.070 [INFO][4230] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali288a0000ae9 ContainerID="e1e0d1c5fb3a2f164bda46257cbace035df11a1c0b3faf17a636d021779fc99a" Namespace="kube-system" Pod="coredns-6f6b679f8f-gdgtb" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-coredns--6f6b679f8f--gdgtb-eth0" Mar 25 01:43:18.093052 containerd[1510]: 2025-03-25 01:43:18.075 [INFO][4230] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e1e0d1c5fb3a2f164bda46257cbace035df11a1c0b3faf17a636d021779fc99a" Namespace="kube-system" Pod="coredns-6f6b679f8f-gdgtb" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-coredns--6f6b679f8f--gdgtb-eth0" Mar 25 01:43:18.093052 containerd[1510]: 2025-03-25 01:43:18.075 [INFO][4230] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="e1e0d1c5fb3a2f164bda46257cbace035df11a1c0b3faf17a636d021779fc99a" Namespace="kube-system" Pod="coredns-6f6b679f8f-gdgtb" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-coredns--6f6b679f8f--gdgtb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--2--22d395eace-k8s-coredns--6f6b679f8f--gdgtb-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"58f0f3d1-dfb1-49eb-96d2-b92341d12cba", ResourceVersion:"663", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 42, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-2-22d395eace", ContainerID:"e1e0d1c5fb3a2f164bda46257cbace035df11a1c0b3faf17a636d021779fc99a", Pod:"coredns-6f6b679f8f-gdgtb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.24.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali288a0000ae9", MAC:"ae:ac:bd:23:28:6d", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:43:18.093052 containerd[1510]: 2025-03-25 01:43:18.090 [INFO][4230] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="e1e0d1c5fb3a2f164bda46257cbace035df11a1c0b3faf17a636d021779fc99a" Namespace="kube-system" Pod="coredns-6f6b679f8f-gdgtb" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-coredns--6f6b679f8f--gdgtb-eth0" Mar 25 01:43:18.142292 containerd[1510]: time="2025-03-25T01:43:18.141848890Z" level=info msg="connecting to shim e1e0d1c5fb3a2f164bda46257cbace035df11a1c0b3faf17a636d021779fc99a" address="unix:///run/containerd/s/8a98535d5897d144daa818328bdf707e92cf0376f7910a32bea4987ffd36bd2e" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:43:18.173949 systemd[1]: Started cri-containerd-e1e0d1c5fb3a2f164bda46257cbace035df11a1c0b3faf17a636d021779fc99a.scope - libcontainer container e1e0d1c5fb3a2f164bda46257cbace035df11a1c0b3faf17a636d021779fc99a. Mar 25 01:43:18.235188 containerd[1510]: time="2025-03-25T01:43:18.235142110Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-gdgtb,Uid:58f0f3d1-dfb1-49eb-96d2-b92341d12cba,Namespace:kube-system,Attempt:0,} returns sandbox id \"e1e0d1c5fb3a2f164bda46257cbace035df11a1c0b3faf17a636d021779fc99a\"" Mar 25 01:43:18.238783 containerd[1510]: time="2025-03-25T01:43:18.238432477Z" level=info msg="CreateContainer within sandbox \"e1e0d1c5fb3a2f164bda46257cbace035df11a1c0b3faf17a636d021779fc99a\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 25 01:43:18.263002 containerd[1510]: time="2025-03-25T01:43:18.262352955Z" level=info msg="Container b51d66e187a7bfc64a26d144cb085d300a14bfef74cc8384e3038346a5cd2f5e: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:43:18.264347 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3483776728.mount: Deactivated successfully. Mar 25 01:43:18.274458 containerd[1510]: time="2025-03-25T01:43:18.273342874Z" level=info msg="CreateContainer within sandbox \"e1e0d1c5fb3a2f164bda46257cbace035df11a1c0b3faf17a636d021779fc99a\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"b51d66e187a7bfc64a26d144cb085d300a14bfef74cc8384e3038346a5cd2f5e\"" Mar 25 01:43:18.274458 containerd[1510]: time="2025-03-25T01:43:18.274285463Z" level=info msg="StartContainer for \"b51d66e187a7bfc64a26d144cb085d300a14bfef74cc8384e3038346a5cd2f5e\"" Mar 25 01:43:18.275911 containerd[1510]: time="2025-03-25T01:43:18.275886956Z" level=info msg="connecting to shim b51d66e187a7bfc64a26d144cb085d300a14bfef74cc8384e3038346a5cd2f5e" address="unix:///run/containerd/s/8a98535d5897d144daa818328bdf707e92cf0376f7910a32bea4987ffd36bd2e" protocol=ttrpc version=3 Mar 25 01:43:18.294425 systemd[1]: Started cri-containerd-b51d66e187a7bfc64a26d144cb085d300a14bfef74cc8384e3038346a5cd2f5e.scope - libcontainer container b51d66e187a7bfc64a26d144cb085d300a14bfef74cc8384e3038346a5cd2f5e. Mar 25 01:43:18.327712 containerd[1510]: time="2025-03-25T01:43:18.327559855Z" level=info msg="StartContainer for \"b51d66e187a7bfc64a26d144cb085d300a14bfef74cc8384e3038346a5cd2f5e\" returns successfully" Mar 25 01:43:18.382502 systemd-networkd[1415]: cali132a044fa0d: Gained IPv6LL Mar 25 01:43:18.912486 containerd[1510]: time="2025-03-25T01:43:18.912401796Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f6c47cd8c-ztc7d,Uid:f3ec862d-50d5-495c-8b60-3b6369bda216,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:43:18.913457 containerd[1510]: time="2025-03-25T01:43:18.912780089Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6c89bd67d9-shfkm,Uid:0bc14d37-3e76-4377-aa16-ba03020be65d,Namespace:calico-system,Attempt:0,}" Mar 25 01:43:19.080847 systemd-networkd[1415]: calie69d2d59c8a: Link UP Mar 25 01:43:19.083022 systemd-networkd[1415]: calie69d2d59c8a: Gained carrier Mar 25 01:43:19.107012 containerd[1510]: 2025-03-25 01:43:18.993 [INFO][4345] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--2--22d395eace-k8s-calico--apiserver--f6c47cd8c--ztc7d-eth0 calico-apiserver-f6c47cd8c- calico-apiserver f3ec862d-50d5-495c-8b60-3b6369bda216 670 0 2025-03-25 01:42:50 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:f6c47cd8c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4284-0-0-2-22d395eace calico-apiserver-f6c47cd8c-ztc7d eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calie69d2d59c8a [] []}} ContainerID="f638298fa8589574341426be9e3bd3a78c4b5a764c8bf79a78574a44f300960b" Namespace="calico-apiserver" Pod="calico-apiserver-f6c47cd8c-ztc7d" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-calico--apiserver--f6c47cd8c--ztc7d-" Mar 25 01:43:19.107012 containerd[1510]: 2025-03-25 01:43:18.993 [INFO][4345] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="f638298fa8589574341426be9e3bd3a78c4b5a764c8bf79a78574a44f300960b" Namespace="calico-apiserver" Pod="calico-apiserver-f6c47cd8c-ztc7d" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-calico--apiserver--f6c47cd8c--ztc7d-eth0" Mar 25 01:43:19.107012 containerd[1510]: 2025-03-25 01:43:19.031 [INFO][4376] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f638298fa8589574341426be9e3bd3a78c4b5a764c8bf79a78574a44f300960b" HandleID="k8s-pod-network.f638298fa8589574341426be9e3bd3a78c4b5a764c8bf79a78574a44f300960b" Workload="ci--4284--0--0--2--22d395eace-k8s-calico--apiserver--f6c47cd8c--ztc7d-eth0" Mar 25 01:43:19.107012 containerd[1510]: 2025-03-25 01:43:19.041 [INFO][4376] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f638298fa8589574341426be9e3bd3a78c4b5a764c8bf79a78574a44f300960b" HandleID="k8s-pod-network.f638298fa8589574341426be9e3bd3a78c4b5a764c8bf79a78574a44f300960b" Workload="ci--4284--0--0--2--22d395eace-k8s-calico--apiserver--f6c47cd8c--ztc7d-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000319750), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4284-0-0-2-22d395eace", "pod":"calico-apiserver-f6c47cd8c-ztc7d", "timestamp":"2025-03-25 01:43:19.030981361 +0000 UTC"}, Hostname:"ci-4284-0-0-2-22d395eace", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:43:19.107012 containerd[1510]: 2025-03-25 01:43:19.041 [INFO][4376] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:43:19.107012 containerd[1510]: 2025-03-25 01:43:19.041 [INFO][4376] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:43:19.107012 containerd[1510]: 2025-03-25 01:43:19.041 [INFO][4376] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-2-22d395eace' Mar 25 01:43:19.107012 containerd[1510]: 2025-03-25 01:43:19.043 [INFO][4376] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.f638298fa8589574341426be9e3bd3a78c4b5a764c8bf79a78574a44f300960b" host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:19.107012 containerd[1510]: 2025-03-25 01:43:19.048 [INFO][4376] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:19.107012 containerd[1510]: 2025-03-25 01:43:19.055 [INFO][4376] ipam/ipam.go 489: Trying affinity for 192.168.24.192/26 host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:19.107012 containerd[1510]: 2025-03-25 01:43:19.057 [INFO][4376] ipam/ipam.go 155: Attempting to load block cidr=192.168.24.192/26 host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:19.107012 containerd[1510]: 2025-03-25 01:43:19.059 [INFO][4376] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.24.192/26 host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:19.107012 containerd[1510]: 2025-03-25 01:43:19.059 [INFO][4376] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.24.192/26 handle="k8s-pod-network.f638298fa8589574341426be9e3bd3a78c4b5a764c8bf79a78574a44f300960b" host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:19.107012 containerd[1510]: 2025-03-25 01:43:19.061 [INFO][4376] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.f638298fa8589574341426be9e3bd3a78c4b5a764c8bf79a78574a44f300960b Mar 25 01:43:19.107012 containerd[1510]: 2025-03-25 01:43:19.066 [INFO][4376] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.24.192/26 handle="k8s-pod-network.f638298fa8589574341426be9e3bd3a78c4b5a764c8bf79a78574a44f300960b" host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:19.107012 containerd[1510]: 2025-03-25 01:43:19.072 [INFO][4376] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.24.196/26] block=192.168.24.192/26 handle="k8s-pod-network.f638298fa8589574341426be9e3bd3a78c4b5a764c8bf79a78574a44f300960b" host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:19.107012 containerd[1510]: 2025-03-25 01:43:19.072 [INFO][4376] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.24.196/26] handle="k8s-pod-network.f638298fa8589574341426be9e3bd3a78c4b5a764c8bf79a78574a44f300960b" host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:19.107012 containerd[1510]: 2025-03-25 01:43:19.072 [INFO][4376] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:43:19.107012 containerd[1510]: 2025-03-25 01:43:19.072 [INFO][4376] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.24.196/26] IPv6=[] ContainerID="f638298fa8589574341426be9e3bd3a78c4b5a764c8bf79a78574a44f300960b" HandleID="k8s-pod-network.f638298fa8589574341426be9e3bd3a78c4b5a764c8bf79a78574a44f300960b" Workload="ci--4284--0--0--2--22d395eace-k8s-calico--apiserver--f6c47cd8c--ztc7d-eth0" Mar 25 01:43:19.108601 containerd[1510]: 2025-03-25 01:43:19.075 [INFO][4345] cni-plugin/k8s.go 386: Populated endpoint ContainerID="f638298fa8589574341426be9e3bd3a78c4b5a764c8bf79a78574a44f300960b" Namespace="calico-apiserver" Pod="calico-apiserver-f6c47cd8c-ztc7d" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-calico--apiserver--f6c47cd8c--ztc7d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--2--22d395eace-k8s-calico--apiserver--f6c47cd8c--ztc7d-eth0", GenerateName:"calico-apiserver-f6c47cd8c-", Namespace:"calico-apiserver", SelfLink:"", UID:"f3ec862d-50d5-495c-8b60-3b6369bda216", ResourceVersion:"670", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 42, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"f6c47cd8c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-2-22d395eace", ContainerID:"", Pod:"calico-apiserver-f6c47cd8c-ztc7d", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.24.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie69d2d59c8a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:43:19.108601 containerd[1510]: 2025-03-25 01:43:19.075 [INFO][4345] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.24.196/32] ContainerID="f638298fa8589574341426be9e3bd3a78c4b5a764c8bf79a78574a44f300960b" Namespace="calico-apiserver" Pod="calico-apiserver-f6c47cd8c-ztc7d" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-calico--apiserver--f6c47cd8c--ztc7d-eth0" Mar 25 01:43:19.108601 containerd[1510]: 2025-03-25 01:43:19.075 [INFO][4345] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie69d2d59c8a ContainerID="f638298fa8589574341426be9e3bd3a78c4b5a764c8bf79a78574a44f300960b" Namespace="calico-apiserver" Pod="calico-apiserver-f6c47cd8c-ztc7d" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-calico--apiserver--f6c47cd8c--ztc7d-eth0" Mar 25 01:43:19.108601 containerd[1510]: 2025-03-25 01:43:19.087 [INFO][4345] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f638298fa8589574341426be9e3bd3a78c4b5a764c8bf79a78574a44f300960b" Namespace="calico-apiserver" Pod="calico-apiserver-f6c47cd8c-ztc7d" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-calico--apiserver--f6c47cd8c--ztc7d-eth0" Mar 25 01:43:19.108601 containerd[1510]: 2025-03-25 01:43:19.088 [INFO][4345] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="f638298fa8589574341426be9e3bd3a78c4b5a764c8bf79a78574a44f300960b" Namespace="calico-apiserver" Pod="calico-apiserver-f6c47cd8c-ztc7d" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-calico--apiserver--f6c47cd8c--ztc7d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--2--22d395eace-k8s-calico--apiserver--f6c47cd8c--ztc7d-eth0", GenerateName:"calico-apiserver-f6c47cd8c-", Namespace:"calico-apiserver", SelfLink:"", UID:"f3ec862d-50d5-495c-8b60-3b6369bda216", ResourceVersion:"670", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 42, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"f6c47cd8c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-2-22d395eace", ContainerID:"f638298fa8589574341426be9e3bd3a78c4b5a764c8bf79a78574a44f300960b", Pod:"calico-apiserver-f6c47cd8c-ztc7d", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.24.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie69d2d59c8a", MAC:"8a:f7:9a:2c:be:68", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:43:19.108601 containerd[1510]: 2025-03-25 01:43:19.104 [INFO][4345] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="f638298fa8589574341426be9e3bd3a78c4b5a764c8bf79a78574a44f300960b" Namespace="calico-apiserver" Pod="calico-apiserver-f6c47cd8c-ztc7d" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-calico--apiserver--f6c47cd8c--ztc7d-eth0" Mar 25 01:43:19.142182 containerd[1510]: time="2025-03-25T01:43:19.142003088Z" level=info msg="connecting to shim f638298fa8589574341426be9e3bd3a78c4b5a764c8bf79a78574a44f300960b" address="unix:///run/containerd/s/84a1e7028ff495eac058819f247ecb8aefbb0ce1a3601799f4f975513a6ae606" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:43:19.183809 systemd[1]: Started cri-containerd-f638298fa8589574341426be9e3bd3a78c4b5a764c8bf79a78574a44f300960b.scope - libcontainer container f638298fa8589574341426be9e3bd3a78c4b5a764c8bf79a78574a44f300960b. Mar 25 01:43:19.189484 kubelet[2808]: I0325 01:43:19.189331 2808 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-gdgtb" podStartSLOduration=36.189309661 podStartE2EDuration="36.189309661s" podCreationTimestamp="2025-03-25 01:42:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-25 01:43:19.162303265 +0000 UTC m=+41.409344624" watchObservedRunningTime="2025-03-25 01:43:19.189309661 +0000 UTC m=+41.436351010" Mar 25 01:43:19.233371 systemd-networkd[1415]: cali1607072ffc3: Link UP Mar 25 01:43:19.233638 systemd-networkd[1415]: cali1607072ffc3: Gained carrier Mar 25 01:43:19.254370 containerd[1510]: 2025-03-25 01:43:18.982 [INFO][4349] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--2--22d395eace-k8s-calico--kube--controllers--6c89bd67d9--shfkm-eth0 calico-kube-controllers-6c89bd67d9- calico-system 0bc14d37-3e76-4377-aa16-ba03020be65d 669 0 2025-03-25 01:42:50 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:6c89bd67d9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4284-0-0-2-22d395eace calico-kube-controllers-6c89bd67d9-shfkm eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali1607072ffc3 [] []}} ContainerID="a9f3963214a06889610d1ba768c306af30a14d13076c27e39f54e7d073d0261f" Namespace="calico-system" Pod="calico-kube-controllers-6c89bd67d9-shfkm" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-calico--kube--controllers--6c89bd67d9--shfkm-" Mar 25 01:43:19.254370 containerd[1510]: 2025-03-25 01:43:18.982 [INFO][4349] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="a9f3963214a06889610d1ba768c306af30a14d13076c27e39f54e7d073d0261f" Namespace="calico-system" Pod="calico-kube-controllers-6c89bd67d9-shfkm" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-calico--kube--controllers--6c89bd67d9--shfkm-eth0" Mar 25 01:43:19.254370 containerd[1510]: 2025-03-25 01:43:19.048 [INFO][4374] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a9f3963214a06889610d1ba768c306af30a14d13076c27e39f54e7d073d0261f" HandleID="k8s-pod-network.a9f3963214a06889610d1ba768c306af30a14d13076c27e39f54e7d073d0261f" Workload="ci--4284--0--0--2--22d395eace-k8s-calico--kube--controllers--6c89bd67d9--shfkm-eth0" Mar 25 01:43:19.254370 containerd[1510]: 2025-03-25 01:43:19.056 [INFO][4374] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a9f3963214a06889610d1ba768c306af30a14d13076c27e39f54e7d073d0261f" HandleID="k8s-pod-network.a9f3963214a06889610d1ba768c306af30a14d13076c27e39f54e7d073d0261f" Workload="ci--4284--0--0--2--22d395eace-k8s-calico--kube--controllers--6c89bd67d9--shfkm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00031bc70), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4284-0-0-2-22d395eace", "pod":"calico-kube-controllers-6c89bd67d9-shfkm", "timestamp":"2025-03-25 01:43:19.048435735 +0000 UTC"}, Hostname:"ci-4284-0-0-2-22d395eace", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:43:19.254370 containerd[1510]: 2025-03-25 01:43:19.056 [INFO][4374] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:43:19.254370 containerd[1510]: 2025-03-25 01:43:19.072 [INFO][4374] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:43:19.254370 containerd[1510]: 2025-03-25 01:43:19.072 [INFO][4374] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-2-22d395eace' Mar 25 01:43:19.254370 containerd[1510]: 2025-03-25 01:43:19.145 [INFO][4374] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.a9f3963214a06889610d1ba768c306af30a14d13076c27e39f54e7d073d0261f" host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:19.254370 containerd[1510]: 2025-03-25 01:43:19.165 [INFO][4374] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:19.254370 containerd[1510]: 2025-03-25 01:43:19.191 [INFO][4374] ipam/ipam.go 489: Trying affinity for 192.168.24.192/26 host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:19.254370 containerd[1510]: 2025-03-25 01:43:19.196 [INFO][4374] ipam/ipam.go 155: Attempting to load block cidr=192.168.24.192/26 host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:19.254370 containerd[1510]: 2025-03-25 01:43:19.200 [INFO][4374] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.24.192/26 host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:19.254370 containerd[1510]: 2025-03-25 01:43:19.200 [INFO][4374] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.24.192/26 handle="k8s-pod-network.a9f3963214a06889610d1ba768c306af30a14d13076c27e39f54e7d073d0261f" host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:19.254370 containerd[1510]: 2025-03-25 01:43:19.206 [INFO][4374] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.a9f3963214a06889610d1ba768c306af30a14d13076c27e39f54e7d073d0261f Mar 25 01:43:19.254370 containerd[1510]: 2025-03-25 01:43:19.215 [INFO][4374] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.24.192/26 handle="k8s-pod-network.a9f3963214a06889610d1ba768c306af30a14d13076c27e39f54e7d073d0261f" host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:19.254370 containerd[1510]: 2025-03-25 01:43:19.225 [INFO][4374] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.24.197/26] block=192.168.24.192/26 handle="k8s-pod-network.a9f3963214a06889610d1ba768c306af30a14d13076c27e39f54e7d073d0261f" host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:19.254370 containerd[1510]: 2025-03-25 01:43:19.225 [INFO][4374] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.24.197/26] handle="k8s-pod-network.a9f3963214a06889610d1ba768c306af30a14d13076c27e39f54e7d073d0261f" host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:19.254370 containerd[1510]: 2025-03-25 01:43:19.225 [INFO][4374] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:43:19.254370 containerd[1510]: 2025-03-25 01:43:19.225 [INFO][4374] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.24.197/26] IPv6=[] ContainerID="a9f3963214a06889610d1ba768c306af30a14d13076c27e39f54e7d073d0261f" HandleID="k8s-pod-network.a9f3963214a06889610d1ba768c306af30a14d13076c27e39f54e7d073d0261f" Workload="ci--4284--0--0--2--22d395eace-k8s-calico--kube--controllers--6c89bd67d9--shfkm-eth0" Mar 25 01:43:19.256458 containerd[1510]: 2025-03-25 01:43:19.228 [INFO][4349] cni-plugin/k8s.go 386: Populated endpoint ContainerID="a9f3963214a06889610d1ba768c306af30a14d13076c27e39f54e7d073d0261f" Namespace="calico-system" Pod="calico-kube-controllers-6c89bd67d9-shfkm" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-calico--kube--controllers--6c89bd67d9--shfkm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--2--22d395eace-k8s-calico--kube--controllers--6c89bd67d9--shfkm-eth0", GenerateName:"calico-kube-controllers-6c89bd67d9-", Namespace:"calico-system", SelfLink:"", UID:"0bc14d37-3e76-4377-aa16-ba03020be65d", ResourceVersion:"669", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 42, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6c89bd67d9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-2-22d395eace", ContainerID:"", Pod:"calico-kube-controllers-6c89bd67d9-shfkm", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.24.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali1607072ffc3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:43:19.256458 containerd[1510]: 2025-03-25 01:43:19.228 [INFO][4349] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.24.197/32] ContainerID="a9f3963214a06889610d1ba768c306af30a14d13076c27e39f54e7d073d0261f" Namespace="calico-system" Pod="calico-kube-controllers-6c89bd67d9-shfkm" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-calico--kube--controllers--6c89bd67d9--shfkm-eth0" Mar 25 01:43:19.256458 containerd[1510]: 2025-03-25 01:43:19.228 [INFO][4349] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1607072ffc3 ContainerID="a9f3963214a06889610d1ba768c306af30a14d13076c27e39f54e7d073d0261f" Namespace="calico-system" Pod="calico-kube-controllers-6c89bd67d9-shfkm" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-calico--kube--controllers--6c89bd67d9--shfkm-eth0" Mar 25 01:43:19.256458 containerd[1510]: 2025-03-25 01:43:19.231 [INFO][4349] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a9f3963214a06889610d1ba768c306af30a14d13076c27e39f54e7d073d0261f" Namespace="calico-system" Pod="calico-kube-controllers-6c89bd67d9-shfkm" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-calico--kube--controllers--6c89bd67d9--shfkm-eth0" Mar 25 01:43:19.256458 containerd[1510]: 2025-03-25 01:43:19.231 [INFO][4349] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="a9f3963214a06889610d1ba768c306af30a14d13076c27e39f54e7d073d0261f" Namespace="calico-system" Pod="calico-kube-controllers-6c89bd67d9-shfkm" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-calico--kube--controllers--6c89bd67d9--shfkm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--2--22d395eace-k8s-calico--kube--controllers--6c89bd67d9--shfkm-eth0", GenerateName:"calico-kube-controllers-6c89bd67d9-", Namespace:"calico-system", SelfLink:"", UID:"0bc14d37-3e76-4377-aa16-ba03020be65d", ResourceVersion:"669", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 42, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6c89bd67d9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-2-22d395eace", ContainerID:"a9f3963214a06889610d1ba768c306af30a14d13076c27e39f54e7d073d0261f", Pod:"calico-kube-controllers-6c89bd67d9-shfkm", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.24.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali1607072ffc3", MAC:"72:e8:2c:3d:a4:9a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:43:19.256458 containerd[1510]: 2025-03-25 01:43:19.251 [INFO][4349] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="a9f3963214a06889610d1ba768c306af30a14d13076c27e39f54e7d073d0261f" Namespace="calico-system" Pod="calico-kube-controllers-6c89bd67d9-shfkm" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-calico--kube--controllers--6c89bd67d9--shfkm-eth0" Mar 25 01:43:19.293912 containerd[1510]: time="2025-03-25T01:43:19.293868470Z" level=info msg="connecting to shim a9f3963214a06889610d1ba768c306af30a14d13076c27e39f54e7d073d0261f" address="unix:///run/containerd/s/02140fe98eafca1a0cbb3b8af6ab8a94c3ca22185ff75a12e6b92de8c8e79988" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:43:19.320243 containerd[1510]: time="2025-03-25T01:43:19.319032606Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f6c47cd8c-ztc7d,Uid:f3ec862d-50d5-495c-8b60-3b6369bda216,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"f638298fa8589574341426be9e3bd3a78c4b5a764c8bf79a78574a44f300960b\"" Mar 25 01:43:19.323827 systemd[1]: Started cri-containerd-a9f3963214a06889610d1ba768c306af30a14d13076c27e39f54e7d073d0261f.scope - libcontainer container a9f3963214a06889610d1ba768c306af30a14d13076c27e39f54e7d073d0261f. Mar 25 01:43:19.376250 containerd[1510]: time="2025-03-25T01:43:19.376182382Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6c89bd67d9-shfkm,Uid:0bc14d37-3e76-4377-aa16-ba03020be65d,Namespace:calico-system,Attempt:0,} returns sandbox id \"a9f3963214a06889610d1ba768c306af30a14d13076c27e39f54e7d073d0261f\"" Mar 25 01:43:19.642763 containerd[1510]: time="2025-03-25T01:43:19.642594926Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:43:19.643905 containerd[1510]: time="2025-03-25T01:43:19.643860204Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.2: active requests=0, bytes read=7909887" Mar 25 01:43:19.645184 containerd[1510]: time="2025-03-25T01:43:19.645121396Z" level=info msg="ImageCreate event name:\"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:43:19.647177 containerd[1510]: time="2025-03-25T01:43:19.647117592Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:43:19.647885 containerd[1510]: time="2025-03-25T01:43:19.647522453Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.2\" with image id \"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:214b4eef7008808bda55ad3cc1d4a3cd8df9e0e8094dff213fa3241104eb892c\", size \"9402991\" in 2.352919501s" Mar 25 01:43:19.647885 containerd[1510]: time="2025-03-25T01:43:19.647547690Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.2\" returns image reference \"sha256:0fae09f861e350c042fe0db9ce9f8cc5ac4df975a5c4e4a9ddc3c6fac1552a9a\"" Mar 25 01:43:19.649258 containerd[1510]: time="2025-03-25T01:43:19.649229652Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 25 01:43:19.650100 containerd[1510]: time="2025-03-25T01:43:19.650067668Z" level=info msg="CreateContainer within sandbox \"0f7e2369524f26a6d6083790d312b2d2c77100d15cb88a04f40f1c894306e542\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 25 01:43:19.665570 containerd[1510]: time="2025-03-25T01:43:19.665525035Z" level=info msg="Container dd5584e36142935b2b66ee735296b0eed4e1f13a90302f9147c30cc5936af19c: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:43:19.687199 containerd[1510]: time="2025-03-25T01:43:19.687134861Z" level=info msg="CreateContainer within sandbox \"0f7e2369524f26a6d6083790d312b2d2c77100d15cb88a04f40f1c894306e542\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"dd5584e36142935b2b66ee735296b0eed4e1f13a90302f9147c30cc5936af19c\"" Mar 25 01:43:19.689247 containerd[1510]: time="2025-03-25T01:43:19.687704340Z" level=info msg="StartContainer for \"dd5584e36142935b2b66ee735296b0eed4e1f13a90302f9147c30cc5936af19c\"" Mar 25 01:43:19.693516 containerd[1510]: time="2025-03-25T01:43:19.692423300Z" level=info msg="connecting to shim dd5584e36142935b2b66ee735296b0eed4e1f13a90302f9147c30cc5936af19c" address="unix:///run/containerd/s/740550c7aceeec1c17f4dfa7c586d4356f547d4913c4e50a1c41e146c2d77a65" protocol=ttrpc version=3 Mar 25 01:43:19.716472 systemd[1]: Started cri-containerd-dd5584e36142935b2b66ee735296b0eed4e1f13a90302f9147c30cc5936af19c.scope - libcontainer container dd5584e36142935b2b66ee735296b0eed4e1f13a90302f9147c30cc5936af19c. Mar 25 01:43:19.774231 containerd[1510]: time="2025-03-25T01:43:19.773448419Z" level=info msg="StartContainer for \"dd5584e36142935b2b66ee735296b0eed4e1f13a90302f9147c30cc5936af19c\" returns successfully" Mar 25 01:43:19.915019 containerd[1510]: time="2025-03-25T01:43:19.914806755Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f6c47cd8c-596j7,Uid:44a52fe5-6229-44f0-a7aa-4765242ee6fa,Namespace:calico-apiserver,Attempt:0,}" Mar 25 01:43:20.111187 systemd-networkd[1415]: cali288a0000ae9: Gained IPv6LL Mar 25 01:43:20.166292 systemd-networkd[1415]: calib0d64f1fee9: Link UP Mar 25 01:43:20.167543 systemd-networkd[1415]: calib0d64f1fee9: Gained carrier Mar 25 01:43:20.184430 containerd[1510]: 2025-03-25 01:43:19.993 [INFO][4546] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--2--22d395eace-k8s-calico--apiserver--f6c47cd8c--596j7-eth0 calico-apiserver-f6c47cd8c- calico-apiserver 44a52fe5-6229-44f0-a7aa-4765242ee6fa 671 0 2025-03-25 01:42:50 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:f6c47cd8c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4284-0-0-2-22d395eace calico-apiserver-f6c47cd8c-596j7 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calib0d64f1fee9 [] []}} ContainerID="e7d83d648f2ae7ff7c684a1e6b18cdd0c73eaba73def8883778af536557e2552" Namespace="calico-apiserver" Pod="calico-apiserver-f6c47cd8c-596j7" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-calico--apiserver--f6c47cd8c--596j7-" Mar 25 01:43:20.184430 containerd[1510]: 2025-03-25 01:43:19.993 [INFO][4546] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="e7d83d648f2ae7ff7c684a1e6b18cdd0c73eaba73def8883778af536557e2552" Namespace="calico-apiserver" Pod="calico-apiserver-f6c47cd8c-596j7" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-calico--apiserver--f6c47cd8c--596j7-eth0" Mar 25 01:43:20.184430 containerd[1510]: 2025-03-25 01:43:20.051 [INFO][4561] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e7d83d648f2ae7ff7c684a1e6b18cdd0c73eaba73def8883778af536557e2552" HandleID="k8s-pod-network.e7d83d648f2ae7ff7c684a1e6b18cdd0c73eaba73def8883778af536557e2552" Workload="ci--4284--0--0--2--22d395eace-k8s-calico--apiserver--f6c47cd8c--596j7-eth0" Mar 25 01:43:20.184430 containerd[1510]: 2025-03-25 01:43:20.066 [INFO][4561] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e7d83d648f2ae7ff7c684a1e6b18cdd0c73eaba73def8883778af536557e2552" HandleID="k8s-pod-network.e7d83d648f2ae7ff7c684a1e6b18cdd0c73eaba73def8883778af536557e2552" Workload="ci--4284--0--0--2--22d395eace-k8s-calico--apiserver--f6c47cd8c--596j7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000290f30), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4284-0-0-2-22d395eace", "pod":"calico-apiserver-f6c47cd8c-596j7", "timestamp":"2025-03-25 01:43:20.051644327 +0000 UTC"}, Hostname:"ci-4284-0-0-2-22d395eace", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 25 01:43:20.184430 containerd[1510]: 2025-03-25 01:43:20.066 [INFO][4561] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 25 01:43:20.184430 containerd[1510]: 2025-03-25 01:43:20.066 [INFO][4561] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 25 01:43:20.184430 containerd[1510]: 2025-03-25 01:43:20.066 [INFO][4561] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-2-22d395eace' Mar 25 01:43:20.184430 containerd[1510]: 2025-03-25 01:43:20.070 [INFO][4561] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.e7d83d648f2ae7ff7c684a1e6b18cdd0c73eaba73def8883778af536557e2552" host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:20.184430 containerd[1510]: 2025-03-25 01:43:20.076 [INFO][4561] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:20.184430 containerd[1510]: 2025-03-25 01:43:20.087 [INFO][4561] ipam/ipam.go 489: Trying affinity for 192.168.24.192/26 host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:20.184430 containerd[1510]: 2025-03-25 01:43:20.092 [INFO][4561] ipam/ipam.go 155: Attempting to load block cidr=192.168.24.192/26 host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:20.184430 containerd[1510]: 2025-03-25 01:43:20.100 [INFO][4561] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.24.192/26 host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:20.184430 containerd[1510]: 2025-03-25 01:43:20.100 [INFO][4561] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.24.192/26 handle="k8s-pod-network.e7d83d648f2ae7ff7c684a1e6b18cdd0c73eaba73def8883778af536557e2552" host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:20.184430 containerd[1510]: 2025-03-25 01:43:20.109 [INFO][4561] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.e7d83d648f2ae7ff7c684a1e6b18cdd0c73eaba73def8883778af536557e2552 Mar 25 01:43:20.184430 containerd[1510]: 2025-03-25 01:43:20.126 [INFO][4561] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.24.192/26 handle="k8s-pod-network.e7d83d648f2ae7ff7c684a1e6b18cdd0c73eaba73def8883778af536557e2552" host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:20.184430 containerd[1510]: 2025-03-25 01:43:20.154 [INFO][4561] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.24.198/26] block=192.168.24.192/26 handle="k8s-pod-network.e7d83d648f2ae7ff7c684a1e6b18cdd0c73eaba73def8883778af536557e2552" host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:20.184430 containerd[1510]: 2025-03-25 01:43:20.155 [INFO][4561] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.24.198/26] handle="k8s-pod-network.e7d83d648f2ae7ff7c684a1e6b18cdd0c73eaba73def8883778af536557e2552" host="ci-4284-0-0-2-22d395eace" Mar 25 01:43:20.184430 containerd[1510]: 2025-03-25 01:43:20.155 [INFO][4561] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 25 01:43:20.184430 containerd[1510]: 2025-03-25 01:43:20.155 [INFO][4561] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.24.198/26] IPv6=[] ContainerID="e7d83d648f2ae7ff7c684a1e6b18cdd0c73eaba73def8883778af536557e2552" HandleID="k8s-pod-network.e7d83d648f2ae7ff7c684a1e6b18cdd0c73eaba73def8883778af536557e2552" Workload="ci--4284--0--0--2--22d395eace-k8s-calico--apiserver--f6c47cd8c--596j7-eth0" Mar 25 01:43:20.186510 containerd[1510]: 2025-03-25 01:43:20.159 [INFO][4546] cni-plugin/k8s.go 386: Populated endpoint ContainerID="e7d83d648f2ae7ff7c684a1e6b18cdd0c73eaba73def8883778af536557e2552" Namespace="calico-apiserver" Pod="calico-apiserver-f6c47cd8c-596j7" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-calico--apiserver--f6c47cd8c--596j7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--2--22d395eace-k8s-calico--apiserver--f6c47cd8c--596j7-eth0", GenerateName:"calico-apiserver-f6c47cd8c-", Namespace:"calico-apiserver", SelfLink:"", UID:"44a52fe5-6229-44f0-a7aa-4765242ee6fa", ResourceVersion:"671", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 42, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"f6c47cd8c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-2-22d395eace", ContainerID:"", Pod:"calico-apiserver-f6c47cd8c-596j7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.24.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib0d64f1fee9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:43:20.186510 containerd[1510]: 2025-03-25 01:43:20.159 [INFO][4546] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.24.198/32] ContainerID="e7d83d648f2ae7ff7c684a1e6b18cdd0c73eaba73def8883778af536557e2552" Namespace="calico-apiserver" Pod="calico-apiserver-f6c47cd8c-596j7" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-calico--apiserver--f6c47cd8c--596j7-eth0" Mar 25 01:43:20.186510 containerd[1510]: 2025-03-25 01:43:20.160 [INFO][4546] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib0d64f1fee9 ContainerID="e7d83d648f2ae7ff7c684a1e6b18cdd0c73eaba73def8883778af536557e2552" Namespace="calico-apiserver" Pod="calico-apiserver-f6c47cd8c-596j7" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-calico--apiserver--f6c47cd8c--596j7-eth0" Mar 25 01:43:20.186510 containerd[1510]: 2025-03-25 01:43:20.167 [INFO][4546] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e7d83d648f2ae7ff7c684a1e6b18cdd0c73eaba73def8883778af536557e2552" Namespace="calico-apiserver" Pod="calico-apiserver-f6c47cd8c-596j7" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-calico--apiserver--f6c47cd8c--596j7-eth0" Mar 25 01:43:20.186510 containerd[1510]: 2025-03-25 01:43:20.167 [INFO][4546] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="e7d83d648f2ae7ff7c684a1e6b18cdd0c73eaba73def8883778af536557e2552" Namespace="calico-apiserver" Pod="calico-apiserver-f6c47cd8c-596j7" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-calico--apiserver--f6c47cd8c--596j7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--2--22d395eace-k8s-calico--apiserver--f6c47cd8c--596j7-eth0", GenerateName:"calico-apiserver-f6c47cd8c-", Namespace:"calico-apiserver", SelfLink:"", UID:"44a52fe5-6229-44f0-a7aa-4765242ee6fa", ResourceVersion:"671", Generation:0, CreationTimestamp:time.Date(2025, time.March, 25, 1, 42, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"f6c47cd8c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-2-22d395eace", ContainerID:"e7d83d648f2ae7ff7c684a1e6b18cdd0c73eaba73def8883778af536557e2552", Pod:"calico-apiserver-f6c47cd8c-596j7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.24.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib0d64f1fee9", MAC:"e6:0f:57:0f:3d:56", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 25 01:43:20.186510 containerd[1510]: 2025-03-25 01:43:20.180 [INFO][4546] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="e7d83d648f2ae7ff7c684a1e6b18cdd0c73eaba73def8883778af536557e2552" Namespace="calico-apiserver" Pod="calico-apiserver-f6c47cd8c-596j7" WorkloadEndpoint="ci--4284--0--0--2--22d395eace-k8s-calico--apiserver--f6c47cd8c--596j7-eth0" Mar 25 01:43:20.226582 containerd[1510]: time="2025-03-25T01:43:20.226482010Z" level=info msg="connecting to shim e7d83d648f2ae7ff7c684a1e6b18cdd0c73eaba73def8883778af536557e2552" address="unix:///run/containerd/s/6933782791b2453d631d577fa4da8fff0e7228ba77c530062900aa031c640f47" namespace=k8s.io protocol=ttrpc version=3 Mar 25 01:43:20.255453 systemd[1]: Started cri-containerd-e7d83d648f2ae7ff7c684a1e6b18cdd0c73eaba73def8883778af536557e2552.scope - libcontainer container e7d83d648f2ae7ff7c684a1e6b18cdd0c73eaba73def8883778af536557e2552. Mar 25 01:43:20.302047 containerd[1510]: time="2025-03-25T01:43:20.301987158Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f6c47cd8c-596j7,Uid:44a52fe5-6229-44f0-a7aa-4765242ee6fa,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"e7d83d648f2ae7ff7c684a1e6b18cdd0c73eaba73def8883778af536557e2552\"" Mar 25 01:43:20.303478 systemd-networkd[1415]: calie69d2d59c8a: Gained IPv6LL Mar 25 01:43:20.350879 kubelet[2808]: I0325 01:43:20.349597 2808 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:43:20.583546 containerd[1510]: time="2025-03-25T01:43:20.583492404Z" level=info msg="TaskExit event in podsandbox handler container_id:\"60a361e49ee2512a46f6f281a8dd5f303f88d280ffac9fd2a107dce473f76a48\" id:\"9e752ea979f1b634038ba94e7d9f5fff91c4b6a773e1fe51248fc3dbc5b4d9d7\" pid:4636 exited_at:{seconds:1742867000 nanos:583094866}" Mar 25 01:43:20.679900 containerd[1510]: time="2025-03-25T01:43:20.679844102Z" level=info msg="TaskExit event in podsandbox handler container_id:\"60a361e49ee2512a46f6f281a8dd5f303f88d280ffac9fd2a107dce473f76a48\" id:\"a3b808e3ef0a854557c2009c5bc4cddcc665061aadf972d08bc700f60d533f46\" pid:4660 exited_at:{seconds:1742867000 nanos:679465689}" Mar 25 01:43:20.815304 systemd-networkd[1415]: cali1607072ffc3: Gained IPv6LL Mar 25 01:43:22.098116 systemd-networkd[1415]: calib0d64f1fee9: Gained IPv6LL Mar 25 01:43:23.139627 systemd[1]: Started sshd@10-37.27.205.216:22-47.237.21.27:59120.service - OpenSSH per-connection server daemon (47.237.21.27:59120). Mar 25 01:43:23.244203 containerd[1510]: time="2025-03-25T01:43:23.244137880Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:43:23.245819 containerd[1510]: time="2025-03-25T01:43:23.245765344Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=42993204" Mar 25 01:43:23.248492 containerd[1510]: time="2025-03-25T01:43:23.248460030Z" level=info msg="ImageCreate event name:\"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:43:23.257155 containerd[1510]: time="2025-03-25T01:43:23.255708627Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:43:23.257155 containerd[1510]: time="2025-03-25T01:43:23.256581810Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"44486324\" in 3.607202739s" Mar 25 01:43:23.257155 containerd[1510]: time="2025-03-25T01:43:23.256622014Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\"" Mar 25 01:43:23.258688 containerd[1510]: time="2025-03-25T01:43:23.258647737Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\"" Mar 25 01:43:23.261207 containerd[1510]: time="2025-03-25T01:43:23.261149665Z" level=info msg="CreateContainer within sandbox \"f638298fa8589574341426be9e3bd3a78c4b5a764c8bf79a78574a44f300960b\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 25 01:43:23.275511 containerd[1510]: time="2025-03-25T01:43:23.274820294Z" level=info msg="Container 4555255cd0517e012eb84dbfba06a1b7c54d8bd603782893ea6cf790bc18c5d4: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:43:23.286383 containerd[1510]: time="2025-03-25T01:43:23.286348090Z" level=info msg="CreateContainer within sandbox \"f638298fa8589574341426be9e3bd3a78c4b5a764c8bf79a78574a44f300960b\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"4555255cd0517e012eb84dbfba06a1b7c54d8bd603782893ea6cf790bc18c5d4\"" Mar 25 01:43:23.291286 containerd[1510]: time="2025-03-25T01:43:23.286892262Z" level=info msg="StartContainer for \"4555255cd0517e012eb84dbfba06a1b7c54d8bd603782893ea6cf790bc18c5d4\"" Mar 25 01:43:23.291286 containerd[1510]: time="2025-03-25T01:43:23.287643016Z" level=info msg="connecting to shim 4555255cd0517e012eb84dbfba06a1b7c54d8bd603782893ea6cf790bc18c5d4" address="unix:///run/containerd/s/84a1e7028ff495eac058819f247ecb8aefbb0ce1a3601799f4f975513a6ae606" protocol=ttrpc version=3 Mar 25 01:43:23.314453 systemd[1]: Started cri-containerd-4555255cd0517e012eb84dbfba06a1b7c54d8bd603782893ea6cf790bc18c5d4.scope - libcontainer container 4555255cd0517e012eb84dbfba06a1b7c54d8bd603782893ea6cf790bc18c5d4. Mar 25 01:43:23.375732 containerd[1510]: time="2025-03-25T01:43:23.375588407Z" level=info msg="StartContainer for \"4555255cd0517e012eb84dbfba06a1b7c54d8bd603782893ea6cf790bc18c5d4\" returns successfully" Mar 25 01:43:24.197585 kubelet[2808]: I0325 01:43:24.197332 2808 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-f6c47cd8c-ztc7d" podStartSLOduration=30.262121428 podStartE2EDuration="34.197311073s" podCreationTimestamp="2025-03-25 01:42:50 +0000 UTC" firstStartedPulling="2025-03-25 01:43:19.32331709 +0000 UTC m=+41.570358439" lastFinishedPulling="2025-03-25 01:43:23.258506735 +0000 UTC m=+45.505548084" observedRunningTime="2025-03-25 01:43:24.196405159 +0000 UTC m=+46.443446549" watchObservedRunningTime="2025-03-25 01:43:24.197311073 +0000 UTC m=+46.444352442" Mar 25 01:43:25.187880 kubelet[2808]: I0325 01:43:25.187802 2808 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:43:26.709972 containerd[1510]: time="2025-03-25T01:43:26.709398811Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:43:26.710961 containerd[1510]: time="2025-03-25T01:43:26.710433284Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.2: active requests=0, bytes read=34792912" Mar 25 01:43:26.713066 containerd[1510]: time="2025-03-25T01:43:26.712080336Z" level=info msg="ImageCreate event name:\"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:43:26.714653 containerd[1510]: time="2025-03-25T01:43:26.714544085Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:43:26.715109 containerd[1510]: time="2025-03-25T01:43:26.715069372Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" with image id \"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:6d1f392b747f912366ec5c60ee1130952c2c07e8ce24c53480187daa0e3364aa\", size \"36285984\" in 3.45628468s" Mar 25 01:43:26.715155 containerd[1510]: time="2025-03-25T01:43:26.715113103Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.2\" returns image reference \"sha256:f6a228558381bc7de7c5296ac6c4e903cfda929899c85806367a726ef6d7ff5f\"" Mar 25 01:43:26.716469 containerd[1510]: time="2025-03-25T01:43:26.716347668Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\"" Mar 25 01:43:26.733333 containerd[1510]: time="2025-03-25T01:43:26.733286033Z" level=info msg="CreateContainer within sandbox \"a9f3963214a06889610d1ba768c306af30a14d13076c27e39f54e7d073d0261f\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 25 01:43:26.744327 containerd[1510]: time="2025-03-25T01:43:26.743614732Z" level=info msg="Container db2b888672fab16c0a7cda0009690ee16758eca5e22b4e6f315b4c5bd055c4f9: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:43:26.765560 containerd[1510]: time="2025-03-25T01:43:26.765476758Z" level=info msg="CreateContainer within sandbox \"a9f3963214a06889610d1ba768c306af30a14d13076c27e39f54e7d073d0261f\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"db2b888672fab16c0a7cda0009690ee16758eca5e22b4e6f315b4c5bd055c4f9\"" Mar 25 01:43:26.767405 containerd[1510]: time="2025-03-25T01:43:26.766393222Z" level=info msg="StartContainer for \"db2b888672fab16c0a7cda0009690ee16758eca5e22b4e6f315b4c5bd055c4f9\"" Mar 25 01:43:26.767863 containerd[1510]: time="2025-03-25T01:43:26.767845852Z" level=info msg="connecting to shim db2b888672fab16c0a7cda0009690ee16758eca5e22b4e6f315b4c5bd055c4f9" address="unix:///run/containerd/s/02140fe98eafca1a0cbb3b8af6ab8a94c3ca22185ff75a12e6b92de8c8e79988" protocol=ttrpc version=3 Mar 25 01:43:26.798186 systemd[1]: Started cri-containerd-db2b888672fab16c0a7cda0009690ee16758eca5e22b4e6f315b4c5bd055c4f9.scope - libcontainer container db2b888672fab16c0a7cda0009690ee16758eca5e22b4e6f315b4c5bd055c4f9. Mar 25 01:43:26.856929 containerd[1510]: time="2025-03-25T01:43:26.856860461Z" level=info msg="StartContainer for \"db2b888672fab16c0a7cda0009690ee16758eca5e22b4e6f315b4c5bd055c4f9\" returns successfully" Mar 25 01:43:27.244650 kubelet[2808]: I0325 01:43:27.243599 2808 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-6c89bd67d9-shfkm" podStartSLOduration=29.905019981 podStartE2EDuration="37.243550271s" podCreationTimestamp="2025-03-25 01:42:50 +0000 UTC" firstStartedPulling="2025-03-25 01:43:19.377629719 +0000 UTC m=+41.624671068" lastFinishedPulling="2025-03-25 01:43:26.716160008 +0000 UTC m=+48.963201358" observedRunningTime="2025-03-25 01:43:27.242709037 +0000 UTC m=+49.489750457" watchObservedRunningTime="2025-03-25 01:43:27.243550271 +0000 UTC m=+49.490591650" Mar 25 01:43:27.291696 containerd[1510]: time="2025-03-25T01:43:27.291428942Z" level=info msg="TaskExit event in podsandbox handler container_id:\"db2b888672fab16c0a7cda0009690ee16758eca5e22b4e6f315b4c5bd055c4f9\" id:\"a333b32d2ef33f443a013bffb3e28ef0480fc13e167d158033739828c4bfdbb3\" pid:4776 exited_at:{seconds:1742867007 nanos:291134154}" Mar 25 01:43:28.013963 sshd[4677]: Invalid user iannotta from 47.237.21.27 port 59120 Mar 25 01:43:29.480767 sshd[4677]: Received disconnect from 47.237.21.27 port 59120:11: Bye Bye [preauth] Mar 25 01:43:29.480767 sshd[4677]: Disconnected from invalid user iannotta 47.237.21.27 port 59120 [preauth] Mar 25 01:43:29.485614 systemd[1]: sshd@10-37.27.205.216:22-47.237.21.27:59120.service: Deactivated successfully. Mar 25 01:43:30.016054 containerd[1510]: time="2025-03-25T01:43:30.015995912Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:43:30.017641 containerd[1510]: time="2025-03-25T01:43:30.017461959Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2: active requests=0, bytes read=13986843" Mar 25 01:43:30.019003 containerd[1510]: time="2025-03-25T01:43:30.018934959Z" level=info msg="ImageCreate event name:\"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:43:30.021237 containerd[1510]: time="2025-03-25T01:43:30.021215801Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:43:30.021979 containerd[1510]: time="2025-03-25T01:43:30.021657993Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" with image id \"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:54ef0afa50feb3f691782e8d6df9a7f27d127a3af9bbcbd0bcdadac98e8be8e3\", size \"15479899\" in 3.305231619s" Mar 25 01:43:30.021979 containerd[1510]: time="2025-03-25T01:43:30.021688511Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.2\" returns image reference \"sha256:09a5a6ea58a48ac826468e05538c78d1378e103737124f1744efea8699fc29a8\"" Mar 25 01:43:30.023510 containerd[1510]: time="2025-03-25T01:43:30.023492946Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\"" Mar 25 01:43:30.024368 containerd[1510]: time="2025-03-25T01:43:30.024162893Z" level=info msg="CreateContainer within sandbox \"0f7e2369524f26a6d6083790d312b2d2c77100d15cb88a04f40f1c894306e542\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 25 01:43:30.035487 containerd[1510]: time="2025-03-25T01:43:30.034549490Z" level=info msg="Container 35f8fa92cecc65739bcd4eeca4474ad9cf025819e29cd92f8ffb5cc82f72e9e7: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:43:30.042683 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3260473160.mount: Deactivated successfully. Mar 25 01:43:30.049328 containerd[1510]: time="2025-03-25T01:43:30.049287111Z" level=info msg="CreateContainer within sandbox \"0f7e2369524f26a6d6083790d312b2d2c77100d15cb88a04f40f1c894306e542\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"35f8fa92cecc65739bcd4eeca4474ad9cf025819e29cd92f8ffb5cc82f72e9e7\"" Mar 25 01:43:30.050077 containerd[1510]: time="2025-03-25T01:43:30.050044580Z" level=info msg="StartContainer for \"35f8fa92cecc65739bcd4eeca4474ad9cf025819e29cd92f8ffb5cc82f72e9e7\"" Mar 25 01:43:30.051243 containerd[1510]: time="2025-03-25T01:43:30.051209286Z" level=info msg="connecting to shim 35f8fa92cecc65739bcd4eeca4474ad9cf025819e29cd92f8ffb5cc82f72e9e7" address="unix:///run/containerd/s/740550c7aceeec1c17f4dfa7c586d4356f547d4913c4e50a1c41e146c2d77a65" protocol=ttrpc version=3 Mar 25 01:43:30.074494 systemd[1]: Started cri-containerd-35f8fa92cecc65739bcd4eeca4474ad9cf025819e29cd92f8ffb5cc82f72e9e7.scope - libcontainer container 35f8fa92cecc65739bcd4eeca4474ad9cf025819e29cd92f8ffb5cc82f72e9e7. Mar 25 01:43:30.124400 containerd[1510]: time="2025-03-25T01:43:30.124142696Z" level=info msg="StartContainer for \"35f8fa92cecc65739bcd4eeca4474ad9cf025819e29cd92f8ffb5cc82f72e9e7\" returns successfully" Mar 25 01:43:30.262358 kubelet[2808]: I0325 01:43:30.262254 2808 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-j9x72" podStartSLOduration=27.533821683 podStartE2EDuration="40.262234366s" podCreationTimestamp="2025-03-25 01:42:50 +0000 UTC" firstStartedPulling="2025-03-25 01:43:17.293885631 +0000 UTC m=+39.540926980" lastFinishedPulling="2025-03-25 01:43:30.022298314 +0000 UTC m=+52.269339663" observedRunningTime="2025-03-25 01:43:30.260494659 +0000 UTC m=+52.507536009" watchObservedRunningTime="2025-03-25 01:43:30.262234366 +0000 UTC m=+52.509275725" Mar 25 01:43:30.571694 containerd[1510]: time="2025-03-25T01:43:30.571592325Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Mar 25 01:43:30.573341 containerd[1510]: time="2025-03-25T01:43:30.573290383Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.2: active requests=0, bytes read=77" Mar 25 01:43:30.577111 containerd[1510]: time="2025-03-25T01:43:30.577059403Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" with image id \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:3623f5b60fad0da3387a8649371b53171a4b1226f4d989d2acad9145dc0ef56f\", size \"44486324\" in 553.299781ms" Mar 25 01:43:30.577111 containerd[1510]: time="2025-03-25T01:43:30.577096102Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.2\" returns image reference \"sha256:d27fc480d1ad33921c40abef2ab6828fadf6524674fdcc622f571a5abc34ad55\"" Mar 25 01:43:30.580778 containerd[1510]: time="2025-03-25T01:43:30.580367115Z" level=info msg="CreateContainer within sandbox \"e7d83d648f2ae7ff7c684a1e6b18cdd0c73eaba73def8883778af536557e2552\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 25 01:43:30.593475 containerd[1510]: time="2025-03-25T01:43:30.593431073Z" level=info msg="Container 94ac75002e2175793209cca08bb4749fd8b1ace91694731428cde308baf03385: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:43:30.604214 containerd[1510]: time="2025-03-25T01:43:30.604174976Z" level=info msg="CreateContainer within sandbox \"e7d83d648f2ae7ff7c684a1e6b18cdd0c73eaba73def8883778af536557e2552\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"94ac75002e2175793209cca08bb4749fd8b1ace91694731428cde308baf03385\"" Mar 25 01:43:30.607076 containerd[1510]: time="2025-03-25T01:43:30.604787686Z" level=info msg="StartContainer for \"94ac75002e2175793209cca08bb4749fd8b1ace91694731428cde308baf03385\"" Mar 25 01:43:30.617540 containerd[1510]: time="2025-03-25T01:43:30.617438105Z" level=info msg="connecting to shim 94ac75002e2175793209cca08bb4749fd8b1ace91694731428cde308baf03385" address="unix:///run/containerd/s/6933782791b2453d631d577fa4da8fff0e7228ba77c530062900aa031c640f47" protocol=ttrpc version=3 Mar 25 01:43:30.643530 systemd[1]: Started cri-containerd-94ac75002e2175793209cca08bb4749fd8b1ace91694731428cde308baf03385.scope - libcontainer container 94ac75002e2175793209cca08bb4749fd8b1ace91694731428cde308baf03385. Mar 25 01:43:30.721048 containerd[1510]: time="2025-03-25T01:43:30.720920163Z" level=info msg="StartContainer for \"94ac75002e2175793209cca08bb4749fd8b1ace91694731428cde308baf03385\" returns successfully" Mar 25 01:43:31.269317 kubelet[2808]: I0325 01:43:31.268788 2808 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-f6c47cd8c-596j7" podStartSLOduration=30.993639831 podStartE2EDuration="41.268768524s" podCreationTimestamp="2025-03-25 01:42:50 +0000 UTC" firstStartedPulling="2025-03-25 01:43:20.303117207 +0000 UTC m=+42.550158556" lastFinishedPulling="2025-03-25 01:43:30.5782459 +0000 UTC m=+52.825287249" observedRunningTime="2025-03-25 01:43:31.267687975 +0000 UTC m=+53.514729323" watchObservedRunningTime="2025-03-25 01:43:31.268768524 +0000 UTC m=+53.515809873" Mar 25 01:43:31.272099 kubelet[2808]: I0325 01:43:31.270875 2808 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 25 01:43:31.285624 kubelet[2808]: I0325 01:43:31.285580 2808 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 25 01:43:32.249464 kubelet[2808]: I0325 01:43:32.248302 2808 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:43:38.211250 containerd[1510]: time="2025-03-25T01:43:38.211105788Z" level=info msg="TaskExit event in podsandbox handler container_id:\"db2b888672fab16c0a7cda0009690ee16758eca5e22b4e6f315b4c5bd055c4f9\" id:\"a87144dfebbd590209afbbf3e0962df147dece7d11767d34279fc36c36d97ef5\" pid:4888 exited_at:{seconds:1742867018 nanos:210794729}" Mar 25 01:43:48.805549 kubelet[2808]: I0325 01:43:48.805314 2808 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:43:50.462865 containerd[1510]: time="2025-03-25T01:43:50.462784533Z" level=info msg="TaskExit event in podsandbox handler container_id:\"60a361e49ee2512a46f6f281a8dd5f303f88d280ffac9fd2a107dce473f76a48\" id:\"7882784d830ad0685ed52c5cac4339584e9945151bd886749df9c20738bc8be8\" pid:4919 exited_at:{seconds:1742867030 nanos:461489479}" Mar 25 01:44:00.798032 kubelet[2808]: I0325 01:44:00.797499 2808 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 25 01:44:03.545537 containerd[1510]: time="2025-03-25T01:44:03.545257980Z" level=info msg="TaskExit event in podsandbox handler container_id:\"db2b888672fab16c0a7cda0009690ee16758eca5e22b4e6f315b4c5bd055c4f9\" id:\"f492fb8baacabc57e30c5668eea5e198ebdf25c4af85e40a581e7fd44185a1f7\" pid:4953 exited_at:{seconds:1742867043 nanos:544677888}" Mar 25 01:44:08.214495 containerd[1510]: time="2025-03-25T01:44:08.214428177Z" level=info msg="TaskExit event in podsandbox handler container_id:\"db2b888672fab16c0a7cda0009690ee16758eca5e22b4e6f315b4c5bd055c4f9\" id:\"45df414d2d460a7b45aeb9fbcc1ea2c1aeb41d0466b948e0747b77a00a16a402\" pid:4976 exited_at:{seconds:1742867048 nanos:214149036}" Mar 25 01:44:13.125420 update_engine[1500]: I20250325 01:44:13.125237 1500 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Mar 25 01:44:13.125420 update_engine[1500]: I20250325 01:44:13.125348 1500 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Mar 25 01:44:13.129803 update_engine[1500]: I20250325 01:44:13.129132 1500 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Mar 25 01:44:13.131643 update_engine[1500]: I20250325 01:44:13.130580 1500 omaha_request_params.cc:62] Current group set to alpha Mar 25 01:44:13.132017 update_engine[1500]: I20250325 01:44:13.131796 1500 update_attempter.cc:499] Already updated boot flags. Skipping. Mar 25 01:44:13.132017 update_engine[1500]: I20250325 01:44:13.131821 1500 update_attempter.cc:643] Scheduling an action processor start. Mar 25 01:44:13.132017 update_engine[1500]: I20250325 01:44:13.131852 1500 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Mar 25 01:44:13.132017 update_engine[1500]: I20250325 01:44:13.131912 1500 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Mar 25 01:44:13.132237 update_engine[1500]: I20250325 01:44:13.132190 1500 omaha_request_action.cc:271] Posting an Omaha request to disabled Mar 25 01:44:13.132237 update_engine[1500]: I20250325 01:44:13.132220 1500 omaha_request_action.cc:272] Request: Mar 25 01:44:13.132237 update_engine[1500]: Mar 25 01:44:13.132237 update_engine[1500]: Mar 25 01:44:13.132237 update_engine[1500]: Mar 25 01:44:13.132237 update_engine[1500]: Mar 25 01:44:13.132237 update_engine[1500]: Mar 25 01:44:13.132237 update_engine[1500]: Mar 25 01:44:13.132237 update_engine[1500]: Mar 25 01:44:13.132237 update_engine[1500]: Mar 25 01:44:13.132237 update_engine[1500]: I20250325 01:44:13.132230 1500 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 25 01:44:13.162636 locksmithd[1529]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Mar 25 01:44:13.170615 update_engine[1500]: I20250325 01:44:13.170536 1500 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 25 01:44:13.171349 update_engine[1500]: I20250325 01:44:13.171206 1500 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 25 01:44:13.172383 update_engine[1500]: E20250325 01:44:13.172178 1500 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 25 01:44:13.172510 update_engine[1500]: I20250325 01:44:13.172477 1500 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Mar 25 01:44:20.475135 containerd[1510]: time="2025-03-25T01:44:20.475080880Z" level=info msg="TaskExit event in podsandbox handler container_id:\"60a361e49ee2512a46f6f281a8dd5f303f88d280ffac9fd2a107dce473f76a48\" id:\"f4aed7c4f296828d50258b42ab41f14b2ac2d44f07e439a51749161183dca3eb\" pid:5001 exited_at:{seconds:1742867060 nanos:474719285}" Mar 25 01:44:23.034370 update_engine[1500]: I20250325 01:44:23.034214 1500 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 25 01:44:23.035877 update_engine[1500]: I20250325 01:44:23.034641 1500 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 25 01:44:23.035877 update_engine[1500]: I20250325 01:44:23.035125 1500 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 25 01:44:23.035877 update_engine[1500]: E20250325 01:44:23.035588 1500 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 25 01:44:23.035877 update_engine[1500]: I20250325 01:44:23.035650 1500 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Mar 25 01:44:24.234430 systemd[1]: Started sshd@11-37.27.205.216:22-36.67.70.198:42574.service - OpenSSH per-connection server daemon (36.67.70.198:42574). Mar 25 01:44:25.367226 sshd[5013]: Invalid user energeticos from 36.67.70.198 port 42574 Mar 25 01:44:25.579985 sshd[5013]: Received disconnect from 36.67.70.198 port 42574:11: Bye Bye [preauth] Mar 25 01:44:25.579985 sshd[5013]: Disconnected from invalid user energeticos 36.67.70.198 port 42574 [preauth] Mar 25 01:44:25.582759 systemd[1]: sshd@11-37.27.205.216:22-36.67.70.198:42574.service: Deactivated successfully. Mar 25 01:44:33.032991 update_engine[1500]: I20250325 01:44:33.032863 1500 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 25 01:44:33.033655 update_engine[1500]: I20250325 01:44:33.033360 1500 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 25 01:44:33.033964 update_engine[1500]: I20250325 01:44:33.033890 1500 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 25 01:44:33.034668 update_engine[1500]: E20250325 01:44:33.034323 1500 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 25 01:44:33.034668 update_engine[1500]: I20250325 01:44:33.034437 1500 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Mar 25 01:44:38.218690 containerd[1510]: time="2025-03-25T01:44:38.218585517Z" level=info msg="TaskExit event in podsandbox handler container_id:\"db2b888672fab16c0a7cda0009690ee16758eca5e22b4e6f315b4c5bd055c4f9\" id:\"0642c8f72570438b1e10462ca709a3759e69bf95a51d6c85a9089042d7cc31b5\" pid:5037 exited_at:{seconds:1742867078 nanos:217638028}" Mar 25 01:44:43.038453 update_engine[1500]: I20250325 01:44:43.038336 1500 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 25 01:44:43.039082 update_engine[1500]: I20250325 01:44:43.038715 1500 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 25 01:44:43.039179 update_engine[1500]: I20250325 01:44:43.039105 1500 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 25 01:44:43.039584 update_engine[1500]: E20250325 01:44:43.039535 1500 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 25 01:44:43.039663 update_engine[1500]: I20250325 01:44:43.039630 1500 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Mar 25 01:44:43.039663 update_engine[1500]: I20250325 01:44:43.039644 1500 omaha_request_action.cc:617] Omaha request response: Mar 25 01:44:43.039915 update_engine[1500]: E20250325 01:44:43.039752 1500 omaha_request_action.cc:636] Omaha request network transfer failed. Mar 25 01:44:43.046162 update_engine[1500]: I20250325 01:44:43.044531 1500 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Mar 25 01:44:43.046162 update_engine[1500]: I20250325 01:44:43.044570 1500 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 25 01:44:43.046162 update_engine[1500]: I20250325 01:44:43.044581 1500 update_attempter.cc:306] Processing Done. Mar 25 01:44:43.046162 update_engine[1500]: E20250325 01:44:43.044605 1500 update_attempter.cc:619] Update failed. Mar 25 01:44:43.046162 update_engine[1500]: I20250325 01:44:43.044617 1500 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Mar 25 01:44:43.046162 update_engine[1500]: I20250325 01:44:43.044626 1500 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Mar 25 01:44:43.046162 update_engine[1500]: I20250325 01:44:43.044636 1500 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Mar 25 01:44:43.046162 update_engine[1500]: I20250325 01:44:43.044750 1500 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Mar 25 01:44:43.046162 update_engine[1500]: I20250325 01:44:43.044806 1500 omaha_request_action.cc:271] Posting an Omaha request to disabled Mar 25 01:44:43.046162 update_engine[1500]: I20250325 01:44:43.044817 1500 omaha_request_action.cc:272] Request: Mar 25 01:44:43.046162 update_engine[1500]: Mar 25 01:44:43.046162 update_engine[1500]: Mar 25 01:44:43.046162 update_engine[1500]: Mar 25 01:44:43.046162 update_engine[1500]: Mar 25 01:44:43.046162 update_engine[1500]: Mar 25 01:44:43.046162 update_engine[1500]: Mar 25 01:44:43.046162 update_engine[1500]: I20250325 01:44:43.044827 1500 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Mar 25 01:44:43.046162 update_engine[1500]: I20250325 01:44:43.045068 1500 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Mar 25 01:44:43.046162 update_engine[1500]: I20250325 01:44:43.045408 1500 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Mar 25 01:44:43.049300 update_engine[1500]: E20250325 01:44:43.046365 1500 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Mar 25 01:44:43.049300 update_engine[1500]: I20250325 01:44:43.046425 1500 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Mar 25 01:44:43.049300 update_engine[1500]: I20250325 01:44:43.046437 1500 omaha_request_action.cc:617] Omaha request response: Mar 25 01:44:43.049300 update_engine[1500]: I20250325 01:44:43.046447 1500 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 25 01:44:43.049300 update_engine[1500]: I20250325 01:44:43.046456 1500 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Mar 25 01:44:43.049300 update_engine[1500]: I20250325 01:44:43.046466 1500 update_attempter.cc:306] Processing Done. Mar 25 01:44:43.049300 update_engine[1500]: I20250325 01:44:43.046475 1500 update_attempter.cc:310] Error event sent. Mar 25 01:44:43.049300 update_engine[1500]: I20250325 01:44:43.046489 1500 update_check_scheduler.cc:74] Next update check in 47m58s Mar 25 01:44:43.050346 locksmithd[1529]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Mar 25 01:44:43.050346 locksmithd[1529]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Mar 25 01:44:50.458101 containerd[1510]: time="2025-03-25T01:44:50.457978584Z" level=info msg="TaskExit event in podsandbox handler container_id:\"60a361e49ee2512a46f6f281a8dd5f303f88d280ffac9fd2a107dce473f76a48\" id:\"b5b0e6a5d29554d26feac2b98f549bab66a21d718f65354bb93c53e48ae1afb0\" pid:5075 exited_at:{seconds:1742867090 nanos:457562550}" Mar 25 01:44:51.018719 systemd[1]: Started sshd@12-37.27.205.216:22-92.255.85.188:17196.service - OpenSSH per-connection server daemon (92.255.85.188:17196). Mar 25 01:44:51.623337 sshd[5093]: Invalid user oracle from 92.255.85.188 port 17196 Mar 25 01:44:51.683695 sshd[5093]: Connection closed by invalid user oracle 92.255.85.188 port 17196 [preauth] Mar 25 01:44:51.687009 systemd[1]: sshd@12-37.27.205.216:22-92.255.85.188:17196.service: Deactivated successfully. Mar 25 01:45:03.535260 containerd[1510]: time="2025-03-25T01:45:03.535164471Z" level=info msg="TaskExit event in podsandbox handler container_id:\"db2b888672fab16c0a7cda0009690ee16758eca5e22b4e6f315b4c5bd055c4f9\" id:\"834818eb59c386e2fc9ba7590c04dc5a21de02826753ff6e6ec772111693b6f0\" pid:5109 exited_at:{seconds:1742867103 nanos:534743049}" Mar 25 01:45:08.224692 containerd[1510]: time="2025-03-25T01:45:08.224628518Z" level=info msg="TaskExit event in podsandbox handler container_id:\"db2b888672fab16c0a7cda0009690ee16758eca5e22b4e6f315b4c5bd055c4f9\" id:\"ce2d2362d10b1b393fc9f3517ad0e218983c79b87ae1ccd038a0ddbae9ec8f2c\" pid:5130 exited_at:{seconds:1742867108 nanos:224059689}" Mar 25 01:45:09.086765 systemd[1]: Started sshd@13-37.27.205.216:22-182.75.65.22:47676.service - OpenSSH per-connection server daemon (182.75.65.22:47676). Mar 25 01:45:10.077430 sshd[5140]: Invalid user ftp from 182.75.65.22 port 47676 Mar 25 01:45:10.260462 sshd[5140]: Received disconnect from 182.75.65.22 port 47676:11: Bye Bye [preauth] Mar 25 01:45:10.260462 sshd[5140]: Disconnected from invalid user ftp 182.75.65.22 port 47676 [preauth] Mar 25 01:45:10.262929 systemd[1]: sshd@13-37.27.205.216:22-182.75.65.22:47676.service: Deactivated successfully. Mar 25 01:45:10.652313 systemd[1]: Started sshd@14-37.27.205.216:22-221.221.160.224:50844.service - OpenSSH per-connection server daemon (221.221.160.224:50844). Mar 25 01:45:20.287870 systemd[1]: Started sshd@15-37.27.205.216:22-187.110.238.50:44534.service - OpenSSH per-connection server daemon (187.110.238.50:44534). Mar 25 01:45:20.462605 containerd[1510]: time="2025-03-25T01:45:20.462500312Z" level=info msg="TaskExit event in podsandbox handler container_id:\"60a361e49ee2512a46f6f281a8dd5f303f88d280ffac9fd2a107dce473f76a48\" id:\"d1266e2b72ca318a1995bf3570b91664a193a3585aac40a0006bd2ad2fe529a8\" pid:5162 exited_at:{seconds:1742867120 nanos:462026712}" Mar 25 01:45:21.402432 sshd[5150]: Invalid user traccar from 187.110.238.50 port 44534 Mar 25 01:45:21.599645 sshd[5150]: Received disconnect from 187.110.238.50 port 44534:11: Bye Bye [preauth] Mar 25 01:45:21.599645 sshd[5150]: Disconnected from invalid user traccar 187.110.238.50 port 44534 [preauth] Mar 25 01:45:21.603707 systemd[1]: sshd@15-37.27.205.216:22-187.110.238.50:44534.service: Deactivated successfully. Mar 25 01:45:38.218403 containerd[1510]: time="2025-03-25T01:45:38.217887870Z" level=info msg="TaskExit event in podsandbox handler container_id:\"db2b888672fab16c0a7cda0009690ee16758eca5e22b4e6f315b4c5bd055c4f9\" id:\"ea4593096652db5b2bce45db4ae90328725de2b20f1ec511a4dee8a6428ec9cb\" pid:5194 exited_at:{seconds:1742867138 nanos:217424091}" Mar 25 01:45:50.454805 containerd[1510]: time="2025-03-25T01:45:50.454719911Z" level=info msg="TaskExit event in podsandbox handler container_id:\"60a361e49ee2512a46f6f281a8dd5f303f88d280ffac9fd2a107dce473f76a48\" id:\"c251f0ce9f5dba8c19db303f433f8c360a7807fac6811ca64fc4a99eb9a1c7da\" pid:5220 exited_at:{seconds:1742867150 nanos:454328567}" Mar 25 01:46:03.540649 containerd[1510]: time="2025-03-25T01:46:03.540597515Z" level=info msg="TaskExit event in podsandbox handler container_id:\"db2b888672fab16c0a7cda0009690ee16758eca5e22b4e6f315b4c5bd055c4f9\" id:\"8ec5e9c500ed2e8d127cc0a1e64f9dfd6746fe98aa9ee9a5947ea828fdeac18c\" pid:5252 exited_at:{seconds:1742867163 nanos:539943590}" Mar 25 01:46:08.186875 containerd[1510]: time="2025-03-25T01:46:08.186836172Z" level=info msg="TaskExit event in podsandbox handler container_id:\"db2b888672fab16c0a7cda0009690ee16758eca5e22b4e6f315b4c5bd055c4f9\" id:\"22b55bbd2dc63716f683f39c6974bc0f0112b4ac9935236446f458fa97001db1\" pid:5273 exited_at:{seconds:1742867168 nanos:186656585}" Mar 25 01:46:20.464886 containerd[1510]: time="2025-03-25T01:46:20.464689844Z" level=info msg="TaskExit event in podsandbox handler container_id:\"60a361e49ee2512a46f6f281a8dd5f303f88d280ffac9fd2a107dce473f76a48\" id:\"dbdc9d403d405425611283f48470eb9741f3e0ac82499e6d7fa3e49823653ba1\" pid:5296 exited_at:{seconds:1742867180 nanos:463803834}" Mar 25 01:46:25.538746 systemd[1]: Started sshd@16-37.27.205.216:22-36.67.70.198:50848.service - OpenSSH per-connection server daemon (36.67.70.198:50848). Mar 25 01:46:26.703981 sshd[5321]: Invalid user mark from 36.67.70.198 port 50848 Mar 25 01:46:26.911570 sshd[5321]: Received disconnect from 36.67.70.198 port 50848:11: Bye Bye [preauth] Mar 25 01:46:26.911570 sshd[5321]: Disconnected from invalid user mark 36.67.70.198 port 50848 [preauth] Mar 25 01:46:26.915171 systemd[1]: sshd@16-37.27.205.216:22-36.67.70.198:50848.service: Deactivated successfully. Mar 25 01:46:27.349570 systemd[1]: Started sshd@17-37.27.205.216:22-182.75.65.22:45908.service - OpenSSH per-connection server daemon (182.75.65.22:45908). Mar 25 01:46:28.353201 sshd[5332]: Invalid user admir from 182.75.65.22 port 45908 Mar 25 01:46:28.541191 sshd[5332]: Received disconnect from 182.75.65.22 port 45908:11: Bye Bye [preauth] Mar 25 01:46:28.541191 sshd[5332]: Disconnected from invalid user admir 182.75.65.22 port 45908 [preauth] Mar 25 01:46:28.543629 systemd[1]: sshd@17-37.27.205.216:22-182.75.65.22:45908.service: Deactivated successfully. Mar 25 01:46:38.216115 containerd[1510]: time="2025-03-25T01:46:38.215947109Z" level=info msg="TaskExit event in podsandbox handler container_id:\"db2b888672fab16c0a7cda0009690ee16758eca5e22b4e6f315b4c5bd055c4f9\" id:\"04a95d21179ffb1c810d61b7d0f7071a3f3872ddd7a4c294362954f1f1fa3fd0\" pid:5352 exited_at:{seconds:1742867198 nanos:215539557}" Mar 25 01:46:45.039501 systemd[1]: Started sshd@18-37.27.205.216:22-187.110.238.50:41798.service - OpenSSH per-connection server daemon (187.110.238.50:41798). Mar 25 01:46:46.052544 sshd[5364]: Invalid user liangdi from 187.110.238.50 port 41798 Mar 25 01:46:46.238481 sshd[5364]: Received disconnect from 187.110.238.50 port 41798:11: Bye Bye [preauth] Mar 25 01:46:46.238481 sshd[5364]: Disconnected from invalid user liangdi 187.110.238.50 port 41798 [preauth] Mar 25 01:46:46.240655 systemd[1]: sshd@18-37.27.205.216:22-187.110.238.50:41798.service: Deactivated successfully. Mar 25 01:46:50.444897 containerd[1510]: time="2025-03-25T01:46:50.444836677Z" level=info msg="TaskExit event in podsandbox handler container_id:\"60a361e49ee2512a46f6f281a8dd5f303f88d280ffac9fd2a107dce473f76a48\" id:\"e664fc692a0c483d5ab61bec6bed602b1758427c6fa115f1690b7c5cd2229e9f\" pid:5381 exited_at:{seconds:1742867210 nanos:443808151}" Mar 25 01:47:03.532228 containerd[1510]: time="2025-03-25T01:47:03.532174237Z" level=info msg="TaskExit event in podsandbox handler container_id:\"db2b888672fab16c0a7cda0009690ee16758eca5e22b4e6f315b4c5bd055c4f9\" id:\"7ce255a21059a289463fb36153bd1b4a976f932b068716e3ad81e941be1ae99b\" pid:5406 exited_at:{seconds:1742867223 nanos:530814122}" Mar 25 01:47:08.230829 containerd[1510]: time="2025-03-25T01:47:08.230754986Z" level=info msg="TaskExit event in podsandbox handler container_id:\"db2b888672fab16c0a7cda0009690ee16758eca5e22b4e6f315b4c5bd055c4f9\" id:\"e0fdc99792d379c6c859a9f54a741b3602923ba5371d88a2afdccdc8f99690ca\" pid:5427 exited_at:{seconds:1742867228 nanos:230138242}" Mar 25 01:47:12.232347 systemd[1]: sshd@14-37.27.205.216:22-221.221.160.224:50844.service: Deactivated successfully. Mar 25 01:47:18.904791 systemd[1]: Started sshd@19-37.27.205.216:22-139.178.68.195:47122.service - OpenSSH per-connection server daemon (139.178.68.195:47122). Mar 25 01:47:19.921336 sshd[5443]: Accepted publickey for core from 139.178.68.195 port 47122 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:47:19.924249 sshd-session[5443]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:47:19.937029 systemd-logind[1499]: New session 8 of user core. Mar 25 01:47:19.945586 systemd[1]: Started session-8.scope - Session 8 of User core. Mar 25 01:47:20.427939 containerd[1510]: time="2025-03-25T01:47:20.427882037Z" level=info msg="TaskExit event in podsandbox handler container_id:\"60a361e49ee2512a46f6f281a8dd5f303f88d280ffac9fd2a107dce473f76a48\" id:\"8c8a5ee8112023c8401a8ef4c75f2aeb477871fba40ec8ecc63d2dc5cb487b56\" pid:5460 exited_at:{seconds:1742867240 nanos:427346504}" Mar 25 01:47:21.273452 sshd[5445]: Connection closed by 139.178.68.195 port 47122 Mar 25 01:47:21.276502 sshd-session[5443]: pam_unix(sshd:session): session closed for user core Mar 25 01:47:21.292263 systemd[1]: sshd@19-37.27.205.216:22-139.178.68.195:47122.service: Deactivated successfully. Mar 25 01:47:21.296492 systemd[1]: session-8.scope: Deactivated successfully. Mar 25 01:47:21.300643 systemd-logind[1499]: Session 8 logged out. Waiting for processes to exit. Mar 25 01:47:21.312783 systemd-logind[1499]: Removed session 8. Mar 25 01:47:26.440110 systemd[1]: Started sshd@20-37.27.205.216:22-139.178.68.195:48372.service - OpenSSH per-connection server daemon (139.178.68.195:48372). Mar 25 01:47:27.455748 sshd[5484]: Accepted publickey for core from 139.178.68.195 port 48372 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:47:27.458495 sshd-session[5484]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:47:27.465607 systemd-logind[1499]: New session 9 of user core. Mar 25 01:47:27.470406 systemd[1]: Started session-9.scope - Session 9 of User core. Mar 25 01:47:28.273434 sshd[5487]: Connection closed by 139.178.68.195 port 48372 Mar 25 01:47:28.276494 sshd-session[5484]: pam_unix(sshd:session): session closed for user core Mar 25 01:47:28.282179 systemd[1]: sshd@20-37.27.205.216:22-139.178.68.195:48372.service: Deactivated successfully. Mar 25 01:47:28.284624 systemd[1]: session-9.scope: Deactivated successfully. Mar 25 01:47:28.286006 systemd-logind[1499]: Session 9 logged out. Waiting for processes to exit. Mar 25 01:47:28.287545 systemd-logind[1499]: Removed session 9. Mar 25 01:47:33.179983 containerd[1510]: time="2025-03-25T01:47:33.138728037Z" level=warning msg="container event discarded" container=b75900d672c962881b1a7d8286d5b9d12bb66ee696b940a410090750f8d3081d type=CONTAINER_CREATED_EVENT Mar 25 01:47:33.229443 containerd[1510]: time="2025-03-25T01:47:33.229346911Z" level=warning msg="container event discarded" container=b75900d672c962881b1a7d8286d5b9d12bb66ee696b940a410090750f8d3081d type=CONTAINER_STARTED_EVENT Mar 25 01:47:33.229443 containerd[1510]: time="2025-03-25T01:47:33.229428574Z" level=warning msg="container event discarded" container=9ed62a39cd25d8b996df411704454f064fecc0a1a8bcad2b63c5130cc222e590 type=CONTAINER_CREATED_EVENT Mar 25 01:47:33.229443 containerd[1510]: time="2025-03-25T01:47:33.229454272Z" level=warning msg="container event discarded" container=9ed62a39cd25d8b996df411704454f064fecc0a1a8bcad2b63c5130cc222e590 type=CONTAINER_STARTED_EVENT Mar 25 01:47:33.229777 containerd[1510]: time="2025-03-25T01:47:33.229472246Z" level=warning msg="container event discarded" container=11aa8d9457861f7f8ba9791f55a3f65d4b34ff7670c942c7cbe0602bfb15d63e type=CONTAINER_CREATED_EVENT Mar 25 01:47:33.229777 containerd[1510]: time="2025-03-25T01:47:33.229490179Z" level=warning msg="container event discarded" container=11aa8d9457861f7f8ba9791f55a3f65d4b34ff7670c942c7cbe0602bfb15d63e type=CONTAINER_STARTED_EVENT Mar 25 01:47:33.229777 containerd[1510]: time="2025-03-25T01:47:33.229504416Z" level=warning msg="container event discarded" container=5ffe9a5b2412695da7395c1cb820afbdf8e0c2fb63f7506797c96724e722284d type=CONTAINER_CREATED_EVENT Mar 25 01:47:33.229777 containerd[1510]: time="2025-03-25T01:47:33.229517321Z" level=warning msg="container event discarded" container=605c443f23fe4a316c1c2d2d537317a1eb6192d489d5597b46df202973e005dd type=CONTAINER_CREATED_EVENT Mar 25 01:47:33.229777 containerd[1510]: time="2025-03-25T01:47:33.229529904Z" level=warning msg="container event discarded" container=4223d5b2978251a2813abe1182ea273ba3e217179a9115ea9a86ad4f6a67f871 type=CONTAINER_CREATED_EVENT Mar 25 01:47:33.302336 containerd[1510]: time="2025-03-25T01:47:33.302217572Z" level=warning msg="container event discarded" container=605c443f23fe4a316c1c2d2d537317a1eb6192d489d5597b46df202973e005dd type=CONTAINER_STARTED_EVENT Mar 25 01:47:33.302336 containerd[1510]: time="2025-03-25T01:47:33.302316758Z" level=warning msg="container event discarded" container=5ffe9a5b2412695da7395c1cb820afbdf8e0c2fb63f7506797c96724e722284d type=CONTAINER_STARTED_EVENT Mar 25 01:47:33.327763 containerd[1510]: time="2025-03-25T01:47:33.327659393Z" level=warning msg="container event discarded" container=4223d5b2978251a2813abe1182ea273ba3e217179a9115ea9a86ad4f6a67f871 type=CONTAINER_STARTED_EVENT Mar 25 01:47:33.451911 systemd[1]: Started sshd@21-37.27.205.216:22-139.178.68.195:48380.service - OpenSSH per-connection server daemon (139.178.68.195:48380). Mar 25 01:47:34.510639 sshd[5501]: Accepted publickey for core from 139.178.68.195 port 48380 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:47:34.515056 sshd-session[5501]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:47:34.524539 systemd-logind[1499]: New session 10 of user core. Mar 25 01:47:34.529580 systemd[1]: Started session-10.scope - Session 10 of User core. Mar 25 01:47:35.300814 sshd[5503]: Connection closed by 139.178.68.195 port 48380 Mar 25 01:47:35.301822 sshd-session[5501]: pam_unix(sshd:session): session closed for user core Mar 25 01:47:35.306827 systemd[1]: sshd@21-37.27.205.216:22-139.178.68.195:48380.service: Deactivated successfully. Mar 25 01:47:35.310709 systemd[1]: session-10.scope: Deactivated successfully. Mar 25 01:47:35.313673 systemd-logind[1499]: Session 10 logged out. Waiting for processes to exit. Mar 25 01:47:35.316712 systemd-logind[1499]: Removed session 10. Mar 25 01:47:35.504039 systemd[1]: Started sshd@22-37.27.205.216:22-139.178.68.195:52432.service - OpenSSH per-connection server daemon (139.178.68.195:52432). Mar 25 01:47:36.594013 sshd[5516]: Accepted publickey for core from 139.178.68.195 port 52432 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:47:36.595946 sshd-session[5516]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:47:36.603009 systemd-logind[1499]: New session 11 of user core. Mar 25 01:47:36.605440 systemd[1]: Started session-11.scope - Session 11 of User core. Mar 25 01:47:37.492699 sshd[5518]: Connection closed by 139.178.68.195 port 52432 Mar 25 01:47:37.494184 sshd-session[5516]: pam_unix(sshd:session): session closed for user core Mar 25 01:47:37.498233 systemd[1]: sshd@22-37.27.205.216:22-139.178.68.195:52432.service: Deactivated successfully. Mar 25 01:47:37.501814 systemd[1]: session-11.scope: Deactivated successfully. Mar 25 01:47:37.504221 systemd-logind[1499]: Session 11 logged out. Waiting for processes to exit. Mar 25 01:47:37.506204 systemd-logind[1499]: Removed session 11. Mar 25 01:47:37.652420 systemd[1]: Started sshd@23-37.27.205.216:22-139.178.68.195:52440.service - OpenSSH per-connection server daemon (139.178.68.195:52440). Mar 25 01:47:38.195939 containerd[1510]: time="2025-03-25T01:47:38.195883537Z" level=info msg="TaskExit event in podsandbox handler container_id:\"db2b888672fab16c0a7cda0009690ee16758eca5e22b4e6f315b4c5bd055c4f9\" id:\"837647504b0ee6c8f6994691b818579c153c2a090d47d2a3da83859eb97b8b8a\" pid:5544 exited_at:{seconds:1742867258 nanos:195106954}" Mar 25 01:47:38.698286 sshd[5528]: Accepted publickey for core from 139.178.68.195 port 52440 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:47:38.699798 sshd-session[5528]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:47:38.704536 systemd-logind[1499]: New session 12 of user core. Mar 25 01:47:38.708416 systemd[1]: Started session-12.scope - Session 12 of User core. Mar 25 01:47:39.504876 sshd[5553]: Connection closed by 139.178.68.195 port 52440 Mar 25 01:47:39.504747 sshd-session[5528]: pam_unix(sshd:session): session closed for user core Mar 25 01:47:39.510581 systemd-logind[1499]: Session 12 logged out. Waiting for processes to exit. Mar 25 01:47:39.511792 systemd[1]: sshd@23-37.27.205.216:22-139.178.68.195:52440.service: Deactivated successfully. Mar 25 01:47:39.515823 systemd[1]: session-12.scope: Deactivated successfully. Mar 25 01:47:39.517685 systemd-logind[1499]: Removed session 12. Mar 25 01:47:44.014674 containerd[1510]: time="2025-03-25T01:47:44.014433823Z" level=warning msg="container event discarded" container=160080cfd9231dc5066cf2b254059a21d06e7240e1ade036209bf5d7e390b3cb type=CONTAINER_CREATED_EVENT Mar 25 01:47:44.014674 containerd[1510]: time="2025-03-25T01:47:44.014629889Z" level=warning msg="container event discarded" container=160080cfd9231dc5066cf2b254059a21d06e7240e1ade036209bf5d7e390b3cb type=CONTAINER_STARTED_EVENT Mar 25 01:47:44.039391 containerd[1510]: time="2025-03-25T01:47:44.039242855Z" level=warning msg="container event discarded" container=c7e50e048cdb37ba3393de5467a0d9622d5e24bc9a99e0c5f9bb7cdc307e040b type=CONTAINER_CREATED_EVENT Mar 25 01:47:44.106049 containerd[1510]: time="2025-03-25T01:47:44.105941164Z" level=warning msg="container event discarded" container=c7e50e048cdb37ba3393de5467a0d9622d5e24bc9a99e0c5f9bb7cdc307e040b type=CONTAINER_STARTED_EVENT Mar 25 01:47:44.193636 containerd[1510]: time="2025-03-25T01:47:44.193481933Z" level=warning msg="container event discarded" container=562b23990bdbcbceae8c5df4d9df819eeadfae554872eb1ab18cdd95bec4f5bb type=CONTAINER_CREATED_EVENT Mar 25 01:47:44.193636 containerd[1510]: time="2025-03-25T01:47:44.193594945Z" level=warning msg="container event discarded" container=562b23990bdbcbceae8c5df4d9df819eeadfae554872eb1ab18cdd95bec4f5bb type=CONTAINER_STARTED_EVENT Mar 25 01:47:44.677550 systemd[1]: Started sshd@24-37.27.205.216:22-139.178.68.195:52454.service - OpenSSH per-connection server daemon (139.178.68.195:52454). Mar 25 01:47:45.155666 systemd[1]: Started sshd@25-37.27.205.216:22-182.75.65.22:44140.service - OpenSSH per-connection server daemon (182.75.65.22:44140). Mar 25 01:47:45.678066 sshd[5571]: Accepted publickey for core from 139.178.68.195 port 52454 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:47:45.680683 sshd-session[5571]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:47:45.690241 systemd-logind[1499]: New session 13 of user core. Mar 25 01:47:45.699544 systemd[1]: Started session-13.scope - Session 13 of User core. Mar 25 01:47:46.217562 sshd[5574]: Invalid user traccar from 182.75.65.22 port 44140 Mar 25 01:47:46.413386 sshd[5574]: Received disconnect from 182.75.65.22 port 44140:11: Bye Bye [preauth] Mar 25 01:47:46.414386 sshd[5574]: Disconnected from invalid user traccar 182.75.65.22 port 44140 [preauth] Mar 25 01:47:46.416433 systemd[1]: Started sshd@26-37.27.205.216:22-36.67.70.198:42318.service - OpenSSH per-connection server daemon (36.67.70.198:42318). Mar 25 01:47:46.418842 systemd[1]: sshd@25-37.27.205.216:22-182.75.65.22:44140.service: Deactivated successfully. Mar 25 01:47:46.493424 sshd[5576]: Connection closed by 139.178.68.195 port 52454 Mar 25 01:47:46.494363 sshd-session[5571]: pam_unix(sshd:session): session closed for user core Mar 25 01:47:46.500816 systemd[1]: sshd@24-37.27.205.216:22-139.178.68.195:52454.service: Deactivated successfully. Mar 25 01:47:46.505342 systemd[1]: session-13.scope: Deactivated successfully. Mar 25 01:47:46.509679 systemd-logind[1499]: Session 13 logged out. Waiting for processes to exit. Mar 25 01:47:46.511368 systemd-logind[1499]: Removed session 13. Mar 25 01:47:46.979648 containerd[1510]: time="2025-03-25T01:47:46.979539840Z" level=warning msg="container event discarded" container=5e93f55ab361b57e046d59cced7c1cf4f15235f712f35eddeb09576206986847 type=CONTAINER_CREATED_EVENT Mar 25 01:47:47.034278 containerd[1510]: time="2025-03-25T01:47:47.034212073Z" level=warning msg="container event discarded" container=5e93f55ab361b57e046d59cced7c1cf4f15235f712f35eddeb09576206986847 type=CONTAINER_STARTED_EVENT Mar 25 01:47:47.536561 sshd[5586]: Invalid user user1 from 36.67.70.198 port 42318 Mar 25 01:47:47.739085 sshd[5586]: Received disconnect from 36.67.70.198 port 42318:11: Bye Bye [preauth] Mar 25 01:47:47.739085 sshd[5586]: Disconnected from invalid user user1 36.67.70.198 port 42318 [preauth] Mar 25 01:47:47.740736 systemd[1]: sshd@26-37.27.205.216:22-36.67.70.198:42318.service: Deactivated successfully. Mar 25 01:47:50.443156 containerd[1510]: time="2025-03-25T01:47:50.443104921Z" level=info msg="TaskExit event in podsandbox handler container_id:\"60a361e49ee2512a46f6f281a8dd5f303f88d280ffac9fd2a107dce473f76a48\" id:\"cea418baebc21dfb573629b4d5cfedde15bcfddd0d202d1b9de46b63c62cfb83\" pid:5611 exited_at:{seconds:1742867270 nanos:442713299}" Mar 25 01:47:50.518968 containerd[1510]: time="2025-03-25T01:47:50.518859520Z" level=warning msg="container event discarded" container=ff91ea43e43a3424f25cd2362cc68a772a781392062b7ee226bf4ec33ca2866f type=CONTAINER_CREATED_EVENT Mar 25 01:47:50.518968 containerd[1510]: time="2025-03-25T01:47:50.518938036Z" level=warning msg="container event discarded" container=ff91ea43e43a3424f25cd2362cc68a772a781392062b7ee226bf4ec33ca2866f type=CONTAINER_STARTED_EVENT Mar 25 01:47:50.630788 containerd[1510]: time="2025-03-25T01:47:50.630656390Z" level=warning msg="container event discarded" container=ac5ce008bbd5beec4e6952cb387d4d544b24191494433280834f716b1577a4ef type=CONTAINER_CREATED_EVENT Mar 25 01:47:50.630788 containerd[1510]: time="2025-03-25T01:47:50.630737642Z" level=warning msg="container event discarded" container=ac5ce008bbd5beec4e6952cb387d4d544b24191494433280834f716b1577a4ef type=CONTAINER_STARTED_EVENT Mar 25 01:47:51.662049 systemd[1]: Started sshd@27-37.27.205.216:22-139.178.68.195:42948.service - OpenSSH per-connection server daemon (139.178.68.195:42948). Mar 25 01:47:52.656671 sshd[5623]: Accepted publickey for core from 139.178.68.195 port 42948 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:47:52.658525 sshd-session[5623]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:47:52.667960 systemd-logind[1499]: New session 14 of user core. Mar 25 01:47:52.671618 systemd[1]: Started session-14.scope - Session 14 of User core. Mar 25 01:47:53.400188 sshd[5625]: Connection closed by 139.178.68.195 port 42948 Mar 25 01:47:53.401013 sshd-session[5623]: pam_unix(sshd:session): session closed for user core Mar 25 01:47:53.404628 systemd[1]: sshd@27-37.27.205.216:22-139.178.68.195:42948.service: Deactivated successfully. Mar 25 01:47:53.406860 systemd[1]: session-14.scope: Deactivated successfully. Mar 25 01:47:53.407817 systemd-logind[1499]: Session 14 logged out. Waiting for processes to exit. Mar 25 01:47:53.409194 systemd-logind[1499]: Removed session 14. Mar 25 01:47:53.568856 systemd[1]: Started sshd@28-37.27.205.216:22-139.178.68.195:42962.service - OpenSSH per-connection server daemon (139.178.68.195:42962). Mar 25 01:47:54.078939 containerd[1510]: time="2025-03-25T01:47:54.078837188Z" level=warning msg="container event discarded" container=a67aebf32d056b1e84f2ecd705e0c282ee3ac4afffd7967c7364ddc23165b85c type=CONTAINER_CREATED_EVENT Mar 25 01:47:54.152588 containerd[1510]: time="2025-03-25T01:47:54.152475553Z" level=warning msg="container event discarded" container=a67aebf32d056b1e84f2ecd705e0c282ee3ac4afffd7967c7364ddc23165b85c type=CONTAINER_STARTED_EVENT Mar 25 01:47:54.573392 sshd[5637]: Accepted publickey for core from 139.178.68.195 port 42962 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:47:54.574760 sshd-session[5637]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:47:54.581496 systemd-logind[1499]: New session 15 of user core. Mar 25 01:47:54.590500 systemd[1]: Started session-15.scope - Session 15 of User core. Mar 25 01:47:55.620406 sshd[5639]: Connection closed by 139.178.68.195 port 42962 Mar 25 01:47:55.623286 sshd-session[5637]: pam_unix(sshd:session): session closed for user core Mar 25 01:47:55.631970 systemd[1]: sshd@28-37.27.205.216:22-139.178.68.195:42962.service: Deactivated successfully. Mar 25 01:47:55.633555 systemd-logind[1499]: Session 15 logged out. Waiting for processes to exit. Mar 25 01:47:55.635510 systemd[1]: session-15.scope: Deactivated successfully. Mar 25 01:47:55.639336 systemd-logind[1499]: Removed session 15. Mar 25 01:47:55.790740 systemd[1]: Started sshd@29-37.27.205.216:22-139.178.68.195:50102.service - OpenSSH per-connection server daemon (139.178.68.195:50102). Mar 25 01:47:56.062586 containerd[1510]: time="2025-03-25T01:47:56.062498509Z" level=warning msg="container event discarded" container=03eda04f4b338e358b1c55990c2d0bc5962eea1602761b0b128a0670332d852c type=CONTAINER_CREATED_EVENT Mar 25 01:47:56.146022 containerd[1510]: time="2025-03-25T01:47:56.145927981Z" level=warning msg="container event discarded" container=03eda04f4b338e358b1c55990c2d0bc5962eea1602761b0b128a0670332d852c type=CONTAINER_STARTED_EVENT Mar 25 01:47:56.288146 containerd[1510]: time="2025-03-25T01:47:56.288066927Z" level=warning msg="container event discarded" container=03eda04f4b338e358b1c55990c2d0bc5962eea1602761b0b128a0670332d852c type=CONTAINER_STOPPED_EVENT Mar 25 01:47:56.832750 sshd[5649]: Accepted publickey for core from 139.178.68.195 port 50102 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:47:56.835898 sshd-session[5649]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:47:56.843835 systemd-logind[1499]: New session 16 of user core. Mar 25 01:47:56.847422 systemd[1]: Started session-16.scope - Session 16 of User core. Mar 25 01:47:59.641569 sshd[5651]: Connection closed by 139.178.68.195 port 50102 Mar 25 01:47:59.643159 sshd-session[5649]: pam_unix(sshd:session): session closed for user core Mar 25 01:47:59.646947 systemd-logind[1499]: Session 16 logged out. Waiting for processes to exit. Mar 25 01:47:59.647107 systemd[1]: sshd@29-37.27.205.216:22-139.178.68.195:50102.service: Deactivated successfully. Mar 25 01:47:59.649194 systemd[1]: session-16.scope: Deactivated successfully. Mar 25 01:47:59.649699 systemd[1]: session-16.scope: Consumed 684ms CPU time, 68.8M memory peak. Mar 25 01:47:59.651012 systemd-logind[1499]: Removed session 16. Mar 25 01:47:59.814976 systemd[1]: Started sshd@30-37.27.205.216:22-139.178.68.195:50108.service - OpenSSH per-connection server daemon (139.178.68.195:50108). Mar 25 01:48:00.843262 sshd[5685]: Accepted publickey for core from 139.178.68.195 port 50108 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:48:00.845845 sshd-session[5685]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:48:00.854380 systemd-logind[1499]: New session 17 of user core. Mar 25 01:48:00.863615 systemd[1]: Started session-17.scope - Session 17 of User core. Mar 25 01:48:02.260372 sshd[5687]: Connection closed by 139.178.68.195 port 50108 Mar 25 01:48:02.261465 sshd-session[5685]: pam_unix(sshd:session): session closed for user core Mar 25 01:48:02.264581 systemd[1]: sshd@30-37.27.205.216:22-139.178.68.195:50108.service: Deactivated successfully. Mar 25 01:48:02.266861 systemd[1]: session-17.scope: Deactivated successfully. Mar 25 01:48:02.268367 systemd-logind[1499]: Session 17 logged out. Waiting for processes to exit. Mar 25 01:48:02.270723 systemd-logind[1499]: Removed session 17. Mar 25 01:48:02.432950 systemd[1]: Started sshd@31-37.27.205.216:22-139.178.68.195:50118.service - OpenSSH per-connection server daemon (139.178.68.195:50118). Mar 25 01:48:02.689373 systemd[1]: Started sshd@32-37.27.205.216:22-187.110.238.50:39062.service - OpenSSH per-connection server daemon (187.110.238.50:39062). Mar 25 01:48:02.820749 containerd[1510]: time="2025-03-25T01:48:02.820555593Z" level=warning msg="container event discarded" container=582b8601727ae6f43513464f6abf6daab9103acce51c4e316c13b0347f058fe0 type=CONTAINER_CREATED_EVENT Mar 25 01:48:02.971222 containerd[1510]: time="2025-03-25T01:48:02.970973396Z" level=warning msg="container event discarded" container=582b8601727ae6f43513464f6abf6daab9103acce51c4e316c13b0347f058fe0 type=CONTAINER_STARTED_EVENT Mar 25 01:48:03.435106 sshd[5698]: Accepted publickey for core from 139.178.68.195 port 50118 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:48:03.437570 sshd-session[5698]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:48:03.445682 systemd-logind[1499]: New session 18 of user core. Mar 25 01:48:03.455595 systemd[1]: Started session-18.scope - Session 18 of User core. Mar 25 01:48:03.604787 containerd[1510]: time="2025-03-25T01:48:03.604136560Z" level=warning msg="container event discarded" container=582b8601727ae6f43513464f6abf6daab9103acce51c4e316c13b0347f058fe0 type=CONTAINER_STOPPED_EVENT Mar 25 01:48:03.682849 containerd[1510]: time="2025-03-25T01:48:03.682809477Z" level=info msg="TaskExit event in podsandbox handler container_id:\"db2b888672fab16c0a7cda0009690ee16758eca5e22b4e6f315b4c5bd055c4f9\" id:\"6eda5762367f222eacf6128ddc01599871aa37e1c4b1afb507d2fb346df33182\" pid:5720 exited_at:{seconds:1742867283 nanos:682144173}" Mar 25 01:48:03.711707 sshd[5701]: Invalid user killer from 187.110.238.50 port 39062 Mar 25 01:48:03.898857 sshd[5701]: Received disconnect from 187.110.238.50 port 39062:11: Bye Bye [preauth] Mar 25 01:48:03.898857 sshd[5701]: Disconnected from invalid user killer 187.110.238.50 port 39062 [preauth] Mar 25 01:48:03.901022 systemd[1]: sshd@32-37.27.205.216:22-187.110.238.50:39062.service: Deactivated successfully. Mar 25 01:48:04.264499 sshd[5706]: Connection closed by 139.178.68.195 port 50118 Mar 25 01:48:04.266291 sshd-session[5698]: pam_unix(sshd:session): session closed for user core Mar 25 01:48:04.269986 systemd[1]: sshd@31-37.27.205.216:22-139.178.68.195:50118.service: Deactivated successfully. Mar 25 01:48:04.273592 systemd[1]: session-18.scope: Deactivated successfully. Mar 25 01:48:04.277386 systemd-logind[1499]: Session 18 logged out. Waiting for processes to exit. Mar 25 01:48:04.279690 systemd-logind[1499]: Removed session 18. Mar 25 01:48:08.300654 containerd[1510]: time="2025-03-25T01:48:08.300567227Z" level=info msg="TaskExit event in podsandbox handler container_id:\"db2b888672fab16c0a7cda0009690ee16758eca5e22b4e6f315b4c5bd055c4f9\" id:\"6eba8eeda05bd14d871b99cd70db5cf23db346b737ff38ce83380d4605f83d03\" pid:5754 exited_at:{seconds:1742867288 nanos:299259140}" Mar 25 01:48:09.449325 systemd[1]: Started sshd@33-37.27.205.216:22-139.178.68.195:51622.service - OpenSSH per-connection server daemon (139.178.68.195:51622). Mar 25 01:48:10.519849 sshd[5764]: Accepted publickey for core from 139.178.68.195 port 51622 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:48:10.523526 sshd-session[5764]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:48:10.537646 systemd-logind[1499]: New session 19 of user core. Mar 25 01:48:10.544675 systemd[1]: Started session-19.scope - Session 19 of User core. Mar 25 01:48:11.352085 sshd[5766]: Connection closed by 139.178.68.195 port 51622 Mar 25 01:48:11.353212 sshd-session[5764]: pam_unix(sshd:session): session closed for user core Mar 25 01:48:11.359547 systemd[1]: sshd@33-37.27.205.216:22-139.178.68.195:51622.service: Deactivated successfully. Mar 25 01:48:11.364101 systemd[1]: session-19.scope: Deactivated successfully. Mar 25 01:48:11.366521 systemd-logind[1499]: Session 19 logged out. Waiting for processes to exit. Mar 25 01:48:11.368492 systemd-logind[1499]: Removed session 19. Mar 25 01:48:12.102696 containerd[1510]: time="2025-03-25T01:48:12.102439904Z" level=warning msg="container event discarded" container=60a361e49ee2512a46f6f281a8dd5f303f88d280ffac9fd2a107dce473f76a48 type=CONTAINER_CREATED_EVENT Mar 25 01:48:12.426598 containerd[1510]: time="2025-03-25T01:48:12.426367526Z" level=warning msg="container event discarded" container=60a361e49ee2512a46f6f281a8dd5f303f88d280ffac9fd2a107dce473f76a48 type=CONTAINER_STARTED_EVENT Mar 25 01:48:16.527585 systemd[1]: Started sshd@34-37.27.205.216:22-139.178.68.195:59672.service - OpenSSH per-connection server daemon (139.178.68.195:59672). Mar 25 01:48:16.598461 containerd[1510]: time="2025-03-25T01:48:16.598366203Z" level=warning msg="container event discarded" container=d84225317892a4339812ef473801e9a5240054b330c77734ee9b8cf5d32c8d77 type=CONTAINER_CREATED_EVENT Mar 25 01:48:16.598461 containerd[1510]: time="2025-03-25T01:48:16.598448768Z" level=warning msg="container event discarded" container=d84225317892a4339812ef473801e9a5240054b330c77734ee9b8cf5d32c8d77 type=CONTAINER_STARTED_EVENT Mar 25 01:48:16.642759 containerd[1510]: time="2025-03-25T01:48:16.642671634Z" level=warning msg="container event discarded" container=65f1b58e60ebadeb7928ba627c521abc8a7c60af60cebca7c7f3efa98e9cb70f type=CONTAINER_CREATED_EVENT Mar 25 01:48:16.732172 containerd[1510]: time="2025-03-25T01:48:16.732084895Z" level=warning msg="container event discarded" container=65f1b58e60ebadeb7928ba627c521abc8a7c60af60cebca7c7f3efa98e9cb70f type=CONTAINER_STARTED_EVENT Mar 25 01:48:17.303347 containerd[1510]: time="2025-03-25T01:48:17.303105685Z" level=warning msg="container event discarded" container=0f7e2369524f26a6d6083790d312b2d2c77100d15cb88a04f40f1c894306e542 type=CONTAINER_CREATED_EVENT Mar 25 01:48:17.303347 containerd[1510]: time="2025-03-25T01:48:17.303225570Z" level=warning msg="container event discarded" container=0f7e2369524f26a6d6083790d312b2d2c77100d15cb88a04f40f1c894306e542 type=CONTAINER_STARTED_EVENT Mar 25 01:48:17.532764 sshd[5780]: Accepted publickey for core from 139.178.68.195 port 59672 ssh2: RSA SHA256:fkhakX93KrTwVnKGj5rOvcSkG5Y68tRFKO8AS7J5lC0 Mar 25 01:48:17.536078 sshd-session[5780]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Mar 25 01:48:17.547569 systemd-logind[1499]: New session 20 of user core. Mar 25 01:48:17.551678 systemd[1]: Started session-20.scope - Session 20 of User core. Mar 25 01:48:18.245863 containerd[1510]: time="2025-03-25T01:48:18.245726401Z" level=warning msg="container event discarded" container=e1e0d1c5fb3a2f164bda46257cbace035df11a1c0b3faf17a636d021779fc99a type=CONTAINER_CREATED_EVENT Mar 25 01:48:18.245863 containerd[1510]: time="2025-03-25T01:48:18.245804346Z" level=warning msg="container event discarded" container=e1e0d1c5fb3a2f164bda46257cbace035df11a1c0b3faf17a636d021779fc99a type=CONTAINER_STARTED_EVENT Mar 25 01:48:18.282860 containerd[1510]: time="2025-03-25T01:48:18.282748905Z" level=warning msg="container event discarded" container=b51d66e187a7bfc64a26d144cb085d300a14bfef74cc8384e3038346a5cd2f5e type=CONTAINER_CREATED_EVENT Mar 25 01:48:18.293914 sshd[5782]: Connection closed by 139.178.68.195 port 59672 Mar 25 01:48:18.294749 sshd-session[5780]: pam_unix(sshd:session): session closed for user core Mar 25 01:48:18.300123 systemd[1]: sshd@34-37.27.205.216:22-139.178.68.195:59672.service: Deactivated successfully. Mar 25 01:48:18.303355 systemd[1]: session-20.scope: Deactivated successfully. Mar 25 01:48:18.304367 systemd-logind[1499]: Session 20 logged out. Waiting for processes to exit. Mar 25 01:48:18.305465 systemd-logind[1499]: Removed session 20. Mar 25 01:48:18.336299 containerd[1510]: time="2025-03-25T01:48:18.336155461Z" level=warning msg="container event discarded" container=b51d66e187a7bfc64a26d144cb085d300a14bfef74cc8384e3038346a5cd2f5e type=CONTAINER_STARTED_EVENT Mar 25 01:48:19.330147 containerd[1510]: time="2025-03-25T01:48:19.330038851Z" level=warning msg="container event discarded" container=f638298fa8589574341426be9e3bd3a78c4b5a764c8bf79a78574a44f300960b type=CONTAINER_CREATED_EVENT Mar 25 01:48:19.330147 containerd[1510]: time="2025-03-25T01:48:19.330094406Z" level=warning msg="container event discarded" container=f638298fa8589574341426be9e3bd3a78c4b5a764c8bf79a78574a44f300960b type=CONTAINER_STARTED_EVENT Mar 25 01:48:19.386585 containerd[1510]: time="2025-03-25T01:48:19.386485655Z" level=warning msg="container event discarded" container=a9f3963214a06889610d1ba768c306af30a14d13076c27e39f54e7d073d0261f type=CONTAINER_CREATED_EVENT Mar 25 01:48:19.386585 containerd[1510]: time="2025-03-25T01:48:19.386537603Z" level=warning msg="container event discarded" container=a9f3963214a06889610d1ba768c306af30a14d13076c27e39f54e7d073d0261f type=CONTAINER_STARTED_EVENT Mar 25 01:48:19.695819 containerd[1510]: time="2025-03-25T01:48:19.695753843Z" level=warning msg="container event discarded" container=dd5584e36142935b2b66ee735296b0eed4e1f13a90302f9147c30cc5936af19c type=CONTAINER_CREATED_EVENT Mar 25 01:48:19.780686 containerd[1510]: time="2025-03-25T01:48:19.780516993Z" level=warning msg="container event discarded" container=dd5584e36142935b2b66ee735296b0eed4e1f13a90302f9147c30cc5936af19c type=CONTAINER_STARTED_EVENT Mar 25 01:48:20.312516 containerd[1510]: time="2025-03-25T01:48:20.312404347Z" level=warning msg="container event discarded" container=e7d83d648f2ae7ff7c684a1e6b18cdd0c73eaba73def8883778af536557e2552 type=CONTAINER_CREATED_EVENT Mar 25 01:48:20.312516 containerd[1510]: time="2025-03-25T01:48:20.312479878Z" level=warning msg="container event discarded" container=e7d83d648f2ae7ff7c684a1e6b18cdd0c73eaba73def8883778af536557e2552 type=CONTAINER_STARTED_EVENT Mar 25 01:48:20.451637 containerd[1510]: time="2025-03-25T01:48:20.451586075Z" level=info msg="TaskExit event in podsandbox handler container_id:\"60a361e49ee2512a46f6f281a8dd5f303f88d280ffac9fd2a107dce473f76a48\" id:\"82c88fe503c20d7ca23e4d533f9ffebcc729f813a43c69901ae27f4e0ab28501\" pid:5807 exited_at:{seconds:1742867300 nanos:451011951}" Mar 25 01:48:23.296259 containerd[1510]: time="2025-03-25T01:48:23.296166722Z" level=warning msg="container event discarded" container=4555255cd0517e012eb84dbfba06a1b7c54d8bd603782893ea6cf790bc18c5d4 type=CONTAINER_CREATED_EVENT Mar 25 01:48:23.384761 containerd[1510]: time="2025-03-25T01:48:23.384629520Z" level=warning msg="container event discarded" container=4555255cd0517e012eb84dbfba06a1b7c54d8bd603782893ea6cf790bc18c5d4 type=CONTAINER_STARTED_EVENT Mar 25 01:48:26.775388 containerd[1510]: time="2025-03-25T01:48:26.775219101Z" level=warning msg="container event discarded" container=db2b888672fab16c0a7cda0009690ee16758eca5e22b4e6f315b4c5bd055c4f9 type=CONTAINER_CREATED_EVENT Mar 25 01:48:26.866225 containerd[1510]: time="2025-03-25T01:48:26.866120099Z" level=warning msg="container event discarded" container=db2b888672fab16c0a7cda0009690ee16758eca5e22b4e6f315b4c5bd055c4f9 type=CONTAINER_STARTED_EVENT Mar 25 01:48:30.059311 containerd[1510]: time="2025-03-25T01:48:30.059167567Z" level=warning msg="container event discarded" container=35f8fa92cecc65739bcd4eeca4474ad9cf025819e29cd92f8ffb5cc82f72e9e7 type=CONTAINER_CREATED_EVENT Mar 25 01:48:30.143374 containerd[1510]: time="2025-03-25T01:48:30.143245007Z" level=warning msg="container event discarded" container=35f8fa92cecc65739bcd4eeca4474ad9cf025819e29cd92f8ffb5cc82f72e9e7 type=CONTAINER_STARTED_EVENT Mar 25 01:48:30.613989 containerd[1510]: time="2025-03-25T01:48:30.613885068Z" level=warning msg="container event discarded" container=94ac75002e2175793209cca08bb4749fd8b1ace91694731428cde308baf03385 type=CONTAINER_CREATED_EVENT Mar 25 01:48:30.727349 containerd[1510]: time="2025-03-25T01:48:30.727286999Z" level=warning msg="container event discarded" container=94ac75002e2175793209cca08bb4749fd8b1ace91694731428cde308baf03385 type=CONTAINER_STARTED_EVENT Mar 25 01:48:32.921675 systemd[1]: cri-containerd-5e93f55ab361b57e046d59cced7c1cf4f15235f712f35eddeb09576206986847.scope: Deactivated successfully. Mar 25 01:48:32.922773 systemd[1]: cri-containerd-5e93f55ab361b57e046d59cced7c1cf4f15235f712f35eddeb09576206986847.scope: Consumed 6.015s CPU time, 65.9M memory peak, 35.5M read from disk. Mar 25 01:48:32.961223 containerd[1510]: time="2025-03-25T01:48:32.961131155Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5e93f55ab361b57e046d59cced7c1cf4f15235f712f35eddeb09576206986847\" id:\"5e93f55ab361b57e046d59cced7c1cf4f15235f712f35eddeb09576206986847\" pid:3165 exit_status:1 exited_at:{seconds:1742867312 nanos:960335366}" Mar 25 01:48:32.995335 containerd[1510]: time="2025-03-25T01:48:32.992166144Z" level=info msg="received exit event container_id:\"5e93f55ab361b57e046d59cced7c1cf4f15235f712f35eddeb09576206986847\" id:\"5e93f55ab361b57e046d59cced7c1cf4f15235f712f35eddeb09576206986847\" pid:3165 exit_status:1 exited_at:{seconds:1742867312 nanos:960335366}" Mar 25 01:48:33.114709 systemd[1]: cri-containerd-4223d5b2978251a2813abe1182ea273ba3e217179a9115ea9a86ad4f6a67f871.scope: Deactivated successfully. Mar 25 01:48:33.115830 systemd[1]: cri-containerd-4223d5b2978251a2813abe1182ea273ba3e217179a9115ea9a86ad4f6a67f871.scope: Consumed 6.898s CPU time, 82.9M memory peak, 62.4M read from disk. Mar 25 01:48:33.122855 containerd[1510]: time="2025-03-25T01:48:33.122674544Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4223d5b2978251a2813abe1182ea273ba3e217179a9115ea9a86ad4f6a67f871\" id:\"4223d5b2978251a2813abe1182ea273ba3e217179a9115ea9a86ad4f6a67f871\" pid:2667 exit_status:1 exited_at:{seconds:1742867313 nanos:122102112}" Mar 25 01:48:33.123297 containerd[1510]: time="2025-03-25T01:48:33.123057680Z" level=info msg="received exit event container_id:\"4223d5b2978251a2813abe1182ea273ba3e217179a9115ea9a86ad4f6a67f871\" id:\"4223d5b2978251a2813abe1182ea273ba3e217179a9115ea9a86ad4f6a67f871\" pid:2667 exit_status:1 exited_at:{seconds:1742867313 nanos:122102112}" Mar 25 01:48:33.177133 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5e93f55ab361b57e046d59cced7c1cf4f15235f712f35eddeb09576206986847-rootfs.mount: Deactivated successfully. Mar 25 01:48:33.225998 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4223d5b2978251a2813abe1182ea273ba3e217179a9115ea9a86ad4f6a67f871-rootfs.mount: Deactivated successfully. Mar 25 01:48:33.338074 kubelet[2808]: E0325 01:48:33.338008 2808 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:45668->10.0.0.2:2379: read: connection timed out" Mar 25 01:48:33.469072 kubelet[2808]: I0325 01:48:33.468673 2808 scope.go:117] "RemoveContainer" containerID="4223d5b2978251a2813abe1182ea273ba3e217179a9115ea9a86ad4f6a67f871" Mar 25 01:48:33.474335 kubelet[2808]: I0325 01:48:33.473263 2808 scope.go:117] "RemoveContainer" containerID="5e93f55ab361b57e046d59cced7c1cf4f15235f712f35eddeb09576206986847" Mar 25 01:48:33.482919 containerd[1510]: time="2025-03-25T01:48:33.482837761Z" level=info msg="CreateContainer within sandbox \"562b23990bdbcbceae8c5df4d9df819eeadfae554872eb1ab18cdd95bec4f5bb\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Mar 25 01:48:33.485031 containerd[1510]: time="2025-03-25T01:48:33.484642037Z" level=info msg="CreateContainer within sandbox \"9ed62a39cd25d8b996df411704454f064fecc0a1a8bcad2b63c5130cc222e590\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Mar 25 01:48:33.567831 containerd[1510]: time="2025-03-25T01:48:33.567786990Z" level=info msg="Container 41e637a557d23cfd38c8c05d5d92039749852bb9809d574df6fca40ae890ae48: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:48:33.573683 containerd[1510]: time="2025-03-25T01:48:33.572378851Z" level=info msg="Container 88c18f47efaa7209cca44157eb293bea83232d6d1b4767ddac42d95aee89155f: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:48:33.595638 containerd[1510]: time="2025-03-25T01:48:33.595604621Z" level=info msg="CreateContainer within sandbox \"9ed62a39cd25d8b996df411704454f064fecc0a1a8bcad2b63c5130cc222e590\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"88c18f47efaa7209cca44157eb293bea83232d6d1b4767ddac42d95aee89155f\"" Mar 25 01:48:33.596780 containerd[1510]: time="2025-03-25T01:48:33.596762938Z" level=info msg="CreateContainer within sandbox \"562b23990bdbcbceae8c5df4d9df819eeadfae554872eb1ab18cdd95bec4f5bb\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"41e637a557d23cfd38c8c05d5d92039749852bb9809d574df6fca40ae890ae48\"" Mar 25 01:48:33.600876 containerd[1510]: time="2025-03-25T01:48:33.600823734Z" level=info msg="StartContainer for \"88c18f47efaa7209cca44157eb293bea83232d6d1b4767ddac42d95aee89155f\"" Mar 25 01:48:33.602429 containerd[1510]: time="2025-03-25T01:48:33.600827922Z" level=info msg="StartContainer for \"41e637a557d23cfd38c8c05d5d92039749852bb9809d574df6fca40ae890ae48\"" Mar 25 01:48:33.614062 containerd[1510]: time="2025-03-25T01:48:33.612819235Z" level=info msg="connecting to shim 41e637a557d23cfd38c8c05d5d92039749852bb9809d574df6fca40ae890ae48" address="unix:///run/containerd/s/2fc1518a171544630245ecb0f5d571d71f69d21f9bb42d6e5843b484985370d5" protocol=ttrpc version=3 Mar 25 01:48:33.618230 containerd[1510]: time="2025-03-25T01:48:33.618169264Z" level=info msg="connecting to shim 88c18f47efaa7209cca44157eb293bea83232d6d1b4767ddac42d95aee89155f" address="unix:///run/containerd/s/97b96248d1e7f896cc9eba9b3517fc395f67a0e1dab544540d79c7d6b457a50e" protocol=ttrpc version=3 Mar 25 01:48:33.634411 systemd[1]: Started cri-containerd-41e637a557d23cfd38c8c05d5d92039749852bb9809d574df6fca40ae890ae48.scope - libcontainer container 41e637a557d23cfd38c8c05d5d92039749852bb9809d574df6fca40ae890ae48. Mar 25 01:48:33.644500 systemd[1]: Started cri-containerd-88c18f47efaa7209cca44157eb293bea83232d6d1b4767ddac42d95aee89155f.scope - libcontainer container 88c18f47efaa7209cca44157eb293bea83232d6d1b4767ddac42d95aee89155f. Mar 25 01:48:33.702703 containerd[1510]: time="2025-03-25T01:48:33.702560108Z" level=info msg="StartContainer for \"41e637a557d23cfd38c8c05d5d92039749852bb9809d574df6fca40ae890ae48\" returns successfully" Mar 25 01:48:33.704207 containerd[1510]: time="2025-03-25T01:48:33.703437820Z" level=info msg="StartContainer for \"88c18f47efaa7209cca44157eb293bea83232d6d1b4767ddac42d95aee89155f\" returns successfully" Mar 25 01:48:38.227617 containerd[1510]: time="2025-03-25T01:48:38.227549186Z" level=info msg="TaskExit event in podsandbox handler container_id:\"db2b888672fab16c0a7cda0009690ee16758eca5e22b4e6f315b4c5bd055c4f9\" id:\"be35a72bf0a354a3ff4e797735f0bc76c79c41f701bf5e40cd4c240cd7d12d71\" pid:5922 exit_status:1 exited_at:{seconds:1742867318 nanos:227114513}" Mar 25 01:48:38.333419 kubelet[2808]: E0325 01:48:38.325044 2808 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:45514->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4284-0-0-2-22d395eace.182fe8951718cb04 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4284-0-0-2-22d395eace,UID:197df92524af76987bea640189ae4987,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4284-0-0-2-22d395eace,},FirstTimestamp:2025-03-25 01:48:27.850943236 +0000 UTC m=+350.097984655,LastTimestamp:2025-03-25 01:48:27.850943236 +0000 UTC m=+350.097984655,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4284-0-0-2-22d395eace,}" Mar 25 01:48:39.579079 systemd[1]: cri-containerd-605c443f23fe4a316c1c2d2d537317a1eb6192d489d5597b46df202973e005dd.scope: Deactivated successfully. Mar 25 01:48:39.580556 systemd[1]: cri-containerd-605c443f23fe4a316c1c2d2d537317a1eb6192d489d5597b46df202973e005dd.scope: Consumed 2.319s CPU time, 41.4M memory peak, 34.7M read from disk. Mar 25 01:48:39.586705 containerd[1510]: time="2025-03-25T01:48:39.586628979Z" level=info msg="received exit event container_id:\"605c443f23fe4a316c1c2d2d537317a1eb6192d489d5597b46df202973e005dd\" id:\"605c443f23fe4a316c1c2d2d537317a1eb6192d489d5597b46df202973e005dd\" pid:2660 exit_status:1 exited_at:{seconds:1742867319 nanos:586168198}" Mar 25 01:48:39.587480 containerd[1510]: time="2025-03-25T01:48:39.587424288Z" level=info msg="TaskExit event in podsandbox handler container_id:\"605c443f23fe4a316c1c2d2d537317a1eb6192d489d5597b46df202973e005dd\" id:\"605c443f23fe4a316c1c2d2d537317a1eb6192d489d5597b46df202973e005dd\" pid:2660 exit_status:1 exited_at:{seconds:1742867319 nanos:586168198}" Mar 25 01:48:39.669132 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-605c443f23fe4a316c1c2d2d537317a1eb6192d489d5597b46df202973e005dd-rootfs.mount: Deactivated successfully. Mar 25 01:48:40.498619 kubelet[2808]: I0325 01:48:40.498582 2808 scope.go:117] "RemoveContainer" containerID="605c443f23fe4a316c1c2d2d537317a1eb6192d489d5597b46df202973e005dd" Mar 25 01:48:40.500692 containerd[1510]: time="2025-03-25T01:48:40.500646517Z" level=info msg="CreateContainer within sandbox \"11aa8d9457861f7f8ba9791f55a3f65d4b34ff7670c942c7cbe0602bfb15d63e\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Mar 25 01:48:40.515784 containerd[1510]: time="2025-03-25T01:48:40.514887038Z" level=info msg="Container 73fee50ba6c198500b6989fe9db10a80641f33f19ae66b535851668f8499836d: CDI devices from CRI Config.CDIDevices: []" Mar 25 01:48:40.526296 containerd[1510]: time="2025-03-25T01:48:40.526233935Z" level=info msg="CreateContainer within sandbox \"11aa8d9457861f7f8ba9791f55a3f65d4b34ff7670c942c7cbe0602bfb15d63e\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"73fee50ba6c198500b6989fe9db10a80641f33f19ae66b535851668f8499836d\"" Mar 25 01:48:40.527093 containerd[1510]: time="2025-03-25T01:48:40.526829749Z" level=info msg="StartContainer for \"73fee50ba6c198500b6989fe9db10a80641f33f19ae66b535851668f8499836d\"" Mar 25 01:48:40.528592 containerd[1510]: time="2025-03-25T01:48:40.528545308Z" level=info msg="connecting to shim 73fee50ba6c198500b6989fe9db10a80641f33f19ae66b535851668f8499836d" address="unix:///run/containerd/s/b5c881daa7c290a37da4bc8c9ba91a462b44c04c6d2e1c328bc1f31529897a52" protocol=ttrpc version=3 Mar 25 01:48:40.553435 systemd[1]: Started cri-containerd-73fee50ba6c198500b6989fe9db10a80641f33f19ae66b535851668f8499836d.scope - libcontainer container 73fee50ba6c198500b6989fe9db10a80641f33f19ae66b535851668f8499836d. Mar 25 01:48:40.624498 containerd[1510]: time="2025-03-25T01:48:40.624239896Z" level=info msg="StartContainer for \"73fee50ba6c198500b6989fe9db10a80641f33f19ae66b535851668f8499836d\" returns successfully"