May 15 12:30:39.794124 kernel: Linux version 6.12.20-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Thu May 15 10:42:41 -00 2025 May 15 12:30:39.794145 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=48287e633374b880fa618bd42bee102ae77c50831859c6cedd6ca9e1aec3dd5c May 15 12:30:39.794152 kernel: BIOS-provided physical RAM map: May 15 12:30:39.794158 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable May 15 12:30:39.794162 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved May 15 12:30:39.794167 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved May 15 12:30:39.794174 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007cfdbfff] usable May 15 12:30:39.794179 kernel: BIOS-e820: [mem 0x000000007cfdc000-0x000000007cffffff] reserved May 15 12:30:39.794184 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved May 15 12:30:39.794188 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved May 15 12:30:39.794193 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved May 15 12:30:39.794198 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved May 15 12:30:39.794202 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved May 15 12:30:39.794207 kernel: NX (Execute Disable) protection: active May 15 12:30:39.794215 kernel: APIC: Static calls initialized May 15 12:30:39.794220 kernel: SMBIOS 3.0.0 present. May 15 12:30:39.794225 kernel: DMI: Hetzner vServer/Standard PC (Q35 + ICH9, 2009), BIOS 20171111 11/11/2017 May 15 12:30:39.794230 kernel: DMI: Memory slots populated: 1/1 May 15 12:30:39.794235 kernel: Hypervisor detected: KVM May 15 12:30:39.794240 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 May 15 12:30:39.794245 kernel: kvm-clock: using sched offset of 4571997542 cycles May 15 12:30:39.794251 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns May 15 12:30:39.794258 kernel: tsc: Detected 2445.404 MHz processor May 15 12:30:39.794263 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved May 15 12:30:39.794269 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable May 15 12:30:39.794274 kernel: last_pfn = 0x7cfdc max_arch_pfn = 0x400000000 May 15 12:30:39.794279 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs May 15 12:30:39.794285 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT May 15 12:30:39.794290 kernel: Using GB pages for direct mapping May 15 12:30:39.794295 kernel: ACPI: Early table checksum verification disabled May 15 12:30:39.794300 kernel: ACPI: RSDP 0x00000000000F5270 000014 (v00 BOCHS ) May 15 12:30:39.794307 kernel: ACPI: RSDT 0x000000007CFE265D 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 15 12:30:39.794312 kernel: ACPI: FACP 0x000000007CFE244D 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) May 15 12:30:39.794317 kernel: ACPI: DSDT 0x000000007CFE0040 00240D (v01 BOCHS BXPC 00000001 BXPC 00000001) May 15 12:30:39.794322 kernel: ACPI: FACS 0x000000007CFE0000 000040 May 15 12:30:39.794328 kernel: ACPI: APIC 0x000000007CFE2541 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) May 15 12:30:39.794333 kernel: ACPI: HPET 0x000000007CFE25C1 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 15 12:30:39.794338 kernel: ACPI: MCFG 0x000000007CFE25F9 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) May 15 12:30:39.794343 kernel: ACPI: WAET 0x000000007CFE2635 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 15 12:30:39.794349 kernel: ACPI: Reserving FACP table memory at [mem 0x7cfe244d-0x7cfe2540] May 15 12:30:39.794357 kernel: ACPI: Reserving DSDT table memory at [mem 0x7cfe0040-0x7cfe244c] May 15 12:30:39.794362 kernel: ACPI: Reserving FACS table memory at [mem 0x7cfe0000-0x7cfe003f] May 15 12:30:39.794368 kernel: ACPI: Reserving APIC table memory at [mem 0x7cfe2541-0x7cfe25c0] May 15 12:30:39.794373 kernel: ACPI: Reserving HPET table memory at [mem 0x7cfe25c1-0x7cfe25f8] May 15 12:30:39.794379 kernel: ACPI: Reserving MCFG table memory at [mem 0x7cfe25f9-0x7cfe2634] May 15 12:30:39.794386 kernel: ACPI: Reserving WAET table memory at [mem 0x7cfe2635-0x7cfe265c] May 15 12:30:39.794391 kernel: No NUMA configuration found May 15 12:30:39.794397 kernel: Faking a node at [mem 0x0000000000000000-0x000000007cfdbfff] May 15 12:30:39.794402 kernel: NODE_DATA(0) allocated [mem 0x7cfd4dc0-0x7cfdbfff] May 15 12:30:39.794408 kernel: Zone ranges: May 15 12:30:39.794413 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] May 15 12:30:39.794419 kernel: DMA32 [mem 0x0000000001000000-0x000000007cfdbfff] May 15 12:30:39.794424 kernel: Normal empty May 15 12:30:39.794430 kernel: Device empty May 15 12:30:39.794436 kernel: Movable zone start for each node May 15 12:30:39.794442 kernel: Early memory node ranges May 15 12:30:39.794447 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] May 15 12:30:39.794453 kernel: node 0: [mem 0x0000000000100000-0x000000007cfdbfff] May 15 12:30:39.794458 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007cfdbfff] May 15 12:30:39.794464 kernel: On node 0, zone DMA: 1 pages in unavailable ranges May 15 12:30:39.794469 kernel: On node 0, zone DMA: 97 pages in unavailable ranges May 15 12:30:39.794475 kernel: On node 0, zone DMA32: 12324 pages in unavailable ranges May 15 12:30:39.794480 kernel: ACPI: PM-Timer IO Port: 0x608 May 15 12:30:39.794486 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) May 15 12:30:39.794493 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 May 15 12:30:39.794498 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) May 15 12:30:39.794504 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) May 15 12:30:39.794509 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) May 15 12:30:39.794515 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) May 15 12:30:39.794520 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) May 15 12:30:39.794526 kernel: ACPI: Using ACPI (MADT) for SMP configuration information May 15 12:30:39.794531 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 May 15 12:30:39.794537 kernel: CPU topo: Max. logical packages: 1 May 15 12:30:39.794543 kernel: CPU topo: Max. logical dies: 1 May 15 12:30:39.794548 kernel: CPU topo: Max. dies per package: 1 May 15 12:30:39.794554 kernel: CPU topo: Max. threads per core: 1 May 15 12:30:39.794559 kernel: CPU topo: Num. cores per package: 2 May 15 12:30:39.794565 kernel: CPU topo: Num. threads per package: 2 May 15 12:30:39.794570 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs May 15 12:30:39.794576 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() May 15 12:30:39.794581 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices May 15 12:30:39.794586 kernel: Booting paravirtualized kernel on KVM May 15 12:30:39.794594 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns May 15 12:30:39.794599 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 May 15 12:30:39.794605 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 May 15 12:30:39.794610 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 May 15 12:30:39.794616 kernel: pcpu-alloc: [0] 0 1 May 15 12:30:39.794621 kernel: kvm-guest: PV spinlocks disabled, no host support May 15 12:30:39.794628 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=48287e633374b880fa618bd42bee102ae77c50831859c6cedd6ca9e1aec3dd5c May 15 12:30:39.794634 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 15 12:30:39.794640 kernel: random: crng init done May 15 12:30:39.794646 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 15 12:30:39.794652 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) May 15 12:30:39.794657 kernel: Fallback order for Node 0: 0 May 15 12:30:39.794663 kernel: Built 1 zonelists, mobility grouping on. Total pages: 511866 May 15 12:30:39.794668 kernel: Policy zone: DMA32 May 15 12:30:39.794673 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 15 12:30:39.794679 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 May 15 12:30:39.794685 kernel: ftrace: allocating 40065 entries in 157 pages May 15 12:30:39.794691 kernel: ftrace: allocated 157 pages with 5 groups May 15 12:30:39.794696 kernel: Dynamic Preempt: voluntary May 15 12:30:39.794702 kernel: rcu: Preemptible hierarchical RCU implementation. May 15 12:30:39.794708 kernel: rcu: RCU event tracing is enabled. May 15 12:30:39.794714 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. May 15 12:30:39.794720 kernel: Trampoline variant of Tasks RCU enabled. May 15 12:30:39.794725 kernel: Rude variant of Tasks RCU enabled. May 15 12:30:39.794731 kernel: Tracing variant of Tasks RCU enabled. May 15 12:30:39.794736 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 15 12:30:39.794742 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 May 15 12:30:39.794749 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 15 12:30:39.796625 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 15 12:30:39.796633 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 15 12:30:39.796639 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 May 15 12:30:39.796644 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. May 15 12:30:39.796655 kernel: Console: colour VGA+ 80x25 May 15 12:30:39.796660 kernel: printk: legacy console [tty0] enabled May 15 12:30:39.796666 kernel: printk: legacy console [ttyS0] enabled May 15 12:30:39.796672 kernel: ACPI: Core revision 20240827 May 15 12:30:39.796685 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns May 15 12:30:39.796691 kernel: APIC: Switch to symmetric I/O mode setup May 15 12:30:39.796698 kernel: x2apic enabled May 15 12:30:39.796704 kernel: APIC: Switched APIC routing to: physical x2apic May 15 12:30:39.796709 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 May 15 12:30:39.796715 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x233fc319723, max_idle_ns: 440795258057 ns May 15 12:30:39.796721 kernel: Calibrating delay loop (skipped) preset value.. 4890.80 BogoMIPS (lpj=2445404) May 15 12:30:39.796727 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated May 15 12:30:39.796733 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 May 15 12:30:39.796740 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 May 15 12:30:39.796746 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization May 15 12:30:39.796763 kernel: Spectre V2 : Mitigation: Retpolines May 15 12:30:39.796769 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch May 15 12:30:39.796774 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT May 15 12:30:39.796780 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls May 15 12:30:39.796786 kernel: RETBleed: Mitigation: untrained return thunk May 15 12:30:39.796794 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier May 15 12:30:39.796799 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl May 15 12:30:39.796805 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' May 15 12:30:39.796811 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' May 15 12:30:39.796817 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' May 15 12:30:39.796823 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 May 15 12:30:39.796829 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. May 15 12:30:39.796834 kernel: Freeing SMP alternatives memory: 32K May 15 12:30:39.796840 kernel: pid_max: default: 32768 minimum: 301 May 15 12:30:39.796847 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima May 15 12:30:39.796853 kernel: landlock: Up and running. May 15 12:30:39.796858 kernel: SELinux: Initializing. May 15 12:30:39.796864 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) May 15 12:30:39.796870 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) May 15 12:30:39.796876 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0) May 15 12:30:39.796882 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. May 15 12:30:39.796888 kernel: ... version: 0 May 15 12:30:39.796893 kernel: ... bit width: 48 May 15 12:30:39.796900 kernel: ... generic registers: 6 May 15 12:30:39.796906 kernel: ... value mask: 0000ffffffffffff May 15 12:30:39.796912 kernel: ... max period: 00007fffffffffff May 15 12:30:39.796917 kernel: ... fixed-purpose events: 0 May 15 12:30:39.796923 kernel: ... event mask: 000000000000003f May 15 12:30:39.796929 kernel: signal: max sigframe size: 1776 May 15 12:30:39.796935 kernel: rcu: Hierarchical SRCU implementation. May 15 12:30:39.796941 kernel: rcu: Max phase no-delay instances is 400. May 15 12:30:39.796947 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level May 15 12:30:39.796954 kernel: smp: Bringing up secondary CPUs ... May 15 12:30:39.796960 kernel: smpboot: x86: Booting SMP configuration: May 15 12:30:39.796965 kernel: .... node #0, CPUs: #1 May 15 12:30:39.796971 kernel: smp: Brought up 1 node, 2 CPUs May 15 12:30:39.796977 kernel: smpboot: Total of 2 processors activated (9781.61 BogoMIPS) May 15 12:30:39.796984 kernel: Memory: 1917780K/2047464K available (14336K kernel code, 2438K rwdata, 9944K rodata, 54416K init, 2544K bss, 125140K reserved, 0K cma-reserved) May 15 12:30:39.796989 kernel: devtmpfs: initialized May 15 12:30:39.796995 kernel: x86/mm: Memory block size: 128MB May 15 12:30:39.797001 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 15 12:30:39.797008 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) May 15 12:30:39.797014 kernel: pinctrl core: initialized pinctrl subsystem May 15 12:30:39.797020 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 15 12:30:39.797026 kernel: audit: initializing netlink subsys (disabled) May 15 12:30:39.797032 kernel: audit: type=2000 audit(1747312237.224:1): state=initialized audit_enabled=0 res=1 May 15 12:30:39.797037 kernel: thermal_sys: Registered thermal governor 'step_wise' May 15 12:30:39.797043 kernel: thermal_sys: Registered thermal governor 'user_space' May 15 12:30:39.797049 kernel: cpuidle: using governor menu May 15 12:30:39.797054 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 15 12:30:39.797061 kernel: dca service started, version 1.12.1 May 15 12:30:39.797067 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff] May 15 12:30:39.797073 kernel: PCI: Using configuration type 1 for base access May 15 12:30:39.797079 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. May 15 12:30:39.797085 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages May 15 12:30:39.797090 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page May 15 12:30:39.797096 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 15 12:30:39.797102 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page May 15 12:30:39.797117 kernel: ACPI: Added _OSI(Module Device) May 15 12:30:39.797124 kernel: ACPI: Added _OSI(Processor Device) May 15 12:30:39.797130 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 15 12:30:39.797135 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 15 12:30:39.797141 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 15 12:30:39.797147 kernel: ACPI: Interpreter enabled May 15 12:30:39.797153 kernel: ACPI: PM: (supports S0 S5) May 15 12:30:39.797158 kernel: ACPI: Using IOAPIC for interrupt routing May 15 12:30:39.797164 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug May 15 12:30:39.797170 kernel: PCI: Using E820 reservations for host bridge windows May 15 12:30:39.797177 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F May 15 12:30:39.797183 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) May 15 12:30:39.797290 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 15 12:30:39.797384 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] May 15 12:30:39.797444 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] May 15 12:30:39.797453 kernel: PCI host bridge to bus 0000:00 May 15 12:30:39.797521 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] May 15 12:30:39.797618 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] May 15 12:30:39.797697 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] May 15 12:30:39.798640 kernel: pci_bus 0000:00: root bus resource [mem 0x7d000000-0xafffffff window] May 15 12:30:39.798703 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] May 15 12:30:39.798790 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x8ffffffff window] May 15 12:30:39.799859 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] May 15 12:30:39.799943 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint May 15 12:30:39.800023 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint May 15 12:30:39.800086 kernel: pci 0000:00:01.0: BAR 0 [mem 0xfb800000-0xfbffffff pref] May 15 12:30:39.800160 kernel: pci 0000:00:01.0: BAR 2 [mem 0xfd200000-0xfd203fff 64bit pref] May 15 12:30:39.800219 kernel: pci 0000:00:01.0: BAR 4 [mem 0xfea10000-0xfea10fff] May 15 12:30:39.800277 kernel: pci 0000:00:01.0: ROM [mem 0xfea00000-0xfea0ffff pref] May 15 12:30:39.800334 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] May 15 12:30:39.800406 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 15 12:30:39.800465 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfea11000-0xfea11fff] May 15 12:30:39.800521 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] May 15 12:30:39.800577 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] May 15 12:30:39.800633 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] May 15 12:30:39.800700 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 15 12:30:39.803012 kernel: pci 0000:00:02.1: BAR 0 [mem 0xfea12000-0xfea12fff] May 15 12:30:39.803097 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] May 15 12:30:39.803172 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] May 15 12:30:39.803231 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] May 15 12:30:39.803296 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 15 12:30:39.803355 kernel: pci 0000:00:02.2: BAR 0 [mem 0xfea13000-0xfea13fff] May 15 12:30:39.803413 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] May 15 12:30:39.803469 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] May 15 12:30:39.803529 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] May 15 12:30:39.803592 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 15 12:30:39.803650 kernel: pci 0000:00:02.3: BAR 0 [mem 0xfea14000-0xfea14fff] May 15 12:30:39.803705 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] May 15 12:30:39.803789 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] May 15 12:30:39.803852 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] May 15 12:30:39.803919 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 15 12:30:39.803983 kernel: pci 0000:00:02.4: BAR 0 [mem 0xfea15000-0xfea15fff] May 15 12:30:39.804039 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] May 15 12:30:39.804096 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] May 15 12:30:39.804166 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] May 15 12:30:39.804230 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 15 12:30:39.804288 kernel: pci 0000:00:02.5: BAR 0 [mem 0xfea16000-0xfea16fff] May 15 12:30:39.804345 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] May 15 12:30:39.804405 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] May 15 12:30:39.804462 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] May 15 12:30:39.804525 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 15 12:30:39.804582 kernel: pci 0000:00:02.6: BAR 0 [mem 0xfea17000-0xfea17fff] May 15 12:30:39.804638 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] May 15 12:30:39.804693 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] May 15 12:30:39.804766 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] May 15 12:30:39.804907 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 15 12:30:39.804969 kernel: pci 0000:00:02.7: BAR 0 [mem 0xfea18000-0xfea18fff] May 15 12:30:39.805027 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] May 15 12:30:39.805084 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] May 15 12:30:39.805153 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] May 15 12:30:39.805219 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 15 12:30:39.805282 kernel: pci 0000:00:03.0: BAR 0 [mem 0xfea19000-0xfea19fff] May 15 12:30:39.805339 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] May 15 12:30:39.805396 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] May 15 12:30:39.805452 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] May 15 12:30:39.805515 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint May 15 12:30:39.806779 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO May 15 12:30:39.806873 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint May 15 12:30:39.806940 kernel: pci 0000:00:1f.2: BAR 4 [io 0xc040-0xc05f] May 15 12:30:39.807000 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfea1a000-0xfea1afff] May 15 12:30:39.807064 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint May 15 12:30:39.807138 kernel: pci 0000:00:1f.3: BAR 4 [io 0x0700-0x073f] May 15 12:30:39.807211 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint May 15 12:30:39.807271 kernel: pci 0000:01:00.0: BAR 1 [mem 0xfe880000-0xfe880fff] May 15 12:30:39.807330 kernel: pci 0000:01:00.0: BAR 4 [mem 0xfd000000-0xfd003fff 64bit pref] May 15 12:30:39.807393 kernel: pci 0000:01:00.0: ROM [mem 0xfe800000-0xfe87ffff pref] May 15 12:30:39.807452 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] May 15 12:30:39.807518 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint May 15 12:30:39.807577 kernel: pci 0000:02:00.0: BAR 0 [mem 0xfe600000-0xfe603fff 64bit] May 15 12:30:39.807633 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] May 15 12:30:39.807698 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint May 15 12:30:39.808806 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfe400000-0xfe400fff] May 15 12:30:39.808883 kernel: pci 0000:03:00.0: BAR 4 [mem 0xfcc00000-0xfcc03fff 64bit pref] May 15 12:30:39.808944 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] May 15 12:30:39.809014 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint May 15 12:30:39.809078 kernel: pci 0000:04:00.0: BAR 4 [mem 0xfca00000-0xfca03fff 64bit pref] May 15 12:30:39.809148 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] May 15 12:30:39.809218 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint May 15 12:30:39.809284 kernel: pci 0000:05:00.0: BAR 1 [mem 0xfe000000-0xfe000fff] May 15 12:30:39.809343 kernel: pci 0000:05:00.0: BAR 4 [mem 0xfc800000-0xfc803fff 64bit pref] May 15 12:30:39.809400 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] May 15 12:30:39.809467 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint May 15 12:30:39.809527 kernel: pci 0000:06:00.0: BAR 1 [mem 0xfde00000-0xfde00fff] May 15 12:30:39.809586 kernel: pci 0000:06:00.0: BAR 4 [mem 0xfc600000-0xfc603fff 64bit pref] May 15 12:30:39.809643 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] May 15 12:30:39.809654 kernel: acpiphp: Slot [0] registered May 15 12:30:39.809720 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint May 15 12:30:39.810676 kernel: pci 0000:07:00.0: BAR 1 [mem 0xfdc80000-0xfdc80fff] May 15 12:30:39.810748 kernel: pci 0000:07:00.0: BAR 4 [mem 0xfc400000-0xfc403fff 64bit pref] May 15 12:30:39.810844 kernel: pci 0000:07:00.0: ROM [mem 0xfdc00000-0xfdc7ffff pref] May 15 12:30:39.810904 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] May 15 12:30:39.810913 kernel: acpiphp: Slot [0-2] registered May 15 12:30:39.810974 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] May 15 12:30:39.810983 kernel: acpiphp: Slot [0-3] registered May 15 12:30:39.811039 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] May 15 12:30:39.811047 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 May 15 12:30:39.811054 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 May 15 12:30:39.811060 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 May 15 12:30:39.811066 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 May 15 12:30:39.811072 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 May 15 12:30:39.811078 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 May 15 12:30:39.811086 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 May 15 12:30:39.811092 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 May 15 12:30:39.811098 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 May 15 12:30:39.811104 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 May 15 12:30:39.811124 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 May 15 12:30:39.811130 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 May 15 12:30:39.811136 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 May 15 12:30:39.811142 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 May 15 12:30:39.811147 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 May 15 12:30:39.811155 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 May 15 12:30:39.811161 kernel: iommu: Default domain type: Translated May 15 12:30:39.811167 kernel: iommu: DMA domain TLB invalidation policy: lazy mode May 15 12:30:39.811173 kernel: PCI: Using ACPI for IRQ routing May 15 12:30:39.811179 kernel: PCI: pci_cache_line_size set to 64 bytes May 15 12:30:39.811185 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] May 15 12:30:39.811191 kernel: e820: reserve RAM buffer [mem 0x7cfdc000-0x7fffffff] May 15 12:30:39.811252 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device May 15 12:30:39.811333 kernel: pci 0000:00:01.0: vgaarb: bridge control possible May 15 12:30:39.811394 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none May 15 12:30:39.811402 kernel: vgaarb: loaded May 15 12:30:39.811409 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 May 15 12:30:39.811415 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter May 15 12:30:39.811422 kernel: clocksource: Switched to clocksource kvm-clock May 15 12:30:39.811428 kernel: VFS: Disk quotas dquot_6.6.0 May 15 12:30:39.811434 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 15 12:30:39.811440 kernel: pnp: PnP ACPI init May 15 12:30:39.811510 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved May 15 12:30:39.811520 kernel: pnp: PnP ACPI: found 5 devices May 15 12:30:39.811527 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns May 15 12:30:39.811533 kernel: NET: Registered PF_INET protocol family May 15 12:30:39.811539 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) May 15 12:30:39.811545 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) May 15 12:30:39.811551 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 15 12:30:39.811557 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) May 15 12:30:39.811565 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) May 15 12:30:39.811571 kernel: TCP: Hash tables configured (established 16384 bind 16384) May 15 12:30:39.811577 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) May 15 12:30:39.811583 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) May 15 12:30:39.811589 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 15 12:30:39.811595 kernel: NET: Registered PF_XDP protocol family May 15 12:30:39.811653 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 May 15 12:30:39.811711 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 May 15 12:30:39.811798 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 May 15 12:30:39.811865 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff]: assigned May 15 12:30:39.811934 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff]: assigned May 15 12:30:39.811995 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff]: assigned May 15 12:30:39.812053 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] May 15 12:30:39.812121 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] May 15 12:30:39.812181 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] May 15 12:30:39.812239 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] May 15 12:30:39.812297 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] May 15 12:30:39.812356 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] May 15 12:30:39.813604 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] May 15 12:30:39.813675 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] May 15 12:30:39.813737 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] May 15 12:30:39.813825 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] May 15 12:30:39.813888 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] May 15 12:30:39.813954 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] May 15 12:30:39.814016 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] May 15 12:30:39.814078 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] May 15 12:30:39.814149 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] May 15 12:30:39.814208 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] May 15 12:30:39.814266 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] May 15 12:30:39.814323 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] May 15 12:30:39.814379 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] May 15 12:30:39.814436 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff] May 15 12:30:39.814497 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] May 15 12:30:39.814622 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] May 15 12:30:39.814698 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] May 15 12:30:39.814785 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff] May 15 12:30:39.814885 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] May 15 12:30:39.814972 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] May 15 12:30:39.815033 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] May 15 12:30:39.815095 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff] May 15 12:30:39.815168 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] May 15 12:30:39.815227 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] May 15 12:30:39.815283 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] May 15 12:30:39.815333 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] May 15 12:30:39.815383 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] May 15 12:30:39.815433 kernel: pci_bus 0000:00: resource 7 [mem 0x7d000000-0xafffffff window] May 15 12:30:39.815482 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] May 15 12:30:39.815536 kernel: pci_bus 0000:00: resource 9 [mem 0x100000000-0x8ffffffff window] May 15 12:30:39.815596 kernel: pci_bus 0000:01: resource 1 [mem 0xfe800000-0xfe9fffff] May 15 12:30:39.815650 kernel: pci_bus 0000:01: resource 2 [mem 0xfd000000-0xfd1fffff 64bit pref] May 15 12:30:39.815711 kernel: pci_bus 0000:02: resource 1 [mem 0xfe600000-0xfe7fffff] May 15 12:30:39.816029 kernel: pci_bus 0000:02: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] May 15 12:30:39.816103 kernel: pci_bus 0000:03: resource 1 [mem 0xfe400000-0xfe5fffff] May 15 12:30:39.816181 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] May 15 12:30:39.816246 kernel: pci_bus 0000:04: resource 1 [mem 0xfe200000-0xfe3fffff] May 15 12:30:39.816302 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] May 15 12:30:39.816361 kernel: pci_bus 0000:05: resource 1 [mem 0xfe000000-0xfe1fffff] May 15 12:30:39.816417 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] May 15 12:30:39.816478 kernel: pci_bus 0000:06: resource 1 [mem 0xfde00000-0xfdffffff] May 15 12:30:39.816532 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] May 15 12:30:39.816594 kernel: pci_bus 0000:07: resource 0 [io 0x1000-0x1fff] May 15 12:30:39.816648 kernel: pci_bus 0000:07: resource 1 [mem 0xfdc00000-0xfddfffff] May 15 12:30:39.816700 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] May 15 12:30:39.816781 kernel: pci_bus 0000:08: resource 0 [io 0x2000-0x2fff] May 15 12:30:39.816841 kernel: pci_bus 0000:08: resource 1 [mem 0xfda00000-0xfdbfffff] May 15 12:30:39.816899 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] May 15 12:30:39.817008 kernel: pci_bus 0000:09: resource 0 [io 0x3000-0x3fff] May 15 12:30:39.817097 kernel: pci_bus 0000:09: resource 1 [mem 0xfd800000-0xfd9fffff] May 15 12:30:39.817165 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] May 15 12:30:39.817176 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 May 15 12:30:39.817183 kernel: PCI: CLS 0 bytes, default 64 May 15 12:30:39.817190 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x233fc319723, max_idle_ns: 440795258057 ns May 15 12:30:39.817196 kernel: Initialise system trusted keyrings May 15 12:30:39.817202 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 May 15 12:30:39.817211 kernel: Key type asymmetric registered May 15 12:30:39.817218 kernel: Asymmetric key parser 'x509' registered May 15 12:30:39.817224 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) May 15 12:30:39.817230 kernel: io scheduler mq-deadline registered May 15 12:30:39.817236 kernel: io scheduler kyber registered May 15 12:30:39.817243 kernel: io scheduler bfq registered May 15 12:30:39.817303 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 May 15 12:30:39.817362 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 May 15 12:30:39.817421 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 May 15 12:30:39.817483 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 May 15 12:30:39.817543 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 May 15 12:30:39.817600 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 May 15 12:30:39.817656 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 May 15 12:30:39.817713 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 May 15 12:30:39.817819 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 May 15 12:30:39.817882 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 May 15 12:30:39.817940 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 May 15 12:30:39.818001 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 May 15 12:30:39.818059 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 May 15 12:30:39.818128 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 May 15 12:30:39.818187 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 May 15 12:30:39.818244 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 May 15 12:30:39.818253 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 May 15 12:30:39.818307 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 May 15 12:30:39.818367 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 May 15 12:30:39.818377 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 May 15 12:30:39.818384 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 May 15 12:30:39.818390 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 15 12:30:39.818397 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A May 15 12:30:39.818403 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 May 15 12:30:39.818410 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 May 15 12:30:39.818416 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 May 15 12:30:39.818424 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 May 15 12:30:39.818484 kernel: rtc_cmos 00:03: RTC can wake from S4 May 15 12:30:39.818538 kernel: rtc_cmos 00:03: registered as rtc0 May 15 12:30:39.818589 kernel: rtc_cmos 00:03: setting system clock to 2025-05-15T12:30:39 UTC (1747312239) May 15 12:30:39.818640 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram, hpet irqs May 15 12:30:39.818649 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled May 15 12:30:39.818656 kernel: NET: Registered PF_INET6 protocol family May 15 12:30:39.818664 kernel: Segment Routing with IPv6 May 15 12:30:39.818671 kernel: In-situ OAM (IOAM) with IPv6 May 15 12:30:39.818677 kernel: NET: Registered PF_PACKET protocol family May 15 12:30:39.818683 kernel: Key type dns_resolver registered May 15 12:30:39.818690 kernel: IPI shorthand broadcast: enabled May 15 12:30:39.818696 kernel: sched_clock: Marking stable (2914009326, 144232332)->(3065276528, -7034870) May 15 12:30:39.818702 kernel: registered taskstats version 1 May 15 12:30:39.818709 kernel: Loading compiled-in X.509 certificates May 15 12:30:39.818715 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.20-flatcar: 05e05785144663be6df1db78301487421c4773b6' May 15 12:30:39.818723 kernel: Demotion targets for Node 0: null May 15 12:30:39.818729 kernel: Key type .fscrypt registered May 15 12:30:39.818735 kernel: Key type fscrypt-provisioning registered May 15 12:30:39.818742 kernel: ima: No TPM chip found, activating TPM-bypass! May 15 12:30:39.818748 kernel: ima: Allocated hash algorithm: sha1 May 15 12:30:39.818771 kernel: ima: No architecture policies found May 15 12:30:39.818777 kernel: clk: Disabling unused clocks May 15 12:30:39.818783 kernel: Warning: unable to open an initial console. May 15 12:30:39.818790 kernel: Freeing unused kernel image (initmem) memory: 54416K May 15 12:30:39.818798 kernel: Write protecting the kernel read-only data: 24576k May 15 12:30:39.818804 kernel: Freeing unused kernel image (rodata/data gap) memory: 296K May 15 12:30:39.818811 kernel: Run /init as init process May 15 12:30:39.818817 kernel: with arguments: May 15 12:30:39.818823 kernel: /init May 15 12:30:39.818829 kernel: with environment: May 15 12:30:39.818836 kernel: HOME=/ May 15 12:30:39.818842 kernel: TERM=linux May 15 12:30:39.818848 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 15 12:30:39.818857 systemd[1]: Successfully made /usr/ read-only. May 15 12:30:39.818867 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 15 12:30:39.818875 systemd[1]: Detected virtualization kvm. May 15 12:30:39.818881 systemd[1]: Detected architecture x86-64. May 15 12:30:39.818888 systemd[1]: Running in initrd. May 15 12:30:39.818895 systemd[1]: No hostname configured, using default hostname. May 15 12:30:39.818902 systemd[1]: Hostname set to . May 15 12:30:39.818909 systemd[1]: Initializing machine ID from VM UUID. May 15 12:30:39.818916 systemd[1]: Queued start job for default target initrd.target. May 15 12:30:39.818933 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 15 12:30:39.818941 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 15 12:30:39.818972 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 15 12:30:39.818980 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 15 12:30:39.818987 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 15 12:30:39.818996 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 15 12:30:39.819004 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 15 12:30:39.819011 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 15 12:30:39.819018 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 15 12:30:39.819025 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 15 12:30:39.819032 systemd[1]: Reached target paths.target - Path Units. May 15 12:30:39.819038 systemd[1]: Reached target slices.target - Slice Units. May 15 12:30:39.819045 systemd[1]: Reached target swap.target - Swaps. May 15 12:30:39.819053 systemd[1]: Reached target timers.target - Timer Units. May 15 12:30:39.819060 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 15 12:30:39.819067 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 15 12:30:39.819074 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 15 12:30:39.819081 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 15 12:30:39.819088 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 15 12:30:39.819095 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 15 12:30:39.819101 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 15 12:30:39.819118 systemd[1]: Reached target sockets.target - Socket Units. May 15 12:30:39.819126 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 15 12:30:39.819133 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 15 12:30:39.819140 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 15 12:30:39.819147 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). May 15 12:30:39.819154 systemd[1]: Starting systemd-fsck-usr.service... May 15 12:30:39.819161 systemd[1]: Starting systemd-journald.service - Journal Service... May 15 12:30:39.819168 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 15 12:30:39.819175 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 15 12:30:39.819183 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 15 12:30:39.819209 systemd-journald[215]: Collecting audit messages is disabled. May 15 12:30:39.819230 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 15 12:30:39.819237 systemd[1]: Finished systemd-fsck-usr.service. May 15 12:30:39.819244 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 15 12:30:39.819252 systemd-journald[215]: Journal started May 15 12:30:39.819270 systemd-journald[215]: Runtime Journal (/run/log/journal/bc974141c40a4822843726a3fcc54c67) is 4.8M, max 38.6M, 33.7M free. May 15 12:30:39.793357 systemd-modules-load[217]: Inserted module 'overlay' May 15 12:30:39.857349 systemd[1]: Started systemd-journald.service - Journal Service. May 15 12:30:39.857373 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 15 12:30:39.857383 kernel: Bridge firewalling registered May 15 12:30:39.825377 systemd-modules-load[217]: Inserted module 'br_netfilter' May 15 12:30:39.857430 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 15 12:30:39.858453 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 15 12:30:39.859460 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 15 12:30:39.862554 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 15 12:30:39.864133 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 15 12:30:39.869283 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 15 12:30:39.872282 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 15 12:30:39.878979 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 15 12:30:39.880196 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 15 12:30:39.886147 systemd-tmpfiles[236]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. May 15 12:30:39.888267 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 15 12:30:39.889544 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 15 12:30:39.891043 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 15 12:30:39.892936 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 15 12:30:39.908204 dracut-cmdline[253]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=48287e633374b880fa618bd42bee102ae77c50831859c6cedd6ca9e1aec3dd5c May 15 12:30:39.925715 systemd-resolved[254]: Positive Trust Anchors: May 15 12:30:39.926373 systemd-resolved[254]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 15 12:30:39.926401 systemd-resolved[254]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 15 12:30:39.928693 systemd-resolved[254]: Defaulting to hostname 'linux'. May 15 12:30:39.929440 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 15 12:30:39.931966 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 15 12:30:39.973790 kernel: SCSI subsystem initialized May 15 12:30:39.980782 kernel: Loading iSCSI transport class v2.0-870. May 15 12:30:39.989777 kernel: iscsi: registered transport (tcp) May 15 12:30:40.005920 kernel: iscsi: registered transport (qla4xxx) May 15 12:30:40.005959 kernel: QLogic iSCSI HBA Driver May 15 12:30:40.019293 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 15 12:30:40.041164 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 15 12:30:40.042843 systemd[1]: Reached target network-pre.target - Preparation for Network. May 15 12:30:40.068901 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 15 12:30:40.070256 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 15 12:30:40.117775 kernel: raid6: avx2x4 gen() 37253 MB/s May 15 12:30:40.134776 kernel: raid6: avx2x2 gen() 36673 MB/s May 15 12:30:40.151853 kernel: raid6: avx2x1 gen() 24997 MB/s May 15 12:30:40.151878 kernel: raid6: using algorithm avx2x4 gen() 37253 MB/s May 15 12:30:40.169956 kernel: raid6: .... xor() 4812 MB/s, rmw enabled May 15 12:30:40.169989 kernel: raid6: using avx2x2 recovery algorithm May 15 12:30:40.186778 kernel: xor: automatically using best checksumming function avx May 15 12:30:40.299794 kernel: Btrfs loaded, zoned=no, fsverity=no May 15 12:30:40.304486 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 15 12:30:40.306280 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 15 12:30:40.333445 systemd-udevd[463]: Using default interface naming scheme 'v255'. May 15 12:30:40.337151 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 15 12:30:40.341247 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 15 12:30:40.357894 dracut-pre-trigger[469]: rd.md=0: removing MD RAID activation May 15 12:30:40.374661 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 15 12:30:40.377380 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 15 12:30:40.421017 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 15 12:30:40.425417 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 15 12:30:40.485777 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues May 15 12:30:40.494413 kernel: scsi host0: Virtio SCSI HBA May 15 12:30:40.494518 kernel: scsi 0:0:0:0: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 May 15 12:30:40.515787 kernel: cryptd: max_cpu_qlen set to 1000 May 15 12:30:40.524954 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 15 12:30:40.527404 kernel: AES CTR mode by8 optimization enabled May 15 12:30:40.525055 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 15 12:30:40.526788 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 15 12:30:40.529731 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 15 12:30:40.571127 kernel: ACPI: bus type USB registered May 15 12:30:40.571208 kernel: usbcore: registered new interface driver usbfs May 15 12:30:40.575768 kernel: usbcore: registered new interface driver hub May 15 12:30:40.575818 kernel: usbcore: registered new device driver usb May 15 12:30:40.578776 kernel: libata version 3.00 loaded. May 15 12:30:40.582870 kernel: sd 0:0:0:0: Power-on or device reset occurred May 15 12:30:40.591059 kernel: sd 0:0:0:0: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) May 15 12:30:40.591175 kernel: sd 0:0:0:0: [sda] Write Protect is off May 15 12:30:40.591254 kernel: sd 0:0:0:0: [sda] Mode Sense: 63 00 00 08 May 15 12:30:40.591329 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA May 15 12:30:40.591401 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. May 15 12:30:40.591414 kernel: GPT:17805311 != 80003071 May 15 12:30:40.591422 kernel: GPT:Alternate GPT header not at the end of the disk. May 15 12:30:40.591429 kernel: GPT:17805311 != 80003071 May 15 12:30:40.591436 kernel: GPT: Use GNU Parted to correct GPT errors. May 15 12:30:40.591443 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 15 12:30:40.591451 kernel: sd 0:0:0:0: [sda] Attached SCSI disk May 15 12:30:40.599119 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 May 15 12:30:40.612776 kernel: ahci 0000:00:1f.2: version 3.0 May 15 12:30:40.623460 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 May 15 12:30:40.623512 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode May 15 12:30:40.625031 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) May 15 12:30:40.625205 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only May 15 12:30:40.625286 kernel: scsi host1: ahci May 15 12:30:40.625366 kernel: scsi host2: ahci May 15 12:30:40.625486 kernel: scsi host3: ahci May 15 12:30:40.625625 kernel: scsi host4: ahci May 15 12:30:40.625882 kernel: scsi host5: ahci May 15 12:30:40.625962 kernel: scsi host6: ahci May 15 12:30:40.626033 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a100 irq 48 lpm-pol 0 May 15 12:30:40.626042 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a180 irq 48 lpm-pol 0 May 15 12:30:40.626050 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a200 irq 48 lpm-pol 0 May 15 12:30:40.626057 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a280 irq 48 lpm-pol 0 May 15 12:30:40.626068 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a300 irq 48 lpm-pol 0 May 15 12:30:40.626075 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a380 irq 48 lpm-pol 0 May 15 12:30:40.682350 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. May 15 12:30:40.687477 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 15 12:30:40.696577 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. May 15 12:30:40.704941 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. May 15 12:30:40.711150 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. May 15 12:30:40.715181 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. May 15 12:30:40.717513 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 15 12:30:40.738071 disk-uuid[626]: Primary Header is updated. May 15 12:30:40.738071 disk-uuid[626]: Secondary Entries is updated. May 15 12:30:40.738071 disk-uuid[626]: Secondary Header is updated. May 15 12:30:40.758438 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 15 12:30:40.781787 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 15 12:30:40.934773 kernel: ata3: SATA link down (SStatus 0 SControl 300) May 15 12:30:40.934834 kernel: ata2: SATA link down (SStatus 0 SControl 300) May 15 12:30:40.934844 kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) May 15 12:30:40.934852 kernel: ata6: SATA link down (SStatus 0 SControl 300) May 15 12:30:40.936795 kernel: ata4: SATA link down (SStatus 0 SControl 300) May 15 12:30:40.936823 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 May 15 12:30:40.938774 kernel: ata1.00: applying bridge limits May 15 12:30:40.939788 kernel: ata5: SATA link down (SStatus 0 SControl 300) May 15 12:30:40.940776 kernel: ata1.00: configured for UDMA/100 May 15 12:30:40.941783 kernel: scsi 1:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 May 15 12:30:40.961130 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller May 15 12:30:40.973808 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 May 15 12:30:40.973953 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 May 15 12:30:40.974069 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller May 15 12:30:40.974176 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 May 15 12:30:40.974306 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed May 15 12:30:40.974402 kernel: hub 1-0:1.0: USB hub found May 15 12:30:40.974554 kernel: hub 1-0:1.0: 4 ports detected May 15 12:30:40.974648 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. May 15 12:30:40.974809 kernel: hub 2-0:1.0: USB hub found May 15 12:30:40.974908 kernel: hub 2-0:1.0: 4 ports detected May 15 12:30:40.981002 kernel: sr 1:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray May 15 12:30:41.005566 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 May 15 12:30:41.005583 kernel: sr 1:0:0:0: Attached scsi CD-ROM sr0 May 15 12:30:41.206847 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd May 15 12:30:41.311330 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 15 12:30:41.313005 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 15 12:30:41.314380 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 15 12:30:41.316343 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 15 12:30:41.320082 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 15 12:30:41.339193 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 15 12:30:41.354793 kernel: hid: raw HID events driver (C) Jiri Kosina May 15 12:30:41.362028 kernel: usbcore: registered new interface driver usbhid May 15 12:30:41.362056 kernel: usbhid: USB HID core driver May 15 12:30:41.371823 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input3 May 15 12:30:41.371856 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 May 15 12:30:41.777789 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 15 12:30:41.778145 disk-uuid[627]: The operation has completed successfully. May 15 12:30:41.833424 systemd[1]: disk-uuid.service: Deactivated successfully. May 15 12:30:41.833531 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 15 12:30:41.866275 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 15 12:30:41.885816 sh[661]: Success May 15 12:30:41.902966 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 15 12:30:41.903017 kernel: device-mapper: uevent: version 1.0.3 May 15 12:30:41.903028 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev May 15 12:30:41.913794 kernel: device-mapper: verity: sha256 using shash "sha256-ni" May 15 12:30:41.962881 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 15 12:30:41.966838 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 15 12:30:41.985360 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 15 12:30:41.997802 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' May 15 12:30:41.997854 kernel: BTRFS: device fsid 2d504097-db49-4d66-a0d5-eeb665b21004 devid 1 transid 41 /dev/mapper/usr (254:0) scanned by mount (673) May 15 12:30:42.004364 kernel: BTRFS info (device dm-0): first mount of filesystem 2d504097-db49-4d66-a0d5-eeb665b21004 May 15 12:30:42.004421 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm May 15 12:30:42.004433 kernel: BTRFS info (device dm-0): using free-space-tree May 15 12:30:42.014241 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 15 12:30:42.015220 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. May 15 12:30:42.016136 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. May 15 12:30:42.016839 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 15 12:30:42.019899 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 15 12:30:42.039790 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 (8:6) scanned by mount (708) May 15 12:30:42.043687 kernel: BTRFS info (device sda6): first mount of filesystem afd0c70c-d15e-448c-8325-f96e3c3ed3a5 May 15 12:30:42.043716 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm May 15 12:30:42.043730 kernel: BTRFS info (device sda6): using free-space-tree May 15 12:30:42.056779 kernel: BTRFS info (device sda6): last unmount of filesystem afd0c70c-d15e-448c-8325-f96e3c3ed3a5 May 15 12:30:42.057362 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 15 12:30:42.059858 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 15 12:30:42.104951 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 15 12:30:42.108852 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 15 12:30:42.152545 systemd-networkd[843]: lo: Link UP May 15 12:30:42.152558 systemd-networkd[843]: lo: Gained carrier May 15 12:30:42.154563 systemd-networkd[843]: Enumeration completed May 15 12:30:42.154641 systemd[1]: Started systemd-networkd.service - Network Configuration. May 15 12:30:42.155518 systemd-networkd[843]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 15 12:30:42.155521 systemd-networkd[843]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 15 12:30:42.155789 systemd[1]: Reached target network.target - Network. May 15 12:30:42.156823 systemd-networkd[843]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 15 12:30:42.156826 systemd-networkd[843]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. May 15 12:30:42.157639 systemd-networkd[843]: eth0: Link UP May 15 12:30:42.157642 systemd-networkd[843]: eth0: Gained carrier May 15 12:30:42.157648 systemd-networkd[843]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 15 12:30:42.163143 systemd-networkd[843]: eth1: Link UP May 15 12:30:42.163146 systemd-networkd[843]: eth1: Gained carrier May 15 12:30:42.163154 systemd-networkd[843]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 15 12:30:42.176826 ignition[781]: Ignition 2.21.0 May 15 12:30:42.176840 ignition[781]: Stage: fetch-offline May 15 12:30:42.176866 ignition[781]: no configs at "/usr/lib/ignition/base.d" May 15 12:30:42.176873 ignition[781]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 15 12:30:42.178968 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 15 12:30:42.176945 ignition[781]: parsed url from cmdline: "" May 15 12:30:42.180869 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... May 15 12:30:42.176948 ignition[781]: no config URL provided May 15 12:30:42.176951 ignition[781]: reading system config file "/usr/lib/ignition/user.ign" May 15 12:30:42.176956 ignition[781]: no config at "/usr/lib/ignition/user.ign" May 15 12:30:42.176960 ignition[781]: failed to fetch config: resource requires networking May 15 12:30:42.177324 ignition[781]: Ignition finished successfully May 15 12:30:42.196904 systemd-networkd[843]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 May 15 12:30:42.209122 ignition[852]: Ignition 2.21.0 May 15 12:30:42.209136 ignition[852]: Stage: fetch May 15 12:30:42.209262 ignition[852]: no configs at "/usr/lib/ignition/base.d" May 15 12:30:42.209271 ignition[852]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 15 12:30:42.209331 ignition[852]: parsed url from cmdline: "" May 15 12:30:42.209334 ignition[852]: no config URL provided May 15 12:30:42.209337 ignition[852]: reading system config file "/usr/lib/ignition/user.ign" May 15 12:30:42.209343 ignition[852]: no config at "/usr/lib/ignition/user.ign" May 15 12:30:42.209372 ignition[852]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 May 15 12:30:42.209485 ignition[852]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable May 15 12:30:42.228810 systemd-networkd[843]: eth0: DHCPv4 address 37.27.185.109/32, gateway 172.31.1.1 acquired from 172.31.1.1 May 15 12:30:42.410333 ignition[852]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 May 15 12:30:42.417558 ignition[852]: GET result: OK May 15 12:30:42.417638 ignition[852]: parsing config with SHA512: 7b84fcf629a41403eb3bed01f1e628a79a71a60c59437f5690364caa80ab68795c5c21b9086756f9ac542480845695a989bf9f6c6acd54120bd983583cc502b7 May 15 12:30:42.423370 unknown[852]: fetched base config from "system" May 15 12:30:42.423381 unknown[852]: fetched base config from "system" May 15 12:30:42.423641 ignition[852]: fetch: fetch complete May 15 12:30:42.423385 unknown[852]: fetched user config from "hetzner" May 15 12:30:42.423646 ignition[852]: fetch: fetch passed May 15 12:30:42.425736 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). May 15 12:30:42.423700 ignition[852]: Ignition finished successfully May 15 12:30:42.427262 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 15 12:30:42.470496 ignition[860]: Ignition 2.21.0 May 15 12:30:42.471059 ignition[860]: Stage: kargs May 15 12:30:42.471238 ignition[860]: no configs at "/usr/lib/ignition/base.d" May 15 12:30:42.471247 ignition[860]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 15 12:30:42.472471 ignition[860]: kargs: kargs passed May 15 12:30:42.472513 ignition[860]: Ignition finished successfully May 15 12:30:42.473716 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 15 12:30:42.475901 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 15 12:30:42.502684 ignition[867]: Ignition 2.21.0 May 15 12:30:42.502701 ignition[867]: Stage: disks May 15 12:30:42.502942 ignition[867]: no configs at "/usr/lib/ignition/base.d" May 15 12:30:42.502956 ignition[867]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 15 12:30:42.504379 ignition[867]: disks: disks passed May 15 12:30:42.506053 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 15 12:30:42.504437 ignition[867]: Ignition finished successfully May 15 12:30:42.507329 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 15 12:30:42.508134 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 15 12:30:42.509308 systemd[1]: Reached target local-fs.target - Local File Systems. May 15 12:30:42.510590 systemd[1]: Reached target sysinit.target - System Initialization. May 15 12:30:42.511894 systemd[1]: Reached target basic.target - Basic System. May 15 12:30:42.513796 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 15 12:30:42.535321 systemd-fsck[876]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks May 15 12:30:42.536862 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 15 12:30:42.539118 systemd[1]: Mounting sysroot.mount - /sysroot... May 15 12:30:42.652807 kernel: EXT4-fs (sda9): mounted filesystem f7dea4bd-2644-4592-b85b-330f322c4d2b r/w with ordered data mode. Quota mode: none. May 15 12:30:42.653536 systemd[1]: Mounted sysroot.mount - /sysroot. May 15 12:30:42.654336 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 15 12:30:42.656327 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 15 12:30:42.658812 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 15 12:30:42.660849 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... May 15 12:30:42.662286 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 15 12:30:42.663277 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 15 12:30:42.667206 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 15 12:30:42.669309 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 15 12:30:42.682777 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 (8:6) scanned by mount (884) May 15 12:30:42.686638 kernel: BTRFS info (device sda6): first mount of filesystem afd0c70c-d15e-448c-8325-f96e3c3ed3a5 May 15 12:30:42.686667 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm May 15 12:30:42.689332 kernel: BTRFS info (device sda6): using free-space-tree May 15 12:30:42.696889 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 15 12:30:42.715236 coreos-metadata[886]: May 15 12:30:42.715 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 May 15 12:30:42.717921 coreos-metadata[886]: May 15 12:30:42.717 INFO Fetch successful May 15 12:30:42.719289 coreos-metadata[886]: May 15 12:30:42.718 INFO wrote hostname ci-4334-0-0-a-dce95649a9 to /sysroot/etc/hostname May 15 12:30:42.720432 initrd-setup-root[911]: cut: /sysroot/etc/passwd: No such file or directory May 15 12:30:42.721365 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. May 15 12:30:42.724737 initrd-setup-root[919]: cut: /sysroot/etc/group: No such file or directory May 15 12:30:42.728450 initrd-setup-root[926]: cut: /sysroot/etc/shadow: No such file or directory May 15 12:30:42.733955 initrd-setup-root[933]: cut: /sysroot/etc/gshadow: No such file or directory May 15 12:30:42.804700 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 15 12:30:42.806269 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 15 12:30:42.807650 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 15 12:30:42.820790 kernel: BTRFS info (device sda6): last unmount of filesystem afd0c70c-d15e-448c-8325-f96e3c3ed3a5 May 15 12:30:42.832575 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 15 12:30:42.838461 ignition[1002]: INFO : Ignition 2.21.0 May 15 12:30:42.838461 ignition[1002]: INFO : Stage: mount May 15 12:30:42.839632 ignition[1002]: INFO : no configs at "/usr/lib/ignition/base.d" May 15 12:30:42.839632 ignition[1002]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 15 12:30:42.839632 ignition[1002]: INFO : mount: mount passed May 15 12:30:42.839632 ignition[1002]: INFO : Ignition finished successfully May 15 12:30:42.840352 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 15 12:30:42.842308 systemd[1]: Starting ignition-files.service - Ignition (files)... May 15 12:30:42.997014 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 15 12:30:42.998375 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 15 12:30:43.024786 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 (8:6) scanned by mount (1013) May 15 12:30:43.028044 kernel: BTRFS info (device sda6): first mount of filesystem afd0c70c-d15e-448c-8325-f96e3c3ed3a5 May 15 12:30:43.028072 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm May 15 12:30:43.030630 kernel: BTRFS info (device sda6): using free-space-tree May 15 12:30:43.036507 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 15 12:30:43.061974 ignition[1029]: INFO : Ignition 2.21.0 May 15 12:30:43.061974 ignition[1029]: INFO : Stage: files May 15 12:30:43.063432 ignition[1029]: INFO : no configs at "/usr/lib/ignition/base.d" May 15 12:30:43.063432 ignition[1029]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 15 12:30:43.063432 ignition[1029]: DEBUG : files: compiled without relabeling support, skipping May 15 12:30:43.066350 ignition[1029]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 15 12:30:43.066350 ignition[1029]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 15 12:30:43.069366 ignition[1029]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 15 12:30:43.070253 ignition[1029]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 15 12:30:43.070253 ignition[1029]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 15 12:30:43.069804 unknown[1029]: wrote ssh authorized keys file for user: core May 15 12:30:43.073357 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" May 15 12:30:43.073357 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 May 15 12:30:43.420305 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 15 12:30:43.508874 systemd-networkd[843]: eth0: Gained IPv6LL May 15 12:30:43.636920 systemd-networkd[843]: eth1: Gained IPv6LL May 15 12:30:45.690295 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" May 15 12:30:45.692247 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 15 12:30:45.692247 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 15 12:30:45.692247 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 15 12:30:45.692247 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 15 12:30:45.692247 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 15 12:30:45.692247 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 15 12:30:45.692247 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 15 12:30:45.692247 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 15 12:30:45.699582 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 15 12:30:45.699582 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 15 12:30:45.699582 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" May 15 12:30:45.699582 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" May 15 12:30:45.699582 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" May 15 12:30:45.699582 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-x86-64.raw: attempt #1 May 15 12:30:46.439682 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 15 12:30:48.344327 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" May 15 12:30:48.345907 ignition[1029]: INFO : files: op(b): [started] processing unit "prepare-helm.service" May 15 12:30:48.346995 ignition[1029]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 15 12:30:48.348619 ignition[1029]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 15 12:30:48.348619 ignition[1029]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" May 15 12:30:48.348619 ignition[1029]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" May 15 12:30:48.353253 ignition[1029]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" May 15 12:30:48.353253 ignition[1029]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" May 15 12:30:48.353253 ignition[1029]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" May 15 12:30:48.353253 ignition[1029]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" May 15 12:30:48.353253 ignition[1029]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" May 15 12:30:48.353253 ignition[1029]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" May 15 12:30:48.353253 ignition[1029]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" May 15 12:30:48.353253 ignition[1029]: INFO : files: files passed May 15 12:30:48.353253 ignition[1029]: INFO : Ignition finished successfully May 15 12:30:48.350488 systemd[1]: Finished ignition-files.service - Ignition (files). May 15 12:30:48.354898 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 15 12:30:48.363665 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 15 12:30:48.366743 systemd[1]: ignition-quench.service: Deactivated successfully. May 15 12:30:48.366862 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 15 12:30:48.374783 initrd-setup-root-after-ignition[1059]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 15 12:30:48.374783 initrd-setup-root-after-ignition[1059]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 15 12:30:48.377240 initrd-setup-root-after-ignition[1063]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 15 12:30:48.378741 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 15 12:30:48.380323 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 15 12:30:48.382433 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 15 12:30:48.427049 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 15 12:30:48.427152 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 15 12:30:48.428536 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 15 12:30:48.429441 systemd[1]: Reached target initrd.target - Initrd Default Target. May 15 12:30:48.430451 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 15 12:30:48.431091 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 15 12:30:48.445504 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 15 12:30:48.448183 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 15 12:30:48.464257 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 15 12:30:48.465936 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 15 12:30:48.467285 systemd[1]: Stopped target timers.target - Timer Units. May 15 12:30:48.468366 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 15 12:30:48.468601 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 15 12:30:48.470155 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 15 12:30:48.470899 systemd[1]: Stopped target basic.target - Basic System. May 15 12:30:48.471734 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 15 12:30:48.472681 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 15 12:30:48.473907 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 15 12:30:48.475085 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. May 15 12:30:48.476302 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 15 12:30:48.477497 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 15 12:30:48.478670 systemd[1]: Stopped target sysinit.target - System Initialization. May 15 12:30:48.479950 systemd[1]: Stopped target local-fs.target - Local File Systems. May 15 12:30:48.481115 systemd[1]: Stopped target swap.target - Swaps. May 15 12:30:48.482041 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 15 12:30:48.482267 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 15 12:30:48.483393 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 15 12:30:48.484227 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 15 12:30:48.485367 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 15 12:30:48.485611 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 15 12:30:48.486566 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 15 12:30:48.486780 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 15 12:30:48.488237 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 15 12:30:48.488439 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 15 12:30:48.489620 systemd[1]: ignition-files.service: Deactivated successfully. May 15 12:30:48.489814 systemd[1]: Stopped ignition-files.service - Ignition (files). May 15 12:30:48.490561 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. May 15 12:30:48.490770 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. May 15 12:30:48.496861 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 15 12:30:48.497412 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 15 12:30:48.497549 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 15 12:30:48.500850 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 15 12:30:48.503456 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 15 12:30:48.503667 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 15 12:30:48.504945 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 15 12:30:48.505041 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 15 12:30:48.509896 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 15 12:30:48.509958 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 15 12:30:48.521964 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 15 12:30:48.525114 systemd[1]: sysroot-boot.service: Deactivated successfully. May 15 12:30:48.525177 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 15 12:30:48.526732 ignition[1083]: INFO : Ignition 2.21.0 May 15 12:30:48.526732 ignition[1083]: INFO : Stage: umount May 15 12:30:48.526732 ignition[1083]: INFO : no configs at "/usr/lib/ignition/base.d" May 15 12:30:48.526732 ignition[1083]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 15 12:30:48.528994 ignition[1083]: INFO : umount: umount passed May 15 12:30:48.528994 ignition[1083]: INFO : Ignition finished successfully May 15 12:30:48.528494 systemd[1]: ignition-mount.service: Deactivated successfully. May 15 12:30:48.528599 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 15 12:30:48.529907 systemd[1]: ignition-disks.service: Deactivated successfully. May 15 12:30:48.529989 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 15 12:30:48.530889 systemd[1]: ignition-kargs.service: Deactivated successfully. May 15 12:30:48.530927 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 15 12:30:48.531725 systemd[1]: ignition-fetch.service: Deactivated successfully. May 15 12:30:48.531815 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). May 15 12:30:48.532748 systemd[1]: Stopped target network.target - Network. May 15 12:30:48.533607 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 15 12:30:48.533645 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 15 12:30:48.534570 systemd[1]: Stopped target paths.target - Path Units. May 15 12:30:48.535450 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 15 12:30:48.538829 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 15 12:30:48.539911 systemd[1]: Stopped target slices.target - Slice Units. May 15 12:30:48.541006 systemd[1]: Stopped target sockets.target - Socket Units. May 15 12:30:48.541878 systemd[1]: iscsid.socket: Deactivated successfully. May 15 12:30:48.541918 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 15 12:30:48.542884 systemd[1]: iscsiuio.socket: Deactivated successfully. May 15 12:30:48.542926 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 15 12:30:48.543705 systemd[1]: ignition-setup.service: Deactivated successfully. May 15 12:30:48.543770 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 15 12:30:48.544583 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 15 12:30:48.544618 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 15 12:30:48.545427 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 15 12:30:48.545464 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 15 12:30:48.546458 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 15 12:30:48.547345 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 15 12:30:48.553268 systemd[1]: systemd-resolved.service: Deactivated successfully. May 15 12:30:48.553354 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 15 12:30:48.556311 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 15 12:30:48.556509 systemd[1]: systemd-networkd.service: Deactivated successfully. May 15 12:30:48.556590 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 15 12:30:48.558276 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 15 12:30:48.558679 systemd[1]: Stopped target network-pre.target - Preparation for Network. May 15 12:30:48.559345 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 15 12:30:48.559371 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 15 12:30:48.561821 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 15 12:30:48.562618 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 15 12:30:48.562656 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 15 12:30:48.563562 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 15 12:30:48.563593 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 15 12:30:48.566668 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 15 12:30:48.566703 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 15 12:30:48.567357 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 15 12:30:48.567387 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 15 12:30:48.568862 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 15 12:30:48.571057 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 15 12:30:48.571117 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 15 12:30:48.574117 systemd[1]: systemd-udevd.service: Deactivated successfully. May 15 12:30:48.574263 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 15 12:30:48.575368 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 15 12:30:48.575449 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 15 12:30:48.576850 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 15 12:30:48.576876 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 15 12:30:48.578938 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 15 12:30:48.578974 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 15 12:30:48.580417 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 15 12:30:48.580452 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 15 12:30:48.581451 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 15 12:30:48.581488 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 15 12:30:48.583832 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 15 12:30:48.584683 systemd[1]: systemd-network-generator.service: Deactivated successfully. May 15 12:30:48.584723 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. May 15 12:30:48.587866 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 15 12:30:48.587899 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 15 12:30:48.590894 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 15 12:30:48.590935 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 15 12:30:48.597565 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. May 15 12:30:48.597601 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. May 15 12:30:48.597629 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 15 12:30:48.597898 systemd[1]: network-cleanup.service: Deactivated successfully. May 15 12:30:48.597959 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 15 12:30:48.598587 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 15 12:30:48.598640 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 15 12:30:48.600008 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 15 12:30:48.601398 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 15 12:30:48.620488 systemd[1]: Switching root. May 15 12:30:48.655128 systemd-journald[215]: Journal stopped May 15 12:30:49.456834 systemd-journald[215]: Received SIGTERM from PID 1 (systemd). May 15 12:30:49.456878 kernel: SELinux: policy capability network_peer_controls=1 May 15 12:30:49.456893 kernel: SELinux: policy capability open_perms=1 May 15 12:30:49.456906 kernel: SELinux: policy capability extended_socket_class=1 May 15 12:30:49.456914 kernel: SELinux: policy capability always_check_network=0 May 15 12:30:49.456923 kernel: SELinux: policy capability cgroup_seclabel=1 May 15 12:30:49.456931 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 15 12:30:49.456939 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 15 12:30:49.456946 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 15 12:30:49.456957 kernel: SELinux: policy capability userspace_initial_context=0 May 15 12:30:49.456965 kernel: audit: type=1403 audit(1747312248.775:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 15 12:30:49.456979 systemd[1]: Successfully loaded SELinux policy in 44.625ms. May 15 12:30:49.456995 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 14.995ms. May 15 12:30:49.457006 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 15 12:30:49.457015 systemd[1]: Detected virtualization kvm. May 15 12:30:49.457023 systemd[1]: Detected architecture x86-64. May 15 12:30:49.457031 systemd[1]: Detected first boot. May 15 12:30:49.457040 systemd[1]: Hostname set to . May 15 12:30:49.457048 systemd[1]: Initializing machine ID from VM UUID. May 15 12:30:49.457057 zram_generator::config[1127]: No configuration found. May 15 12:30:49.457085 kernel: Guest personality initialized and is inactive May 15 12:30:49.457095 kernel: VMCI host device registered (name=vmci, major=10, minor=125) May 15 12:30:49.457103 kernel: Initialized host personality May 15 12:30:49.457111 kernel: NET: Registered PF_VSOCK protocol family May 15 12:30:49.457119 systemd[1]: Populated /etc with preset unit settings. May 15 12:30:49.457128 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 15 12:30:49.457136 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 15 12:30:49.457144 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 15 12:30:49.457152 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 15 12:30:49.457160 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 15 12:30:49.457170 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 15 12:30:49.457179 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 15 12:30:49.457187 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 15 12:30:49.457197 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 15 12:30:49.457205 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 15 12:30:49.457213 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 15 12:30:49.457221 systemd[1]: Created slice user.slice - User and Session Slice. May 15 12:30:49.457234 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 15 12:30:49.457244 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 15 12:30:49.457253 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 15 12:30:49.457261 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 15 12:30:49.457270 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 15 12:30:49.457279 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 15 12:30:49.457288 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... May 15 12:30:49.457296 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 15 12:30:49.457304 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 15 12:30:49.457313 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 15 12:30:49.457321 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 15 12:30:49.457329 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 15 12:30:49.457338 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 15 12:30:49.457347 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 15 12:30:49.457356 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 15 12:30:49.457364 systemd[1]: Reached target slices.target - Slice Units. May 15 12:30:49.457373 systemd[1]: Reached target swap.target - Swaps. May 15 12:30:49.457382 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 15 12:30:49.457390 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 15 12:30:49.457398 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 15 12:30:49.457407 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 15 12:30:49.457415 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 15 12:30:49.457424 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 15 12:30:49.457433 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 15 12:30:49.457442 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 15 12:30:49.457450 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 15 12:30:49.457459 systemd[1]: Mounting media.mount - External Media Directory... May 15 12:30:49.457470 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 15 12:30:49.457479 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 15 12:30:49.457488 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 15 12:30:49.457496 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 15 12:30:49.457504 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 15 12:30:49.457514 systemd[1]: Reached target machines.target - Containers. May 15 12:30:49.457522 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 15 12:30:49.457531 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 15 12:30:49.457540 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 15 12:30:49.457548 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 15 12:30:49.457556 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 15 12:30:49.457565 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 15 12:30:49.457573 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 15 12:30:49.457583 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 15 12:30:49.457591 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 15 12:30:49.457600 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 15 12:30:49.457609 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 15 12:30:49.457617 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 15 12:30:49.457626 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 15 12:30:49.457634 systemd[1]: Stopped systemd-fsck-usr.service. May 15 12:30:49.457642 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 15 12:30:49.457652 systemd[1]: Starting systemd-journald.service - Journal Service... May 15 12:30:49.457660 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 15 12:30:49.457668 kernel: fuse: init (API version 7.41) May 15 12:30:49.457676 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 15 12:30:49.457684 kernel: ACPI: bus type drm_connector registered May 15 12:30:49.457692 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 15 12:30:49.457700 kernel: loop: module loaded May 15 12:30:49.457708 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 15 12:30:49.457718 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 15 12:30:49.457727 systemd[1]: verity-setup.service: Deactivated successfully. May 15 12:30:49.457735 systemd[1]: Stopped verity-setup.service. May 15 12:30:49.457745 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 15 12:30:49.457878 systemd-journald[1215]: Collecting audit messages is disabled. May 15 12:30:49.457917 systemd-journald[1215]: Journal started May 15 12:30:49.457937 systemd-journald[1215]: Runtime Journal (/run/log/journal/bc974141c40a4822843726a3fcc54c67) is 4.8M, max 38.6M, 33.7M free. May 15 12:30:49.214914 systemd[1]: Queued start job for default target multi-user.target. May 15 12:30:49.460816 systemd[1]: Started systemd-journald.service - Journal Service. May 15 12:30:49.223786 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. May 15 12:30:49.224169 systemd[1]: systemd-journald.service: Deactivated successfully. May 15 12:30:49.461183 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 15 12:30:49.463045 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 15 12:30:49.463571 systemd[1]: Mounted media.mount - External Media Directory. May 15 12:30:49.464137 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 15 12:30:49.464646 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 15 12:30:49.465199 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 15 12:30:49.465988 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 15 12:30:49.466782 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 15 12:30:49.469211 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 15 12:30:49.469342 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 15 12:30:49.469978 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 15 12:30:49.470108 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 15 12:30:49.471119 systemd[1]: modprobe@drm.service: Deactivated successfully. May 15 12:30:49.471232 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 15 12:30:49.472100 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 15 12:30:49.472925 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 15 12:30:49.473560 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 15 12:30:49.473671 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 15 12:30:49.474447 systemd[1]: modprobe@loop.service: Deactivated successfully. May 15 12:30:49.474909 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 15 12:30:49.475536 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 15 12:30:49.477186 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 15 12:30:49.478047 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 15 12:30:49.484515 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 15 12:30:49.488248 systemd[1]: Reached target network-pre.target - Preparation for Network. May 15 12:30:49.490823 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 15 12:30:49.496105 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 15 12:30:49.496584 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 15 12:30:49.496608 systemd[1]: Reached target local-fs.target - Local File Systems. May 15 12:30:49.498956 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 15 12:30:49.503853 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 15 12:30:49.505353 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 15 12:30:49.506889 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 15 12:30:49.509477 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 15 12:30:49.510846 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 15 12:30:49.513130 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 15 12:30:49.514828 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 15 12:30:49.516869 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 15 12:30:49.520394 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 15 12:30:49.522895 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 15 12:30:49.524850 systemd-journald[1215]: Time spent on flushing to /var/log/journal/bc974141c40a4822843726a3fcc54c67 is 23.767ms for 1161 entries. May 15 12:30:49.524850 systemd-journald[1215]: System Journal (/var/log/journal/bc974141c40a4822843726a3fcc54c67) is 8M, max 584.8M, 576.8M free. May 15 12:30:49.563400 systemd-journald[1215]: Received client request to flush runtime journal. May 15 12:30:49.563439 kernel: loop0: detected capacity change from 0 to 113872 May 15 12:30:49.526705 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 15 12:30:49.528963 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 15 12:30:49.530549 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 15 12:30:49.548021 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 15 12:30:49.553266 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 15 12:30:49.553856 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 15 12:30:49.556205 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 15 12:30:49.565322 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 15 12:30:49.577847 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 15 12:30:49.589541 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 15 12:30:49.597815 kernel: loop1: detected capacity change from 0 to 146240 May 15 12:30:49.603101 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 15 12:30:49.605232 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 15 12:30:49.627855 systemd-tmpfiles[1270]: ACLs are not supported, ignoring. May 15 12:30:49.627872 systemd-tmpfiles[1270]: ACLs are not supported, ignoring. May 15 12:30:49.632671 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 15 12:30:49.635177 kernel: loop2: detected capacity change from 0 to 8 May 15 12:30:49.650787 kernel: loop3: detected capacity change from 0 to 210664 May 15 12:30:49.687879 kernel: loop4: detected capacity change from 0 to 113872 May 15 12:30:49.703793 kernel: loop5: detected capacity change from 0 to 146240 May 15 12:30:49.725779 kernel: loop6: detected capacity change from 0 to 8 May 15 12:30:49.728883 kernel: loop7: detected capacity change from 0 to 210664 May 15 12:30:49.751160 (sd-merge)[1276]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. May 15 12:30:49.751834 (sd-merge)[1276]: Merged extensions into '/usr'. May 15 12:30:49.755188 systemd[1]: Reload requested from client PID 1252 ('systemd-sysext') (unit systemd-sysext.service)... May 15 12:30:49.755272 systemd[1]: Reloading... May 15 12:30:49.830790 zram_generator::config[1302]: No configuration found. May 15 12:30:49.943017 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 15 12:30:49.978777 ldconfig[1247]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 15 12:30:50.016997 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 15 12:30:50.017558 systemd[1]: Reloading finished in 261 ms. May 15 12:30:50.040864 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 15 12:30:50.041688 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 15 12:30:50.050853 systemd[1]: Starting ensure-sysext.service... May 15 12:30:50.053664 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 15 12:30:50.061938 systemd[1]: Reload requested from client PID 1345 ('systemctl') (unit ensure-sysext.service)... May 15 12:30:50.062020 systemd[1]: Reloading... May 15 12:30:50.084016 systemd-tmpfiles[1346]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. May 15 12:30:50.084533 systemd-tmpfiles[1346]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. May 15 12:30:50.086134 systemd-tmpfiles[1346]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 15 12:30:50.086371 systemd-tmpfiles[1346]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 15 12:30:50.086968 systemd-tmpfiles[1346]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 15 12:30:50.087236 systemd-tmpfiles[1346]: ACLs are not supported, ignoring. May 15 12:30:50.087327 systemd-tmpfiles[1346]: ACLs are not supported, ignoring. May 15 12:30:50.092318 systemd-tmpfiles[1346]: Detected autofs mount point /boot during canonicalization of boot. May 15 12:30:50.092401 systemd-tmpfiles[1346]: Skipping /boot May 15 12:30:50.103845 systemd-tmpfiles[1346]: Detected autofs mount point /boot during canonicalization of boot. May 15 12:30:50.103957 systemd-tmpfiles[1346]: Skipping /boot May 15 12:30:50.113775 zram_generator::config[1368]: No configuration found. May 15 12:30:50.191313 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 15 12:30:50.259961 systemd[1]: Reloading finished in 197 ms. May 15 12:30:50.280133 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 15 12:30:50.284374 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 15 12:30:50.289451 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 15 12:30:50.291284 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 15 12:30:50.293644 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 15 12:30:50.297108 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 15 12:30:50.302952 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 15 12:30:50.304498 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 15 12:30:50.312863 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 15 12:30:50.312991 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 15 12:30:50.316995 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 15 12:30:50.320115 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 15 12:30:50.322988 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 15 12:30:50.323526 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 15 12:30:50.323618 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 15 12:30:50.323701 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 15 12:30:50.333623 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 15 12:30:50.343735 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 15 12:30:50.345531 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 15 12:30:50.346872 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 15 12:30:50.352082 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 15 12:30:50.352090 systemd-udevd[1427]: Using default interface naming scheme 'v255'. May 15 12:30:50.352683 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 15 12:30:50.354738 systemd[1]: modprobe@loop.service: Deactivated successfully. May 15 12:30:50.355277 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 15 12:30:50.362172 systemd[1]: Finished ensure-sysext.service. May 15 12:30:50.364853 augenrules[1450]: No rules May 15 12:30:50.366081 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 15 12:30:50.366231 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 15 12:30:50.367704 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 15 12:30:50.369725 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 15 12:30:50.371913 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 15 12:30:50.375001 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 15 12:30:50.376890 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 15 12:30:50.376926 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 15 12:30:50.378882 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... May 15 12:30:50.381880 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 15 12:30:50.386190 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 15 12:30:50.386622 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 15 12:30:50.386861 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 15 12:30:50.387508 systemd[1]: audit-rules.service: Deactivated successfully. May 15 12:30:50.389986 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 15 12:30:50.390734 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 15 12:30:50.391404 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 15 12:30:50.391536 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 15 12:30:50.398131 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 15 12:30:50.398594 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 15 12:30:50.398837 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 15 12:30:50.398973 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 15 12:30:50.399641 systemd[1]: modprobe@loop.service: Deactivated successfully. May 15 12:30:50.399849 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 15 12:30:50.403570 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 15 12:30:50.403626 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 15 12:30:50.404906 systemd[1]: modprobe@drm.service: Deactivated successfully. May 15 12:30:50.405024 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 15 12:30:50.413202 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 15 12:30:50.466383 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 15 12:30:50.491977 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. May 15 12:30:50.548795 kernel: mousedev: PS/2 mouse device common for all mice May 15 12:30:50.611781 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 May 15 12:30:50.632778 kernel: ACPI: button: Power Button [PWRF] May 15 12:30:50.636377 systemd-networkd[1471]: lo: Link UP May 15 12:30:50.636386 systemd-networkd[1471]: lo: Gained carrier May 15 12:30:50.639929 systemd-networkd[1471]: Enumeration completed May 15 12:30:50.640016 systemd[1]: Started systemd-networkd.service - Network Configuration. May 15 12:30:50.640233 systemd-networkd[1471]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 15 12:30:50.640236 systemd-networkd[1471]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 15 12:30:50.640615 systemd-networkd[1471]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 15 12:30:50.640618 systemd-networkd[1471]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. May 15 12:30:50.640935 systemd-networkd[1471]: eth0: Link UP May 15 12:30:50.641034 systemd-networkd[1471]: eth1: Link UP May 15 12:30:50.641167 systemd-networkd[1471]: eth0: Gained carrier May 15 12:30:50.641179 systemd-networkd[1471]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 15 12:30:50.643211 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 15 12:30:50.645026 systemd-networkd[1471]: eth1: Gained carrier May 15 12:30:50.645040 systemd-networkd[1471]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 15 12:30:50.647848 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 15 12:30:50.667627 systemd-resolved[1421]: Positive Trust Anchors: May 15 12:30:50.667643 systemd-resolved[1421]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 15 12:30:50.667668 systemd-resolved[1421]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 15 12:30:50.668329 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. May 15 12:30:50.668361 systemd-networkd[1471]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 May 15 12:30:50.668896 systemd-timesyncd[1460]: Network configuration changed, trying to establish connection. May 15 12:30:50.669197 systemd[1]: Reached target time-set.target - System Time Set. May 15 12:30:50.674779 systemd-resolved[1421]: Using system hostname 'ci-4334-0-0-a-dce95649a9'. May 15 12:30:50.676134 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 15 12:30:50.676839 systemd[1]: Reached target network.target - Network. May 15 12:30:50.677737 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 15 12:30:50.678325 systemd[1]: Reached target sysinit.target - System Initialization. May 15 12:30:50.679128 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 15 12:30:50.685153 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 15 12:30:50.685810 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. May 15 12:30:50.686872 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 15 12:30:50.687879 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 15 12:30:50.688806 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 15 12:30:50.689373 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 15 12:30:50.689455 systemd[1]: Reached target paths.target - Path Units. May 15 12:30:50.690154 systemd[1]: Reached target timers.target - Timer Units. May 15 12:30:50.692570 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 15 12:30:50.694465 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 May 15 12:30:50.694507 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console May 15 12:30:50.703823 systemd-networkd[1471]: eth0: DHCPv4 address 37.27.185.109/32, gateway 172.31.1.1 acquired from 172.31.1.1 May 15 12:30:50.704870 systemd-timesyncd[1460]: Network configuration changed, trying to establish connection. May 15 12:30:50.708043 kernel: Console: switching to colour dummy device 80x25 May 15 12:30:50.708093 kernel: [drm] features: -virgl +edid -resource_blob -host_visible May 15 12:30:50.708105 kernel: [drm] features: -context_init May 15 12:30:50.710943 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt May 15 12:30:50.711116 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD May 15 12:30:50.712867 systemd[1]: Starting docker.socket - Docker Socket for the API... May 15 12:30:50.716563 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 15 12:30:50.716720 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). May 15 12:30:50.716803 systemd[1]: Reached target ssh-access.target - SSH Access Available. May 15 12:30:50.718843 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 15 12:30:50.719225 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 15 12:30:50.721811 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 15 12:30:50.721998 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 15 12:30:50.724171 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. May 15 12:30:50.725988 systemd[1]: Reached target sockets.target - Socket Units. May 15 12:30:50.726055 systemd[1]: Reached target basic.target - Basic System. May 15 12:30:50.726165 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 15 12:30:50.726184 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 15 12:30:50.727175 systemd[1]: Starting containerd.service - containerd container runtime... May 15 12:30:50.729330 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... May 15 12:30:50.731907 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 15 12:30:50.734872 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 15 12:30:50.742280 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 15 12:30:50.744924 kernel: [drm] number of scanouts: 1 May 15 12:30:50.745288 kernel: [drm] number of cap sets: 0 May 15 12:30:50.745480 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 15 12:30:50.745597 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 15 12:30:50.748540 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... May 15 12:30:50.756807 jq[1542]: false May 15 12:30:50.754815 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 15 12:30:50.758206 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 15 12:30:50.766947 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. May 15 12:30:50.769953 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 15 12:30:50.773896 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 15 12:30:50.775984 google_oslogin_nss_cache[1546]: oslogin_cache_refresh[1546]: Refreshing passwd entry cache May 15 12:30:50.776357 oslogin_cache_refresh[1546]: Refreshing passwd entry cache May 15 12:30:50.777738 coreos-metadata[1537]: May 15 12:30:50.777 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 May 15 12:30:50.778895 systemd[1]: Starting systemd-logind.service - User Login Management... May 15 12:30:50.779662 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 15 12:30:50.780137 coreos-metadata[1537]: May 15 12:30:50.779 INFO Fetch successful May 15 12:30:50.780429 coreos-metadata[1537]: May 15 12:30:50.780 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 May 15 12:30:50.780978 coreos-metadata[1537]: May 15 12:30:50.780 INFO Fetch successful May 15 12:30:50.782910 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 15 12:30:50.783159 google_oslogin_nss_cache[1546]: oslogin_cache_refresh[1546]: Failure getting users, quitting May 15 12:30:50.783159 google_oslogin_nss_cache[1546]: oslogin_cache_refresh[1546]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. May 15 12:30:50.783159 google_oslogin_nss_cache[1546]: oslogin_cache_refresh[1546]: Refreshing group entry cache May 15 12:30:50.782312 oslogin_cache_refresh[1546]: Failure getting users, quitting May 15 12:30:50.782326 oslogin_cache_refresh[1546]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. May 15 12:30:50.782361 oslogin_cache_refresh[1546]: Refreshing group entry cache May 15 12:30:50.785598 google_oslogin_nss_cache[1546]: oslogin_cache_refresh[1546]: Failure getting groups, quitting May 15 12:30:50.785598 google_oslogin_nss_cache[1546]: oslogin_cache_refresh[1546]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. May 15 12:30:50.784903 systemd[1]: Starting update-engine.service - Update Engine... May 15 12:30:50.783537 oslogin_cache_refresh[1546]: Failure getting groups, quitting May 15 12:30:50.783544 oslogin_cache_refresh[1546]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. May 15 12:30:50.796430 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 15 12:30:50.818778 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 May 15 12:30:50.821633 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 15 12:30:50.822026 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 15 12:30:50.822185 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 15 12:30:50.822394 systemd[1]: google-oslogin-cache.service: Deactivated successfully. May 15 12:30:50.822526 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. May 15 12:30:50.823983 systemd[1]: motdgen.service: Deactivated successfully. May 15 12:30:50.824126 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 15 12:30:50.825575 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 15 12:30:50.825713 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 15 12:30:50.830037 jq[1563]: true May 15 12:30:50.831392 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. May 15 12:30:50.836506 extend-filesystems[1545]: Found loop4 May 15 12:30:50.836941 extend-filesystems[1545]: Found loop5 May 15 12:30:50.836941 extend-filesystems[1545]: Found loop6 May 15 12:30:50.836941 extend-filesystems[1545]: Found loop7 May 15 12:30:50.836941 extend-filesystems[1545]: Found sda May 15 12:30:50.836941 extend-filesystems[1545]: Found sda1 May 15 12:30:50.836941 extend-filesystems[1545]: Found sda2 May 15 12:30:50.836941 extend-filesystems[1545]: Found sda3 May 15 12:30:50.836941 extend-filesystems[1545]: Found usr May 15 12:30:50.836941 extend-filesystems[1545]: Found sda4 May 15 12:30:50.836941 extend-filesystems[1545]: Found sda6 May 15 12:30:50.836941 extend-filesystems[1545]: Found sda7 May 15 12:30:50.836941 extend-filesystems[1545]: Found sda9 May 15 12:30:50.840535 extend-filesystems[1545]: Checking size of /dev/sda9 May 15 12:30:50.845317 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 15 12:30:50.848870 update_engine[1557]: I20250515 12:30:50.847005 1557 main.cc:92] Flatcar Update Engine starting May 15 12:30:50.855047 tar[1570]: linux-amd64/helm May 15 12:30:50.853178 (ntainerd)[1571]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 15 12:30:50.873818 extend-filesystems[1545]: Resized partition /dev/sda9 May 15 12:30:50.880107 jq[1572]: true May 15 12:30:50.883967 extend-filesystems[1591]: resize2fs 1.47.2 (1-Jan-2025) May 15 12:30:50.897008 kernel: EDAC MC: Ver: 3.0.0 May 15 12:30:50.903134 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks May 15 12:30:50.903450 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 15 12:30:50.924462 dbus-daemon[1539]: [system] SELinux support is enabled May 15 12:30:50.924582 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 15 12:30:50.929233 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 15 12:30:50.929263 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 15 12:30:50.929830 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 15 12:30:50.929846 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 15 12:30:50.951012 systemd[1]: Started update-engine.service - Update Engine. May 15 12:30:50.952009 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 15 12:30:50.952551 update_engine[1557]: I20250515 12:30:50.952355 1557 update_check_scheduler.cc:74] Next update check in 10m42s May 15 12:30:50.993095 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. May 15 12:30:50.993302 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 15 12:30:51.019791 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 15 12:30:51.019940 bash[1621]: Updated "/home/core/.ssh/authorized_keys" May 15 12:30:51.020574 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 15 12:30:51.031994 systemd[1]: Starting sshkeys.service... May 15 12:30:51.039775 kernel: EXT4-fs (sda9): resized filesystem to 9393147 May 15 12:30:51.057560 extend-filesystems[1591]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required May 15 12:30:51.057560 extend-filesystems[1591]: old_desc_blocks = 1, new_desc_blocks = 5 May 15 12:30:51.057560 extend-filesystems[1591]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. May 15 12:30:51.057798 extend-filesystems[1545]: Resized filesystem in /dev/sda9 May 15 12:30:51.057798 extend-filesystems[1545]: Found sr0 May 15 12:30:51.060536 systemd[1]: extend-filesystems.service: Deactivated successfully. May 15 12:30:51.061645 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 15 12:30:51.073438 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. May 15 12:30:51.079965 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... May 15 12:30:51.082800 containerd[1571]: time="2025-05-15T12:30:51Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 15 12:30:51.093368 containerd[1571]: time="2025-05-15T12:30:51.093325139Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 May 15 12:30:51.131431 containerd[1571]: time="2025-05-15T12:30:51.130574450Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="8.948µs" May 15 12:30:51.131431 containerd[1571]: time="2025-05-15T12:30:51.130601962Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 15 12:30:51.131431 containerd[1571]: time="2025-05-15T12:30:51.130617782Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 15 12:30:51.131431 containerd[1571]: time="2025-05-15T12:30:51.130736014Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 15 12:30:51.133449 containerd[1571]: time="2025-05-15T12:30:51.133428373Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 15 12:30:51.133564 containerd[1571]: time="2025-05-15T12:30:51.133551093Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 15 12:30:51.133869 containerd[1571]: time="2025-05-15T12:30:51.133853611Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 15 12:30:51.134153 containerd[1571]: time="2025-05-15T12:30:51.134140509Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 15 12:30:51.139133 containerd[1571]: time="2025-05-15T12:30:51.139114447Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 15 12:30:51.139428 containerd[1571]: time="2025-05-15T12:30:51.139414440Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 15 12:30:51.139491 containerd[1571]: time="2025-05-15T12:30:51.139477187Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 15 12:30:51.139718 containerd[1571]: time="2025-05-15T12:30:51.139707339Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 15 12:30:51.139860 containerd[1571]: time="2025-05-15T12:30:51.139844487Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 15 12:30:51.140349 containerd[1571]: time="2025-05-15T12:30:51.140333364Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 15 12:30:51.141035 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 15 12:30:51.141152 containerd[1571]: time="2025-05-15T12:30:51.141135628Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 15 12:30:51.141197 containerd[1571]: time="2025-05-15T12:30:51.141187024Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 15 12:30:51.141274 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 15 12:30:51.141473 containerd[1571]: time="2025-05-15T12:30:51.141458193Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 15 12:30:51.142330 containerd[1571]: time="2025-05-15T12:30:51.142315250Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 15 12:30:51.142423 containerd[1571]: time="2025-05-15T12:30:51.142410429Z" level=info msg="metadata content store policy set" policy=shared May 15 12:30:51.146331 containerd[1571]: time="2025-05-15T12:30:51.146309642Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 15 12:30:51.147209 containerd[1571]: time="2025-05-15T12:30:51.146801254Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 15 12:30:51.147209 containerd[1571]: time="2025-05-15T12:30:51.146818326Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 15 12:30:51.147209 containerd[1571]: time="2025-05-15T12:30:51.146829988Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 15 12:30:51.147209 containerd[1571]: time="2025-05-15T12:30:51.146839305Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 15 12:30:51.147209 containerd[1571]: time="2025-05-15T12:30:51.146873730Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 15 12:30:51.147209 containerd[1571]: time="2025-05-15T12:30:51.146887937Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 15 12:30:51.147209 containerd[1571]: time="2025-05-15T12:30:51.146897435Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 15 12:30:51.147209 containerd[1571]: time="2025-05-15T12:30:51.146906080Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 15 12:30:51.147209 containerd[1571]: time="2025-05-15T12:30:51.146913995Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 15 12:30:51.147209 containerd[1571]: time="2025-05-15T12:30:51.146920608Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 15 12:30:51.147209 containerd[1571]: time="2025-05-15T12:30:51.146931688Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 15 12:30:51.147209 containerd[1571]: time="2025-05-15T12:30:51.147020254Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 15 12:30:51.147209 containerd[1571]: time="2025-05-15T12:30:51.147038349Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 15 12:30:51.147209 containerd[1571]: time="2025-05-15T12:30:51.147049480Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 15 12:30:51.147437 containerd[1571]: time="2025-05-15T12:30:51.147072112Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 15 12:30:51.147437 containerd[1571]: time="2025-05-15T12:30:51.147081149Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 15 12:30:51.147437 containerd[1571]: time="2025-05-15T12:30:51.147089315Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 15 12:30:51.147437 containerd[1571]: time="2025-05-15T12:30:51.147098612Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 15 12:30:51.147437 containerd[1571]: time="2025-05-15T12:30:51.147106256Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 15 12:30:51.147437 containerd[1571]: time="2025-05-15T12:30:51.147114341Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 15 12:30:51.147437 containerd[1571]: time="2025-05-15T12:30:51.147121976Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 15 12:30:51.147437 containerd[1571]: time="2025-05-15T12:30:51.147129710Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 15 12:30:51.147437 containerd[1571]: time="2025-05-15T12:30:51.147179333Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 15 12:30:51.147437 containerd[1571]: time="2025-05-15T12:30:51.147190023Z" level=info msg="Start snapshots syncer" May 15 12:30:51.148493 containerd[1571]: time="2025-05-15T12:30:51.147881160Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 15 12:30:51.148493 containerd[1571]: time="2025-05-15T12:30:51.148084701Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 15 12:30:51.148601 containerd[1571]: time="2025-05-15T12:30:51.148122292Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 15 12:30:51.149824 containerd[1571]: time="2025-05-15T12:30:51.149805178Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 15 12:30:51.150210 containerd[1571]: time="2025-05-15T12:30:51.150191533Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 15 12:30:51.150274 containerd[1571]: time="2025-05-15T12:30:51.150262005Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 15 12:30:51.150321 containerd[1571]: time="2025-05-15T12:30:51.150310495Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 15 12:30:51.150365 containerd[1571]: time="2025-05-15T12:30:51.150355389Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 15 12:30:51.150424 containerd[1571]: time="2025-05-15T12:30:51.150410934Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 15 12:30:51.151945 containerd[1571]: time="2025-05-15T12:30:51.151032680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 15 12:30:51.151945 containerd[1571]: time="2025-05-15T12:30:51.151052958Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 15 12:30:51.151945 containerd[1571]: time="2025-05-15T12:30:51.151106569Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 15 12:30:51.151945 containerd[1571]: time="2025-05-15T12:30:51.151124743Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 15 12:30:51.151945 containerd[1571]: time="2025-05-15T12:30:51.151135292Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 15 12:30:51.151945 containerd[1571]: time="2025-05-15T12:30:51.151177441Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 15 12:30:51.151945 containerd[1571]: time="2025-05-15T12:30:51.151199082Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 15 12:30:51.151945 containerd[1571]: time="2025-05-15T12:30:51.151206416Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 15 12:30:51.151945 containerd[1571]: time="2025-05-15T12:30:51.151214431Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 15 12:30:51.151945 containerd[1571]: time="2025-05-15T12:30:51.151220211Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 15 12:30:51.151945 containerd[1571]: time="2025-05-15T12:30:51.151274624Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 15 12:30:51.151945 containerd[1571]: time="2025-05-15T12:30:51.151284402Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 15 12:30:51.151945 containerd[1571]: time="2025-05-15T12:30:51.151297287Z" level=info msg="runtime interface created" May 15 12:30:51.151945 containerd[1571]: time="2025-05-15T12:30:51.151301304Z" level=info msg="created NRI interface" May 15 12:30:51.151945 containerd[1571]: time="2025-05-15T12:30:51.151307485Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 15 12:30:51.152180 containerd[1571]: time="2025-05-15T12:30:51.151317995Z" level=info msg="Connect containerd service" May 15 12:30:51.152180 containerd[1571]: time="2025-05-15T12:30:51.151340187Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 15 12:30:51.152345 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 15 12:30:51.153455 containerd[1571]: time="2025-05-15T12:30:51.153005861Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 15 12:30:51.160001 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 15 12:30:51.174860 coreos-metadata[1631]: May 15 12:30:51.173 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 May 15 12:30:51.174860 coreos-metadata[1631]: May 15 12:30:51.174 INFO Fetch successful May 15 12:30:51.177173 unknown[1631]: wrote ssh authorized keys file for user: core May 15 12:30:51.243876 systemd-logind[1554]: New seat seat0. May 15 12:30:51.248695 update-ssh-keys[1648]: Updated "/home/core/.ssh/authorized_keys" May 15 12:30:51.250690 systemd-logind[1554]: Watching system buttons on /dev/input/event3 (Power Button) May 15 12:30:51.250708 systemd-logind[1554]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) May 15 12:30:51.250991 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). May 15 12:30:51.251280 systemd[1]: Started systemd-logind.service - User Login Management. May 15 12:30:51.253995 systemd[1]: Finished sshkeys.service. May 15 12:30:51.258494 locksmithd[1607]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 15 12:30:51.298490 containerd[1571]: time="2025-05-15T12:30:51.298105076Z" level=info msg="Start subscribing containerd event" May 15 12:30:51.298490 containerd[1571]: time="2025-05-15T12:30:51.298150431Z" level=info msg="Start recovering state" May 15 12:30:51.298779 containerd[1571]: time="2025-05-15T12:30:51.298631784Z" level=info msg="Start event monitor" May 15 12:30:51.298779 containerd[1571]: time="2025-05-15T12:30:51.298679172Z" level=info msg="Start cni network conf syncer for default" May 15 12:30:51.298779 containerd[1571]: time="2025-05-15T12:30:51.298724818Z" level=info msg="Start streaming server" May 15 12:30:51.298779 containerd[1571]: time="2025-05-15T12:30:51.298733474Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 15 12:30:51.298779 containerd[1571]: time="2025-05-15T12:30:51.298739656Z" level=info msg="runtime interface starting up..." May 15 12:30:51.298779 containerd[1571]: time="2025-05-15T12:30:51.298743794Z" level=info msg="starting plugins..." May 15 12:30:51.298984 containerd[1571]: time="2025-05-15T12:30:51.298658774Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 15 12:30:51.299105 containerd[1571]: time="2025-05-15T12:30:51.299074263Z" level=info msg=serving... address=/run/containerd/containerd.sock May 15 12:30:51.299229 containerd[1571]: time="2025-05-15T12:30:51.298927478Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 15 12:30:51.299581 systemd[1]: Started containerd.service - containerd container runtime. May 15 12:30:51.299958 containerd[1571]: time="2025-05-15T12:30:51.299945518Z" level=info msg="containerd successfully booted in 0.225607s" May 15 12:30:51.344612 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 15 12:30:51.417037 sshd_keygen[1569]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 15 12:30:51.448906 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 15 12:30:51.452909 systemd[1]: Starting issuegen.service - Generate /run/issue... May 15 12:30:51.465114 systemd[1]: issuegen.service: Deactivated successfully. May 15 12:30:51.465447 systemd[1]: Finished issuegen.service - Generate /run/issue. May 15 12:30:51.469131 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 15 12:30:51.480684 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 15 12:30:51.484028 systemd[1]: Started getty@tty1.service - Getty on tty1. May 15 12:30:51.485287 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. May 15 12:30:51.485931 systemd[1]: Reached target getty.target - Login Prompts. May 15 12:30:51.585425 tar[1570]: linux-amd64/LICENSE May 15 12:30:51.585520 tar[1570]: linux-amd64/README.md May 15 12:30:51.599531 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 15 12:30:51.956960 systemd-networkd[1471]: eth1: Gained IPv6LL May 15 12:30:51.957560 systemd-timesyncd[1460]: Network configuration changed, trying to establish connection. May 15 12:30:51.959610 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 15 12:30:51.960198 systemd[1]: Reached target network-online.target - Network is Online. May 15 12:30:51.962492 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:30:51.964950 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 15 12:30:51.993654 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 15 12:30:52.276976 systemd-networkd[1471]: eth0: Gained IPv6LL May 15 12:30:52.277455 systemd-timesyncd[1460]: Network configuration changed, trying to establish connection. May 15 12:30:52.727683 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:30:52.728204 systemd[1]: Reached target multi-user.target - Multi-User System. May 15 12:30:52.729816 systemd[1]: Startup finished in 2.984s (kernel) + 9.147s (initrd) + 3.997s (userspace) = 16.129s. May 15 12:30:52.731041 (kubelet)[1703]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:30:53.330510 kubelet[1703]: E0515 12:30:53.330435 1703 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:30:53.332731 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:30:53.332874 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:30:53.333131 systemd[1]: kubelet.service: Consumed 803ms CPU time, 243.1M memory peak. May 15 12:31:03.335106 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 15 12:31:03.337242 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:31:03.450368 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:31:03.457063 (kubelet)[1724]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:31:03.508202 kubelet[1724]: E0515 12:31:03.508135 1724 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:31:03.513087 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:31:03.513272 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:31:03.513620 systemd[1]: kubelet.service: Consumed 138ms CPU time, 94.6M memory peak. May 15 12:31:13.584904 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. May 15 12:31:13.586304 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:31:13.687276 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:31:13.689716 (kubelet)[1740]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:31:13.723005 kubelet[1740]: E0515 12:31:13.722946 1740 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:31:13.724672 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:31:13.724825 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:31:13.725101 systemd[1]: kubelet.service: Consumed 106ms CPU time, 96.5M memory peak. May 15 12:31:23.254617 systemd-timesyncd[1460]: Contacted time server 217.79.189.239:123 (2.flatcar.pool.ntp.org). May 15 12:31:23.254673 systemd-timesyncd[1460]: Initial clock synchronization to Thu 2025-05-15 12:31:23.254470 UTC. May 15 12:31:23.254758 systemd-resolved[1421]: Clock change detected. Flushing caches. May 15 12:31:24.394984 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. May 15 12:31:24.398195 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:31:24.536985 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:31:24.539232 (kubelet)[1756]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:31:24.567635 kubelet[1756]: E0515 12:31:24.567588 1756 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:31:24.569795 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:31:24.569910 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:31:24.570155 systemd[1]: kubelet.service: Consumed 125ms CPU time, 95.7M memory peak. May 15 12:31:27.892040 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 15 12:31:27.893578 systemd[1]: Started sshd@0-37.27.185.109:22-39.109.116.40:34120.service - OpenSSH per-connection server daemon (39.109.116.40:34120). May 15 12:31:29.573425 sshd[1765]: Invalid user bb from 39.109.116.40 port 34120 May 15 12:31:29.896788 sshd[1765]: Received disconnect from 39.109.116.40 port 34120:11: Bye Bye [preauth] May 15 12:31:29.896788 sshd[1765]: Disconnected from invalid user bb 39.109.116.40 port 34120 [preauth] May 15 12:31:29.898880 systemd[1]: sshd@0-37.27.185.109:22-39.109.116.40:34120.service: Deactivated successfully. May 15 12:31:34.644507 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. May 15 12:31:34.645941 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:31:34.736119 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:31:34.745682 (kubelet)[1777]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:31:34.785909 kubelet[1777]: E0515 12:31:34.785858 1777 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:31:34.787933 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:31:34.788044 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:31:34.788270 systemd[1]: kubelet.service: Consumed 109ms CPU time, 96.1M memory peak. May 15 12:31:37.196567 update_engine[1557]: I20250515 12:31:37.196466 1557 update_attempter.cc:509] Updating boot flags... May 15 12:31:44.894442 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. May 15 12:31:44.895797 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:31:45.049293 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:31:45.058653 (kubelet)[1817]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:31:45.095894 kubelet[1817]: E0515 12:31:45.095824 1817 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:31:45.098142 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:31:45.098254 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:31:45.098666 systemd[1]: kubelet.service: Consumed 145ms CPU time, 95.4M memory peak. May 15 12:31:55.144625 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. May 15 12:31:55.146169 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:31:55.240174 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:31:55.245667 (kubelet)[1833]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:31:55.281528 kubelet[1833]: E0515 12:31:55.281465 1833 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:31:55.283749 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:31:55.283951 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:31:55.284274 systemd[1]: kubelet.service: Consumed 110ms CPU time, 95.5M memory peak. May 15 12:32:05.394424 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 7. May 15 12:32:05.396106 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:32:05.511901 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:32:05.530655 (kubelet)[1849]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:32:05.573888 kubelet[1849]: E0515 12:32:05.573824 1849 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:32:05.575919 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:32:05.576054 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:32:05.576341 systemd[1]: kubelet.service: Consumed 124ms CPU time, 94.3M memory peak. May 15 12:32:15.644734 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 8. May 15 12:32:15.647280 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:32:15.740002 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:32:15.748560 (kubelet)[1865]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:32:15.781059 kubelet[1865]: E0515 12:32:15.781003 1865 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:32:15.783104 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:32:15.783215 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:32:15.783568 systemd[1]: kubelet.service: Consumed 108ms CPU time, 95.7M memory peak. May 15 12:32:25.894484 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 9. May 15 12:32:25.896106 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:32:26.020850 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:32:26.030652 (kubelet)[1881]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:32:26.060046 kubelet[1881]: E0515 12:32:26.059988 1881 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:32:26.061790 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:32:26.061903 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:32:26.062135 systemd[1]: kubelet.service: Consumed 127ms CPU time, 95.5M memory peak. May 15 12:32:36.144555 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 10. May 15 12:32:36.146367 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:32:36.263271 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:32:36.268508 (kubelet)[1897]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:32:36.296289 kubelet[1897]: E0515 12:32:36.296260 1897 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:32:36.299059 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:32:36.299256 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:32:36.299583 systemd[1]: kubelet.service: Consumed 118ms CPU time, 95.7M memory peak. May 15 12:32:46.395131 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 11. May 15 12:32:46.397397 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:32:46.491192 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:32:46.494181 (kubelet)[1913]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:32:46.533114 kubelet[1913]: E0515 12:32:46.533065 1913 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:32:46.534472 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:32:46.534583 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:32:46.534822 systemd[1]: kubelet.service: Consumed 110ms CPU time, 95.5M memory peak. May 15 12:32:56.645060 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 12. May 15 12:32:56.647689 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:32:56.772072 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:32:56.777529 (kubelet)[1929]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:32:56.818202 kubelet[1929]: E0515 12:32:56.818163 1929 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:32:56.820030 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:32:56.820236 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:32:56.820759 systemd[1]: kubelet.service: Consumed 117ms CPU time, 95.9M memory peak. May 15 12:33:06.894400 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 13. May 15 12:33:06.895984 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:33:06.995234 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:33:07.000637 (kubelet)[1946]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:33:07.036388 kubelet[1946]: E0515 12:33:07.036293 1946 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:33:07.038511 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:33:07.038639 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:33:07.038882 systemd[1]: kubelet.service: Consumed 109ms CPU time, 95.2M memory peak. May 15 12:33:17.144482 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 14. May 15 12:33:17.146295 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:33:17.289369 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:33:17.301592 (kubelet)[1962]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:33:17.345254 kubelet[1962]: E0515 12:33:17.345180 1962 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:33:17.348243 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:33:17.348491 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:33:17.348893 systemd[1]: kubelet.service: Consumed 151ms CPU time, 95.6M memory peak. May 15 12:33:27.394510 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 15. May 15 12:33:27.396374 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:33:27.526263 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:33:27.531666 (kubelet)[1977]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:33:27.563944 kubelet[1977]: E0515 12:33:27.563885 1977 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:33:27.565898 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:33:27.566055 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:33:27.566407 systemd[1]: kubelet.service: Consumed 108ms CPU time, 95.8M memory peak. May 15 12:33:37.644499 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 16. May 15 12:33:37.646074 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:33:37.767152 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:33:37.775517 (kubelet)[1993]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:33:37.807498 kubelet[1993]: E0515 12:33:37.807421 1993 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:33:37.810979 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:33:37.811102 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:33:37.811427 systemd[1]: kubelet.service: Consumed 116ms CPU time, 95.6M memory peak. May 15 12:33:47.894499 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 17. May 15 12:33:47.895932 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:33:48.012842 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:33:48.019549 (kubelet)[2009]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:33:48.050894 kubelet[2009]: E0515 12:33:48.050824 2009 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:33:48.053070 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:33:48.053189 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:33:48.053473 systemd[1]: kubelet.service: Consumed 126ms CPU time, 95.3M memory peak. May 15 12:33:58.144432 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 18. May 15 12:33:58.146493 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:33:58.256125 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:33:58.262566 (kubelet)[2025]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:33:58.297028 kubelet[2025]: E0515 12:33:58.296970 2025 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:33:58.299175 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:33:58.299287 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:33:58.299621 systemd[1]: kubelet.service: Consumed 109ms CPU time, 95.4M memory peak. May 15 12:34:08.394647 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 19. May 15 12:34:08.396699 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:34:08.504513 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:34:08.513864 (kubelet)[2041]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:34:08.559109 kubelet[2041]: E0515 12:34:08.559041 2041 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:34:08.561668 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:34:08.561863 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:34:08.562160 systemd[1]: kubelet.service: Consumed 121ms CPU time, 95.7M memory peak. May 15 12:34:18.644504 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 20. May 15 12:34:18.646286 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:34:18.749293 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:34:18.753589 (kubelet)[2057]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:34:18.785953 kubelet[2057]: E0515 12:34:18.785897 2057 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:34:18.788153 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:34:18.788300 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:34:18.788587 systemd[1]: kubelet.service: Consumed 107ms CPU time, 95.6M memory peak. May 15 12:34:28.894365 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 21. May 15 12:34:28.895748 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:34:29.016879 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:34:29.027627 (kubelet)[2073]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:34:29.062696 kubelet[2073]: E0515 12:34:29.062637 2073 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:34:29.064891 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:34:29.065002 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:34:29.065261 systemd[1]: kubelet.service: Consumed 115ms CPU time, 94M memory peak. May 15 12:34:39.144574 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 22. May 15 12:34:39.146025 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:34:39.240117 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:34:39.245541 (kubelet)[2090]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:34:39.278806 kubelet[2090]: E0515 12:34:39.278743 2090 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:34:39.281212 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:34:39.281364 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:34:39.281688 systemd[1]: kubelet.service: Consumed 111ms CPU time, 95.5M memory peak. May 15 12:34:40.013944 systemd[1]: Started sshd@1-37.27.185.109:22-147.75.109.163:42448.service - OpenSSH per-connection server daemon (147.75.109.163:42448). May 15 12:34:40.996553 sshd[2099]: Accepted publickey for core from 147.75.109.163 port 42448 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 12:34:40.998656 sshd-session[2099]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:34:41.008747 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 15 12:34:41.010667 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 15 12:34:41.024418 systemd-logind[1554]: New session 1 of user core. May 15 12:34:41.032091 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 15 12:34:41.038020 systemd[1]: Starting user@500.service - User Manager for UID 500... May 15 12:34:41.060901 (systemd)[2103]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 15 12:34:41.064487 systemd-logind[1554]: New session c1 of user core. May 15 12:34:41.211064 systemd[2103]: Queued start job for default target default.target. May 15 12:34:41.217073 systemd[2103]: Created slice app.slice - User Application Slice. May 15 12:34:41.217099 systemd[2103]: Reached target paths.target - Paths. May 15 12:34:41.217134 systemd[2103]: Reached target timers.target - Timers. May 15 12:34:41.218115 systemd[2103]: Starting dbus.socket - D-Bus User Message Bus Socket... May 15 12:34:41.227141 systemd[2103]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 15 12:34:41.227182 systemd[2103]: Reached target sockets.target - Sockets. May 15 12:34:41.227219 systemd[2103]: Reached target basic.target - Basic System. May 15 12:34:41.227247 systemd[2103]: Reached target default.target - Main User Target. May 15 12:34:41.227275 systemd[2103]: Startup finished in 152ms. May 15 12:34:41.227353 systemd[1]: Started user@500.service - User Manager for UID 500. May 15 12:34:41.232488 systemd[1]: Started session-1.scope - Session 1 of User core. May 15 12:34:41.918702 systemd[1]: Started sshd@2-37.27.185.109:22-147.75.109.163:42450.service - OpenSSH per-connection server daemon (147.75.109.163:42450). May 15 12:34:42.906162 sshd[2114]: Accepted publickey for core from 147.75.109.163 port 42450 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 12:34:42.907406 sshd-session[2114]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:34:42.912795 systemd-logind[1554]: New session 2 of user core. May 15 12:34:42.922527 systemd[1]: Started session-2.scope - Session 2 of User core. May 15 12:34:43.578572 sshd[2116]: Connection closed by 147.75.109.163 port 42450 May 15 12:34:43.579564 sshd-session[2114]: pam_unix(sshd:session): session closed for user core May 15 12:34:43.584184 systemd[1]: sshd@2-37.27.185.109:22-147.75.109.163:42450.service: Deactivated successfully. May 15 12:34:43.586973 systemd[1]: session-2.scope: Deactivated successfully. May 15 12:34:43.590503 systemd-logind[1554]: Session 2 logged out. Waiting for processes to exit. May 15 12:34:43.592403 systemd-logind[1554]: Removed session 2. May 15 12:34:43.750743 systemd[1]: Started sshd@3-37.27.185.109:22-147.75.109.163:42452.service - OpenSSH per-connection server daemon (147.75.109.163:42452). May 15 12:34:44.742999 sshd[2122]: Accepted publickey for core from 147.75.109.163 port 42452 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 12:34:44.744216 sshd-session[2122]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:34:44.749063 systemd-logind[1554]: New session 3 of user core. May 15 12:34:44.755474 systemd[1]: Started session-3.scope - Session 3 of User core. May 15 12:34:45.414501 sshd[2124]: Connection closed by 147.75.109.163 port 42452 May 15 12:34:45.415292 sshd-session[2122]: pam_unix(sshd:session): session closed for user core May 15 12:34:45.420702 systemd-logind[1554]: Session 3 logged out. Waiting for processes to exit. May 15 12:34:45.421106 systemd[1]: sshd@3-37.27.185.109:22-147.75.109.163:42452.service: Deactivated successfully. May 15 12:34:45.423615 systemd[1]: session-3.scope: Deactivated successfully. May 15 12:34:45.426015 systemd-logind[1554]: Removed session 3. May 15 12:34:45.588779 systemd[1]: Started sshd@4-37.27.185.109:22-147.75.109.163:42466.service - OpenSSH per-connection server daemon (147.75.109.163:42466). May 15 12:34:46.562306 sshd[2130]: Accepted publickey for core from 147.75.109.163 port 42466 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 12:34:46.564027 sshd-session[2130]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:34:46.570863 systemd-logind[1554]: New session 4 of user core. May 15 12:34:46.576518 systemd[1]: Started session-4.scope - Session 4 of User core. May 15 12:34:47.235690 sshd[2132]: Connection closed by 147.75.109.163 port 42466 May 15 12:34:47.236227 sshd-session[2130]: pam_unix(sshd:session): session closed for user core May 15 12:34:47.239418 systemd[1]: sshd@4-37.27.185.109:22-147.75.109.163:42466.service: Deactivated successfully. May 15 12:34:47.241612 systemd[1]: session-4.scope: Deactivated successfully. May 15 12:34:47.242646 systemd-logind[1554]: Session 4 logged out. Waiting for processes to exit. May 15 12:34:47.243894 systemd-logind[1554]: Removed session 4. May 15 12:34:47.416148 systemd[1]: Started sshd@5-37.27.185.109:22-147.75.109.163:42474.service - OpenSSH per-connection server daemon (147.75.109.163:42474). May 15 12:34:48.401458 sshd[2138]: Accepted publickey for core from 147.75.109.163 port 42474 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 12:34:48.402928 sshd-session[2138]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:34:48.407991 systemd-logind[1554]: New session 5 of user core. May 15 12:34:48.417529 systemd[1]: Started session-5.scope - Session 5 of User core. May 15 12:34:48.928095 sudo[2141]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 15 12:34:48.928411 sudo[2141]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 15 12:34:48.937710 sudo[2141]: pam_unix(sudo:session): session closed for user root May 15 12:34:49.095592 sshd[2140]: Connection closed by 147.75.109.163 port 42474 May 15 12:34:49.096274 sshd-session[2138]: pam_unix(sshd:session): session closed for user core May 15 12:34:49.099750 systemd[1]: sshd@5-37.27.185.109:22-147.75.109.163:42474.service: Deactivated successfully. May 15 12:34:49.101404 systemd[1]: session-5.scope: Deactivated successfully. May 15 12:34:49.102590 systemd-logind[1554]: Session 5 logged out. Waiting for processes to exit. May 15 12:34:49.104165 systemd-logind[1554]: Removed session 5. May 15 12:34:49.273811 systemd[1]: Started sshd@6-37.27.185.109:22-147.75.109.163:44204.service - OpenSSH per-connection server daemon (147.75.109.163:44204). May 15 12:34:49.283312 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 23. May 15 12:34:49.286402 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:34:49.398252 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:34:49.404661 (kubelet)[2157]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:34:49.440508 kubelet[2157]: E0515 12:34:49.440452 2157 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:34:49.442481 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:34:49.442591 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:34:49.442999 systemd[1]: kubelet.service: Consumed 112ms CPU time, 93.9M memory peak. May 15 12:34:50.250871 sshd[2147]: Accepted publickey for core from 147.75.109.163 port 44204 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 12:34:50.252189 sshd-session[2147]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:34:50.257493 systemd-logind[1554]: New session 6 of user core. May 15 12:34:50.263477 systemd[1]: Started session-6.scope - Session 6 of User core. May 15 12:34:50.766254 sudo[2167]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 15 12:34:50.766548 sudo[2167]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 15 12:34:50.770910 sudo[2167]: pam_unix(sudo:session): session closed for user root May 15 12:34:50.775543 sudo[2166]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 15 12:34:50.775811 sudo[2166]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 15 12:34:50.784983 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 15 12:34:50.820804 augenrules[2189]: No rules May 15 12:34:50.821367 systemd[1]: audit-rules.service: Deactivated successfully. May 15 12:34:50.821567 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 15 12:34:50.822522 sudo[2166]: pam_unix(sudo:session): session closed for user root May 15 12:34:50.979745 sshd[2165]: Connection closed by 147.75.109.163 port 44204 May 15 12:34:50.980246 sshd-session[2147]: pam_unix(sshd:session): session closed for user core May 15 12:34:50.982722 systemd[1]: sshd@6-37.27.185.109:22-147.75.109.163:44204.service: Deactivated successfully. May 15 12:34:50.984126 systemd[1]: session-6.scope: Deactivated successfully. May 15 12:34:50.984993 systemd-logind[1554]: Session 6 logged out. Waiting for processes to exit. May 15 12:34:50.986190 systemd-logind[1554]: Removed session 6. May 15 12:34:51.149823 systemd[1]: Started sshd@7-37.27.185.109:22-147.75.109.163:44206.service - OpenSSH per-connection server daemon (147.75.109.163:44206). May 15 12:34:52.124582 sshd[2198]: Accepted publickey for core from 147.75.109.163 port 44206 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 12:34:52.125815 sshd-session[2198]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:34:52.131389 systemd-logind[1554]: New session 7 of user core. May 15 12:34:52.139487 systemd[1]: Started session-7.scope - Session 7 of User core. May 15 12:34:52.639173 sudo[2201]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 15 12:34:52.639457 sudo[2201]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 15 12:34:52.895888 systemd[1]: Starting docker.service - Docker Application Container Engine... May 15 12:34:52.909687 (dockerd)[2219]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 15 12:34:53.100920 dockerd[2219]: time="2025-05-15T12:34:53.100863468Z" level=info msg="Starting up" May 15 12:34:53.102734 dockerd[2219]: time="2025-05-15T12:34:53.102704383Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 15 12:34:53.127482 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport2233978542-merged.mount: Deactivated successfully. May 15 12:34:53.157376 dockerd[2219]: time="2025-05-15T12:34:53.156969703Z" level=info msg="Loading containers: start." May 15 12:34:53.169369 kernel: Initializing XFRM netlink socket May 15 12:34:53.378862 systemd-networkd[1471]: docker0: Link UP May 15 12:34:53.382940 dockerd[2219]: time="2025-05-15T12:34:53.382890170Z" level=info msg="Loading containers: done." May 15 12:34:53.395554 dockerd[2219]: time="2025-05-15T12:34:53.395504233Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 15 12:34:53.395672 dockerd[2219]: time="2025-05-15T12:34:53.395589011Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 May 15 12:34:53.395696 dockerd[2219]: time="2025-05-15T12:34:53.395678680Z" level=info msg="Initializing buildkit" May 15 12:34:53.416647 dockerd[2219]: time="2025-05-15T12:34:53.416245828Z" level=info msg="Completed buildkit initialization" May 15 12:34:53.424360 dockerd[2219]: time="2025-05-15T12:34:53.424093015Z" level=info msg="Daemon has completed initialization" May 15 12:34:53.424360 dockerd[2219]: time="2025-05-15T12:34:53.424158558Z" level=info msg="API listen on /run/docker.sock" May 15 12:34:53.424293 systemd[1]: Started docker.service - Docker Application Container Engine. May 15 12:34:54.548608 containerd[1571]: time="2025-05-15T12:34:54.548461732Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.12\"" May 15 12:34:55.131531 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2773645956.mount: Deactivated successfully. May 15 12:34:56.338821 containerd[1571]: time="2025-05-15T12:34:56.338768170Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.30.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:34:56.339845 containerd[1571]: time="2025-05-15T12:34:56.339810154Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.30.12: active requests=0, bytes read=32674967" May 15 12:34:56.340817 containerd[1571]: time="2025-05-15T12:34:56.340779604Z" level=info msg="ImageCreate event name:\"sha256:e113c59aa22f0650435e2a3ed64aadb01e87f3d2835aa3825fe078cd39699bfb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:34:56.343105 containerd[1571]: time="2025-05-15T12:34:56.343071576Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:4878682f7a044274d42399a6316ef452c5411aafd4ad99cc57de7235ca490e4e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:34:56.344070 containerd[1571]: time="2025-05-15T12:34:56.343915279Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.30.12\" with image id \"sha256:e113c59aa22f0650435e2a3ed64aadb01e87f3d2835aa3825fe078cd39699bfb\", repo tag \"registry.k8s.io/kube-apiserver:v1.30.12\", repo digest \"registry.k8s.io/kube-apiserver@sha256:4878682f7a044274d42399a6316ef452c5411aafd4ad99cc57de7235ca490e4e\", size \"32671673\" in 1.795307723s" May 15 12:34:56.344070 containerd[1571]: time="2025-05-15T12:34:56.343942339Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.12\" returns image reference \"sha256:e113c59aa22f0650435e2a3ed64aadb01e87f3d2835aa3825fe078cd39699bfb\"" May 15 12:34:56.356889 containerd[1571]: time="2025-05-15T12:34:56.356854401Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.12\"" May 15 12:34:57.775995 containerd[1571]: time="2025-05-15T12:34:57.775935388Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.30.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:34:57.776853 containerd[1571]: time="2025-05-15T12:34:57.776828244Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.30.12: active requests=0, bytes read=29617556" May 15 12:34:57.777596 containerd[1571]: time="2025-05-15T12:34:57.777563764Z" level=info msg="ImageCreate event name:\"sha256:70742b7b7d90a618a1fa06d89248dbe2c291c19d7f75f4ad60a69d0454dbbac8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:34:57.779672 containerd[1571]: time="2025-05-15T12:34:57.779633277Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:3a36711d0409d565b370a18d0c19339e93d4f1b1f2b3fd382eb31c714c463b74\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:34:57.780283 containerd[1571]: time="2025-05-15T12:34:57.780149095Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.30.12\" with image id \"sha256:70742b7b7d90a618a1fa06d89248dbe2c291c19d7f75f4ad60a69d0454dbbac8\", repo tag \"registry.k8s.io/kube-controller-manager:v1.30.12\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:3a36711d0409d565b370a18d0c19339e93d4f1b1f2b3fd382eb31c714c463b74\", size \"31105907\" in 1.423264047s" May 15 12:34:57.780283 containerd[1571]: time="2025-05-15T12:34:57.780171608Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.12\" returns image reference \"sha256:70742b7b7d90a618a1fa06d89248dbe2c291c19d7f75f4ad60a69d0454dbbac8\"" May 15 12:34:57.794661 containerd[1571]: time="2025-05-15T12:34:57.794560882Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.12\"" May 15 12:34:59.282271 containerd[1571]: time="2025-05-15T12:34:59.282219202Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.30.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:34:59.283291 containerd[1571]: time="2025-05-15T12:34:59.283255828Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.30.12: active requests=0, bytes read=17903704" May 15 12:34:59.284361 containerd[1571]: time="2025-05-15T12:34:59.284308884Z" level=info msg="ImageCreate event name:\"sha256:c0b91cfea9f9a1c09fc5d056f3a015e52604fd0d63671ff5bf31e642402ef05d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:34:59.289342 containerd[1571]: time="2025-05-15T12:34:59.288887016Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:521c843d01025be7d4e246ddee8cde74556eb9813c606d6db9f0f03236f6d029\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:34:59.290017 containerd[1571]: time="2025-05-15T12:34:59.289995135Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.30.12\" with image id \"sha256:c0b91cfea9f9a1c09fc5d056f3a015e52604fd0d63671ff5bf31e642402ef05d\", repo tag \"registry.k8s.io/kube-scheduler:v1.30.12\", repo digest \"registry.k8s.io/kube-scheduler@sha256:521c843d01025be7d4e246ddee8cde74556eb9813c606d6db9f0f03236f6d029\", size \"19392073\" in 1.495333013s" May 15 12:34:59.290094 containerd[1571]: time="2025-05-15T12:34:59.290082078Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.12\" returns image reference \"sha256:c0b91cfea9f9a1c09fc5d056f3a015e52604fd0d63671ff5bf31e642402ef05d\"" May 15 12:34:59.305825 containerd[1571]: time="2025-05-15T12:34:59.305750222Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.12\"" May 15 12:34:59.644736 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 24. May 15 12:34:59.647198 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:34:59.746940 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:34:59.757088 (kubelet)[2522]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:34:59.799396 kubelet[2522]: E0515 12:34:59.799341 2522 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:34:59.801665 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:34:59.801780 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:34:59.802149 systemd[1]: kubelet.service: Consumed 133ms CPU time, 95.8M memory peak. May 15 12:35:00.219614 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3765913381.mount: Deactivated successfully. May 15 12:35:00.507012 containerd[1571]: time="2025-05-15T12:35:00.506892946Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.30.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:35:00.508097 containerd[1571]: time="2025-05-15T12:35:00.507915945Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.30.12: active requests=0, bytes read=29185845" May 15 12:35:00.508820 containerd[1571]: time="2025-05-15T12:35:00.508791599Z" level=info msg="ImageCreate event name:\"sha256:c9356fea5d151501039907c3ba870272461396117eabc74063632616f4e31b2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:35:00.510294 containerd[1571]: time="2025-05-15T12:35:00.510267418Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:ea8c7d5392acf6b0c11ebba78301e1a6c2dc6abcd7544102ed578e49d1c82f15\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:35:00.510750 containerd[1571]: time="2025-05-15T12:35:00.510721070Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.30.12\" with image id \"sha256:c9356fea5d151501039907c3ba870272461396117eabc74063632616f4e31b2b\", repo tag \"registry.k8s.io/kube-proxy:v1.30.12\", repo digest \"registry.k8s.io/kube-proxy@sha256:ea8c7d5392acf6b0c11ebba78301e1a6c2dc6abcd7544102ed578e49d1c82f15\", size \"29184836\" in 1.204907269s" May 15 12:35:00.511065 containerd[1571]: time="2025-05-15T12:35:00.510803605Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.12\" returns image reference \"sha256:c9356fea5d151501039907c3ba870272461396117eabc74063632616f4e31b2b\"" May 15 12:35:00.524347 containerd[1571]: time="2025-05-15T12:35:00.524317886Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" May 15 12:35:01.062098 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3023552833.mount: Deactivated successfully. May 15 12:35:01.706634 containerd[1571]: time="2025-05-15T12:35:01.706588426Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:35:01.707671 containerd[1571]: time="2025-05-15T12:35:01.707643076Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185843" May 15 12:35:01.708656 containerd[1571]: time="2025-05-15T12:35:01.708359770Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:35:01.710300 containerd[1571]: time="2025-05-15T12:35:01.710279494Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:35:01.710994 containerd[1571]: time="2025-05-15T12:35:01.710969758Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 1.186495159s" May 15 12:35:01.711040 containerd[1571]: time="2025-05-15T12:35:01.710995987Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" May 15 12:35:01.723923 containerd[1571]: time="2025-05-15T12:35:01.723865880Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" May 15 12:35:02.176269 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3320661948.mount: Deactivated successfully. May 15 12:35:02.184284 containerd[1571]: time="2025-05-15T12:35:02.184234784Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:35:02.184983 containerd[1571]: time="2025-05-15T12:35:02.184946440Z" level=info msg="stop pulling image registry.k8s.io/pause:3.9: active requests=0, bytes read=322312" May 15 12:35:02.185995 containerd[1571]: time="2025-05-15T12:35:02.185953148Z" level=info msg="ImageCreate event name:\"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:35:02.187789 containerd[1571]: time="2025-05-15T12:35:02.187748728Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:35:02.188469 containerd[1571]: time="2025-05-15T12:35:02.188298831Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.9\" with image id \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\", repo tag \"registry.k8s.io/pause:3.9\", repo digest \"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\", size \"321520\" in 464.256039ms" May 15 12:35:02.188469 containerd[1571]: time="2025-05-15T12:35:02.188346060Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\"" May 15 12:35:02.202673 containerd[1571]: time="2025-05-15T12:35:02.202645123Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\"" May 15 12:35:02.734617 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount998128923.mount: Deactivated successfully. May 15 12:35:04.482965 containerd[1571]: time="2025-05-15T12:35:04.482910911Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.12-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:35:04.483947 containerd[1571]: time="2025-05-15T12:35:04.483678532Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.12-0: active requests=0, bytes read=57238653" May 15 12:35:04.484815 containerd[1571]: time="2025-05-15T12:35:04.484783815Z" level=info msg="ImageCreate event name:\"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:35:04.486837 containerd[1571]: time="2025-05-15T12:35:04.486805550Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:35:04.487666 containerd[1571]: time="2025-05-15T12:35:04.487637771Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.12-0\" with image id \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\", repo tag \"registry.k8s.io/etcd:3.5.12-0\", repo digest \"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\", size \"57236178\" in 2.284816586s" May 15 12:35:04.487710 containerd[1571]: time="2025-05-15T12:35:04.487668940Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\" returns image reference \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\"" May 15 12:35:07.268733 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:35:07.268933 systemd[1]: kubelet.service: Consumed 133ms CPU time, 95.8M memory peak. May 15 12:35:07.273484 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:35:07.286941 systemd[1]: Reload requested from client PID 2738 ('systemctl') (unit session-7.scope)... May 15 12:35:07.286957 systemd[1]: Reloading... May 15 12:35:07.391389 zram_generator::config[2783]: No configuration found. May 15 12:35:07.464918 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 15 12:35:07.559910 systemd[1]: Reloading finished in 272 ms. May 15 12:35:07.607711 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM May 15 12:35:07.607778 systemd[1]: kubelet.service: Failed with result 'signal'. May 15 12:35:07.607998 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:35:07.608035 systemd[1]: kubelet.service: Consumed 67ms CPU time, 83.6M memory peak. May 15 12:35:07.609544 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:35:07.697670 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:35:07.703532 (kubelet)[2837]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 15 12:35:07.747370 kubelet[2837]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 15 12:35:07.747370 kubelet[2837]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 15 12:35:07.747370 kubelet[2837]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 15 12:35:07.747702 kubelet[2837]: I0515 12:35:07.747398 2837 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 15 12:35:07.986304 kubelet[2837]: I0515 12:35:07.986264 2837 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" May 15 12:35:07.986304 kubelet[2837]: I0515 12:35:07.986289 2837 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 15 12:35:07.986516 kubelet[2837]: I0515 12:35:07.986492 2837 server.go:927] "Client rotation is on, will bootstrap in background" May 15 12:35:08.015248 kubelet[2837]: I0515 12:35:08.015198 2837 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 15 12:35:08.017036 kubelet[2837]: E0515 12:35:08.016951 2837 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://37.27.185.109:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 37.27.185.109:6443: connect: connection refused May 15 12:35:08.028403 kubelet[2837]: I0515 12:35:08.028363 2837 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 15 12:35:08.031172 kubelet[2837]: I0515 12:35:08.030809 2837 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 15 12:35:08.032219 kubelet[2837]: I0515 12:35:08.030995 2837 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4334-0-0-a-dce95649a9","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} May 15 12:35:08.032832 kubelet[2837]: I0515 12:35:08.032799 2837 topology_manager.go:138] "Creating topology manager with none policy" May 15 12:35:08.032832 kubelet[2837]: I0515 12:35:08.032820 2837 container_manager_linux.go:301] "Creating device plugin manager" May 15 12:35:08.032913 kubelet[2837]: I0515 12:35:08.032898 2837 state_mem.go:36] "Initialized new in-memory state store" May 15 12:35:08.033517 kubelet[2837]: I0515 12:35:08.033498 2837 kubelet.go:400] "Attempting to sync node with API server" May 15 12:35:08.033517 kubelet[2837]: I0515 12:35:08.033514 2837 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" May 15 12:35:08.034360 kubelet[2837]: I0515 12:35:08.034022 2837 kubelet.go:312] "Adding apiserver pod source" May 15 12:35:08.034360 kubelet[2837]: I0515 12:35:08.034043 2837 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 15 12:35:08.034360 kubelet[2837]: W0515 12:35:08.034111 2837 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://37.27.185.109:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4334-0-0-a-dce95649a9&limit=500&resourceVersion=0": dial tcp 37.27.185.109:6443: connect: connection refused May 15 12:35:08.034360 kubelet[2837]: E0515 12:35:08.034180 2837 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://37.27.185.109:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4334-0-0-a-dce95649a9&limit=500&resourceVersion=0": dial tcp 37.27.185.109:6443: connect: connection refused May 15 12:35:08.036153 kubelet[2837]: W0515 12:35:08.036117 2837 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://37.27.185.109:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 37.27.185.109:6443: connect: connection refused May 15 12:35:08.036203 kubelet[2837]: E0515 12:35:08.036158 2837 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://37.27.185.109:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 37.27.185.109:6443: connect: connection refused May 15 12:35:08.036618 kubelet[2837]: I0515 12:35:08.036596 2837 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 15 12:35:08.039032 kubelet[2837]: I0515 12:35:08.038001 2837 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 15 12:35:08.039032 kubelet[2837]: W0515 12:35:08.038042 2837 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 15 12:35:08.039032 kubelet[2837]: I0515 12:35:08.038475 2837 server.go:1264] "Started kubelet" May 15 12:35:08.042094 kubelet[2837]: I0515 12:35:08.041864 2837 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 15 12:35:08.043956 kubelet[2837]: I0515 12:35:08.043485 2837 server.go:455] "Adding debug handlers to kubelet server" May 15 12:35:08.045242 kubelet[2837]: I0515 12:35:08.045178 2837 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 15 12:35:08.045474 kubelet[2837]: I0515 12:35:08.045449 2837 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 15 12:35:08.045948 kubelet[2837]: E0515 12:35:08.045582 2837 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://37.27.185.109:6443/api/v1/namespaces/default/events\": dial tcp 37.27.185.109:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4334-0-0-a-dce95649a9.183fb377e93ab0f1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4334-0-0-a-dce95649a9,UID:ci-4334-0-0-a-dce95649a9,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4334-0-0-a-dce95649a9,},FirstTimestamp:2025-05-15 12:35:08.038459633 +0000 UTC m=+0.331102463,LastTimestamp:2025-05-15 12:35:08.038459633 +0000 UTC m=+0.331102463,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4334-0-0-a-dce95649a9,}" May 15 12:35:08.047825 kubelet[2837]: I0515 12:35:08.047719 2837 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 15 12:35:08.049296 kubelet[2837]: E0515 12:35:08.049271 2837 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4334-0-0-a-dce95649a9\" not found" May 15 12:35:08.051933 kubelet[2837]: I0515 12:35:08.051522 2837 volume_manager.go:291] "Starting Kubelet Volume Manager" May 15 12:35:08.051933 kubelet[2837]: I0515 12:35:08.051591 2837 desired_state_of_world_populator.go:149] "Desired state populator starts to run" May 15 12:35:08.051933 kubelet[2837]: I0515 12:35:08.051622 2837 reconciler.go:26] "Reconciler: start to sync state" May 15 12:35:08.051933 kubelet[2837]: W0515 12:35:08.051810 2837 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://37.27.185.109:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 37.27.185.109:6443: connect: connection refused May 15 12:35:08.051933 kubelet[2837]: E0515 12:35:08.051837 2837 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://37.27.185.109:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 37.27.185.109:6443: connect: connection refused May 15 12:35:08.054577 kubelet[2837]: E0515 12:35:08.053284 2837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://37.27.185.109:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4334-0-0-a-dce95649a9?timeout=10s\": dial tcp 37.27.185.109:6443: connect: connection refused" interval="200ms" May 15 12:35:08.057167 kubelet[2837]: I0515 12:35:08.057154 2837 factory.go:221] Registration of the systemd container factory successfully May 15 12:35:08.057405 kubelet[2837]: I0515 12:35:08.057391 2837 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 15 12:35:08.058554 kubelet[2837]: E0515 12:35:08.058542 2837 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 15 12:35:08.058762 kubelet[2837]: I0515 12:35:08.058750 2837 factory.go:221] Registration of the containerd container factory successfully May 15 12:35:08.061428 kubelet[2837]: I0515 12:35:08.061196 2837 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 15 12:35:08.062017 kubelet[2837]: I0515 12:35:08.061991 2837 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 15 12:35:08.064846 kubelet[2837]: I0515 12:35:08.062020 2837 status_manager.go:217] "Starting to sync pod status with apiserver" May 15 12:35:08.064846 kubelet[2837]: I0515 12:35:08.062037 2837 kubelet.go:2337] "Starting kubelet main sync loop" May 15 12:35:08.064846 kubelet[2837]: E0515 12:35:08.062069 2837 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 15 12:35:08.069061 kubelet[2837]: W0515 12:35:08.069015 2837 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://37.27.185.109:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 37.27.185.109:6443: connect: connection refused May 15 12:35:08.069106 kubelet[2837]: E0515 12:35:08.069082 2837 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://37.27.185.109:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 37.27.185.109:6443: connect: connection refused May 15 12:35:08.086277 kubelet[2837]: I0515 12:35:08.086249 2837 cpu_manager.go:214] "Starting CPU manager" policy="none" May 15 12:35:08.086277 kubelet[2837]: I0515 12:35:08.086263 2837 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 15 12:35:08.086484 kubelet[2837]: I0515 12:35:08.086378 2837 state_mem.go:36] "Initialized new in-memory state store" May 15 12:35:08.088461 kubelet[2837]: I0515 12:35:08.088432 2837 policy_none.go:49] "None policy: Start" May 15 12:35:08.089275 kubelet[2837]: I0515 12:35:08.089250 2837 memory_manager.go:170] "Starting memorymanager" policy="None" May 15 12:35:08.089275 kubelet[2837]: I0515 12:35:08.089276 2837 state_mem.go:35] "Initializing new in-memory state store" May 15 12:35:08.100511 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 15 12:35:08.109615 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 15 12:35:08.112538 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 15 12:35:08.121116 kubelet[2837]: I0515 12:35:08.121084 2837 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 15 12:35:08.121549 kubelet[2837]: I0515 12:35:08.121258 2837 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 15 12:35:08.121549 kubelet[2837]: I0515 12:35:08.121366 2837 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 15 12:35:08.123586 kubelet[2837]: E0515 12:35:08.123556 2837 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4334-0-0-a-dce95649a9\" not found" May 15 12:35:08.154255 kubelet[2837]: I0515 12:35:08.154210 2837 kubelet_node_status.go:73] "Attempting to register node" node="ci-4334-0-0-a-dce95649a9" May 15 12:35:08.154679 kubelet[2837]: E0515 12:35:08.154649 2837 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://37.27.185.109:6443/api/v1/nodes\": dial tcp 37.27.185.109:6443: connect: connection refused" node="ci-4334-0-0-a-dce95649a9" May 15 12:35:08.163035 kubelet[2837]: I0515 12:35:08.162980 2837 topology_manager.go:215] "Topology Admit Handler" podUID="0a1b449f5386d2bcc340ff278fe6764c" podNamespace="kube-system" podName="kube-apiserver-ci-4334-0-0-a-dce95649a9" May 15 12:35:08.164387 kubelet[2837]: I0515 12:35:08.164351 2837 topology_manager.go:215] "Topology Admit Handler" podUID="6776df8d3e08d4d04cd3eac2728945b2" podNamespace="kube-system" podName="kube-controller-manager-ci-4334-0-0-a-dce95649a9" May 15 12:35:08.165357 kubelet[2837]: I0515 12:35:08.165341 2837 topology_manager.go:215] "Topology Admit Handler" podUID="7383362cc9a988636a98fdee5b48e515" podNamespace="kube-system" podName="kube-scheduler-ci-4334-0-0-a-dce95649a9" May 15 12:35:08.171273 systemd[1]: Created slice kubepods-burstable-pod0a1b449f5386d2bcc340ff278fe6764c.slice - libcontainer container kubepods-burstable-pod0a1b449f5386d2bcc340ff278fe6764c.slice. May 15 12:35:08.196210 systemd[1]: Created slice kubepods-burstable-pod7383362cc9a988636a98fdee5b48e515.slice - libcontainer container kubepods-burstable-pod7383362cc9a988636a98fdee5b48e515.slice. May 15 12:35:08.200896 systemd[1]: Created slice kubepods-burstable-pod6776df8d3e08d4d04cd3eac2728945b2.slice - libcontainer container kubepods-burstable-pod6776df8d3e08d4d04cd3eac2728945b2.slice. May 15 12:35:08.254323 kubelet[2837]: E0515 12:35:08.253657 2837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://37.27.185.109:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4334-0-0-a-dce95649a9?timeout=10s\": dial tcp 37.27.185.109:6443: connect: connection refused" interval="400ms" May 15 12:35:08.354404 kubelet[2837]: I0515 12:35:08.354282 2837 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6776df8d3e08d4d04cd3eac2728945b2-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4334-0-0-a-dce95649a9\" (UID: \"6776df8d3e08d4d04cd3eac2728945b2\") " pod="kube-system/kube-controller-manager-ci-4334-0-0-a-dce95649a9" May 15 12:35:08.354404 kubelet[2837]: I0515 12:35:08.354346 2837 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0a1b449f5386d2bcc340ff278fe6764c-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4334-0-0-a-dce95649a9\" (UID: \"0a1b449f5386d2bcc340ff278fe6764c\") " pod="kube-system/kube-apiserver-ci-4334-0-0-a-dce95649a9" May 15 12:35:08.354404 kubelet[2837]: I0515 12:35:08.354367 2837 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/6776df8d3e08d4d04cd3eac2728945b2-flexvolume-dir\") pod \"kube-controller-manager-ci-4334-0-0-a-dce95649a9\" (UID: \"6776df8d3e08d4d04cd3eac2728945b2\") " pod="kube-system/kube-controller-manager-ci-4334-0-0-a-dce95649a9" May 15 12:35:08.354404 kubelet[2837]: I0515 12:35:08.354394 2837 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6776df8d3e08d4d04cd3eac2728945b2-kubeconfig\") pod \"kube-controller-manager-ci-4334-0-0-a-dce95649a9\" (UID: \"6776df8d3e08d4d04cd3eac2728945b2\") " pod="kube-system/kube-controller-manager-ci-4334-0-0-a-dce95649a9" May 15 12:35:08.354404 kubelet[2837]: I0515 12:35:08.354407 2837 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6776df8d3e08d4d04cd3eac2728945b2-k8s-certs\") pod \"kube-controller-manager-ci-4334-0-0-a-dce95649a9\" (UID: \"6776df8d3e08d4d04cd3eac2728945b2\") " pod="kube-system/kube-controller-manager-ci-4334-0-0-a-dce95649a9" May 15 12:35:08.354872 kubelet[2837]: I0515 12:35:08.354421 2837 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7383362cc9a988636a98fdee5b48e515-kubeconfig\") pod \"kube-scheduler-ci-4334-0-0-a-dce95649a9\" (UID: \"7383362cc9a988636a98fdee5b48e515\") " pod="kube-system/kube-scheduler-ci-4334-0-0-a-dce95649a9" May 15 12:35:08.354872 kubelet[2837]: I0515 12:35:08.354433 2837 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0a1b449f5386d2bcc340ff278fe6764c-ca-certs\") pod \"kube-apiserver-ci-4334-0-0-a-dce95649a9\" (UID: \"0a1b449f5386d2bcc340ff278fe6764c\") " pod="kube-system/kube-apiserver-ci-4334-0-0-a-dce95649a9" May 15 12:35:08.354872 kubelet[2837]: I0515 12:35:08.354445 2837 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0a1b449f5386d2bcc340ff278fe6764c-k8s-certs\") pod \"kube-apiserver-ci-4334-0-0-a-dce95649a9\" (UID: \"0a1b449f5386d2bcc340ff278fe6764c\") " pod="kube-system/kube-apiserver-ci-4334-0-0-a-dce95649a9" May 15 12:35:08.354872 kubelet[2837]: I0515 12:35:08.354457 2837 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6776df8d3e08d4d04cd3eac2728945b2-ca-certs\") pod \"kube-controller-manager-ci-4334-0-0-a-dce95649a9\" (UID: \"6776df8d3e08d4d04cd3eac2728945b2\") " pod="kube-system/kube-controller-manager-ci-4334-0-0-a-dce95649a9" May 15 12:35:08.357311 kubelet[2837]: I0515 12:35:08.357276 2837 kubelet_node_status.go:73] "Attempting to register node" node="ci-4334-0-0-a-dce95649a9" May 15 12:35:08.357643 kubelet[2837]: E0515 12:35:08.357604 2837 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://37.27.185.109:6443/api/v1/nodes\": dial tcp 37.27.185.109:6443: connect: connection refused" node="ci-4334-0-0-a-dce95649a9" May 15 12:35:08.495302 containerd[1571]: time="2025-05-15T12:35:08.495238374Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4334-0-0-a-dce95649a9,Uid:0a1b449f5386d2bcc340ff278fe6764c,Namespace:kube-system,Attempt:0,}" May 15 12:35:08.501269 containerd[1571]: time="2025-05-15T12:35:08.501160132Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4334-0-0-a-dce95649a9,Uid:7383362cc9a988636a98fdee5b48e515,Namespace:kube-system,Attempt:0,}" May 15 12:35:08.504054 containerd[1571]: time="2025-05-15T12:35:08.503955454Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4334-0-0-a-dce95649a9,Uid:6776df8d3e08d4d04cd3eac2728945b2,Namespace:kube-system,Attempt:0,}" May 15 12:35:08.654854 kubelet[2837]: E0515 12:35:08.654763 2837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://37.27.185.109:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4334-0-0-a-dce95649a9?timeout=10s\": dial tcp 37.27.185.109:6443: connect: connection refused" interval="800ms" May 15 12:35:08.759523 kubelet[2837]: I0515 12:35:08.759403 2837 kubelet_node_status.go:73] "Attempting to register node" node="ci-4334-0-0-a-dce95649a9" May 15 12:35:08.760266 kubelet[2837]: E0515 12:35:08.760203 2837 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://37.27.185.109:6443/api/v1/nodes\": dial tcp 37.27.185.109:6443: connect: connection refused" node="ci-4334-0-0-a-dce95649a9" May 15 12:35:08.927933 kubelet[2837]: W0515 12:35:08.927732 2837 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://37.27.185.109:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 37.27.185.109:6443: connect: connection refused May 15 12:35:08.927933 kubelet[2837]: E0515 12:35:08.927837 2837 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://37.27.185.109:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 37.27.185.109:6443: connect: connection refused May 15 12:35:08.996513 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4188223150.mount: Deactivated successfully. May 15 12:35:09.000381 containerd[1571]: time="2025-05-15T12:35:09.000284164Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 15 12:35:09.001425 containerd[1571]: time="2025-05-15T12:35:09.001385083Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321160" May 15 12:35:09.003248 containerd[1571]: time="2025-05-15T12:35:09.003205123Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 15 12:35:09.004877 containerd[1571]: time="2025-05-15T12:35:09.004835137Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 15 12:35:09.006497 containerd[1571]: time="2025-05-15T12:35:09.006433870Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" May 15 12:35:09.008289 containerd[1571]: time="2025-05-15T12:35:09.008207924Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 15 12:35:09.008917 containerd[1571]: time="2025-05-15T12:35:09.008870229Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 506.821266ms" May 15 12:35:09.009670 containerd[1571]: time="2025-05-15T12:35:09.009542613Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" May 15 12:35:09.009888 containerd[1571]: time="2025-05-15T12:35:09.009860149Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 15 12:35:09.013095 containerd[1571]: time="2025-05-15T12:35:09.013035114Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 499.781427ms" May 15 12:35:09.014932 containerd[1571]: time="2025-05-15T12:35:09.014848793Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 508.285859ms" May 15 12:35:09.088521 containerd[1571]: time="2025-05-15T12:35:09.088458984Z" level=info msg="connecting to shim 93df93f6913f00e9171e7c72c9f70e71edfb30442efc3c36514f5c6dddec91d1" address="unix:///run/containerd/s/94c6bed3e700ed41d1b9d21c64b43a89fff4839b4637ebe9bd28f031af5d014c" namespace=k8s.io protocol=ttrpc version=3 May 15 12:35:09.094637 containerd[1571]: time="2025-05-15T12:35:09.094427199Z" level=info msg="connecting to shim 917876ebd87e8b4a8b91df0799278ea36dd6037c26a39232aaeab1c0fa6a935b" address="unix:///run/containerd/s/88b5e5f248e809be7b24b73355e47d65f0f5950bd0ff510890737c95f6125085" namespace=k8s.io protocol=ttrpc version=3 May 15 12:35:09.098363 containerd[1571]: time="2025-05-15T12:35:09.097439349Z" level=info msg="connecting to shim 5b91d80da7cbed93c6f129399f80eb62aeede167c161a123310fe28727169529" address="unix:///run/containerd/s/5aafde2757077152b68bbf2473e9fb357467636ced6096d493a1a18506ecb59e" namespace=k8s.io protocol=ttrpc version=3 May 15 12:35:09.166479 systemd[1]: Started cri-containerd-5b91d80da7cbed93c6f129399f80eb62aeede167c161a123310fe28727169529.scope - libcontainer container 5b91d80da7cbed93c6f129399f80eb62aeede167c161a123310fe28727169529. May 15 12:35:09.167504 systemd[1]: Started cri-containerd-917876ebd87e8b4a8b91df0799278ea36dd6037c26a39232aaeab1c0fa6a935b.scope - libcontainer container 917876ebd87e8b4a8b91df0799278ea36dd6037c26a39232aaeab1c0fa6a935b. May 15 12:35:09.168386 systemd[1]: Started cri-containerd-93df93f6913f00e9171e7c72c9f70e71edfb30442efc3c36514f5c6dddec91d1.scope - libcontainer container 93df93f6913f00e9171e7c72c9f70e71edfb30442efc3c36514f5c6dddec91d1. May 15 12:35:09.226158 containerd[1571]: time="2025-05-15T12:35:09.226059670Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4334-0-0-a-dce95649a9,Uid:6776df8d3e08d4d04cd3eac2728945b2,Namespace:kube-system,Attempt:0,} returns sandbox id \"93df93f6913f00e9171e7c72c9f70e71edfb30442efc3c36514f5c6dddec91d1\"" May 15 12:35:09.235238 containerd[1571]: time="2025-05-15T12:35:09.234895002Z" level=info msg="CreateContainer within sandbox \"93df93f6913f00e9171e7c72c9f70e71edfb30442efc3c36514f5c6dddec91d1\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 15 12:35:09.235238 containerd[1571]: time="2025-05-15T12:35:09.235172463Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4334-0-0-a-dce95649a9,Uid:7383362cc9a988636a98fdee5b48e515,Namespace:kube-system,Attempt:0,} returns sandbox id \"917876ebd87e8b4a8b91df0799278ea36dd6037c26a39232aaeab1c0fa6a935b\"" May 15 12:35:09.239881 containerd[1571]: time="2025-05-15T12:35:09.239833973Z" level=info msg="CreateContainer within sandbox \"917876ebd87e8b4a8b91df0799278ea36dd6037c26a39232aaeab1c0fa6a935b\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 15 12:35:09.253916 containerd[1571]: time="2025-05-15T12:35:09.253879466Z" level=info msg="Container e823510bedc9ed0e482b1d3eb7d8a8f7b1eef33bcbb7e7fea24183a83e878d89: CDI devices from CRI Config.CDIDevices: []" May 15 12:35:09.254430 containerd[1571]: time="2025-05-15T12:35:09.254056979Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4334-0-0-a-dce95649a9,Uid:0a1b449f5386d2bcc340ff278fe6764c,Namespace:kube-system,Attempt:0,} returns sandbox id \"5b91d80da7cbed93c6f129399f80eb62aeede167c161a123310fe28727169529\"" May 15 12:35:09.255162 containerd[1571]: time="2025-05-15T12:35:09.255112773Z" level=info msg="Container 42c06d99f30ace834b14668a80883ed210238f4267dcb393331e444c50366d91: CDI devices from CRI Config.CDIDevices: []" May 15 12:35:09.258807 containerd[1571]: time="2025-05-15T12:35:09.258699222Z" level=info msg="CreateContainer within sandbox \"5b91d80da7cbed93c6f129399f80eb62aeede167c161a123310fe28727169529\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 15 12:35:09.260619 containerd[1571]: time="2025-05-15T12:35:09.260599253Z" level=info msg="CreateContainer within sandbox \"93df93f6913f00e9171e7c72c9f70e71edfb30442efc3c36514f5c6dddec91d1\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"e823510bedc9ed0e482b1d3eb7d8a8f7b1eef33bcbb7e7fea24183a83e878d89\"" May 15 12:35:09.262810 containerd[1571]: time="2025-05-15T12:35:09.261948649Z" level=info msg="StartContainer for \"e823510bedc9ed0e482b1d3eb7d8a8f7b1eef33bcbb7e7fea24183a83e878d89\"" May 15 12:35:09.262923 containerd[1571]: time="2025-05-15T12:35:09.262905687Z" level=info msg="connecting to shim e823510bedc9ed0e482b1d3eb7d8a8f7b1eef33bcbb7e7fea24183a83e878d89" address="unix:///run/containerd/s/94c6bed3e700ed41d1b9d21c64b43a89fff4839b4637ebe9bd28f031af5d014c" protocol=ttrpc version=3 May 15 12:35:09.264099 containerd[1571]: time="2025-05-15T12:35:09.264065887Z" level=info msg="CreateContainer within sandbox \"917876ebd87e8b4a8b91df0799278ea36dd6037c26a39232aaeab1c0fa6a935b\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"42c06d99f30ace834b14668a80883ed210238f4267dcb393331e444c50366d91\"" May 15 12:35:09.264450 containerd[1571]: time="2025-05-15T12:35:09.264415784Z" level=info msg="StartContainer for \"42c06d99f30ace834b14668a80883ed210238f4267dcb393331e444c50366d91\"" May 15 12:35:09.267271 containerd[1571]: time="2025-05-15T12:35:09.267243679Z" level=info msg="connecting to shim 42c06d99f30ace834b14668a80883ed210238f4267dcb393331e444c50366d91" address="unix:///run/containerd/s/88b5e5f248e809be7b24b73355e47d65f0f5950bd0ff510890737c95f6125085" protocol=ttrpc version=3 May 15 12:35:09.274917 containerd[1571]: time="2025-05-15T12:35:09.274874467Z" level=info msg="Container cab22f6292af38347de93fe153d8da7c2930a591f769935458cd4c429f3c590f: CDI devices from CRI Config.CDIDevices: []" May 15 12:35:09.280406 containerd[1571]: time="2025-05-15T12:35:09.280379241Z" level=info msg="CreateContainer within sandbox \"5b91d80da7cbed93c6f129399f80eb62aeede167c161a123310fe28727169529\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"cab22f6292af38347de93fe153d8da7c2930a591f769935458cd4c429f3c590f\"" May 15 12:35:09.280874 containerd[1571]: time="2025-05-15T12:35:09.280813256Z" level=info msg="StartContainer for \"cab22f6292af38347de93fe153d8da7c2930a591f769935458cd4c429f3c590f\"" May 15 12:35:09.281968 containerd[1571]: time="2025-05-15T12:35:09.281872237Z" level=info msg="connecting to shim cab22f6292af38347de93fe153d8da7c2930a591f769935458cd4c429f3c590f" address="unix:///run/containerd/s/5aafde2757077152b68bbf2473e9fb357467636ced6096d493a1a18506ecb59e" protocol=ttrpc version=3 May 15 12:35:09.296531 systemd[1]: Started cri-containerd-42c06d99f30ace834b14668a80883ed210238f4267dcb393331e444c50366d91.scope - libcontainer container 42c06d99f30ace834b14668a80883ed210238f4267dcb393331e444c50366d91. May 15 12:35:09.301367 systemd[1]: Started cri-containerd-cab22f6292af38347de93fe153d8da7c2930a591f769935458cd4c429f3c590f.scope - libcontainer container cab22f6292af38347de93fe153d8da7c2930a591f769935458cd4c429f3c590f. May 15 12:35:09.302706 systemd[1]: Started cri-containerd-e823510bedc9ed0e482b1d3eb7d8a8f7b1eef33bcbb7e7fea24183a83e878d89.scope - libcontainer container e823510bedc9ed0e482b1d3eb7d8a8f7b1eef33bcbb7e7fea24183a83e878d89. May 15 12:35:09.347258 kubelet[2837]: W0515 12:35:09.347202 2837 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://37.27.185.109:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 37.27.185.109:6443: connect: connection refused May 15 12:35:09.348362 kubelet[2837]: E0515 12:35:09.347264 2837 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://37.27.185.109:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 37.27.185.109:6443: connect: connection refused May 15 12:35:09.357110 containerd[1571]: time="2025-05-15T12:35:09.356963210Z" level=info msg="StartContainer for \"cab22f6292af38347de93fe153d8da7c2930a591f769935458cd4c429f3c590f\" returns successfully" May 15 12:35:09.387687 containerd[1571]: time="2025-05-15T12:35:09.387485554Z" level=info msg="StartContainer for \"42c06d99f30ace834b14668a80883ed210238f4267dcb393331e444c50366d91\" returns successfully" May 15 12:35:09.389787 containerd[1571]: time="2025-05-15T12:35:09.389743017Z" level=info msg="StartContainer for \"e823510bedc9ed0e482b1d3eb7d8a8f7b1eef33bcbb7e7fea24183a83e878d89\" returns successfully" May 15 12:35:09.440725 kubelet[2837]: W0515 12:35:09.440662 2837 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://37.27.185.109:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 37.27.185.109:6443: connect: connection refused May 15 12:35:09.440725 kubelet[2837]: E0515 12:35:09.440724 2837 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://37.27.185.109:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 37.27.185.109:6443: connect: connection refused May 15 12:35:09.455903 kubelet[2837]: E0515 12:35:09.455862 2837 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://37.27.185.109:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4334-0-0-a-dce95649a9?timeout=10s\": dial tcp 37.27.185.109:6443: connect: connection refused" interval="1.6s" May 15 12:35:09.545080 kubelet[2837]: W0515 12:35:09.544944 2837 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://37.27.185.109:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4334-0-0-a-dce95649a9&limit=500&resourceVersion=0": dial tcp 37.27.185.109:6443: connect: connection refused May 15 12:35:09.545080 kubelet[2837]: E0515 12:35:09.545015 2837 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://37.27.185.109:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4334-0-0-a-dce95649a9&limit=500&resourceVersion=0": dial tcp 37.27.185.109:6443: connect: connection refused May 15 12:35:09.563071 kubelet[2837]: I0515 12:35:09.562843 2837 kubelet_node_status.go:73] "Attempting to register node" node="ci-4334-0-0-a-dce95649a9" May 15 12:35:09.564343 kubelet[2837]: E0515 12:35:09.563279 2837 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://37.27.185.109:6443/api/v1/nodes\": dial tcp 37.27.185.109:6443: connect: connection refused" node="ci-4334-0-0-a-dce95649a9" May 15 12:35:10.991908 kubelet[2837]: E0515 12:35:10.991855 2837 csi_plugin.go:308] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "ci-4334-0-0-a-dce95649a9" not found May 15 12:35:11.060312 kubelet[2837]: E0515 12:35:11.060274 2837 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4334-0-0-a-dce95649a9\" not found" node="ci-4334-0-0-a-dce95649a9" May 15 12:35:11.167156 kubelet[2837]: I0515 12:35:11.166793 2837 kubelet_node_status.go:73] "Attempting to register node" node="ci-4334-0-0-a-dce95649a9" May 15 12:35:11.183244 kubelet[2837]: I0515 12:35:11.183191 2837 kubelet_node_status.go:76] "Successfully registered node" node="ci-4334-0-0-a-dce95649a9" May 15 12:35:11.194079 kubelet[2837]: E0515 12:35:11.194009 2837 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"ci-4334-0-0-a-dce95649a9\" not found" May 15 12:35:12.038275 kubelet[2837]: I0515 12:35:12.038218 2837 apiserver.go:52] "Watching apiserver" May 15 12:35:12.051975 kubelet[2837]: I0515 12:35:12.051932 2837 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" May 15 12:35:12.636727 systemd[1]: Reload requested from client PID 3112 ('systemctl') (unit session-7.scope)... May 15 12:35:12.636745 systemd[1]: Reloading... May 15 12:35:12.719360 zram_generator::config[3152]: No configuration found. May 15 12:35:12.793272 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 15 12:35:12.894855 systemd[1]: Reloading finished in 257 ms. May 15 12:35:12.915165 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:35:12.915852 kubelet[2837]: E0515 12:35:12.915604 2837 event.go:319] "Unable to write event (broadcaster is shut down)" event="&Event{ObjectMeta:{ci-4334-0-0-a-dce95649a9.183fb377e93ab0f1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4334-0-0-a-dce95649a9,UID:ci-4334-0-0-a-dce95649a9,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4334-0-0-a-dce95649a9,},FirstTimestamp:2025-05-15 12:35:08.038459633 +0000 UTC m=+0.331102463,LastTimestamp:2025-05-15 12:35:08.038459633 +0000 UTC m=+0.331102463,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4334-0-0-a-dce95649a9,}" May 15 12:35:12.916123 kubelet[2837]: I0515 12:35:12.916089 2837 dynamic_cafile_content.go:171] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 15 12:35:12.932177 systemd[1]: kubelet.service: Deactivated successfully. May 15 12:35:12.932379 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:35:12.932421 systemd[1]: kubelet.service: Consumed 599ms CPU time, 112.8M memory peak. May 15 12:35:12.933779 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:35:13.027292 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:35:13.033671 (kubelet)[3207]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 15 12:35:13.083280 kubelet[3207]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 15 12:35:13.083280 kubelet[3207]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 15 12:35:13.083280 kubelet[3207]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 15 12:35:13.083280 kubelet[3207]: I0515 12:35:13.082482 3207 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 15 12:35:13.087932 kubelet[3207]: I0515 12:35:13.087909 3207 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" May 15 12:35:13.087932 kubelet[3207]: I0515 12:35:13.087927 3207 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 15 12:35:13.088147 kubelet[3207]: I0515 12:35:13.088125 3207 server.go:927] "Client rotation is on, will bootstrap in background" May 15 12:35:13.090114 kubelet[3207]: I0515 12:35:13.090094 3207 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". May 15 12:35:13.093258 kubelet[3207]: I0515 12:35:13.092180 3207 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 15 12:35:13.098070 kubelet[3207]: I0515 12:35:13.098050 3207 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 15 12:35:13.098227 kubelet[3207]: I0515 12:35:13.098198 3207 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 15 12:35:13.098392 kubelet[3207]: I0515 12:35:13.098222 3207 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4334-0-0-a-dce95649a9","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} May 15 12:35:13.098392 kubelet[3207]: I0515 12:35:13.098391 3207 topology_manager.go:138] "Creating topology manager with none policy" May 15 12:35:13.098602 kubelet[3207]: I0515 12:35:13.098401 3207 container_manager_linux.go:301] "Creating device plugin manager" May 15 12:35:13.098602 kubelet[3207]: I0515 12:35:13.098430 3207 state_mem.go:36] "Initialized new in-memory state store" May 15 12:35:13.098602 kubelet[3207]: I0515 12:35:13.098509 3207 kubelet.go:400] "Attempting to sync node with API server" May 15 12:35:13.098602 kubelet[3207]: I0515 12:35:13.098521 3207 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" May 15 12:35:13.098602 kubelet[3207]: I0515 12:35:13.098537 3207 kubelet.go:312] "Adding apiserver pod source" May 15 12:35:13.099861 kubelet[3207]: I0515 12:35:13.099383 3207 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 15 12:35:13.100055 kubelet[3207]: I0515 12:35:13.100044 3207 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 15 12:35:13.101478 kubelet[3207]: I0515 12:35:13.101467 3207 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 15 12:35:13.101843 kubelet[3207]: I0515 12:35:13.101831 3207 server.go:1264] "Started kubelet" May 15 12:35:13.104897 kubelet[3207]: I0515 12:35:13.104871 3207 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 15 12:35:13.114381 kubelet[3207]: I0515 12:35:13.113841 3207 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 15 12:35:13.114637 kubelet[3207]: I0515 12:35:13.114618 3207 server.go:455] "Adding debug handlers to kubelet server" May 15 12:35:13.115420 kubelet[3207]: I0515 12:35:13.115408 3207 volume_manager.go:291] "Starting Kubelet Volume Manager" May 15 12:35:13.116854 kubelet[3207]: I0515 12:35:13.116809 3207 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 15 12:35:13.117001 kubelet[3207]: I0515 12:35:13.116981 3207 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 15 12:35:13.119215 kubelet[3207]: I0515 12:35:13.119202 3207 desired_state_of_world_populator.go:149] "Desired state populator starts to run" May 15 12:35:13.119519 kubelet[3207]: I0515 12:35:13.119509 3207 reconciler.go:26] "Reconciler: start to sync state" May 15 12:35:13.120860 kubelet[3207]: I0515 12:35:13.120811 3207 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 15 12:35:13.121861 kubelet[3207]: I0515 12:35:13.121849 3207 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 15 12:35:13.121937 kubelet[3207]: I0515 12:35:13.121928 3207 status_manager.go:217] "Starting to sync pod status with apiserver" May 15 12:35:13.121990 kubelet[3207]: I0515 12:35:13.121984 3207 kubelet.go:2337] "Starting kubelet main sync loop" May 15 12:35:13.122078 kubelet[3207]: E0515 12:35:13.122061 3207 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 15 12:35:13.125981 kubelet[3207]: I0515 12:35:13.125521 3207 factory.go:221] Registration of the systemd container factory successfully May 15 12:35:13.125981 kubelet[3207]: I0515 12:35:13.125603 3207 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 15 12:35:13.127382 kubelet[3207]: I0515 12:35:13.127270 3207 factory.go:221] Registration of the containerd container factory successfully May 15 12:35:13.152493 kubelet[3207]: E0515 12:35:13.152382 3207 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 15 12:35:13.180152 kubelet[3207]: I0515 12:35:13.180116 3207 cpu_manager.go:214] "Starting CPU manager" policy="none" May 15 12:35:13.180152 kubelet[3207]: I0515 12:35:13.180130 3207 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 15 12:35:13.180152 kubelet[3207]: I0515 12:35:13.180144 3207 state_mem.go:36] "Initialized new in-memory state store" May 15 12:35:13.181422 kubelet[3207]: I0515 12:35:13.180251 3207 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 15 12:35:13.181422 kubelet[3207]: I0515 12:35:13.180258 3207 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 15 12:35:13.181422 kubelet[3207]: I0515 12:35:13.180273 3207 policy_none.go:49] "None policy: Start" May 15 12:35:13.181849 kubelet[3207]: I0515 12:35:13.181809 3207 memory_manager.go:170] "Starting memorymanager" policy="None" May 15 12:35:13.181849 kubelet[3207]: I0515 12:35:13.181824 3207 state_mem.go:35] "Initializing new in-memory state store" May 15 12:35:13.182087 kubelet[3207]: I0515 12:35:13.182029 3207 state_mem.go:75] "Updated machine memory state" May 15 12:35:13.185863 kubelet[3207]: I0515 12:35:13.185853 3207 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 15 12:35:13.186133 kubelet[3207]: I0515 12:35:13.186110 3207 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 15 12:35:13.186624 kubelet[3207]: I0515 12:35:13.186614 3207 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 15 12:35:13.220508 kubelet[3207]: I0515 12:35:13.220477 3207 kubelet_node_status.go:73] "Attempting to register node" node="ci-4334-0-0-a-dce95649a9" May 15 12:35:13.222894 kubelet[3207]: I0515 12:35:13.222841 3207 topology_manager.go:215] "Topology Admit Handler" podUID="0a1b449f5386d2bcc340ff278fe6764c" podNamespace="kube-system" podName="kube-apiserver-ci-4334-0-0-a-dce95649a9" May 15 12:35:13.222974 kubelet[3207]: I0515 12:35:13.222943 3207 topology_manager.go:215] "Topology Admit Handler" podUID="6776df8d3e08d4d04cd3eac2728945b2" podNamespace="kube-system" podName="kube-controller-manager-ci-4334-0-0-a-dce95649a9" May 15 12:35:13.223386 kubelet[3207]: I0515 12:35:13.223353 3207 topology_manager.go:215] "Topology Admit Handler" podUID="7383362cc9a988636a98fdee5b48e515" podNamespace="kube-system" podName="kube-scheduler-ci-4334-0-0-a-dce95649a9" May 15 12:35:13.232184 kubelet[3207]: E0515 12:35:13.232145 3207 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4334-0-0-a-dce95649a9\" already exists" pod="kube-system/kube-apiserver-ci-4334-0-0-a-dce95649a9" May 15 12:35:13.234529 kubelet[3207]: I0515 12:35:13.234484 3207 kubelet_node_status.go:112] "Node was previously registered" node="ci-4334-0-0-a-dce95649a9" May 15 12:35:13.234616 kubelet[3207]: I0515 12:35:13.234590 3207 kubelet_node_status.go:76] "Successfully registered node" node="ci-4334-0-0-a-dce95649a9" May 15 12:35:13.421765 kubelet[3207]: I0515 12:35:13.421424 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0a1b449f5386d2bcc340ff278fe6764c-k8s-certs\") pod \"kube-apiserver-ci-4334-0-0-a-dce95649a9\" (UID: \"0a1b449f5386d2bcc340ff278fe6764c\") " pod="kube-system/kube-apiserver-ci-4334-0-0-a-dce95649a9" May 15 12:35:13.421765 kubelet[3207]: I0515 12:35:13.421480 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6776df8d3e08d4d04cd3eac2728945b2-ca-certs\") pod \"kube-controller-manager-ci-4334-0-0-a-dce95649a9\" (UID: \"6776df8d3e08d4d04cd3eac2728945b2\") " pod="kube-system/kube-controller-manager-ci-4334-0-0-a-dce95649a9" May 15 12:35:13.421765 kubelet[3207]: I0515 12:35:13.421513 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6776df8d3e08d4d04cd3eac2728945b2-k8s-certs\") pod \"kube-controller-manager-ci-4334-0-0-a-dce95649a9\" (UID: \"6776df8d3e08d4d04cd3eac2728945b2\") " pod="kube-system/kube-controller-manager-ci-4334-0-0-a-dce95649a9" May 15 12:35:13.421765 kubelet[3207]: I0515 12:35:13.421539 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6776df8d3e08d4d04cd3eac2728945b2-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4334-0-0-a-dce95649a9\" (UID: \"6776df8d3e08d4d04cd3eac2728945b2\") " pod="kube-system/kube-controller-manager-ci-4334-0-0-a-dce95649a9" May 15 12:35:13.421765 kubelet[3207]: I0515 12:35:13.421569 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0a1b449f5386d2bcc340ff278fe6764c-ca-certs\") pod \"kube-apiserver-ci-4334-0-0-a-dce95649a9\" (UID: \"0a1b449f5386d2bcc340ff278fe6764c\") " pod="kube-system/kube-apiserver-ci-4334-0-0-a-dce95649a9" May 15 12:35:13.422146 kubelet[3207]: I0515 12:35:13.421594 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0a1b449f5386d2bcc340ff278fe6764c-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4334-0-0-a-dce95649a9\" (UID: \"0a1b449f5386d2bcc340ff278fe6764c\") " pod="kube-system/kube-apiserver-ci-4334-0-0-a-dce95649a9" May 15 12:35:13.422146 kubelet[3207]: I0515 12:35:13.421626 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/6776df8d3e08d4d04cd3eac2728945b2-flexvolume-dir\") pod \"kube-controller-manager-ci-4334-0-0-a-dce95649a9\" (UID: \"6776df8d3e08d4d04cd3eac2728945b2\") " pod="kube-system/kube-controller-manager-ci-4334-0-0-a-dce95649a9" May 15 12:35:13.422146 kubelet[3207]: I0515 12:35:13.421652 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6776df8d3e08d4d04cd3eac2728945b2-kubeconfig\") pod \"kube-controller-manager-ci-4334-0-0-a-dce95649a9\" (UID: \"6776df8d3e08d4d04cd3eac2728945b2\") " pod="kube-system/kube-controller-manager-ci-4334-0-0-a-dce95649a9" May 15 12:35:13.422146 kubelet[3207]: I0515 12:35:13.421677 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7383362cc9a988636a98fdee5b48e515-kubeconfig\") pod \"kube-scheduler-ci-4334-0-0-a-dce95649a9\" (UID: \"7383362cc9a988636a98fdee5b48e515\") " pod="kube-system/kube-scheduler-ci-4334-0-0-a-dce95649a9" May 15 12:35:14.100056 kubelet[3207]: I0515 12:35:14.100018 3207 apiserver.go:52] "Watching apiserver" May 15 12:35:14.120456 kubelet[3207]: I0515 12:35:14.120422 3207 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" May 15 12:35:14.173903 kubelet[3207]: E0515 12:35:14.173859 3207 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4334-0-0-a-dce95649a9\" already exists" pod="kube-system/kube-apiserver-ci-4334-0-0-a-dce95649a9" May 15 12:35:14.200318 kubelet[3207]: I0515 12:35:14.200264 3207 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4334-0-0-a-dce95649a9" podStartSLOduration=1.200246809 podStartE2EDuration="1.200246809s" podCreationTimestamp="2025-05-15 12:35:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 12:35:14.190817332 +0000 UTC m=+1.150616156" watchObservedRunningTime="2025-05-15 12:35:14.200246809 +0000 UTC m=+1.160045633" May 15 12:35:14.208034 kubelet[3207]: I0515 12:35:14.207920 3207 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4334-0-0-a-dce95649a9" podStartSLOduration=2.207900509 podStartE2EDuration="2.207900509s" podCreationTimestamp="2025-05-15 12:35:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 12:35:14.200592397 +0000 UTC m=+1.160391222" watchObservedRunningTime="2025-05-15 12:35:14.207900509 +0000 UTC m=+1.167699333" May 15 12:35:18.150792 sudo[2201]: pam_unix(sudo:session): session closed for user root May 15 12:35:18.314507 sshd[2200]: Connection closed by 147.75.109.163 port 44206 May 15 12:35:18.315453 sshd-session[2198]: pam_unix(sshd:session): session closed for user core May 15 12:35:18.318858 systemd[1]: sshd@7-37.27.185.109:22-147.75.109.163:44206.service: Deactivated successfully. May 15 12:35:18.320500 systemd[1]: session-7.scope: Deactivated successfully. May 15 12:35:18.320702 systemd[1]: session-7.scope: Consumed 4.026s CPU time, 187.2M memory peak. May 15 12:35:18.322094 systemd-logind[1554]: Session 7 logged out. Waiting for processes to exit. May 15 12:35:18.323515 systemd-logind[1554]: Removed session 7. May 15 12:35:22.090299 kubelet[3207]: I0515 12:35:22.090203 3207 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4334-0-0-a-dce95649a9" podStartSLOduration=9.090186946 podStartE2EDuration="9.090186946s" podCreationTimestamp="2025-05-15 12:35:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 12:35:14.208110484 +0000 UTC m=+1.167909308" watchObservedRunningTime="2025-05-15 12:35:22.090186946 +0000 UTC m=+9.049985770" May 15 12:35:27.895927 kubelet[3207]: I0515 12:35:27.895897 3207 kuberuntime_manager.go:1523] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 15 12:35:27.896722 containerd[1571]: time="2025-05-15T12:35:27.896558786Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 15 12:35:27.897270 kubelet[3207]: I0515 12:35:27.897248 3207 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 15 12:35:28.574751 kubelet[3207]: I0515 12:35:28.574670 3207 topology_manager.go:215] "Topology Admit Handler" podUID="3a472aa5-af0c-4a72-a135-699739a0cf6e" podNamespace="kube-system" podName="kube-proxy-dspx2" May 15 12:35:28.588503 systemd[1]: Created slice kubepods-besteffort-pod3a472aa5_af0c_4a72_a135_699739a0cf6e.slice - libcontainer container kubepods-besteffort-pod3a472aa5_af0c_4a72_a135_699739a0cf6e.slice. May 15 12:35:28.616286 kubelet[3207]: I0515 12:35:28.616238 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3a472aa5-af0c-4a72-a135-699739a0cf6e-lib-modules\") pod \"kube-proxy-dspx2\" (UID: \"3a472aa5-af0c-4a72-a135-699739a0cf6e\") " pod="kube-system/kube-proxy-dspx2" May 15 12:35:28.616286 kubelet[3207]: I0515 12:35:28.616290 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tppm7\" (UniqueName: \"kubernetes.io/projected/3a472aa5-af0c-4a72-a135-699739a0cf6e-kube-api-access-tppm7\") pod \"kube-proxy-dspx2\" (UID: \"3a472aa5-af0c-4a72-a135-699739a0cf6e\") " pod="kube-system/kube-proxy-dspx2" May 15 12:35:28.616504 kubelet[3207]: I0515 12:35:28.616318 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/3a472aa5-af0c-4a72-a135-699739a0cf6e-xtables-lock\") pod \"kube-proxy-dspx2\" (UID: \"3a472aa5-af0c-4a72-a135-699739a0cf6e\") " pod="kube-system/kube-proxy-dspx2" May 15 12:35:28.616504 kubelet[3207]: I0515 12:35:28.616386 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/3a472aa5-af0c-4a72-a135-699739a0cf6e-kube-proxy\") pod \"kube-proxy-dspx2\" (UID: \"3a472aa5-af0c-4a72-a135-699739a0cf6e\") " pod="kube-system/kube-proxy-dspx2" May 15 12:35:28.897987 containerd[1571]: time="2025-05-15T12:35:28.897897576Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-dspx2,Uid:3a472aa5-af0c-4a72-a135-699739a0cf6e,Namespace:kube-system,Attempt:0,}" May 15 12:35:28.920902 containerd[1571]: time="2025-05-15T12:35:28.920824574Z" level=info msg="connecting to shim 0716c37f90ed94c8e74b1390bb7423b13593d84cbf42b5c0f53b8e2fb7ea8134" address="unix:///run/containerd/s/377607c94bf9a039af74391dae6ad6b98a31b8ed45897d9a30d9b99ebfce4185" namespace=k8s.io protocol=ttrpc version=3 May 15 12:35:28.960604 systemd[1]: Started cri-containerd-0716c37f90ed94c8e74b1390bb7423b13593d84cbf42b5c0f53b8e2fb7ea8134.scope - libcontainer container 0716c37f90ed94c8e74b1390bb7423b13593d84cbf42b5c0f53b8e2fb7ea8134. May 15 12:35:28.992733 containerd[1571]: time="2025-05-15T12:35:28.992616546Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-dspx2,Uid:3a472aa5-af0c-4a72-a135-699739a0cf6e,Namespace:kube-system,Attempt:0,} returns sandbox id \"0716c37f90ed94c8e74b1390bb7423b13593d84cbf42b5c0f53b8e2fb7ea8134\"" May 15 12:35:28.995228 kubelet[3207]: I0515 12:35:28.995197 3207 topology_manager.go:215] "Topology Admit Handler" podUID="582e964a-88e0-4f60-96d1-99730ced53cd" podNamespace="tigera-operator" podName="tigera-operator-797db67f8-pv7m8" May 15 12:35:28.998547 kubelet[3207]: W0515 12:35:28.998523 3207 reflector.go:547] object-"tigera-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4334-0-0-a-dce95649a9" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ci-4334-0-0-a-dce95649a9' and this object May 15 12:35:28.998629 kubelet[3207]: E0515 12:35:28.998552 3207 reflector.go:150] object-"tigera-operator"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4334-0-0-a-dce95649a9" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ci-4334-0-0-a-dce95649a9' and this object May 15 12:35:28.998629 kubelet[3207]: W0515 12:35:28.998582 3207 reflector.go:547] object-"tigera-operator"/"kubernetes-services-endpoint": failed to list *v1.ConfigMap: configmaps "kubernetes-services-endpoint" is forbidden: User "system:node:ci-4334-0-0-a-dce95649a9" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ci-4334-0-0-a-dce95649a9' and this object May 15 12:35:28.998629 kubelet[3207]: E0515 12:35:28.998589 3207 reflector.go:150] object-"tigera-operator"/"kubernetes-services-endpoint": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kubernetes-services-endpoint" is forbidden: User "system:node:ci-4334-0-0-a-dce95649a9" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ci-4334-0-0-a-dce95649a9' and this object May 15 12:35:29.002815 containerd[1571]: time="2025-05-15T12:35:29.002787783Z" level=info msg="CreateContainer within sandbox \"0716c37f90ed94c8e74b1390bb7423b13593d84cbf42b5c0f53b8e2fb7ea8134\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 15 12:35:29.005317 systemd[1]: Created slice kubepods-besteffort-pod582e964a_88e0_4f60_96d1_99730ced53cd.slice - libcontainer container kubepods-besteffort-pod582e964a_88e0_4f60_96d1_99730ced53cd.slice. May 15 12:35:29.016832 containerd[1571]: time="2025-05-15T12:35:29.016801367Z" level=info msg="Container dfaffc62d6317420a389133f9bbb97ec14f878f35708e9c575a358bab4bb6009: CDI devices from CRI Config.CDIDevices: []" May 15 12:35:29.020799 kubelet[3207]: I0515 12:35:29.020752 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/582e964a-88e0-4f60-96d1-99730ced53cd-var-lib-calico\") pod \"tigera-operator-797db67f8-pv7m8\" (UID: \"582e964a-88e0-4f60-96d1-99730ced53cd\") " pod="tigera-operator/tigera-operator-797db67f8-pv7m8" May 15 12:35:29.020957 kubelet[3207]: I0515 12:35:29.020926 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jdkk\" (UniqueName: \"kubernetes.io/projected/582e964a-88e0-4f60-96d1-99730ced53cd-kube-api-access-2jdkk\") pod \"tigera-operator-797db67f8-pv7m8\" (UID: \"582e964a-88e0-4f60-96d1-99730ced53cd\") " pod="tigera-operator/tigera-operator-797db67f8-pv7m8" May 15 12:35:29.023227 containerd[1571]: time="2025-05-15T12:35:29.023143139Z" level=info msg="CreateContainer within sandbox \"0716c37f90ed94c8e74b1390bb7423b13593d84cbf42b5c0f53b8e2fb7ea8134\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"dfaffc62d6317420a389133f9bbb97ec14f878f35708e9c575a358bab4bb6009\"" May 15 12:35:29.023850 containerd[1571]: time="2025-05-15T12:35:29.023812877Z" level=info msg="StartContainer for \"dfaffc62d6317420a389133f9bbb97ec14f878f35708e9c575a358bab4bb6009\"" May 15 12:35:29.025578 containerd[1571]: time="2025-05-15T12:35:29.025553137Z" level=info msg="connecting to shim dfaffc62d6317420a389133f9bbb97ec14f878f35708e9c575a358bab4bb6009" address="unix:///run/containerd/s/377607c94bf9a039af74391dae6ad6b98a31b8ed45897d9a30d9b99ebfce4185" protocol=ttrpc version=3 May 15 12:35:29.040457 systemd[1]: Started cri-containerd-dfaffc62d6317420a389133f9bbb97ec14f878f35708e9c575a358bab4bb6009.scope - libcontainer container dfaffc62d6317420a389133f9bbb97ec14f878f35708e9c575a358bab4bb6009. May 15 12:35:29.074865 containerd[1571]: time="2025-05-15T12:35:29.074772304Z" level=info msg="StartContainer for \"dfaffc62d6317420a389133f9bbb97ec14f878f35708e9c575a358bab4bb6009\" returns successfully" May 15 12:35:29.205052 kubelet[3207]: I0515 12:35:29.204563 3207 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-dspx2" podStartSLOduration=1.204544738 podStartE2EDuration="1.204544738s" podCreationTimestamp="2025-05-15 12:35:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 12:35:29.20437013 +0000 UTC m=+16.164168964" watchObservedRunningTime="2025-05-15 12:35:29.204544738 +0000 UTC m=+16.164343572" May 15 12:35:30.130195 kubelet[3207]: E0515 12:35:30.130108 3207 projected.go:294] Couldn't get configMap tigera-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition May 15 12:35:30.130195 kubelet[3207]: E0515 12:35:30.130166 3207 projected.go:200] Error preparing data for projected volume kube-api-access-2jdkk for pod tigera-operator/tigera-operator-797db67f8-pv7m8: failed to sync configmap cache: timed out waiting for the condition May 15 12:35:30.130953 kubelet[3207]: E0515 12:35:30.130261 3207 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/582e964a-88e0-4f60-96d1-99730ced53cd-kube-api-access-2jdkk podName:582e964a-88e0-4f60-96d1-99730ced53cd nodeName:}" failed. No retries permitted until 2025-05-15 12:35:30.630230431 +0000 UTC m=+17.590029296 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-2jdkk" (UniqueName: "kubernetes.io/projected/582e964a-88e0-4f60-96d1-99730ced53cd-kube-api-access-2jdkk") pod "tigera-operator-797db67f8-pv7m8" (UID: "582e964a-88e0-4f60-96d1-99730ced53cd") : failed to sync configmap cache: timed out waiting for the condition May 15 12:35:30.810462 containerd[1571]: time="2025-05-15T12:35:30.810397089Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-797db67f8-pv7m8,Uid:582e964a-88e0-4f60-96d1-99730ced53cd,Namespace:tigera-operator,Attempt:0,}" May 15 12:35:30.837550 containerd[1571]: time="2025-05-15T12:35:30.837448544Z" level=info msg="connecting to shim e9e748df1d7c8e825c55595cf6843ddb00fcf7fa8dfc3eed2501ddd8d80be070" address="unix:///run/containerd/s/bc47da3063820745c9afbeb2e834ec1e43cd199b236a7022c9e2cc938278fecb" namespace=k8s.io protocol=ttrpc version=3 May 15 12:35:30.871450 systemd[1]: Started cri-containerd-e9e748df1d7c8e825c55595cf6843ddb00fcf7fa8dfc3eed2501ddd8d80be070.scope - libcontainer container e9e748df1d7c8e825c55595cf6843ddb00fcf7fa8dfc3eed2501ddd8d80be070. May 15 12:35:30.911573 containerd[1571]: time="2025-05-15T12:35:30.911532287Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-797db67f8-pv7m8,Uid:582e964a-88e0-4f60-96d1-99730ced53cd,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"e9e748df1d7c8e825c55595cf6843ddb00fcf7fa8dfc3eed2501ddd8d80be070\"" May 15 12:35:30.913243 containerd[1571]: time="2025-05-15T12:35:30.913225317Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\"" May 15 12:35:32.856108 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount373810231.mount: Deactivated successfully. May 15 12:35:33.254304 containerd[1571]: time="2025-05-15T12:35:33.254264589Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:35:33.255297 containerd[1571]: time="2025-05-15T12:35:33.255115217Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.7: active requests=0, bytes read=22002662" May 15 12:35:33.255945 containerd[1571]: time="2025-05-15T12:35:33.255918316Z" level=info msg="ImageCreate event name:\"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:35:33.257769 containerd[1571]: time="2025-05-15T12:35:33.257741109Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:35:33.258318 containerd[1571]: time="2025-05-15T12:35:33.258289649Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.7\" with image id \"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\", repo tag \"quay.io/tigera/operator:v1.36.7\", repo digest \"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\", size \"21998657\" in 2.344931213s" May 15 12:35:33.258431 containerd[1571]: time="2025-05-15T12:35:33.258416678Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\" returns image reference \"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\"" May 15 12:35:33.260570 containerd[1571]: time="2025-05-15T12:35:33.260539325Z" level=info msg="CreateContainer within sandbox \"e9e748df1d7c8e825c55595cf6843ddb00fcf7fa8dfc3eed2501ddd8d80be070\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 15 12:35:33.268598 containerd[1571]: time="2025-05-15T12:35:33.268096541Z" level=info msg="Container 216d20dedc13f4de9ae415f3572568a64266231433b4b01a5537118c810d6cd7: CDI devices from CRI Config.CDIDevices: []" May 15 12:35:33.287038 containerd[1571]: time="2025-05-15T12:35:33.286995891Z" level=info msg="CreateContainer within sandbox \"e9e748df1d7c8e825c55595cf6843ddb00fcf7fa8dfc3eed2501ddd8d80be070\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"216d20dedc13f4de9ae415f3572568a64266231433b4b01a5537118c810d6cd7\"" May 15 12:35:33.287866 containerd[1571]: time="2025-05-15T12:35:33.287811783Z" level=info msg="StartContainer for \"216d20dedc13f4de9ae415f3572568a64266231433b4b01a5537118c810d6cd7\"" May 15 12:35:33.288856 containerd[1571]: time="2025-05-15T12:35:33.288676608Z" level=info msg="connecting to shim 216d20dedc13f4de9ae415f3572568a64266231433b4b01a5537118c810d6cd7" address="unix:///run/containerd/s/bc47da3063820745c9afbeb2e834ec1e43cd199b236a7022c9e2cc938278fecb" protocol=ttrpc version=3 May 15 12:35:33.315464 systemd[1]: Started cri-containerd-216d20dedc13f4de9ae415f3572568a64266231433b4b01a5537118c810d6cd7.scope - libcontainer container 216d20dedc13f4de9ae415f3572568a64266231433b4b01a5537118c810d6cd7. May 15 12:35:33.342116 containerd[1571]: time="2025-05-15T12:35:33.342008115Z" level=info msg="StartContainer for \"216d20dedc13f4de9ae415f3572568a64266231433b4b01a5537118c810d6cd7\" returns successfully" May 15 12:35:36.357068 kubelet[3207]: I0515 12:35:36.357016 3207 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-797db67f8-pv7m8" podStartSLOduration=6.010642299 podStartE2EDuration="8.357001153s" podCreationTimestamp="2025-05-15 12:35:28 +0000 UTC" firstStartedPulling="2025-05-15 12:35:30.912806019 +0000 UTC m=+17.872604844" lastFinishedPulling="2025-05-15 12:35:33.259164873 +0000 UTC m=+20.218963698" observedRunningTime="2025-05-15 12:35:34.218003984 +0000 UTC m=+21.177802808" watchObservedRunningTime="2025-05-15 12:35:36.357001153 +0000 UTC m=+23.316799968" May 15 12:35:36.357461 kubelet[3207]: I0515 12:35:36.357197 3207 topology_manager.go:215] "Topology Admit Handler" podUID="6c8535cc-4665-4d1e-a5d8-657a84679149" podNamespace="calico-system" podName="calico-typha-75c8575ff-tc4r7" May 15 12:35:36.364204 systemd[1]: Created slice kubepods-besteffort-pod6c8535cc_4665_4d1e_a5d8_657a84679149.slice - libcontainer container kubepods-besteffort-pod6c8535cc_4665_4d1e_a5d8_657a84679149.slice. May 15 12:35:36.368847 kubelet[3207]: I0515 12:35:36.368824 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/6c8535cc-4665-4d1e-a5d8-657a84679149-typha-certs\") pod \"calico-typha-75c8575ff-tc4r7\" (UID: \"6c8535cc-4665-4d1e-a5d8-657a84679149\") " pod="calico-system/calico-typha-75c8575ff-tc4r7" May 15 12:35:36.368847 kubelet[3207]: I0515 12:35:36.368853 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c8535cc-4665-4d1e-a5d8-657a84679149-tigera-ca-bundle\") pod \"calico-typha-75c8575ff-tc4r7\" (UID: \"6c8535cc-4665-4d1e-a5d8-657a84679149\") " pod="calico-system/calico-typha-75c8575ff-tc4r7" May 15 12:35:36.368942 kubelet[3207]: I0515 12:35:36.368870 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6g7q\" (UniqueName: \"kubernetes.io/projected/6c8535cc-4665-4d1e-a5d8-657a84679149-kube-api-access-k6g7q\") pod \"calico-typha-75c8575ff-tc4r7\" (UID: \"6c8535cc-4665-4d1e-a5d8-657a84679149\") " pod="calico-system/calico-typha-75c8575ff-tc4r7" May 15 12:35:36.487841 kubelet[3207]: I0515 12:35:36.487799 3207 topology_manager.go:215] "Topology Admit Handler" podUID="fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1" podNamespace="calico-system" podName="calico-node-prt2t" May 15 12:35:36.507636 systemd[1]: Created slice kubepods-besteffort-podfa5bd1ea_b5f7_4bb5_b6d2_db45304772d1.slice - libcontainer container kubepods-besteffort-podfa5bd1ea_b5f7_4bb5_b6d2_db45304772d1.slice. May 15 12:35:36.569938 kubelet[3207]: I0515 12:35:36.569900 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-var-run-calico\") pod \"calico-node-prt2t\" (UID: \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\") " pod="calico-system/calico-node-prt2t" May 15 12:35:36.569938 kubelet[3207]: I0515 12:35:36.569939 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-cni-bin-dir\") pod \"calico-node-prt2t\" (UID: \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\") " pod="calico-system/calico-node-prt2t" May 15 12:35:36.570161 kubelet[3207]: I0515 12:35:36.569954 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-flexvol-driver-host\") pod \"calico-node-prt2t\" (UID: \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\") " pod="calico-system/calico-node-prt2t" May 15 12:35:36.570161 kubelet[3207]: I0515 12:35:36.569968 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-var-lib-calico\") pod \"calico-node-prt2t\" (UID: \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\") " pod="calico-system/calico-node-prt2t" May 15 12:35:36.570161 kubelet[3207]: I0515 12:35:36.569982 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-policysync\") pod \"calico-node-prt2t\" (UID: \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\") " pod="calico-system/calico-node-prt2t" May 15 12:35:36.570161 kubelet[3207]: I0515 12:35:36.569994 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-lib-modules\") pod \"calico-node-prt2t\" (UID: \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\") " pod="calico-system/calico-node-prt2t" May 15 12:35:36.570161 kubelet[3207]: I0515 12:35:36.570005 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-cni-log-dir\") pod \"calico-node-prt2t\" (UID: \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\") " pod="calico-system/calico-node-prt2t" May 15 12:35:36.570259 kubelet[3207]: I0515 12:35:36.570020 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-xtables-lock\") pod \"calico-node-prt2t\" (UID: \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\") " pod="calico-system/calico-node-prt2t" May 15 12:35:36.570259 kubelet[3207]: I0515 12:35:36.570032 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-cni-net-dir\") pod \"calico-node-prt2t\" (UID: \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\") " pod="calico-system/calico-node-prt2t" May 15 12:35:36.570259 kubelet[3207]: I0515 12:35:36.570044 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8whb\" (UniqueName: \"kubernetes.io/projected/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-kube-api-access-s8whb\") pod \"calico-node-prt2t\" (UID: \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\") " pod="calico-system/calico-node-prt2t" May 15 12:35:36.570259 kubelet[3207]: I0515 12:35:36.570063 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-node-certs\") pod \"calico-node-prt2t\" (UID: \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\") " pod="calico-system/calico-node-prt2t" May 15 12:35:36.570259 kubelet[3207]: I0515 12:35:36.570075 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-tigera-ca-bundle\") pod \"calico-node-prt2t\" (UID: \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\") " pod="calico-system/calico-node-prt2t" May 15 12:35:36.601042 kubelet[3207]: I0515 12:35:36.601011 3207 topology_manager.go:215] "Topology Admit Handler" podUID="530e456b-2944-4384-8744-d360e07d8aae" podNamespace="calico-system" podName="csi-node-driver-2gvxv" May 15 12:35:36.601383 kubelet[3207]: E0515 12:35:36.601234 3207 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2gvxv" podUID="530e456b-2944-4384-8744-d360e07d8aae" May 15 12:35:36.667201 containerd[1571]: time="2025-05-15T12:35:36.667155486Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-75c8575ff-tc4r7,Uid:6c8535cc-4665-4d1e-a5d8-657a84679149,Namespace:calico-system,Attempt:0,}" May 15 12:35:36.673376 kubelet[3207]: I0515 12:35:36.671609 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/530e456b-2944-4384-8744-d360e07d8aae-registration-dir\") pod \"csi-node-driver-2gvxv\" (UID: \"530e456b-2944-4384-8744-d360e07d8aae\") " pod="calico-system/csi-node-driver-2gvxv" May 15 12:35:36.673645 kubelet[3207]: I0515 12:35:36.673523 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/530e456b-2944-4384-8744-d360e07d8aae-kubelet-dir\") pod \"csi-node-driver-2gvxv\" (UID: \"530e456b-2944-4384-8744-d360e07d8aae\") " pod="calico-system/csi-node-driver-2gvxv" May 15 12:35:36.673725 kubelet[3207]: I0515 12:35:36.673713 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6prl\" (UniqueName: \"kubernetes.io/projected/530e456b-2944-4384-8744-d360e07d8aae-kube-api-access-x6prl\") pod \"csi-node-driver-2gvxv\" (UID: \"530e456b-2944-4384-8744-d360e07d8aae\") " pod="calico-system/csi-node-driver-2gvxv" May 15 12:35:36.673907 kubelet[3207]: I0515 12:35:36.673895 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/530e456b-2944-4384-8744-d360e07d8aae-varrun\") pod \"csi-node-driver-2gvxv\" (UID: \"530e456b-2944-4384-8744-d360e07d8aae\") " pod="calico-system/csi-node-driver-2gvxv" May 15 12:35:36.674036 kubelet[3207]: I0515 12:35:36.673974 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/530e456b-2944-4384-8744-d360e07d8aae-socket-dir\") pod \"csi-node-driver-2gvxv\" (UID: \"530e456b-2944-4384-8744-d360e07d8aae\") " pod="calico-system/csi-node-driver-2gvxv" May 15 12:35:36.679766 kubelet[3207]: E0515 12:35:36.679747 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:36.679766 kubelet[3207]: W0515 12:35:36.679764 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:36.680786 kubelet[3207]: E0515 12:35:36.680764 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:36.699449 kubelet[3207]: E0515 12:35:36.699429 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:36.699449 kubelet[3207]: W0515 12:35:36.699446 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:36.699596 kubelet[3207]: E0515 12:35:36.699462 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:36.699802 containerd[1571]: time="2025-05-15T12:35:36.699745429Z" level=info msg="connecting to shim fbbffb0dc23d24ebe2df35b6c6ffb88a730020f39a5222ec0a817e1b9ddaff3d" address="unix:///run/containerd/s/86e77c290c457dd2cd2b0e52e6c479b36a5e3c3dbf0d56818eb0a4875502ff7a" namespace=k8s.io protocol=ttrpc version=3 May 15 12:35:36.725511 systemd[1]: Started cri-containerd-fbbffb0dc23d24ebe2df35b6c6ffb88a730020f39a5222ec0a817e1b9ddaff3d.scope - libcontainer container fbbffb0dc23d24ebe2df35b6c6ffb88a730020f39a5222ec0a817e1b9ddaff3d. May 15 12:35:36.774629 kubelet[3207]: E0515 12:35:36.774606 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:36.775436 kubelet[3207]: W0515 12:35:36.775254 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:36.775436 kubelet[3207]: E0515 12:35:36.775278 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:36.776433 kubelet[3207]: E0515 12:35:36.776266 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:36.776433 kubelet[3207]: W0515 12:35:36.776275 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:36.776433 kubelet[3207]: E0515 12:35:36.776291 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:36.776663 kubelet[3207]: E0515 12:35:36.776551 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:36.776663 kubelet[3207]: W0515 12:35:36.776560 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:36.776792 kubelet[3207]: E0515 12:35:36.776733 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:36.777378 kubelet[3207]: E0515 12:35:36.777217 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:36.777378 kubelet[3207]: W0515 12:35:36.777227 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:36.779226 kubelet[3207]: E0515 12:35:36.778964 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:36.779226 kubelet[3207]: W0515 12:35:36.778995 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:36.779226 kubelet[3207]: E0515 12:35:36.779093 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:36.779226 kubelet[3207]: W0515 12:35:36.779099 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:36.779226 kubelet[3207]: E0515 12:35:36.779172 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:36.779226 kubelet[3207]: W0515 12:35:36.779177 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:36.779226 kubelet[3207]: E0515 12:35:36.779184 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:36.779640 kubelet[3207]: E0515 12:35:36.779510 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:36.779640 kubelet[3207]: W0515 12:35:36.779518 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:36.779640 kubelet[3207]: E0515 12:35:36.779525 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:36.780070 kubelet[3207]: E0515 12:35:36.780057 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:36.780416 kubelet[3207]: E0515 12:35:36.780232 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:36.780566 kubelet[3207]: E0515 12:35:36.780241 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:36.780699 kubelet[3207]: E0515 12:35:36.780292 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:36.780699 kubelet[3207]: W0515 12:35:36.780623 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:36.780699 kubelet[3207]: E0515 12:35:36.780632 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:36.781010 kubelet[3207]: E0515 12:35:36.781001 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:36.781140 kubelet[3207]: W0515 12:35:36.781076 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:36.781254 kubelet[3207]: E0515 12:35:36.781196 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:36.781517 kubelet[3207]: E0515 12:35:36.781471 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:36.781573 kubelet[3207]: W0515 12:35:36.781564 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:36.781637 kubelet[3207]: E0515 12:35:36.781628 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:36.782377 kubelet[3207]: E0515 12:35:36.781927 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:36.782480 kubelet[3207]: W0515 12:35:36.782434 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:36.782480 kubelet[3207]: E0515 12:35:36.782451 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:36.782675 kubelet[3207]: E0515 12:35:36.782656 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:36.782675 kubelet[3207]: W0515 12:35:36.782665 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:36.782857 kubelet[3207]: E0515 12:35:36.782824 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:36.782927 kubelet[3207]: E0515 12:35:36.782849 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:36.782979 kubelet[3207]: W0515 12:35:36.782971 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:36.783116 kubelet[3207]: E0515 12:35:36.783100 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:36.783186 kubelet[3207]: E0515 12:35:36.783179 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:36.783263 kubelet[3207]: W0515 12:35:36.783224 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:36.784120 kubelet[3207]: E0515 12:35:36.783413 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:36.784221 kubelet[3207]: E0515 12:35:36.784210 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:36.784359 kubelet[3207]: W0515 12:35:36.784291 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:36.784429 kubelet[3207]: E0515 12:35:36.784418 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:36.784582 kubelet[3207]: E0515 12:35:36.784564 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:36.784582 kubelet[3207]: W0515 12:35:36.784572 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:36.784879 kubelet[3207]: E0515 12:35:36.784701 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:36.784944 kubelet[3207]: E0515 12:35:36.784935 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:36.784990 kubelet[3207]: W0515 12:35:36.784982 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:36.785112 kubelet[3207]: E0515 12:35:36.785104 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:36.785160 kubelet[3207]: W0515 12:35:36.785153 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:36.785352 kubelet[3207]: E0515 12:35:36.785277 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:36.785352 kubelet[3207]: E0515 12:35:36.785299 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:36.785438 kubelet[3207]: E0515 12:35:36.785430 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:36.785530 kubelet[3207]: W0515 12:35:36.785475 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:36.785530 kubelet[3207]: E0515 12:35:36.785487 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:36.785681 kubelet[3207]: E0515 12:35:36.785638 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:36.785681 kubelet[3207]: W0515 12:35:36.785646 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:36.785681 kubelet[3207]: E0515 12:35:36.785656 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:36.785861 kubelet[3207]: E0515 12:35:36.785843 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:36.785861 kubelet[3207]: W0515 12:35:36.785851 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:36.786230 kubelet[3207]: E0515 12:35:36.785931 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:36.786871 kubelet[3207]: E0515 12:35:36.786854 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:36.786871 kubelet[3207]: W0515 12:35:36.786869 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:36.787244 kubelet[3207]: E0515 12:35:36.787187 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:36.787994 kubelet[3207]: E0515 12:35:36.787968 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:36.787994 kubelet[3207]: W0515 12:35:36.787979 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:36.788144 kubelet[3207]: E0515 12:35:36.788077 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:36.788691 kubelet[3207]: E0515 12:35:36.788671 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:36.788691 kubelet[3207]: W0515 12:35:36.788684 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:36.788691 kubelet[3207]: E0515 12:35:36.788693 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:36.797158 containerd[1571]: time="2025-05-15T12:35:36.796965768Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-75c8575ff-tc4r7,Uid:6c8535cc-4665-4d1e-a5d8-657a84679149,Namespace:calico-system,Attempt:0,} returns sandbox id \"fbbffb0dc23d24ebe2df35b6c6ffb88a730020f39a5222ec0a817e1b9ddaff3d\"" May 15 12:35:36.800387 containerd[1571]: time="2025-05-15T12:35:36.800026366Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\"" May 15 12:35:36.800537 kubelet[3207]: E0515 12:35:36.800509 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:36.800665 kubelet[3207]: W0515 12:35:36.800626 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:36.800665 kubelet[3207]: E0515 12:35:36.800643 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:36.812618 containerd[1571]: time="2025-05-15T12:35:36.812439913Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-prt2t,Uid:fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1,Namespace:calico-system,Attempt:0,}" May 15 12:35:36.833984 containerd[1571]: time="2025-05-15T12:35:36.833923466Z" level=info msg="connecting to shim 707fa5f61523b78a5e896fac2d98b2ffbfa13a54d3678c81085d638d8aa179c7" address="unix:///run/containerd/s/a32b1c3db32c6a823bd692a94942c573e83841159de206dcaa2a1f6aaa796632" namespace=k8s.io protocol=ttrpc version=3 May 15 12:35:36.853571 systemd[1]: Started cri-containerd-707fa5f61523b78a5e896fac2d98b2ffbfa13a54d3678c81085d638d8aa179c7.scope - libcontainer container 707fa5f61523b78a5e896fac2d98b2ffbfa13a54d3678c81085d638d8aa179c7. May 15 12:35:36.883623 containerd[1571]: time="2025-05-15T12:35:36.883581408Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-prt2t,Uid:fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1,Namespace:calico-system,Attempt:0,} returns sandbox id \"707fa5f61523b78a5e896fac2d98b2ffbfa13a54d3678c81085d638d8aa179c7\"" May 15 12:35:38.123240 kubelet[3207]: E0515 12:35:38.123150 3207 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2gvxv" podUID="530e456b-2944-4384-8744-d360e07d8aae" May 15 12:35:40.122904 kubelet[3207]: E0515 12:35:40.122840 3207 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2gvxv" podUID="530e456b-2944-4384-8744-d360e07d8aae" May 15 12:35:42.122864 kubelet[3207]: E0515 12:35:42.122808 3207 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2gvxv" podUID="530e456b-2944-4384-8744-d360e07d8aae" May 15 12:35:44.122751 kubelet[3207]: E0515 12:35:44.122673 3207 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2gvxv" podUID="530e456b-2944-4384-8744-d360e07d8aae" May 15 12:35:46.123189 kubelet[3207]: E0515 12:35:46.123127 3207 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2gvxv" podUID="530e456b-2944-4384-8744-d360e07d8aae" May 15 12:35:48.123393 kubelet[3207]: E0515 12:35:48.123278 3207 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2gvxv" podUID="530e456b-2944-4384-8744-d360e07d8aae" May 15 12:35:49.128522 containerd[1571]: time="2025-05-15T12:35:49.128478213Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:35:49.129583 containerd[1571]: time="2025-05-15T12:35:49.129546379Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.3: active requests=0, bytes read=30426870" May 15 12:35:49.130705 containerd[1571]: time="2025-05-15T12:35:49.130667234Z" level=info msg="ImageCreate event name:\"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:35:49.132387 containerd[1571]: time="2025-05-15T12:35:49.132370843Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:35:49.132795 containerd[1571]: time="2025-05-15T12:35:49.132696785Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.3\" with image id \"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\", size \"31919484\" in 12.331823237s" May 15 12:35:49.132795 containerd[1571]: time="2025-05-15T12:35:49.132720250Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\" returns image reference \"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\"" May 15 12:35:49.134936 containerd[1571]: time="2025-05-15T12:35:49.134909040Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\"" May 15 12:35:49.147136 containerd[1571]: time="2025-05-15T12:35:49.147095878Z" level=info msg="CreateContainer within sandbox \"fbbffb0dc23d24ebe2df35b6c6ffb88a730020f39a5222ec0a817e1b9ddaff3d\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 15 12:35:49.158762 containerd[1571]: time="2025-05-15T12:35:49.157969137Z" level=info msg="Container 23933857f1169d98c8240e1a804eaa00fe5361f65e4d2e0e5cd48d201f401369: CDI devices from CRI Config.CDIDevices: []" May 15 12:35:49.167613 containerd[1571]: time="2025-05-15T12:35:49.167582492Z" level=info msg="CreateContainer within sandbox \"fbbffb0dc23d24ebe2df35b6c6ffb88a730020f39a5222ec0a817e1b9ddaff3d\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"23933857f1169d98c8240e1a804eaa00fe5361f65e4d2e0e5cd48d201f401369\"" May 15 12:35:49.168291 containerd[1571]: time="2025-05-15T12:35:49.168140139Z" level=info msg="StartContainer for \"23933857f1169d98c8240e1a804eaa00fe5361f65e4d2e0e5cd48d201f401369\"" May 15 12:35:49.169070 containerd[1571]: time="2025-05-15T12:35:49.169038807Z" level=info msg="connecting to shim 23933857f1169d98c8240e1a804eaa00fe5361f65e4d2e0e5cd48d201f401369" address="unix:///run/containerd/s/86e77c290c457dd2cd2b0e52e6c479b36a5e3c3dbf0d56818eb0a4875502ff7a" protocol=ttrpc version=3 May 15 12:35:49.186447 systemd[1]: Started cri-containerd-23933857f1169d98c8240e1a804eaa00fe5361f65e4d2e0e5cd48d201f401369.scope - libcontainer container 23933857f1169d98c8240e1a804eaa00fe5361f65e4d2e0e5cd48d201f401369. May 15 12:35:49.233134 containerd[1571]: time="2025-05-15T12:35:49.233036285Z" level=info msg="StartContainer for \"23933857f1169d98c8240e1a804eaa00fe5361f65e4d2e0e5cd48d201f401369\" returns successfully" May 15 12:35:49.252190 kubelet[3207]: E0515 12:35:49.252162 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:49.252190 kubelet[3207]: W0515 12:35:49.252181 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:49.252190 kubelet[3207]: E0515 12:35:49.252196 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:49.252681 kubelet[3207]: E0515 12:35:49.252528 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:49.252681 kubelet[3207]: W0515 12:35:49.252538 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:49.252681 kubelet[3207]: E0515 12:35:49.252548 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:49.253486 kubelet[3207]: E0515 12:35:49.253464 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:49.253486 kubelet[3207]: W0515 12:35:49.253481 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:49.253486 kubelet[3207]: E0515 12:35:49.253490 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:49.253687 kubelet[3207]: E0515 12:35:49.253658 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:49.253687 kubelet[3207]: W0515 12:35:49.253667 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:49.253687 kubelet[3207]: E0515 12:35:49.253675 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:49.253928 kubelet[3207]: E0515 12:35:49.253913 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:49.253928 kubelet[3207]: W0515 12:35:49.253926 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:49.254018 kubelet[3207]: E0515 12:35:49.253934 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:49.254342 kubelet[3207]: E0515 12:35:49.254260 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:49.254342 kubelet[3207]: W0515 12:35:49.254269 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:49.254342 kubelet[3207]: E0515 12:35:49.254277 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:49.255196 kubelet[3207]: E0515 12:35:49.254967 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:49.255196 kubelet[3207]: W0515 12:35:49.254979 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:49.255196 kubelet[3207]: E0515 12:35:49.254987 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:49.255707 kubelet[3207]: E0515 12:35:49.255676 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:49.255707 kubelet[3207]: W0515 12:35:49.255686 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:49.255707 kubelet[3207]: E0515 12:35:49.255694 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:49.256256 kubelet[3207]: E0515 12:35:49.256240 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:49.256256 kubelet[3207]: W0515 12:35:49.256252 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:49.256367 kubelet[3207]: E0515 12:35:49.256260 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:49.256638 kubelet[3207]: I0515 12:35:49.256592 3207 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-75c8575ff-tc4r7" podStartSLOduration=0.922459508 podStartE2EDuration="13.25657985s" podCreationTimestamp="2025-05-15 12:35:36 +0000 UTC" firstStartedPulling="2025-05-15 12:35:36.799255609 +0000 UTC m=+23.759054433" lastFinishedPulling="2025-05-15 12:35:49.133375952 +0000 UTC m=+36.093174775" observedRunningTime="2025-05-15 12:35:49.2533742 +0000 UTC m=+36.213173044" watchObservedRunningTime="2025-05-15 12:35:49.25657985 +0000 UTC m=+36.216378674" May 15 12:35:49.257187 kubelet[3207]: E0515 12:35:49.257170 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:49.257187 kubelet[3207]: W0515 12:35:49.257182 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:49.257251 kubelet[3207]: E0515 12:35:49.257190 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:49.257449 kubelet[3207]: E0515 12:35:49.257408 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:49.257449 kubelet[3207]: W0515 12:35:49.257416 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:49.257449 kubelet[3207]: E0515 12:35:49.257425 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:49.258129 kubelet[3207]: E0515 12:35:49.257702 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:49.258129 kubelet[3207]: W0515 12:35:49.257715 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:49.258129 kubelet[3207]: E0515 12:35:49.257722 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:49.258129 kubelet[3207]: E0515 12:35:49.257854 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:49.258129 kubelet[3207]: W0515 12:35:49.257860 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:49.258129 kubelet[3207]: E0515 12:35:49.257867 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:49.258129 kubelet[3207]: E0515 12:35:49.257975 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:49.258129 kubelet[3207]: W0515 12:35:49.258000 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:49.258129 kubelet[3207]: E0515 12:35:49.258007 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:49.258129 kubelet[3207]: E0515 12:35:49.258122 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:49.259134 kubelet[3207]: W0515 12:35:49.258128 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:49.259134 kubelet[3207]: E0515 12:35:49.258136 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:49.266799 kubelet[3207]: E0515 12:35:49.266782 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:49.266799 kubelet[3207]: W0515 12:35:49.266795 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:49.266910 kubelet[3207]: E0515 12:35:49.266805 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:49.267072 kubelet[3207]: E0515 12:35:49.267047 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:49.267072 kubelet[3207]: W0515 12:35:49.267061 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:49.267072 kubelet[3207]: E0515 12:35:49.267072 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:49.267242 kubelet[3207]: E0515 12:35:49.267223 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:49.267242 kubelet[3207]: W0515 12:35:49.267235 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:49.267293 kubelet[3207]: E0515 12:35:49.267250 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:49.267477 kubelet[3207]: E0515 12:35:49.267458 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:49.267477 kubelet[3207]: W0515 12:35:49.267469 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:49.267534 kubelet[3207]: E0515 12:35:49.267486 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:49.268323 kubelet[3207]: E0515 12:35:49.267853 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:49.268323 kubelet[3207]: W0515 12:35:49.267862 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:49.268323 kubelet[3207]: E0515 12:35:49.267883 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:49.268323 kubelet[3207]: E0515 12:35:49.268023 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:49.268323 kubelet[3207]: W0515 12:35:49.268030 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:49.268323 kubelet[3207]: E0515 12:35:49.268048 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:49.268323 kubelet[3207]: E0515 12:35:49.268214 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:49.268323 kubelet[3207]: W0515 12:35:49.268221 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:49.268323 kubelet[3207]: E0515 12:35:49.268323 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:49.268583 kubelet[3207]: E0515 12:35:49.268568 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:49.268583 kubelet[3207]: W0515 12:35:49.268580 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:49.268679 kubelet[3207]: E0515 12:35:49.268664 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:49.268679 kubelet[3207]: W0515 12:35:49.268676 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:49.268778 kubelet[3207]: E0515 12:35:49.268744 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:49.268778 kubelet[3207]: E0515 12:35:49.268767 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:49.268937 kubelet[3207]: E0515 12:35:49.268915 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:49.268937 kubelet[3207]: W0515 12:35:49.268927 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:49.269243 kubelet[3207]: E0515 12:35:49.269221 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:49.269374 kubelet[3207]: E0515 12:35:49.269357 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:49.269374 kubelet[3207]: W0515 12:35:49.269368 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:49.269423 kubelet[3207]: E0515 12:35:49.269391 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:49.270090 kubelet[3207]: E0515 12:35:49.270069 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:49.270090 kubelet[3207]: W0515 12:35:49.270082 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:49.270147 kubelet[3207]: E0515 12:35:49.270093 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:49.270248 kubelet[3207]: E0515 12:35:49.270232 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:49.270248 kubelet[3207]: W0515 12:35:49.270242 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:49.270505 kubelet[3207]: E0515 12:35:49.270478 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:49.270537 kubelet[3207]: E0515 12:35:49.270521 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:49.270537 kubelet[3207]: W0515 12:35:49.270526 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:49.270537 kubelet[3207]: E0515 12:35:49.270533 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:49.270863 kubelet[3207]: E0515 12:35:49.270810 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:49.270863 kubelet[3207]: W0515 12:35:49.270849 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:49.270944 kubelet[3207]: E0515 12:35:49.270923 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:49.271076 kubelet[3207]: E0515 12:35:49.271056 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:49.271488 kubelet[3207]: W0515 12:35:49.271086 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:49.271488 kubelet[3207]: E0515 12:35:49.271094 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:49.271488 kubelet[3207]: E0515 12:35:49.271237 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:49.271488 kubelet[3207]: W0515 12:35:49.271246 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:49.271488 kubelet[3207]: E0515 12:35:49.271259 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:49.271488 kubelet[3207]: E0515 12:35:49.271413 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:49.271488 kubelet[3207]: W0515 12:35:49.271420 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:49.271488 kubelet[3207]: E0515 12:35:49.271428 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:50.122837 kubelet[3207]: E0515 12:35:50.122682 3207 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2gvxv" podUID="530e456b-2944-4384-8744-d360e07d8aae" May 15 12:35:50.242964 kubelet[3207]: I0515 12:35:50.242896 3207 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 15 12:35:50.265528 kubelet[3207]: E0515 12:35:50.265484 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:50.265528 kubelet[3207]: W0515 12:35:50.265515 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:50.265868 kubelet[3207]: E0515 12:35:50.265561 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:50.265868 kubelet[3207]: E0515 12:35:50.265799 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:50.265868 kubelet[3207]: W0515 12:35:50.265813 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:50.265868 kubelet[3207]: E0515 12:35:50.265827 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:50.266021 kubelet[3207]: E0515 12:35:50.265987 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:50.266021 kubelet[3207]: W0515 12:35:50.266010 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:50.266097 kubelet[3207]: E0515 12:35:50.266022 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:50.266291 kubelet[3207]: E0515 12:35:50.266248 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:50.266291 kubelet[3207]: W0515 12:35:50.266270 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:50.266291 kubelet[3207]: E0515 12:35:50.266293 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:50.266567 kubelet[3207]: E0515 12:35:50.266494 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:50.266567 kubelet[3207]: W0515 12:35:50.266503 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:50.266567 kubelet[3207]: E0515 12:35:50.266511 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:50.266677 kubelet[3207]: E0515 12:35:50.266603 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:50.266677 kubelet[3207]: W0515 12:35:50.266611 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:50.266677 kubelet[3207]: E0515 12:35:50.266618 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:50.266951 kubelet[3207]: E0515 12:35:50.266707 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:50.266951 kubelet[3207]: W0515 12:35:50.266714 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:50.266951 kubelet[3207]: E0515 12:35:50.266723 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:50.266951 kubelet[3207]: E0515 12:35:50.266813 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:50.266951 kubelet[3207]: W0515 12:35:50.266819 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:50.266951 kubelet[3207]: E0515 12:35:50.266826 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:50.266951 kubelet[3207]: E0515 12:35:50.266941 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:50.266951 kubelet[3207]: W0515 12:35:50.266948 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:50.266951 kubelet[3207]: E0515 12:35:50.266955 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:50.267488 kubelet[3207]: E0515 12:35:50.267056 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:50.267488 kubelet[3207]: W0515 12:35:50.267063 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:50.267488 kubelet[3207]: E0515 12:35:50.267070 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:50.267488 kubelet[3207]: E0515 12:35:50.267202 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:50.267488 kubelet[3207]: W0515 12:35:50.267209 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:50.267488 kubelet[3207]: E0515 12:35:50.267216 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:50.267488 kubelet[3207]: E0515 12:35:50.267348 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:50.267488 kubelet[3207]: W0515 12:35:50.267356 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:50.267488 kubelet[3207]: E0515 12:35:50.267363 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:50.267488 kubelet[3207]: E0515 12:35:50.267481 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:50.267947 kubelet[3207]: W0515 12:35:50.267488 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:50.267947 kubelet[3207]: E0515 12:35:50.267495 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:50.267947 kubelet[3207]: E0515 12:35:50.267601 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:50.267947 kubelet[3207]: W0515 12:35:50.267608 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:50.267947 kubelet[3207]: E0515 12:35:50.267614 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:50.267947 kubelet[3207]: E0515 12:35:50.267717 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:50.267947 kubelet[3207]: W0515 12:35:50.267723 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:50.267947 kubelet[3207]: E0515 12:35:50.267730 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:50.274194 kubelet[3207]: E0515 12:35:50.274162 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:50.274194 kubelet[3207]: W0515 12:35:50.274189 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:50.274279 kubelet[3207]: E0515 12:35:50.274204 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:50.274520 kubelet[3207]: E0515 12:35:50.274489 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:50.274520 kubelet[3207]: W0515 12:35:50.274510 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:50.274592 kubelet[3207]: E0515 12:35:50.274532 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:50.274751 kubelet[3207]: E0515 12:35:50.274724 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:50.274751 kubelet[3207]: W0515 12:35:50.274742 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:50.274815 kubelet[3207]: E0515 12:35:50.274763 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:50.275026 kubelet[3207]: E0515 12:35:50.275001 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:50.275026 kubelet[3207]: W0515 12:35:50.275020 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:50.275091 kubelet[3207]: E0515 12:35:50.275041 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:50.275274 kubelet[3207]: E0515 12:35:50.275249 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:50.275274 kubelet[3207]: W0515 12:35:50.275268 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:50.275368 kubelet[3207]: E0515 12:35:50.275296 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:50.275618 kubelet[3207]: E0515 12:35:50.275592 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:50.275618 kubelet[3207]: W0515 12:35:50.275612 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:50.275687 kubelet[3207]: E0515 12:35:50.275632 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:50.275864 kubelet[3207]: E0515 12:35:50.275839 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:50.275864 kubelet[3207]: W0515 12:35:50.275857 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:50.276004 kubelet[3207]: E0515 12:35:50.275974 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:50.276152 kubelet[3207]: E0515 12:35:50.276120 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:50.276152 kubelet[3207]: W0515 12:35:50.276138 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:50.276318 kubelet[3207]: E0515 12:35:50.276279 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:50.276444 kubelet[3207]: E0515 12:35:50.276416 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:50.276444 kubelet[3207]: W0515 12:35:50.276436 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:50.276509 kubelet[3207]: E0515 12:35:50.276456 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:50.276776 kubelet[3207]: E0515 12:35:50.276745 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:50.276776 kubelet[3207]: W0515 12:35:50.276769 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:50.276838 kubelet[3207]: E0515 12:35:50.276805 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:50.277064 kubelet[3207]: E0515 12:35:50.277036 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:50.277064 kubelet[3207]: W0515 12:35:50.277057 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:50.277120 kubelet[3207]: E0515 12:35:50.277070 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:50.277493 kubelet[3207]: E0515 12:35:50.277464 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:50.277538 kubelet[3207]: W0515 12:35:50.277516 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:50.277567 kubelet[3207]: E0515 12:35:50.277538 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:50.277909 kubelet[3207]: E0515 12:35:50.277882 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:50.277909 kubelet[3207]: W0515 12:35:50.277902 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:50.277967 kubelet[3207]: E0515 12:35:50.277923 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:50.278170 kubelet[3207]: E0515 12:35:50.278146 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:50.278170 kubelet[3207]: W0515 12:35:50.278164 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:50.278230 kubelet[3207]: E0515 12:35:50.278198 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:50.278509 kubelet[3207]: E0515 12:35:50.278483 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:50.278509 kubelet[3207]: W0515 12:35:50.278503 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:50.278577 kubelet[3207]: E0515 12:35:50.278516 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:50.278730 kubelet[3207]: E0515 12:35:50.278701 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:50.278730 kubelet[3207]: W0515 12:35:50.278718 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:50.278730 kubelet[3207]: E0515 12:35:50.278730 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:50.278965 kubelet[3207]: E0515 12:35:50.278940 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:50.279011 kubelet[3207]: W0515 12:35:50.278966 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:50.279011 kubelet[3207]: E0515 12:35:50.278979 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:50.279455 kubelet[3207]: E0515 12:35:50.279429 3207 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 12:35:50.279455 kubelet[3207]: W0515 12:35:50.279448 3207 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 12:35:50.279518 kubelet[3207]: E0515 12:35:50.279461 3207 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 12:35:52.122543 kubelet[3207]: E0515 12:35:52.122471 3207 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2gvxv" podUID="530e456b-2944-4384-8744-d360e07d8aae" May 15 12:35:54.123593 kubelet[3207]: E0515 12:35:54.123543 3207 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2gvxv" podUID="530e456b-2944-4384-8744-d360e07d8aae" May 15 12:35:54.503371 containerd[1571]: time="2025-05-15T12:35:54.503267790Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:35:54.504313 containerd[1571]: time="2025-05-15T12:35:54.504197004Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3: active requests=0, bytes read=5366937" May 15 12:35:54.505196 containerd[1571]: time="2025-05-15T12:35:54.505170813Z" level=info msg="ImageCreate event name:\"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:35:54.506851 containerd[1571]: time="2025-05-15T12:35:54.506815902Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:35:54.507214 containerd[1571]: time="2025-05-15T12:35:54.507096379Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" with image id \"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\", size \"6859519\" in 5.37210305s" May 15 12:35:54.507214 containerd[1571]: time="2025-05-15T12:35:54.507123189Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" returns image reference \"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\"" May 15 12:35:54.510313 containerd[1571]: time="2025-05-15T12:35:54.510262165Z" level=info msg="CreateContainer within sandbox \"707fa5f61523b78a5e896fac2d98b2ffbfa13a54d3678c81085d638d8aa179c7\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 15 12:35:54.517964 containerd[1571]: time="2025-05-15T12:35:54.516174076Z" level=info msg="Container 618412cae37ee73f537ff619135965ecb8f483b34411e34fec694e7f48c64483: CDI devices from CRI Config.CDIDevices: []" May 15 12:35:54.539227 containerd[1571]: time="2025-05-15T12:35:54.539190620Z" level=info msg="CreateContainer within sandbox \"707fa5f61523b78a5e896fac2d98b2ffbfa13a54d3678c81085d638d8aa179c7\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"618412cae37ee73f537ff619135965ecb8f483b34411e34fec694e7f48c64483\"" May 15 12:35:54.540061 containerd[1571]: time="2025-05-15T12:35:54.539863973Z" level=info msg="StartContainer for \"618412cae37ee73f537ff619135965ecb8f483b34411e34fec694e7f48c64483\"" May 15 12:35:54.574684 containerd[1571]: time="2025-05-15T12:35:54.574646443Z" level=info msg="connecting to shim 618412cae37ee73f537ff619135965ecb8f483b34411e34fec694e7f48c64483" address="unix:///run/containerd/s/a32b1c3db32c6a823bd692a94942c573e83841159de206dcaa2a1f6aaa796632" protocol=ttrpc version=3 May 15 12:35:54.597493 systemd[1]: Started cri-containerd-618412cae37ee73f537ff619135965ecb8f483b34411e34fec694e7f48c64483.scope - libcontainer container 618412cae37ee73f537ff619135965ecb8f483b34411e34fec694e7f48c64483. May 15 12:35:54.634753 containerd[1571]: time="2025-05-15T12:35:54.634705890Z" level=info msg="StartContainer for \"618412cae37ee73f537ff619135965ecb8f483b34411e34fec694e7f48c64483\" returns successfully" May 15 12:35:54.647754 systemd[1]: cri-containerd-618412cae37ee73f537ff619135965ecb8f483b34411e34fec694e7f48c64483.scope: Deactivated successfully. May 15 12:35:54.666152 containerd[1571]: time="2025-05-15T12:35:54.665196409Z" level=info msg="TaskExit event in podsandbox handler container_id:\"618412cae37ee73f537ff619135965ecb8f483b34411e34fec694e7f48c64483\" id:\"618412cae37ee73f537ff619135965ecb8f483b34411e34fec694e7f48c64483\" pid:3837 exited_at:{seconds:1747312554 nanos:648954077}" May 15 12:35:54.691534 containerd[1571]: time="2025-05-15T12:35:54.691485397Z" level=info msg="received exit event container_id:\"618412cae37ee73f537ff619135965ecb8f483b34411e34fec694e7f48c64483\" id:\"618412cae37ee73f537ff619135965ecb8f483b34411e34fec694e7f48c64483\" pid:3837 exited_at:{seconds:1747312554 nanos:648954077}" May 15 12:35:54.712815 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-618412cae37ee73f537ff619135965ecb8f483b34411e34fec694e7f48c64483-rootfs.mount: Deactivated successfully. May 15 12:35:54.989166 kubelet[3207]: I0515 12:35:54.988920 3207 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 15 12:35:55.258113 containerd[1571]: time="2025-05-15T12:35:55.257935734Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\"" May 15 12:35:56.122286 kubelet[3207]: E0515 12:35:56.122245 3207 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2gvxv" podUID="530e456b-2944-4384-8744-d360e07d8aae" May 15 12:35:58.123132 kubelet[3207]: E0515 12:35:58.123068 3207 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2gvxv" podUID="530e456b-2944-4384-8744-d360e07d8aae" May 15 12:36:00.122895 kubelet[3207]: E0515 12:36:00.122821 3207 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2gvxv" podUID="530e456b-2944-4384-8744-d360e07d8aae" May 15 12:36:02.123237 kubelet[3207]: E0515 12:36:02.123194 3207 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2gvxv" podUID="530e456b-2944-4384-8744-d360e07d8aae" May 15 12:36:02.678997 containerd[1571]: time="2025-05-15T12:36:02.678939408Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:36:02.679990 containerd[1571]: time="2025-05-15T12:36:02.679821585Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.3: active requests=0, bytes read=97793683" May 15 12:36:02.680937 containerd[1571]: time="2025-05-15T12:36:02.680909558Z" level=info msg="ImageCreate event name:\"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:36:02.682667 containerd[1571]: time="2025-05-15T12:36:02.682638315Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:36:02.683371 containerd[1571]: time="2025-05-15T12:36:02.683323761Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.3\" with image id \"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\", size \"99286305\" in 7.425342021s" May 15 12:36:02.683736 containerd[1571]: time="2025-05-15T12:36:02.683439870Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\" returns image reference \"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\"" May 15 12:36:02.686088 containerd[1571]: time="2025-05-15T12:36:02.686053757Z" level=info msg="CreateContainer within sandbox \"707fa5f61523b78a5e896fac2d98b2ffbfa13a54d3678c81085d638d8aa179c7\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 15 12:36:02.697472 containerd[1571]: time="2025-05-15T12:36:02.697050247Z" level=info msg="Container 8224fd3bf6efd144d236bd2174a49d4897766eb457260b15646427dbef6b9011: CDI devices from CRI Config.CDIDevices: []" May 15 12:36:02.711215 containerd[1571]: time="2025-05-15T12:36:02.711173698Z" level=info msg="CreateContainer within sandbox \"707fa5f61523b78a5e896fac2d98b2ffbfa13a54d3678c81085d638d8aa179c7\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"8224fd3bf6efd144d236bd2174a49d4897766eb457260b15646427dbef6b9011\"" May 15 12:36:02.712091 containerd[1571]: time="2025-05-15T12:36:02.711828457Z" level=info msg="StartContainer for \"8224fd3bf6efd144d236bd2174a49d4897766eb457260b15646427dbef6b9011\"" May 15 12:36:02.714623 containerd[1571]: time="2025-05-15T12:36:02.714585715Z" level=info msg="connecting to shim 8224fd3bf6efd144d236bd2174a49d4897766eb457260b15646427dbef6b9011" address="unix:///run/containerd/s/a32b1c3db32c6a823bd692a94942c573e83841159de206dcaa2a1f6aaa796632" protocol=ttrpc version=3 May 15 12:36:02.735499 systemd[1]: Started cri-containerd-8224fd3bf6efd144d236bd2174a49d4897766eb457260b15646427dbef6b9011.scope - libcontainer container 8224fd3bf6efd144d236bd2174a49d4897766eb457260b15646427dbef6b9011. May 15 12:36:02.772294 containerd[1571]: time="2025-05-15T12:36:02.772215030Z" level=info msg="StartContainer for \"8224fd3bf6efd144d236bd2174a49d4897766eb457260b15646427dbef6b9011\" returns successfully" May 15 12:36:03.109078 systemd[1]: cri-containerd-8224fd3bf6efd144d236bd2174a49d4897766eb457260b15646427dbef6b9011.scope: Deactivated successfully. May 15 12:36:03.109343 systemd[1]: cri-containerd-8224fd3bf6efd144d236bd2174a49d4897766eb457260b15646427dbef6b9011.scope: Consumed 353ms CPU time, 145M memory peak, 12K read from disk, 154M written to disk. May 15 12:36:03.128156 containerd[1571]: time="2025-05-15T12:36:03.128015687Z" level=info msg="received exit event container_id:\"8224fd3bf6efd144d236bd2174a49d4897766eb457260b15646427dbef6b9011\" id:\"8224fd3bf6efd144d236bd2174a49d4897766eb457260b15646427dbef6b9011\" pid:3896 exited_at:{seconds:1747312563 nanos:110416611}" May 15 12:36:03.143544 containerd[1571]: time="2025-05-15T12:36:03.143474937Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8224fd3bf6efd144d236bd2174a49d4897766eb457260b15646427dbef6b9011\" id:\"8224fd3bf6efd144d236bd2174a49d4897766eb457260b15646427dbef6b9011\" pid:3896 exited_at:{seconds:1747312563 nanos:110416611}" May 15 12:36:03.166021 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8224fd3bf6efd144d236bd2174a49d4897766eb457260b15646427dbef6b9011-rootfs.mount: Deactivated successfully. May 15 12:36:03.171136 kubelet[3207]: I0515 12:36:03.170678 3207 kubelet_node_status.go:497] "Fast updating node status as it just became ready" May 15 12:36:03.213263 kubelet[3207]: I0515 12:36:03.213222 3207 topology_manager.go:215] "Topology Admit Handler" podUID="06a220b5-1770-44be-8b97-a7f6c8e515a9" podNamespace="kube-system" podName="coredns-7db6d8ff4d-s6gwc" May 15 12:36:03.226920 kubelet[3207]: I0515 12:36:03.226886 3207 topology_manager.go:215] "Topology Admit Handler" podUID="6f5a1d6d-abb2-4c77-8b9b-cd49f30f6287" podNamespace="kube-system" podName="coredns-7db6d8ff4d-krh5f" May 15 12:36:03.227183 kubelet[3207]: I0515 12:36:03.227025 3207 topology_manager.go:215] "Topology Admit Handler" podUID="0cea196a-c35a-4ed1-aba8-1434c056f4d9" podNamespace="calico-system" podName="calico-kube-controllers-8b9db6c54-8mmnz" May 15 12:36:03.231025 systemd[1]: Created slice kubepods-burstable-pod06a220b5_1770_44be_8b97_a7f6c8e515a9.slice - libcontainer container kubepods-burstable-pod06a220b5_1770_44be_8b97_a7f6c8e515a9.slice. May 15 12:36:03.236835 kubelet[3207]: I0515 12:36:03.236790 3207 topology_manager.go:215] "Topology Admit Handler" podUID="f536eb8d-06b8-43f3-9327-01d9d6a4d759" podNamespace="calico-apiserver" podName="calico-apiserver-9bffb6db8-8dkvq" May 15 12:36:03.238047 kubelet[3207]: I0515 12:36:03.238027 3207 topology_manager.go:215] "Topology Admit Handler" podUID="f7cfcf4a-92ec-4f4c-9f12-dafd70717cab" podNamespace="calico-apiserver" podName="calico-apiserver-9bffb6db8-psvtq" May 15 12:36:03.239443 kubelet[3207]: I0515 12:36:03.239424 3207 topology_manager.go:215] "Topology Admit Handler" podUID="495efd61-94a1-4706-8775-ab4f278dd3b6" podNamespace="calico-apiserver" podName="calico-apiserver-6cf5655c7b-ch7hd" May 15 12:36:03.242563 systemd[1]: Created slice kubepods-besteffort-pod0cea196a_c35a_4ed1_aba8_1434c056f4d9.slice - libcontainer container kubepods-besteffort-pod0cea196a_c35a_4ed1_aba8_1434c056f4d9.slice. May 15 12:36:03.253322 systemd[1]: Created slice kubepods-burstable-pod6f5a1d6d_abb2_4c77_8b9b_cd49f30f6287.slice - libcontainer container kubepods-burstable-pod6f5a1d6d_abb2_4c77_8b9b_cd49f30f6287.slice. May 15 12:36:03.259022 systemd[1]: Created slice kubepods-besteffort-podf7cfcf4a_92ec_4f4c_9f12_dafd70717cab.slice - libcontainer container kubepods-besteffort-podf7cfcf4a_92ec_4f4c_9f12_dafd70717cab.slice. May 15 12:36:03.265078 systemd[1]: Created slice kubepods-besteffort-pod495efd61_94a1_4706_8775_ab4f278dd3b6.slice - libcontainer container kubepods-besteffort-pod495efd61_94a1_4706_8775_ab4f278dd3b6.slice. May 15 12:36:03.269314 systemd[1]: Created slice kubepods-besteffort-podf536eb8d_06b8_43f3_9327_01d9d6a4d759.slice - libcontainer container kubepods-besteffort-podf536eb8d_06b8_43f3_9327_01d9d6a4d759.slice. May 15 12:36:03.282101 containerd[1571]: time="2025-05-15T12:36:03.282067882Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\"" May 15 12:36:03.365209 kubelet[3207]: I0515 12:36:03.365088 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f7cfcf4a-92ec-4f4c-9f12-dafd70717cab-calico-apiserver-certs\") pod \"calico-apiserver-9bffb6db8-psvtq\" (UID: \"f7cfcf4a-92ec-4f4c-9f12-dafd70717cab\") " pod="calico-apiserver/calico-apiserver-9bffb6db8-psvtq" May 15 12:36:03.366248 kubelet[3207]: I0515 12:36:03.365265 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r98kp\" (UniqueName: \"kubernetes.io/projected/06a220b5-1770-44be-8b97-a7f6c8e515a9-kube-api-access-r98kp\") pod \"coredns-7db6d8ff4d-s6gwc\" (UID: \"06a220b5-1770-44be-8b97-a7f6c8e515a9\") " pod="kube-system/coredns-7db6d8ff4d-s6gwc" May 15 12:36:03.366248 kubelet[3207]: I0515 12:36:03.365315 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4mtg\" (UniqueName: \"kubernetes.io/projected/6f5a1d6d-abb2-4c77-8b9b-cd49f30f6287-kube-api-access-l4mtg\") pod \"coredns-7db6d8ff4d-krh5f\" (UID: \"6f5a1d6d-abb2-4c77-8b9b-cd49f30f6287\") " pod="kube-system/coredns-7db6d8ff4d-krh5f" May 15 12:36:03.366248 kubelet[3207]: I0515 12:36:03.365354 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppqg2\" (UniqueName: \"kubernetes.io/projected/f7cfcf4a-92ec-4f4c-9f12-dafd70717cab-kube-api-access-ppqg2\") pod \"calico-apiserver-9bffb6db8-psvtq\" (UID: \"f7cfcf4a-92ec-4f4c-9f12-dafd70717cab\") " pod="calico-apiserver/calico-apiserver-9bffb6db8-psvtq" May 15 12:36:03.366248 kubelet[3207]: I0515 12:36:03.365393 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/495efd61-94a1-4706-8775-ab4f278dd3b6-calico-apiserver-certs\") pod \"calico-apiserver-6cf5655c7b-ch7hd\" (UID: \"495efd61-94a1-4706-8775-ab4f278dd3b6\") " pod="calico-apiserver/calico-apiserver-6cf5655c7b-ch7hd" May 15 12:36:03.366248 kubelet[3207]: I0515 12:36:03.365409 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf7nc\" (UniqueName: \"kubernetes.io/projected/0cea196a-c35a-4ed1-aba8-1434c056f4d9-kube-api-access-wf7nc\") pod \"calico-kube-controllers-8b9db6c54-8mmnz\" (UID: \"0cea196a-c35a-4ed1-aba8-1434c056f4d9\") " pod="calico-system/calico-kube-controllers-8b9db6c54-8mmnz" May 15 12:36:03.366409 kubelet[3207]: I0515 12:36:03.365424 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fcdp\" (UniqueName: \"kubernetes.io/projected/495efd61-94a1-4706-8775-ab4f278dd3b6-kube-api-access-8fcdp\") pod \"calico-apiserver-6cf5655c7b-ch7hd\" (UID: \"495efd61-94a1-4706-8775-ab4f278dd3b6\") " pod="calico-apiserver/calico-apiserver-6cf5655c7b-ch7hd" May 15 12:36:03.366409 kubelet[3207]: I0515 12:36:03.365439 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06a220b5-1770-44be-8b97-a7f6c8e515a9-config-volume\") pod \"coredns-7db6d8ff4d-s6gwc\" (UID: \"06a220b5-1770-44be-8b97-a7f6c8e515a9\") " pod="kube-system/coredns-7db6d8ff4d-s6gwc" May 15 12:36:03.366409 kubelet[3207]: I0515 12:36:03.365454 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0cea196a-c35a-4ed1-aba8-1434c056f4d9-tigera-ca-bundle\") pod \"calico-kube-controllers-8b9db6c54-8mmnz\" (UID: \"0cea196a-c35a-4ed1-aba8-1434c056f4d9\") " pod="calico-system/calico-kube-controllers-8b9db6c54-8mmnz" May 15 12:36:03.366409 kubelet[3207]: I0515 12:36:03.365470 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f536eb8d-06b8-43f3-9327-01d9d6a4d759-calico-apiserver-certs\") pod \"calico-apiserver-9bffb6db8-8dkvq\" (UID: \"f536eb8d-06b8-43f3-9327-01d9d6a4d759\") " pod="calico-apiserver/calico-apiserver-9bffb6db8-8dkvq" May 15 12:36:03.366409 kubelet[3207]: I0515 12:36:03.365482 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f5a1d6d-abb2-4c77-8b9b-cd49f30f6287-config-volume\") pod \"coredns-7db6d8ff4d-krh5f\" (UID: \"6f5a1d6d-abb2-4c77-8b9b-cd49f30f6287\") " pod="kube-system/coredns-7db6d8ff4d-krh5f" May 15 12:36:03.366504 kubelet[3207]: I0515 12:36:03.365498 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5htrp\" (UniqueName: \"kubernetes.io/projected/f536eb8d-06b8-43f3-9327-01d9d6a4d759-kube-api-access-5htrp\") pod \"calico-apiserver-9bffb6db8-8dkvq\" (UID: \"f536eb8d-06b8-43f3-9327-01d9d6a4d759\") " pod="calico-apiserver/calico-apiserver-9bffb6db8-8dkvq" May 15 12:36:03.547315 containerd[1571]: time="2025-05-15T12:36:03.547240878Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-s6gwc,Uid:06a220b5-1770-44be-8b97-a7f6c8e515a9,Namespace:kube-system,Attempt:0,}" May 15 12:36:03.549263 containerd[1571]: time="2025-05-15T12:36:03.549204655Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8b9db6c54-8mmnz,Uid:0cea196a-c35a-4ed1-aba8-1434c056f4d9,Namespace:calico-system,Attempt:0,}" May 15 12:36:03.563749 containerd[1571]: time="2025-05-15T12:36:03.563351449Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9bffb6db8-psvtq,Uid:f7cfcf4a-92ec-4f4c-9f12-dafd70717cab,Namespace:calico-apiserver,Attempt:0,}" May 15 12:36:03.564610 containerd[1571]: time="2025-05-15T12:36:03.564179264Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-krh5f,Uid:6f5a1d6d-abb2-4c77-8b9b-cd49f30f6287,Namespace:kube-system,Attempt:0,}" May 15 12:36:03.579345 containerd[1571]: time="2025-05-15T12:36:03.579173800Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9bffb6db8-8dkvq,Uid:f536eb8d-06b8-43f3-9327-01d9d6a4d759,Namespace:calico-apiserver,Attempt:0,}" May 15 12:36:03.581548 containerd[1571]: time="2025-05-15T12:36:03.581360266Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6cf5655c7b-ch7hd,Uid:495efd61-94a1-4706-8775-ab4f278dd3b6,Namespace:calico-apiserver,Attempt:0,}" May 15 12:36:03.792515 containerd[1571]: time="2025-05-15T12:36:03.792465494Z" level=error msg="Failed to destroy network for sandbox \"2eab7cc818330fb02d0ff823e8e1b69d041f5ae5faeef1d0b1dd1ff2c400c6ed\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:36:03.795989 systemd[1]: run-netns-cni\x2dc98c17f5\x2d9081\x2d505a\x2d199f\x2d178e829888ce.mount: Deactivated successfully. May 15 12:36:03.798510 containerd[1571]: time="2025-05-15T12:36:03.796861639Z" level=error msg="Failed to destroy network for sandbox \"5b606b94c710a39f7e2ca3db4cd41fec7948dbf0f3d340a813ef7679b00d65da\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:36:03.800011 systemd[1]: run-netns-cni\x2d7424ec1d\x2d86f8\x2d6ff0\x2d7339\x2df903a70c8b7e.mount: Deactivated successfully. May 15 12:36:03.800557 containerd[1571]: time="2025-05-15T12:36:03.800104168Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-krh5f,Uid:6f5a1d6d-abb2-4c77-8b9b-cd49f30f6287,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2eab7cc818330fb02d0ff823e8e1b69d041f5ae5faeef1d0b1dd1ff2c400c6ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:36:03.802173 kubelet[3207]: E0515 12:36:03.801486 3207 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2eab7cc818330fb02d0ff823e8e1b69d041f5ae5faeef1d0b1dd1ff2c400c6ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:36:03.802173 kubelet[3207]: E0515 12:36:03.801838 3207 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2eab7cc818330fb02d0ff823e8e1b69d041f5ae5faeef1d0b1dd1ff2c400c6ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-krh5f" May 15 12:36:03.802173 kubelet[3207]: E0515 12:36:03.801857 3207 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2eab7cc818330fb02d0ff823e8e1b69d041f5ae5faeef1d0b1dd1ff2c400c6ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-krh5f" May 15 12:36:03.802272 kubelet[3207]: E0515 12:36:03.801899 3207 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-krh5f_kube-system(6f5a1d6d-abb2-4c77-8b9b-cd49f30f6287)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-krh5f_kube-system(6f5a1d6d-abb2-4c77-8b9b-cd49f30f6287)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2eab7cc818330fb02d0ff823e8e1b69d041f5ae5faeef1d0b1dd1ff2c400c6ed\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-krh5f" podUID="6f5a1d6d-abb2-4c77-8b9b-cd49f30f6287" May 15 12:36:03.804221 containerd[1571]: time="2025-05-15T12:36:03.804166215Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6cf5655c7b-ch7hd,Uid:495efd61-94a1-4706-8775-ab4f278dd3b6,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b606b94c710a39f7e2ca3db4cd41fec7948dbf0f3d340a813ef7679b00d65da\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:36:03.805322 kubelet[3207]: E0515 12:36:03.804833 3207 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b606b94c710a39f7e2ca3db4cd41fec7948dbf0f3d340a813ef7679b00d65da\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:36:03.805322 kubelet[3207]: E0515 12:36:03.804909 3207 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b606b94c710a39f7e2ca3db4cd41fec7948dbf0f3d340a813ef7679b00d65da\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6cf5655c7b-ch7hd" May 15 12:36:03.805322 kubelet[3207]: E0515 12:36:03.804934 3207 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b606b94c710a39f7e2ca3db4cd41fec7948dbf0f3d340a813ef7679b00d65da\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6cf5655c7b-ch7hd" May 15 12:36:03.805854 kubelet[3207]: E0515 12:36:03.805024 3207 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6cf5655c7b-ch7hd_calico-apiserver(495efd61-94a1-4706-8775-ab4f278dd3b6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6cf5655c7b-ch7hd_calico-apiserver(495efd61-94a1-4706-8775-ab4f278dd3b6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5b606b94c710a39f7e2ca3db4cd41fec7948dbf0f3d340a813ef7679b00d65da\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6cf5655c7b-ch7hd" podUID="495efd61-94a1-4706-8775-ab4f278dd3b6" May 15 12:36:03.819010 containerd[1571]: time="2025-05-15T12:36:03.818892809Z" level=error msg="Failed to destroy network for sandbox \"fcd9983cd99b82620446679d8c885a9fabd1225b13d148398b3f4ceffbba98ff\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:36:03.820356 containerd[1571]: time="2025-05-15T12:36:03.820302786Z" level=error msg="Failed to destroy network for sandbox \"414fe4156397d88d65e580e8c519d2a188e8c5c5850c50f0f0357086a5b93672\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:36:03.822504 containerd[1571]: time="2025-05-15T12:36:03.822459757Z" level=error msg="Failed to destroy network for sandbox \"b84df4719363a54308d1caa0a98b045fa2ce814302f40ab413d58d965e0f50bc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:36:03.822953 containerd[1571]: time="2025-05-15T12:36:03.822921794Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9bffb6db8-8dkvq,Uid:f536eb8d-06b8-43f3-9327-01d9d6a4d759,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"fcd9983cd99b82620446679d8c885a9fabd1225b13d148398b3f4ceffbba98ff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:36:03.823199 systemd[1]: run-netns-cni\x2dc32e0373\x2d3713\x2d47ed\x2d1642\x2dabcd63fd41c9.mount: Deactivated successfully. May 15 12:36:03.823301 systemd[1]: run-netns-cni\x2d2f17ad2d\x2ded28\x2dc607\x2d2d57\x2df1df124f4528.mount: Deactivated successfully. May 15 12:36:03.824195 kubelet[3207]: E0515 12:36:03.824115 3207 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fcd9983cd99b82620446679d8c885a9fabd1225b13d148398b3f4ceffbba98ff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:36:03.824340 containerd[1571]: time="2025-05-15T12:36:03.824232846Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-s6gwc,Uid:06a220b5-1770-44be-8b97-a7f6c8e515a9,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"414fe4156397d88d65e580e8c519d2a188e8c5c5850c50f0f0357086a5b93672\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:36:03.826470 kubelet[3207]: E0515 12:36:03.824411 3207 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"414fe4156397d88d65e580e8c519d2a188e8c5c5850c50f0f0357086a5b93672\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:36:03.826470 kubelet[3207]: E0515 12:36:03.824443 3207 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"414fe4156397d88d65e580e8c519d2a188e8c5c5850c50f0f0357086a5b93672\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-s6gwc" May 15 12:36:03.826470 kubelet[3207]: E0515 12:36:03.824466 3207 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"414fe4156397d88d65e580e8c519d2a188e8c5c5850c50f0f0357086a5b93672\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-s6gwc" May 15 12:36:03.828680 kubelet[3207]: E0515 12:36:03.824518 3207 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-s6gwc_kube-system(06a220b5-1770-44be-8b97-a7f6c8e515a9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-s6gwc_kube-system(06a220b5-1770-44be-8b97-a7f6c8e515a9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"414fe4156397d88d65e580e8c519d2a188e8c5c5850c50f0f0357086a5b93672\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-s6gwc" podUID="06a220b5-1770-44be-8b97-a7f6c8e515a9" May 15 12:36:03.828837 containerd[1571]: time="2025-05-15T12:36:03.826957301Z" level=error msg="Failed to destroy network for sandbox \"189dee09196733cf8e72849d85a7318c02f14d923b132c8c3e614334116b27ed\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:36:03.829536 kubelet[3207]: E0515 12:36:03.828372 3207 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fcd9983cd99b82620446679d8c885a9fabd1225b13d148398b3f4ceffbba98ff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-9bffb6db8-8dkvq" May 15 12:36:03.829629 kubelet[3207]: E0515 12:36:03.829540 3207 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fcd9983cd99b82620446679d8c885a9fabd1225b13d148398b3f4ceffbba98ff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-9bffb6db8-8dkvq" May 15 12:36:03.829629 kubelet[3207]: E0515 12:36:03.829595 3207 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-9bffb6db8-8dkvq_calico-apiserver(f536eb8d-06b8-43f3-9327-01d9d6a4d759)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-9bffb6db8-8dkvq_calico-apiserver(f536eb8d-06b8-43f3-9327-01d9d6a4d759)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fcd9983cd99b82620446679d8c885a9fabd1225b13d148398b3f4ceffbba98ff\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-9bffb6db8-8dkvq" podUID="f536eb8d-06b8-43f3-9327-01d9d6a4d759" May 15 12:36:03.829966 containerd[1571]: time="2025-05-15T12:36:03.829889568Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8b9db6c54-8mmnz,Uid:0cea196a-c35a-4ed1-aba8-1434c056f4d9,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b84df4719363a54308d1caa0a98b045fa2ce814302f40ab413d58d965e0f50bc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:36:03.830767 kubelet[3207]: E0515 12:36:03.830571 3207 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b84df4719363a54308d1caa0a98b045fa2ce814302f40ab413d58d965e0f50bc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:36:03.830767 kubelet[3207]: E0515 12:36:03.830604 3207 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b84df4719363a54308d1caa0a98b045fa2ce814302f40ab413d58d965e0f50bc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8b9db6c54-8mmnz" May 15 12:36:03.830767 kubelet[3207]: E0515 12:36:03.830618 3207 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b84df4719363a54308d1caa0a98b045fa2ce814302f40ab413d58d965e0f50bc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8b9db6c54-8mmnz" May 15 12:36:03.830860 kubelet[3207]: E0515 12:36:03.830646 3207 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-8b9db6c54-8mmnz_calico-system(0cea196a-c35a-4ed1-aba8-1434c056f4d9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-8b9db6c54-8mmnz_calico-system(0cea196a-c35a-4ed1-aba8-1434c056f4d9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b84df4719363a54308d1caa0a98b045fa2ce814302f40ab413d58d965e0f50bc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-8b9db6c54-8mmnz" podUID="0cea196a-c35a-4ed1-aba8-1434c056f4d9" May 15 12:36:03.831508 containerd[1571]: time="2025-05-15T12:36:03.831465877Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9bffb6db8-psvtq,Uid:f7cfcf4a-92ec-4f4c-9f12-dafd70717cab,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"189dee09196733cf8e72849d85a7318c02f14d923b132c8c3e614334116b27ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:36:03.831608 systemd[1]: run-netns-cni\x2dc24a04d5\x2dc616\x2d29b9\x2d445f\x2d5016b249b263.mount: Deactivated successfully. May 15 12:36:03.831684 systemd[1]: run-netns-cni\x2d391a1caa\x2d964a\x2d4ec8\x2d6c4d\x2d21fb9d3ff117.mount: Deactivated successfully. May 15 12:36:03.832075 kubelet[3207]: E0515 12:36:03.831961 3207 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"189dee09196733cf8e72849d85a7318c02f14d923b132c8c3e614334116b27ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:36:03.832313 kubelet[3207]: E0515 12:36:03.832205 3207 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"189dee09196733cf8e72849d85a7318c02f14d923b132c8c3e614334116b27ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-9bffb6db8-psvtq" May 15 12:36:03.832547 kubelet[3207]: E0515 12:36:03.832393 3207 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"189dee09196733cf8e72849d85a7318c02f14d923b132c8c3e614334116b27ed\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-9bffb6db8-psvtq" May 15 12:36:03.832547 kubelet[3207]: E0515 12:36:03.832614 3207 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-9bffb6db8-psvtq_calico-apiserver(f7cfcf4a-92ec-4f4c-9f12-dafd70717cab)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-9bffb6db8-psvtq_calico-apiserver(f7cfcf4a-92ec-4f4c-9f12-dafd70717cab)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"189dee09196733cf8e72849d85a7318c02f14d923b132c8c3e614334116b27ed\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-9bffb6db8-psvtq" podUID="f7cfcf4a-92ec-4f4c-9f12-dafd70717cab" May 15 12:36:04.127808 systemd[1]: Created slice kubepods-besteffort-pod530e456b_2944_4384_8744_d360e07d8aae.slice - libcontainer container kubepods-besteffort-pod530e456b_2944_4384_8744_d360e07d8aae.slice. May 15 12:36:04.130406 containerd[1571]: time="2025-05-15T12:36:04.130370751Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2gvxv,Uid:530e456b-2944-4384-8744-d360e07d8aae,Namespace:calico-system,Attempt:0,}" May 15 12:36:04.171068 containerd[1571]: time="2025-05-15T12:36:04.171003200Z" level=error msg="Failed to destroy network for sandbox \"2915eb1487bd421e1967fafa91e7d7d253d75f15d6cf0cac0edc3e52b02be249\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:36:04.172147 containerd[1571]: time="2025-05-15T12:36:04.172108454Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2gvxv,Uid:530e456b-2944-4384-8744-d360e07d8aae,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2915eb1487bd421e1967fafa91e7d7d253d75f15d6cf0cac0edc3e52b02be249\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:36:04.172437 kubelet[3207]: E0515 12:36:04.172389 3207 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2915eb1487bd421e1967fafa91e7d7d253d75f15d6cf0cac0edc3e52b02be249\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 12:36:04.172659 kubelet[3207]: E0515 12:36:04.172447 3207 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2915eb1487bd421e1967fafa91e7d7d253d75f15d6cf0cac0edc3e52b02be249\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-2gvxv" May 15 12:36:04.172659 kubelet[3207]: E0515 12:36:04.172465 3207 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2915eb1487bd421e1967fafa91e7d7d253d75f15d6cf0cac0edc3e52b02be249\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-2gvxv" May 15 12:36:04.172659 kubelet[3207]: E0515 12:36:04.172500 3207 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-2gvxv_calico-system(530e456b-2944-4384-8744-d360e07d8aae)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-2gvxv_calico-system(530e456b-2944-4384-8744-d360e07d8aae)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2915eb1487bd421e1967fafa91e7d7d253d75f15d6cf0cac0edc3e52b02be249\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-2gvxv" podUID="530e456b-2944-4384-8744-d360e07d8aae" May 15 12:36:10.986093 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1319892662.mount: Deactivated successfully. May 15 12:36:11.019696 containerd[1571]: time="2025-05-15T12:36:11.019379391Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:36:11.021228 containerd[1571]: time="2025-05-15T12:36:11.020919221Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.3: active requests=0, bytes read=144068748" May 15 12:36:11.028604 containerd[1571]: time="2025-05-15T12:36:11.028578824Z" level=info msg="ImageCreate event name:\"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:36:11.031535 containerd[1571]: time="2025-05-15T12:36:11.031459583Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:36:11.032201 containerd[1571]: time="2025-05-15T12:36:11.032179384Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.3\" with image id \"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\", size \"144068610\" in 7.750080664s" May 15 12:36:11.032741 containerd[1571]: time="2025-05-15T12:36:11.032300613Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\" returns image reference \"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\"" May 15 12:36:11.061441 containerd[1571]: time="2025-05-15T12:36:11.061384862Z" level=info msg="CreateContainer within sandbox \"707fa5f61523b78a5e896fac2d98b2ffbfa13a54d3678c81085d638d8aa179c7\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 15 12:36:11.084346 containerd[1571]: time="2025-05-15T12:36:11.082696127Z" level=info msg="Container 5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5: CDI devices from CRI Config.CDIDevices: []" May 15 12:36:11.087491 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4251811641.mount: Deactivated successfully. May 15 12:36:11.146625 containerd[1571]: time="2025-05-15T12:36:11.146567986Z" level=info msg="CreateContainer within sandbox \"707fa5f61523b78a5e896fac2d98b2ffbfa13a54d3678c81085d638d8aa179c7\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5\"" May 15 12:36:11.148223 containerd[1571]: time="2025-05-15T12:36:11.147362900Z" level=info msg="StartContainer for \"5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5\"" May 15 12:36:11.149299 containerd[1571]: time="2025-05-15T12:36:11.149245965Z" level=info msg="connecting to shim 5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5" address="unix:///run/containerd/s/a32b1c3db32c6a823bd692a94942c573e83841159de206dcaa2a1f6aaa796632" protocol=ttrpc version=3 May 15 12:36:11.216453 systemd[1]: Started cri-containerd-5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5.scope - libcontainer container 5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5. May 15 12:36:11.276954 containerd[1571]: time="2025-05-15T12:36:11.276828532Z" level=info msg="StartContainer for \"5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5\" returns successfully" May 15 12:36:11.552892 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 15 12:36:11.553472 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 15 12:36:11.571754 containerd[1571]: time="2025-05-15T12:36:11.571694098Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5\" id:\"6f6bba469f74d30024ace0b732a05a95daca7968ed104077697fee7147c98c5f\" pid:4208 exit_status:1 exited_at:{seconds:1747312571 nanos:560849595}" May 15 12:36:12.359701 containerd[1571]: time="2025-05-15T12:36:12.359659351Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5\" id:\"5114b8ea3867c4ba51b742940021db249949bab0abc8b819503f381fdcc94475\" pid:4263 exit_status:1 exited_at:{seconds:1747312572 nanos:359314854}" May 15 12:36:13.255251 systemd-networkd[1471]: vxlan.calico: Link UP May 15 12:36:13.255275 systemd-networkd[1471]: vxlan.calico: Gained carrier May 15 12:36:13.375132 containerd[1571]: time="2025-05-15T12:36:13.375099751Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5\" id:\"757acc5da830930985015b77f0dca96be5358db887d850672063fd42094375a3\" pid:4450 exit_status:1 exited_at:{seconds:1747312573 nanos:374833892}" May 15 12:36:14.019378 containerd[1571]: time="2025-05-15T12:36:14.019341238Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5\" id:\"7e68bfc3a330aa64f4923ce322f17d34620ddc6a147820c47582d456e4410f98\" pid:4505 exit_status:1 exited_at:{seconds:1747312574 nanos:19057134}" May 15 12:36:14.123271 containerd[1571]: time="2025-05-15T12:36:14.123208711Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9bffb6db8-8dkvq,Uid:f536eb8d-06b8-43f3-9327-01d9d6a4d759,Namespace:calico-apiserver,Attempt:0,}" May 15 12:36:14.418724 systemd-networkd[1471]: cali07bd55d4895: Link UP May 15 12:36:14.419360 systemd-networkd[1471]: cali07bd55d4895: Gained carrier May 15 12:36:14.433788 kubelet[3207]: I0515 12:36:14.432450 3207 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-prt2t" podStartSLOduration=4.278846784 podStartE2EDuration="38.432432775s" podCreationTimestamp="2025-05-15 12:35:36 +0000 UTC" firstStartedPulling="2025-05-15 12:35:36.885609728 +0000 UTC m=+23.845408551" lastFinishedPulling="2025-05-15 12:36:11.039195718 +0000 UTC m=+57.998994542" observedRunningTime="2025-05-15 12:36:11.322157033 +0000 UTC m=+58.281955887" watchObservedRunningTime="2025-05-15 12:36:14.432432775 +0000 UTC m=+61.392231599" May 15 12:36:14.436208 containerd[1571]: 2025-05-15 12:36:14.188 [INFO][4518] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--8dkvq-eth0 calico-apiserver-9bffb6db8- calico-apiserver f536eb8d-06b8-43f3-9327-01d9d6a4d759 746 0 2025-05-15 12:35:36 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:9bffb6db8 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4334-0-0-a-dce95649a9 calico-apiserver-9bffb6db8-8dkvq eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali07bd55d4895 [] []}} ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" Namespace="calico-apiserver" Pod="calico-apiserver-9bffb6db8-8dkvq" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--8dkvq-" May 15 12:36:14.436208 containerd[1571]: 2025-05-15 12:36:14.189 [INFO][4518] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" Namespace="calico-apiserver" Pod="calico-apiserver-9bffb6db8-8dkvq" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--8dkvq-eth0" May 15 12:36:14.436208 containerd[1571]: 2025-05-15 12:36:14.357 [INFO][4529] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" HandleID="k8s-pod-network.add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--8dkvq-eth0" May 15 12:36:14.437162 containerd[1571]: 2025-05-15 12:36:14.377 [INFO][4529] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" HandleID="k8s-pod-network.add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--8dkvq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00030fe80), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4334-0-0-a-dce95649a9", "pod":"calico-apiserver-9bffb6db8-8dkvq", "timestamp":"2025-05-15 12:36:14.357172455 +0000 UTC"}, Hostname:"ci-4334-0-0-a-dce95649a9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 15 12:36:14.437162 containerd[1571]: 2025-05-15 12:36:14.377 [INFO][4529] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:36:14.437162 containerd[1571]: 2025-05-15 12:36:14.377 [INFO][4529] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:36:14.437162 containerd[1571]: 2025-05-15 12:36:14.377 [INFO][4529] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4334-0-0-a-dce95649a9' May 15 12:36:14.437162 containerd[1571]: 2025-05-15 12:36:14.379 [INFO][4529] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:14.437162 containerd[1571]: 2025-05-15 12:36:14.386 [INFO][4529] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4334-0-0-a-dce95649a9" May 15 12:36:14.437162 containerd[1571]: 2025-05-15 12:36:14.391 [INFO][4529] ipam/ipam.go 489: Trying affinity for 192.168.88.192/26 host="ci-4334-0-0-a-dce95649a9" May 15 12:36:14.437162 containerd[1571]: 2025-05-15 12:36:14.393 [INFO][4529] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.192/26 host="ci-4334-0-0-a-dce95649a9" May 15 12:36:14.437162 containerd[1571]: 2025-05-15 12:36:14.396 [INFO][4529] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.192/26 host="ci-4334-0-0-a-dce95649a9" May 15 12:36:14.437806 containerd[1571]: 2025-05-15 12:36:14.396 [INFO][4529] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.192/26 handle="k8s-pod-network.add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:14.437806 containerd[1571]: 2025-05-15 12:36:14.398 [INFO][4529] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8 May 15 12:36:14.437806 containerd[1571]: 2025-05-15 12:36:14.403 [INFO][4529] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.192/26 handle="k8s-pod-network.add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:14.437806 containerd[1571]: 2025-05-15 12:36:14.410 [INFO][4529] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.193/26] block=192.168.88.192/26 handle="k8s-pod-network.add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:14.437806 containerd[1571]: 2025-05-15 12:36:14.410 [INFO][4529] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.193/26] handle="k8s-pod-network.add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:14.437806 containerd[1571]: 2025-05-15 12:36:14.410 [INFO][4529] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:36:14.437806 containerd[1571]: 2025-05-15 12:36:14.410 [INFO][4529] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.193/26] IPv6=[] ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" HandleID="k8s-pod-network.add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--8dkvq-eth0" May 15 12:36:14.438762 containerd[1571]: 2025-05-15 12:36:14.412 [INFO][4518] cni-plugin/k8s.go 386: Populated endpoint ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" Namespace="calico-apiserver" Pod="calico-apiserver-9bffb6db8-8dkvq" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--8dkvq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--8dkvq-eth0", GenerateName:"calico-apiserver-9bffb6db8-", Namespace:"calico-apiserver", SelfLink:"", UID:"f536eb8d-06b8-43f3-9327-01d9d6a4d759", ResourceVersion:"746", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 35, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"9bffb6db8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334-0-0-a-dce95649a9", ContainerID:"", Pod:"calico-apiserver-9bffb6db8-8dkvq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali07bd55d4895", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:36:14.438825 containerd[1571]: 2025-05-15 12:36:14.413 [INFO][4518] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.193/32] ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" Namespace="calico-apiserver" Pod="calico-apiserver-9bffb6db8-8dkvq" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--8dkvq-eth0" May 15 12:36:14.438825 containerd[1571]: 2025-05-15 12:36:14.413 [INFO][4518] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali07bd55d4895 ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" Namespace="calico-apiserver" Pod="calico-apiserver-9bffb6db8-8dkvq" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--8dkvq-eth0" May 15 12:36:14.438825 containerd[1571]: 2025-05-15 12:36:14.420 [INFO][4518] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" Namespace="calico-apiserver" Pod="calico-apiserver-9bffb6db8-8dkvq" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--8dkvq-eth0" May 15 12:36:14.438917 containerd[1571]: 2025-05-15 12:36:14.420 [INFO][4518] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" Namespace="calico-apiserver" Pod="calico-apiserver-9bffb6db8-8dkvq" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--8dkvq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--8dkvq-eth0", GenerateName:"calico-apiserver-9bffb6db8-", Namespace:"calico-apiserver", SelfLink:"", UID:"f536eb8d-06b8-43f3-9327-01d9d6a4d759", ResourceVersion:"746", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 35, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"9bffb6db8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334-0-0-a-dce95649a9", ContainerID:"add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8", Pod:"calico-apiserver-9bffb6db8-8dkvq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali07bd55d4895", MAC:"3e:5e:29:fb:d1:1e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:36:14.439151 containerd[1571]: 2025-05-15 12:36:14.431 [INFO][4518] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" Namespace="calico-apiserver" Pod="calico-apiserver-9bffb6db8-8dkvq" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--8dkvq-eth0" May 15 12:36:14.476577 containerd[1571]: time="2025-05-15T12:36:14.476516818Z" level=info msg="connecting to shim add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" address="unix:///run/containerd/s/ebdbb70bcca08bed7d39e3342d8fb68844f4ddb48fff95fdefc1f8c870cce150" namespace=k8s.io protocol=ttrpc version=3 May 15 12:36:14.519465 systemd[1]: Started cri-containerd-add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8.scope - libcontainer container add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8. May 15 12:36:14.571289 containerd[1571]: time="2025-05-15T12:36:14.571207812Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9bffb6db8-8dkvq,Uid:f536eb8d-06b8-43f3-9327-01d9d6a4d759,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8\"" May 15 12:36:14.572984 containerd[1571]: time="2025-05-15T12:36:14.572953078Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 15 12:36:15.124123 containerd[1571]: time="2025-05-15T12:36:15.124042231Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6cf5655c7b-ch7hd,Uid:495efd61-94a1-4706-8775-ab4f278dd3b6,Namespace:calico-apiserver,Attempt:0,}" May 15 12:36:15.226751 systemd-networkd[1471]: calib69612931f1: Link UP May 15 12:36:15.227468 systemd-networkd[1471]: calib69612931f1: Gained carrier May 15 12:36:15.243431 containerd[1571]: 2025-05-15 12:36:15.159 [INFO][4597] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--6cf5655c7b--ch7hd-eth0 calico-apiserver-6cf5655c7b- calico-apiserver 495efd61-94a1-4706-8775-ab4f278dd3b6 751 0 2025-05-15 12:35:36 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6cf5655c7b projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4334-0-0-a-dce95649a9 calico-apiserver-6cf5655c7b-ch7hd eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calib69612931f1 [] []}} ContainerID="0d6ae8ceda8de3ae0ff9f586e52d0189a81cb05ed38f9c3a625d84a512213ba7" Namespace="calico-apiserver" Pod="calico-apiserver-6cf5655c7b-ch7hd" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--6cf5655c7b--ch7hd-" May 15 12:36:15.243431 containerd[1571]: 2025-05-15 12:36:15.159 [INFO][4597] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="0d6ae8ceda8de3ae0ff9f586e52d0189a81cb05ed38f9c3a625d84a512213ba7" Namespace="calico-apiserver" Pod="calico-apiserver-6cf5655c7b-ch7hd" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--6cf5655c7b--ch7hd-eth0" May 15 12:36:15.243431 containerd[1571]: 2025-05-15 12:36:15.189 [INFO][4610] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0d6ae8ceda8de3ae0ff9f586e52d0189a81cb05ed38f9c3a625d84a512213ba7" HandleID="k8s-pod-network.0d6ae8ceda8de3ae0ff9f586e52d0189a81cb05ed38f9c3a625d84a512213ba7" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--6cf5655c7b--ch7hd-eth0" May 15 12:36:15.243617 containerd[1571]: 2025-05-15 12:36:15.196 [INFO][4610] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0d6ae8ceda8de3ae0ff9f586e52d0189a81cb05ed38f9c3a625d84a512213ba7" HandleID="k8s-pod-network.0d6ae8ceda8de3ae0ff9f586e52d0189a81cb05ed38f9c3a625d84a512213ba7" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--6cf5655c7b--ch7hd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000334440), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4334-0-0-a-dce95649a9", "pod":"calico-apiserver-6cf5655c7b-ch7hd", "timestamp":"2025-05-15 12:36:15.189342559 +0000 UTC"}, Hostname:"ci-4334-0-0-a-dce95649a9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 15 12:36:15.243617 containerd[1571]: 2025-05-15 12:36:15.197 [INFO][4610] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:36:15.243617 containerd[1571]: 2025-05-15 12:36:15.197 [INFO][4610] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:36:15.243617 containerd[1571]: 2025-05-15 12:36:15.197 [INFO][4610] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4334-0-0-a-dce95649a9' May 15 12:36:15.243617 containerd[1571]: 2025-05-15 12:36:15.199 [INFO][4610] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.0d6ae8ceda8de3ae0ff9f586e52d0189a81cb05ed38f9c3a625d84a512213ba7" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:15.243617 containerd[1571]: 2025-05-15 12:36:15.202 [INFO][4610] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4334-0-0-a-dce95649a9" May 15 12:36:15.243617 containerd[1571]: 2025-05-15 12:36:15.206 [INFO][4610] ipam/ipam.go 489: Trying affinity for 192.168.88.192/26 host="ci-4334-0-0-a-dce95649a9" May 15 12:36:15.243617 containerd[1571]: 2025-05-15 12:36:15.208 [INFO][4610] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.192/26 host="ci-4334-0-0-a-dce95649a9" May 15 12:36:15.243617 containerd[1571]: 2025-05-15 12:36:15.210 [INFO][4610] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.192/26 host="ci-4334-0-0-a-dce95649a9" May 15 12:36:15.243779 containerd[1571]: 2025-05-15 12:36:15.210 [INFO][4610] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.192/26 handle="k8s-pod-network.0d6ae8ceda8de3ae0ff9f586e52d0189a81cb05ed38f9c3a625d84a512213ba7" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:15.243779 containerd[1571]: 2025-05-15 12:36:15.211 [INFO][4610] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.0d6ae8ceda8de3ae0ff9f586e52d0189a81cb05ed38f9c3a625d84a512213ba7 May 15 12:36:15.243779 containerd[1571]: 2025-05-15 12:36:15.215 [INFO][4610] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.192/26 handle="k8s-pod-network.0d6ae8ceda8de3ae0ff9f586e52d0189a81cb05ed38f9c3a625d84a512213ba7" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:15.243779 containerd[1571]: 2025-05-15 12:36:15.221 [INFO][4610] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.194/26] block=192.168.88.192/26 handle="k8s-pod-network.0d6ae8ceda8de3ae0ff9f586e52d0189a81cb05ed38f9c3a625d84a512213ba7" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:15.243779 containerd[1571]: 2025-05-15 12:36:15.221 [INFO][4610] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.194/26] handle="k8s-pod-network.0d6ae8ceda8de3ae0ff9f586e52d0189a81cb05ed38f9c3a625d84a512213ba7" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:15.243779 containerd[1571]: 2025-05-15 12:36:15.221 [INFO][4610] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:36:15.243779 containerd[1571]: 2025-05-15 12:36:15.221 [INFO][4610] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.194/26] IPv6=[] ContainerID="0d6ae8ceda8de3ae0ff9f586e52d0189a81cb05ed38f9c3a625d84a512213ba7" HandleID="k8s-pod-network.0d6ae8ceda8de3ae0ff9f586e52d0189a81cb05ed38f9c3a625d84a512213ba7" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--6cf5655c7b--ch7hd-eth0" May 15 12:36:15.243889 containerd[1571]: 2025-05-15 12:36:15.223 [INFO][4597] cni-plugin/k8s.go 386: Populated endpoint ContainerID="0d6ae8ceda8de3ae0ff9f586e52d0189a81cb05ed38f9c3a625d84a512213ba7" Namespace="calico-apiserver" Pod="calico-apiserver-6cf5655c7b-ch7hd" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--6cf5655c7b--ch7hd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--6cf5655c7b--ch7hd-eth0", GenerateName:"calico-apiserver-6cf5655c7b-", Namespace:"calico-apiserver", SelfLink:"", UID:"495efd61-94a1-4706-8775-ab4f278dd3b6", ResourceVersion:"751", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 35, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6cf5655c7b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334-0-0-a-dce95649a9", ContainerID:"", Pod:"calico-apiserver-6cf5655c7b-ch7hd", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib69612931f1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:36:15.244458 containerd[1571]: 2025-05-15 12:36:15.224 [INFO][4597] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.194/32] ContainerID="0d6ae8ceda8de3ae0ff9f586e52d0189a81cb05ed38f9c3a625d84a512213ba7" Namespace="calico-apiserver" Pod="calico-apiserver-6cf5655c7b-ch7hd" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--6cf5655c7b--ch7hd-eth0" May 15 12:36:15.244458 containerd[1571]: 2025-05-15 12:36:15.224 [INFO][4597] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib69612931f1 ContainerID="0d6ae8ceda8de3ae0ff9f586e52d0189a81cb05ed38f9c3a625d84a512213ba7" Namespace="calico-apiserver" Pod="calico-apiserver-6cf5655c7b-ch7hd" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--6cf5655c7b--ch7hd-eth0" May 15 12:36:15.244458 containerd[1571]: 2025-05-15 12:36:15.225 [INFO][4597] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0d6ae8ceda8de3ae0ff9f586e52d0189a81cb05ed38f9c3a625d84a512213ba7" Namespace="calico-apiserver" Pod="calico-apiserver-6cf5655c7b-ch7hd" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--6cf5655c7b--ch7hd-eth0" May 15 12:36:15.244527 containerd[1571]: 2025-05-15 12:36:15.225 [INFO][4597] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="0d6ae8ceda8de3ae0ff9f586e52d0189a81cb05ed38f9c3a625d84a512213ba7" Namespace="calico-apiserver" Pod="calico-apiserver-6cf5655c7b-ch7hd" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--6cf5655c7b--ch7hd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--6cf5655c7b--ch7hd-eth0", GenerateName:"calico-apiserver-6cf5655c7b-", Namespace:"calico-apiserver", SelfLink:"", UID:"495efd61-94a1-4706-8775-ab4f278dd3b6", ResourceVersion:"751", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 35, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6cf5655c7b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334-0-0-a-dce95649a9", ContainerID:"0d6ae8ceda8de3ae0ff9f586e52d0189a81cb05ed38f9c3a625d84a512213ba7", Pod:"calico-apiserver-6cf5655c7b-ch7hd", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib69612931f1", MAC:"06:a5:65:40:86:db", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:36:15.244576 containerd[1571]: 2025-05-15 12:36:15.238 [INFO][4597] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="0d6ae8ceda8de3ae0ff9f586e52d0189a81cb05ed38f9c3a625d84a512213ba7" Namespace="calico-apiserver" Pod="calico-apiserver-6cf5655c7b-ch7hd" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--6cf5655c7b--ch7hd-eth0" May 15 12:36:15.266274 containerd[1571]: time="2025-05-15T12:36:15.265776111Z" level=info msg="connecting to shim 0d6ae8ceda8de3ae0ff9f586e52d0189a81cb05ed38f9c3a625d84a512213ba7" address="unix:///run/containerd/s/5059bf1800531beb0cecae590c558029208146bfbc8317399251103563719781" namespace=k8s.io protocol=ttrpc version=3 May 15 12:36:15.269795 systemd-networkd[1471]: vxlan.calico: Gained IPv6LL May 15 12:36:15.291510 systemd[1]: Started cri-containerd-0d6ae8ceda8de3ae0ff9f586e52d0189a81cb05ed38f9c3a625d84a512213ba7.scope - libcontainer container 0d6ae8ceda8de3ae0ff9f586e52d0189a81cb05ed38f9c3a625d84a512213ba7. May 15 12:36:15.335136 containerd[1571]: time="2025-05-15T12:36:15.335068374Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6cf5655c7b-ch7hd,Uid:495efd61-94a1-4706-8775-ab4f278dd3b6,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"0d6ae8ceda8de3ae0ff9f586e52d0189a81cb05ed38f9c3a625d84a512213ba7\"" May 15 12:36:15.652610 systemd-networkd[1471]: cali07bd55d4895: Gained IPv6LL May 15 12:36:16.123719 containerd[1571]: time="2025-05-15T12:36:16.123536361Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2gvxv,Uid:530e456b-2944-4384-8744-d360e07d8aae,Namespace:calico-system,Attempt:0,}" May 15 12:36:16.221083 systemd-networkd[1471]: cali2b1d175571d: Link UP May 15 12:36:16.221702 systemd-networkd[1471]: cali2b1d175571d: Gained carrier May 15 12:36:16.235924 containerd[1571]: 2025-05-15 12:36:16.157 [INFO][4677] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4334--0--0--a--dce95649a9-k8s-csi--node--driver--2gvxv-eth0 csi-node-driver- calico-system 530e456b-2944-4384-8744-d360e07d8aae 605 0 2025-05-15 12:35:36 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:55b7b4b9d k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4334-0-0-a-dce95649a9 csi-node-driver-2gvxv eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali2b1d175571d [] []}} ContainerID="fd6a4542a2ce7eeec767ab419d676324a7c728ab2a544d92dc855d0ab870ab83" Namespace="calico-system" Pod="csi-node-driver-2gvxv" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-csi--node--driver--2gvxv-" May 15 12:36:16.235924 containerd[1571]: 2025-05-15 12:36:16.157 [INFO][4677] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="fd6a4542a2ce7eeec767ab419d676324a7c728ab2a544d92dc855d0ab870ab83" Namespace="calico-system" Pod="csi-node-driver-2gvxv" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-csi--node--driver--2gvxv-eth0" May 15 12:36:16.235924 containerd[1571]: 2025-05-15 12:36:16.184 [INFO][4689] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fd6a4542a2ce7eeec767ab419d676324a7c728ab2a544d92dc855d0ab870ab83" HandleID="k8s-pod-network.fd6a4542a2ce7eeec767ab419d676324a7c728ab2a544d92dc855d0ab870ab83" Workload="ci--4334--0--0--a--dce95649a9-k8s-csi--node--driver--2gvxv-eth0" May 15 12:36:16.236137 containerd[1571]: 2025-05-15 12:36:16.193 [INFO][4689] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="fd6a4542a2ce7eeec767ab419d676324a7c728ab2a544d92dc855d0ab870ab83" HandleID="k8s-pod-network.fd6a4542a2ce7eeec767ab419d676324a7c728ab2a544d92dc855d0ab870ab83" Workload="ci--4334--0--0--a--dce95649a9-k8s-csi--node--driver--2gvxv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003350f0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4334-0-0-a-dce95649a9", "pod":"csi-node-driver-2gvxv", "timestamp":"2025-05-15 12:36:16.184181278 +0000 UTC"}, Hostname:"ci-4334-0-0-a-dce95649a9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 15 12:36:16.236137 containerd[1571]: 2025-05-15 12:36:16.193 [INFO][4689] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:36:16.236137 containerd[1571]: 2025-05-15 12:36:16.193 [INFO][4689] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:36:16.236137 containerd[1571]: 2025-05-15 12:36:16.193 [INFO][4689] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4334-0-0-a-dce95649a9' May 15 12:36:16.236137 containerd[1571]: 2025-05-15 12:36:16.195 [INFO][4689] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.fd6a4542a2ce7eeec767ab419d676324a7c728ab2a544d92dc855d0ab870ab83" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:16.236137 containerd[1571]: 2025-05-15 12:36:16.199 [INFO][4689] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4334-0-0-a-dce95649a9" May 15 12:36:16.236137 containerd[1571]: 2025-05-15 12:36:16.202 [INFO][4689] ipam/ipam.go 489: Trying affinity for 192.168.88.192/26 host="ci-4334-0-0-a-dce95649a9" May 15 12:36:16.236137 containerd[1571]: 2025-05-15 12:36:16.204 [INFO][4689] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.192/26 host="ci-4334-0-0-a-dce95649a9" May 15 12:36:16.236137 containerd[1571]: 2025-05-15 12:36:16.206 [INFO][4689] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.192/26 host="ci-4334-0-0-a-dce95649a9" May 15 12:36:16.237861 containerd[1571]: 2025-05-15 12:36:16.206 [INFO][4689] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.192/26 handle="k8s-pod-network.fd6a4542a2ce7eeec767ab419d676324a7c728ab2a544d92dc855d0ab870ab83" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:16.237861 containerd[1571]: 2025-05-15 12:36:16.207 [INFO][4689] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.fd6a4542a2ce7eeec767ab419d676324a7c728ab2a544d92dc855d0ab870ab83 May 15 12:36:16.237861 containerd[1571]: 2025-05-15 12:36:16.212 [INFO][4689] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.192/26 handle="k8s-pod-network.fd6a4542a2ce7eeec767ab419d676324a7c728ab2a544d92dc855d0ab870ab83" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:16.237861 containerd[1571]: 2025-05-15 12:36:16.216 [INFO][4689] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.195/26] block=192.168.88.192/26 handle="k8s-pod-network.fd6a4542a2ce7eeec767ab419d676324a7c728ab2a544d92dc855d0ab870ab83" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:16.237861 containerd[1571]: 2025-05-15 12:36:16.216 [INFO][4689] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.195/26] handle="k8s-pod-network.fd6a4542a2ce7eeec767ab419d676324a7c728ab2a544d92dc855d0ab870ab83" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:16.237861 containerd[1571]: 2025-05-15 12:36:16.216 [INFO][4689] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:36:16.237861 containerd[1571]: 2025-05-15 12:36:16.216 [INFO][4689] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.195/26] IPv6=[] ContainerID="fd6a4542a2ce7eeec767ab419d676324a7c728ab2a544d92dc855d0ab870ab83" HandleID="k8s-pod-network.fd6a4542a2ce7eeec767ab419d676324a7c728ab2a544d92dc855d0ab870ab83" Workload="ci--4334--0--0--a--dce95649a9-k8s-csi--node--driver--2gvxv-eth0" May 15 12:36:16.238396 containerd[1571]: 2025-05-15 12:36:16.218 [INFO][4677] cni-plugin/k8s.go 386: Populated endpoint ContainerID="fd6a4542a2ce7eeec767ab419d676324a7c728ab2a544d92dc855d0ab870ab83" Namespace="calico-system" Pod="csi-node-driver-2gvxv" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-csi--node--driver--2gvxv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334--0--0--a--dce95649a9-k8s-csi--node--driver--2gvxv-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"530e456b-2944-4384-8744-d360e07d8aae", ResourceVersion:"605", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 35, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"55b7b4b9d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334-0-0-a-dce95649a9", ContainerID:"", Pod:"csi-node-driver-2gvxv", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali2b1d175571d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:36:16.238453 containerd[1571]: 2025-05-15 12:36:16.218 [INFO][4677] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.195/32] ContainerID="fd6a4542a2ce7eeec767ab419d676324a7c728ab2a544d92dc855d0ab870ab83" Namespace="calico-system" Pod="csi-node-driver-2gvxv" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-csi--node--driver--2gvxv-eth0" May 15 12:36:16.238453 containerd[1571]: 2025-05-15 12:36:16.218 [INFO][4677] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2b1d175571d ContainerID="fd6a4542a2ce7eeec767ab419d676324a7c728ab2a544d92dc855d0ab870ab83" Namespace="calico-system" Pod="csi-node-driver-2gvxv" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-csi--node--driver--2gvxv-eth0" May 15 12:36:16.238453 containerd[1571]: 2025-05-15 12:36:16.221 [INFO][4677] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fd6a4542a2ce7eeec767ab419d676324a7c728ab2a544d92dc855d0ab870ab83" Namespace="calico-system" Pod="csi-node-driver-2gvxv" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-csi--node--driver--2gvxv-eth0" May 15 12:36:16.238505 containerd[1571]: 2025-05-15 12:36:16.222 [INFO][4677] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="fd6a4542a2ce7eeec767ab419d676324a7c728ab2a544d92dc855d0ab870ab83" Namespace="calico-system" Pod="csi-node-driver-2gvxv" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-csi--node--driver--2gvxv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334--0--0--a--dce95649a9-k8s-csi--node--driver--2gvxv-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"530e456b-2944-4384-8744-d360e07d8aae", ResourceVersion:"605", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 35, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"55b7b4b9d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334-0-0-a-dce95649a9", ContainerID:"fd6a4542a2ce7eeec767ab419d676324a7c728ab2a544d92dc855d0ab870ab83", Pod:"csi-node-driver-2gvxv", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali2b1d175571d", MAC:"ae:ff:32:b1:d4:7a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:36:16.239377 containerd[1571]: 2025-05-15 12:36:16.232 [INFO][4677] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="fd6a4542a2ce7eeec767ab419d676324a7c728ab2a544d92dc855d0ab870ab83" Namespace="calico-system" Pod="csi-node-driver-2gvxv" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-csi--node--driver--2gvxv-eth0" May 15 12:36:16.263083 containerd[1571]: time="2025-05-15T12:36:16.262455164Z" level=info msg="connecting to shim fd6a4542a2ce7eeec767ab419d676324a7c728ab2a544d92dc855d0ab870ab83" address="unix:///run/containerd/s/234b5e5ec00e1b6e6ba7b26814c525a9e6271e5ba24bac9acc45a7194dbe8b8a" namespace=k8s.io protocol=ttrpc version=3 May 15 12:36:16.289613 systemd[1]: Started cri-containerd-fd6a4542a2ce7eeec767ab419d676324a7c728ab2a544d92dc855d0ab870ab83.scope - libcontainer container fd6a4542a2ce7eeec767ab419d676324a7c728ab2a544d92dc855d0ab870ab83. May 15 12:36:16.347585 containerd[1571]: time="2025-05-15T12:36:16.347548585Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2gvxv,Uid:530e456b-2944-4384-8744-d360e07d8aae,Namespace:calico-system,Attempt:0,} returns sandbox id \"fd6a4542a2ce7eeec767ab419d676324a7c728ab2a544d92dc855d0ab870ab83\"" May 15 12:36:16.356547 systemd-networkd[1471]: calib69612931f1: Gained IPv6LL May 15 12:36:17.123773 containerd[1571]: time="2025-05-15T12:36:17.123711701Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9bffb6db8-psvtq,Uid:f7cfcf4a-92ec-4f4c-9f12-dafd70717cab,Namespace:calico-apiserver,Attempt:0,}" May 15 12:36:17.124465 containerd[1571]: time="2025-05-15T12:36:17.124131539Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-s6gwc,Uid:06a220b5-1770-44be-8b97-a7f6c8e515a9,Namespace:kube-system,Attempt:0,}" May 15 12:36:17.247630 systemd-networkd[1471]: cali8ec66ff2179: Link UP May 15 12:36:17.249834 systemd-networkd[1471]: cali8ec66ff2179: Gained carrier May 15 12:36:17.261533 containerd[1571]: 2025-05-15 12:36:17.174 [INFO][4752] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--psvtq-eth0 calico-apiserver-9bffb6db8- calico-apiserver f7cfcf4a-92ec-4f4c-9f12-dafd70717cab 747 0 2025-05-15 12:35:36 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:9bffb6db8 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4334-0-0-a-dce95649a9 calico-apiserver-9bffb6db8-psvtq eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali8ec66ff2179 [] []}} ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" Namespace="calico-apiserver" Pod="calico-apiserver-9bffb6db8-psvtq" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--psvtq-" May 15 12:36:17.261533 containerd[1571]: 2025-05-15 12:36:17.175 [INFO][4752] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" Namespace="calico-apiserver" Pod="calico-apiserver-9bffb6db8-psvtq" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--psvtq-eth0" May 15 12:36:17.261533 containerd[1571]: 2025-05-15 12:36:17.204 [INFO][4777] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" HandleID="k8s-pod-network.9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--psvtq-eth0" May 15 12:36:17.262446 containerd[1571]: 2025-05-15 12:36:17.214 [INFO][4777] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" HandleID="k8s-pod-network.9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--psvtq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002842a0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4334-0-0-a-dce95649a9", "pod":"calico-apiserver-9bffb6db8-psvtq", "timestamp":"2025-05-15 12:36:17.204890059 +0000 UTC"}, Hostname:"ci-4334-0-0-a-dce95649a9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 15 12:36:17.262446 containerd[1571]: 2025-05-15 12:36:17.214 [INFO][4777] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:36:17.262446 containerd[1571]: 2025-05-15 12:36:17.215 [INFO][4777] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:36:17.262446 containerd[1571]: 2025-05-15 12:36:17.215 [INFO][4777] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4334-0-0-a-dce95649a9' May 15 12:36:17.262446 containerd[1571]: 2025-05-15 12:36:17.216 [INFO][4777] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:17.262446 containerd[1571]: 2025-05-15 12:36:17.220 [INFO][4777] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4334-0-0-a-dce95649a9" May 15 12:36:17.262446 containerd[1571]: 2025-05-15 12:36:17.224 [INFO][4777] ipam/ipam.go 489: Trying affinity for 192.168.88.192/26 host="ci-4334-0-0-a-dce95649a9" May 15 12:36:17.262446 containerd[1571]: 2025-05-15 12:36:17.227 [INFO][4777] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.192/26 host="ci-4334-0-0-a-dce95649a9" May 15 12:36:17.262446 containerd[1571]: 2025-05-15 12:36:17.230 [INFO][4777] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.192/26 host="ci-4334-0-0-a-dce95649a9" May 15 12:36:17.263514 containerd[1571]: 2025-05-15 12:36:17.230 [INFO][4777] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.192/26 handle="k8s-pod-network.9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:17.263514 containerd[1571]: 2025-05-15 12:36:17.231 [INFO][4777] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664 May 15 12:36:17.263514 containerd[1571]: 2025-05-15 12:36:17.235 [INFO][4777] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.192/26 handle="k8s-pod-network.9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:17.263514 containerd[1571]: 2025-05-15 12:36:17.240 [INFO][4777] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.196/26] block=192.168.88.192/26 handle="k8s-pod-network.9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:17.263514 containerd[1571]: 2025-05-15 12:36:17.240 [INFO][4777] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.196/26] handle="k8s-pod-network.9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:17.263514 containerd[1571]: 2025-05-15 12:36:17.240 [INFO][4777] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:36:17.263514 containerd[1571]: 2025-05-15 12:36:17.240 [INFO][4777] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.196/26] IPv6=[] ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" HandleID="k8s-pod-network.9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--psvtq-eth0" May 15 12:36:17.263629 containerd[1571]: 2025-05-15 12:36:17.243 [INFO][4752] cni-plugin/k8s.go 386: Populated endpoint ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" Namespace="calico-apiserver" Pod="calico-apiserver-9bffb6db8-psvtq" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--psvtq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--psvtq-eth0", GenerateName:"calico-apiserver-9bffb6db8-", Namespace:"calico-apiserver", SelfLink:"", UID:"f7cfcf4a-92ec-4f4c-9f12-dafd70717cab", ResourceVersion:"747", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 35, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"9bffb6db8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334-0-0-a-dce95649a9", ContainerID:"", Pod:"calico-apiserver-9bffb6db8-psvtq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali8ec66ff2179", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:36:17.263679 containerd[1571]: 2025-05-15 12:36:17.243 [INFO][4752] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.196/32] ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" Namespace="calico-apiserver" Pod="calico-apiserver-9bffb6db8-psvtq" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--psvtq-eth0" May 15 12:36:17.263679 containerd[1571]: 2025-05-15 12:36:17.243 [INFO][4752] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8ec66ff2179 ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" Namespace="calico-apiserver" Pod="calico-apiserver-9bffb6db8-psvtq" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--psvtq-eth0" May 15 12:36:17.263679 containerd[1571]: 2025-05-15 12:36:17.250 [INFO][4752] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" Namespace="calico-apiserver" Pod="calico-apiserver-9bffb6db8-psvtq" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--psvtq-eth0" May 15 12:36:17.263772 containerd[1571]: 2025-05-15 12:36:17.250 [INFO][4752] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" Namespace="calico-apiserver" Pod="calico-apiserver-9bffb6db8-psvtq" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--psvtq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--psvtq-eth0", GenerateName:"calico-apiserver-9bffb6db8-", Namespace:"calico-apiserver", SelfLink:"", UID:"f7cfcf4a-92ec-4f4c-9f12-dafd70717cab", ResourceVersion:"747", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 35, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"9bffb6db8", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334-0-0-a-dce95649a9", ContainerID:"9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664", Pod:"calico-apiserver-9bffb6db8-psvtq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali8ec66ff2179", MAC:"3e:cb:08:74:35:ff", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:36:17.263821 containerd[1571]: 2025-05-15 12:36:17.260 [INFO][4752] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" Namespace="calico-apiserver" Pod="calico-apiserver-9bffb6db8-psvtq" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--psvtq-eth0" May 15 12:36:17.306421 systemd-networkd[1471]: cali83fb3c70dde: Link UP May 15 12:36:17.308023 systemd-networkd[1471]: cali83fb3c70dde: Gained carrier May 15 12:36:17.310521 containerd[1571]: time="2025-05-15T12:36:17.310455026Z" level=info msg="connecting to shim 9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" address="unix:///run/containerd/s/66a8cd14ac458f7422b7da83515afeceeb880bee838e2d09fd76b4b68efd7312" namespace=k8s.io protocol=ttrpc version=3 May 15 12:36:17.325912 containerd[1571]: 2025-05-15 12:36:17.174 [INFO][4759] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4334--0--0--a--dce95649a9-k8s-coredns--7db6d8ff4d--s6gwc-eth0 coredns-7db6d8ff4d- kube-system 06a220b5-1770-44be-8b97-a7f6c8e515a9 744 0 2025-05-15 12:35:28 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4334-0-0-a-dce95649a9 coredns-7db6d8ff4d-s6gwc eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali83fb3c70dde [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="45ceec2235dc392448bc8c5e9de9582e8ccc155b844c8b3c91e25237383122ba" Namespace="kube-system" Pod="coredns-7db6d8ff4d-s6gwc" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-coredns--7db6d8ff4d--s6gwc-" May 15 12:36:17.325912 containerd[1571]: 2025-05-15 12:36:17.174 [INFO][4759] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="45ceec2235dc392448bc8c5e9de9582e8ccc155b844c8b3c91e25237383122ba" Namespace="kube-system" Pod="coredns-7db6d8ff4d-s6gwc" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-coredns--7db6d8ff4d--s6gwc-eth0" May 15 12:36:17.325912 containerd[1571]: 2025-05-15 12:36:17.206 [INFO][4776] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="45ceec2235dc392448bc8c5e9de9582e8ccc155b844c8b3c91e25237383122ba" HandleID="k8s-pod-network.45ceec2235dc392448bc8c5e9de9582e8ccc155b844c8b3c91e25237383122ba" Workload="ci--4334--0--0--a--dce95649a9-k8s-coredns--7db6d8ff4d--s6gwc-eth0" May 15 12:36:17.326417 containerd[1571]: 2025-05-15 12:36:17.215 [INFO][4776] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="45ceec2235dc392448bc8c5e9de9582e8ccc155b844c8b3c91e25237383122ba" HandleID="k8s-pod-network.45ceec2235dc392448bc8c5e9de9582e8ccc155b844c8b3c91e25237383122ba" Workload="ci--4334--0--0--a--dce95649a9-k8s-coredns--7db6d8ff4d--s6gwc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000051bf0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4334-0-0-a-dce95649a9", "pod":"coredns-7db6d8ff4d-s6gwc", "timestamp":"2025-05-15 12:36:17.205989813 +0000 UTC"}, Hostname:"ci-4334-0-0-a-dce95649a9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 15 12:36:17.326417 containerd[1571]: 2025-05-15 12:36:17.215 [INFO][4776] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:36:17.326417 containerd[1571]: 2025-05-15 12:36:17.240 [INFO][4776] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:36:17.326417 containerd[1571]: 2025-05-15 12:36:17.240 [INFO][4776] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4334-0-0-a-dce95649a9' May 15 12:36:17.326417 containerd[1571]: 2025-05-15 12:36:17.243 [INFO][4776] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.45ceec2235dc392448bc8c5e9de9582e8ccc155b844c8b3c91e25237383122ba" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:17.326417 containerd[1571]: 2025-05-15 12:36:17.249 [INFO][4776] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4334-0-0-a-dce95649a9" May 15 12:36:17.326417 containerd[1571]: 2025-05-15 12:36:17.257 [INFO][4776] ipam/ipam.go 489: Trying affinity for 192.168.88.192/26 host="ci-4334-0-0-a-dce95649a9" May 15 12:36:17.326417 containerd[1571]: 2025-05-15 12:36:17.264 [INFO][4776] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.192/26 host="ci-4334-0-0-a-dce95649a9" May 15 12:36:17.326417 containerd[1571]: 2025-05-15 12:36:17.268 [INFO][4776] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.192/26 host="ci-4334-0-0-a-dce95649a9" May 15 12:36:17.327050 containerd[1571]: 2025-05-15 12:36:17.268 [INFO][4776] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.192/26 handle="k8s-pod-network.45ceec2235dc392448bc8c5e9de9582e8ccc155b844c8b3c91e25237383122ba" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:17.327050 containerd[1571]: 2025-05-15 12:36:17.270 [INFO][4776] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.45ceec2235dc392448bc8c5e9de9582e8ccc155b844c8b3c91e25237383122ba May 15 12:36:17.327050 containerd[1571]: 2025-05-15 12:36:17.280 [INFO][4776] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.192/26 handle="k8s-pod-network.45ceec2235dc392448bc8c5e9de9582e8ccc155b844c8b3c91e25237383122ba" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:17.327050 containerd[1571]: 2025-05-15 12:36:17.290 [INFO][4776] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.197/26] block=192.168.88.192/26 handle="k8s-pod-network.45ceec2235dc392448bc8c5e9de9582e8ccc155b844c8b3c91e25237383122ba" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:17.327050 containerd[1571]: 2025-05-15 12:36:17.290 [INFO][4776] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.197/26] handle="k8s-pod-network.45ceec2235dc392448bc8c5e9de9582e8ccc155b844c8b3c91e25237383122ba" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:17.327050 containerd[1571]: 2025-05-15 12:36:17.290 [INFO][4776] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:36:17.327050 containerd[1571]: 2025-05-15 12:36:17.290 [INFO][4776] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.197/26] IPv6=[] ContainerID="45ceec2235dc392448bc8c5e9de9582e8ccc155b844c8b3c91e25237383122ba" HandleID="k8s-pod-network.45ceec2235dc392448bc8c5e9de9582e8ccc155b844c8b3c91e25237383122ba" Workload="ci--4334--0--0--a--dce95649a9-k8s-coredns--7db6d8ff4d--s6gwc-eth0" May 15 12:36:17.327527 containerd[1571]: 2025-05-15 12:36:17.297 [INFO][4759] cni-plugin/k8s.go 386: Populated endpoint ContainerID="45ceec2235dc392448bc8c5e9de9582e8ccc155b844c8b3c91e25237383122ba" Namespace="kube-system" Pod="coredns-7db6d8ff4d-s6gwc" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-coredns--7db6d8ff4d--s6gwc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334--0--0--a--dce95649a9-k8s-coredns--7db6d8ff4d--s6gwc-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"06a220b5-1770-44be-8b97-a7f6c8e515a9", ResourceVersion:"744", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 35, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334-0-0-a-dce95649a9", ContainerID:"", Pod:"coredns-7db6d8ff4d-s6gwc", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali83fb3c70dde", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:36:17.327527 containerd[1571]: 2025-05-15 12:36:17.297 [INFO][4759] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.197/32] ContainerID="45ceec2235dc392448bc8c5e9de9582e8ccc155b844c8b3c91e25237383122ba" Namespace="kube-system" Pod="coredns-7db6d8ff4d-s6gwc" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-coredns--7db6d8ff4d--s6gwc-eth0" May 15 12:36:17.327527 containerd[1571]: 2025-05-15 12:36:17.297 [INFO][4759] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali83fb3c70dde ContainerID="45ceec2235dc392448bc8c5e9de9582e8ccc155b844c8b3c91e25237383122ba" Namespace="kube-system" Pod="coredns-7db6d8ff4d-s6gwc" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-coredns--7db6d8ff4d--s6gwc-eth0" May 15 12:36:17.327527 containerd[1571]: 2025-05-15 12:36:17.304 [INFO][4759] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="45ceec2235dc392448bc8c5e9de9582e8ccc155b844c8b3c91e25237383122ba" Namespace="kube-system" Pod="coredns-7db6d8ff4d-s6gwc" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-coredns--7db6d8ff4d--s6gwc-eth0" May 15 12:36:17.327527 containerd[1571]: 2025-05-15 12:36:17.304 [INFO][4759] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="45ceec2235dc392448bc8c5e9de9582e8ccc155b844c8b3c91e25237383122ba" Namespace="kube-system" Pod="coredns-7db6d8ff4d-s6gwc" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-coredns--7db6d8ff4d--s6gwc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334--0--0--a--dce95649a9-k8s-coredns--7db6d8ff4d--s6gwc-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"06a220b5-1770-44be-8b97-a7f6c8e515a9", ResourceVersion:"744", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 35, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334-0-0-a-dce95649a9", ContainerID:"45ceec2235dc392448bc8c5e9de9582e8ccc155b844c8b3c91e25237383122ba", Pod:"coredns-7db6d8ff4d-s6gwc", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali83fb3c70dde", MAC:"6a:50:dd:00:62:81", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:36:17.327527 containerd[1571]: 2025-05-15 12:36:17.322 [INFO][4759] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="45ceec2235dc392448bc8c5e9de9582e8ccc155b844c8b3c91e25237383122ba" Namespace="kube-system" Pod="coredns-7db6d8ff4d-s6gwc" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-coredns--7db6d8ff4d--s6gwc-eth0" May 15 12:36:17.347630 systemd[1]: Started cri-containerd-9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664.scope - libcontainer container 9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664. May 15 12:36:17.365958 containerd[1571]: time="2025-05-15T12:36:17.365888206Z" level=info msg="connecting to shim 45ceec2235dc392448bc8c5e9de9582e8ccc155b844c8b3c91e25237383122ba" address="unix:///run/containerd/s/a20bd5ee72f6343b1ee71c0763cee56d4ff628444d704238daa3f3c3156cf454" namespace=k8s.io protocol=ttrpc version=3 May 15 12:36:17.389442 systemd[1]: Started cri-containerd-45ceec2235dc392448bc8c5e9de9582e8ccc155b844c8b3c91e25237383122ba.scope - libcontainer container 45ceec2235dc392448bc8c5e9de9582e8ccc155b844c8b3c91e25237383122ba. May 15 12:36:17.424974 containerd[1571]: time="2025-05-15T12:36:17.424945791Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9bffb6db8-psvtq,Uid:f7cfcf4a-92ec-4f4c-9f12-dafd70717cab,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664\"" May 15 12:36:17.437606 containerd[1571]: time="2025-05-15T12:36:17.437539897Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-s6gwc,Uid:06a220b5-1770-44be-8b97-a7f6c8e515a9,Namespace:kube-system,Attempt:0,} returns sandbox id \"45ceec2235dc392448bc8c5e9de9582e8ccc155b844c8b3c91e25237383122ba\"" May 15 12:36:17.441310 containerd[1571]: time="2025-05-15T12:36:17.441189350Z" level=info msg="CreateContainer within sandbox \"45ceec2235dc392448bc8c5e9de9582e8ccc155b844c8b3c91e25237383122ba\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 15 12:36:17.456061 containerd[1571]: time="2025-05-15T12:36:17.456022690Z" level=info msg="Container 8ba3dd7cfc8f743257c0c14823e36d88034d782508bbcf011a2434c7e1569390: CDI devices from CRI Config.CDIDevices: []" May 15 12:36:17.460556 containerd[1571]: time="2025-05-15T12:36:17.460518532Z" level=info msg="CreateContainer within sandbox \"45ceec2235dc392448bc8c5e9de9582e8ccc155b844c8b3c91e25237383122ba\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"8ba3dd7cfc8f743257c0c14823e36d88034d782508bbcf011a2434c7e1569390\"" May 15 12:36:17.461161 containerd[1571]: time="2025-05-15T12:36:17.461135961Z" level=info msg="StartContainer for \"8ba3dd7cfc8f743257c0c14823e36d88034d782508bbcf011a2434c7e1569390\"" May 15 12:36:17.462621 containerd[1571]: time="2025-05-15T12:36:17.462589931Z" level=info msg="connecting to shim 8ba3dd7cfc8f743257c0c14823e36d88034d782508bbcf011a2434c7e1569390" address="unix:///run/containerd/s/a20bd5ee72f6343b1ee71c0763cee56d4ff628444d704238daa3f3c3156cf454" protocol=ttrpc version=3 May 15 12:36:17.479474 systemd[1]: Started cri-containerd-8ba3dd7cfc8f743257c0c14823e36d88034d782508bbcf011a2434c7e1569390.scope - libcontainer container 8ba3dd7cfc8f743257c0c14823e36d88034d782508bbcf011a2434c7e1569390. May 15 12:36:17.505829 containerd[1571]: time="2025-05-15T12:36:17.505798919Z" level=info msg="StartContainer for \"8ba3dd7cfc8f743257c0c14823e36d88034d782508bbcf011a2434c7e1569390\" returns successfully" May 15 12:36:18.084623 systemd-networkd[1471]: cali2b1d175571d: Gained IPv6LL May 15 12:36:18.123983 containerd[1571]: time="2025-05-15T12:36:18.123952880Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8b9db6c54-8mmnz,Uid:0cea196a-c35a-4ed1-aba8-1434c056f4d9,Namespace:calico-system,Attempt:0,}" May 15 12:36:18.124630 containerd[1571]: time="2025-05-15T12:36:18.124046776Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-krh5f,Uid:6f5a1d6d-abb2-4c77-8b9b-cd49f30f6287,Namespace:kube-system,Attempt:0,}" May 15 12:36:18.267168 systemd-networkd[1471]: cali2e35093a7b1: Link UP May 15 12:36:18.269071 systemd-networkd[1471]: cali2e35093a7b1: Gained carrier May 15 12:36:18.301033 containerd[1571]: 2025-05-15 12:36:18.182 [INFO][4939] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--8b9db6c54--8mmnz-eth0 calico-kube-controllers-8b9db6c54- calico-system 0cea196a-c35a-4ed1-aba8-1434c056f4d9 750 0 2025-05-15 12:35:36 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:8b9db6c54 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4334-0-0-a-dce95649a9 calico-kube-controllers-8b9db6c54-8mmnz eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali2e35093a7b1 [] []}} ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" Namespace="calico-system" Pod="calico-kube-controllers-8b9db6c54-8mmnz" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--8b9db6c54--8mmnz-" May 15 12:36:18.301033 containerd[1571]: 2025-05-15 12:36:18.183 [INFO][4939] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" Namespace="calico-system" Pod="calico-kube-controllers-8b9db6c54-8mmnz" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--8b9db6c54--8mmnz-eth0" May 15 12:36:18.301033 containerd[1571]: 2025-05-15 12:36:18.215 [INFO][4967] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" HandleID="k8s-pod-network.565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--8b9db6c54--8mmnz-eth0" May 15 12:36:18.301033 containerd[1571]: 2025-05-15 12:36:18.223 [INFO][4967] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" HandleID="k8s-pod-network.565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--8b9db6c54--8mmnz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002907e0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4334-0-0-a-dce95649a9", "pod":"calico-kube-controllers-8b9db6c54-8mmnz", "timestamp":"2025-05-15 12:36:18.215055131 +0000 UTC"}, Hostname:"ci-4334-0-0-a-dce95649a9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 15 12:36:18.301033 containerd[1571]: 2025-05-15 12:36:18.223 [INFO][4967] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:36:18.301033 containerd[1571]: 2025-05-15 12:36:18.223 [INFO][4967] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:36:18.301033 containerd[1571]: 2025-05-15 12:36:18.223 [INFO][4967] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4334-0-0-a-dce95649a9' May 15 12:36:18.301033 containerd[1571]: 2025-05-15 12:36:18.225 [INFO][4967] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:18.301033 containerd[1571]: 2025-05-15 12:36:18.230 [INFO][4967] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4334-0-0-a-dce95649a9" May 15 12:36:18.301033 containerd[1571]: 2025-05-15 12:36:18.234 [INFO][4967] ipam/ipam.go 489: Trying affinity for 192.168.88.192/26 host="ci-4334-0-0-a-dce95649a9" May 15 12:36:18.301033 containerd[1571]: 2025-05-15 12:36:18.236 [INFO][4967] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.192/26 host="ci-4334-0-0-a-dce95649a9" May 15 12:36:18.301033 containerd[1571]: 2025-05-15 12:36:18.239 [INFO][4967] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.192/26 host="ci-4334-0-0-a-dce95649a9" May 15 12:36:18.301033 containerd[1571]: 2025-05-15 12:36:18.239 [INFO][4967] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.192/26 handle="k8s-pod-network.565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:18.301033 containerd[1571]: 2025-05-15 12:36:18.241 [INFO][4967] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006 May 15 12:36:18.301033 containerd[1571]: 2025-05-15 12:36:18.245 [INFO][4967] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.192/26 handle="k8s-pod-network.565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:18.301033 containerd[1571]: 2025-05-15 12:36:18.255 [INFO][4967] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.198/26] block=192.168.88.192/26 handle="k8s-pod-network.565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:18.301033 containerd[1571]: 2025-05-15 12:36:18.256 [INFO][4967] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.198/26] handle="k8s-pod-network.565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:18.301033 containerd[1571]: 2025-05-15 12:36:18.256 [INFO][4967] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:36:18.301033 containerd[1571]: 2025-05-15 12:36:18.256 [INFO][4967] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.198/26] IPv6=[] ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" HandleID="k8s-pod-network.565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--8b9db6c54--8mmnz-eth0" May 15 12:36:18.306987 containerd[1571]: 2025-05-15 12:36:18.261 [INFO][4939] cni-plugin/k8s.go 386: Populated endpoint ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" Namespace="calico-system" Pod="calico-kube-controllers-8b9db6c54-8mmnz" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--8b9db6c54--8mmnz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--8b9db6c54--8mmnz-eth0", GenerateName:"calico-kube-controllers-8b9db6c54-", Namespace:"calico-system", SelfLink:"", UID:"0cea196a-c35a-4ed1-aba8-1434c056f4d9", ResourceVersion:"750", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 35, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8b9db6c54", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334-0-0-a-dce95649a9", ContainerID:"", Pod:"calico-kube-controllers-8b9db6c54-8mmnz", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali2e35093a7b1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:36:18.306987 containerd[1571]: 2025-05-15 12:36:18.261 [INFO][4939] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.198/32] ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" Namespace="calico-system" Pod="calico-kube-controllers-8b9db6c54-8mmnz" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--8b9db6c54--8mmnz-eth0" May 15 12:36:18.306987 containerd[1571]: 2025-05-15 12:36:18.262 [INFO][4939] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2e35093a7b1 ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" Namespace="calico-system" Pod="calico-kube-controllers-8b9db6c54-8mmnz" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--8b9db6c54--8mmnz-eth0" May 15 12:36:18.306987 containerd[1571]: 2025-05-15 12:36:18.270 [INFO][4939] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" Namespace="calico-system" Pod="calico-kube-controllers-8b9db6c54-8mmnz" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--8b9db6c54--8mmnz-eth0" May 15 12:36:18.306987 containerd[1571]: 2025-05-15 12:36:18.270 [INFO][4939] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" Namespace="calico-system" Pod="calico-kube-controllers-8b9db6c54-8mmnz" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--8b9db6c54--8mmnz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--8b9db6c54--8mmnz-eth0", GenerateName:"calico-kube-controllers-8b9db6c54-", Namespace:"calico-system", SelfLink:"", UID:"0cea196a-c35a-4ed1-aba8-1434c056f4d9", ResourceVersion:"750", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 35, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8b9db6c54", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334-0-0-a-dce95649a9", ContainerID:"565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006", Pod:"calico-kube-controllers-8b9db6c54-8mmnz", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali2e35093a7b1", MAC:"86:da:43:b1:fe:19", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:36:18.306987 containerd[1571]: 2025-05-15 12:36:18.290 [INFO][4939] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" Namespace="calico-system" Pod="calico-kube-controllers-8b9db6c54-8mmnz" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--8b9db6c54--8mmnz-eth0" May 15 12:36:18.301272 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3818520320.mount: Deactivated successfully. May 15 12:36:18.338553 systemd-networkd[1471]: cali66e4b893d86: Link UP May 15 12:36:18.341782 systemd-networkd[1471]: cali66e4b893d86: Gained carrier May 15 12:36:18.384107 containerd[1571]: time="2025-05-15T12:36:18.384059357Z" level=info msg="connecting to shim 565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" address="unix:///run/containerd/s/9993a97c8181b5a4fb6794a0eeb35dc2d1393b8c0285d79783dd02f595c685a1" namespace=k8s.io protocol=ttrpc version=3 May 15 12:36:18.385860 containerd[1571]: 2025-05-15 12:36:18.186 [INFO][4941] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4334--0--0--a--dce95649a9-k8s-coredns--7db6d8ff4d--krh5f-eth0 coredns-7db6d8ff4d- kube-system 6f5a1d6d-abb2-4c77-8b9b-cd49f30f6287 748 0 2025-05-15 12:35:28 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4334-0-0-a-dce95649a9 coredns-7db6d8ff4d-krh5f eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali66e4b893d86 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="5f5396d674935327778079efcec428be923c236a0016b52a40c1d7a43854a510" Namespace="kube-system" Pod="coredns-7db6d8ff4d-krh5f" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-coredns--7db6d8ff4d--krh5f-" May 15 12:36:18.385860 containerd[1571]: 2025-05-15 12:36:18.187 [INFO][4941] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="5f5396d674935327778079efcec428be923c236a0016b52a40c1d7a43854a510" Namespace="kube-system" Pod="coredns-7db6d8ff4d-krh5f" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-coredns--7db6d8ff4d--krh5f-eth0" May 15 12:36:18.385860 containerd[1571]: 2025-05-15 12:36:18.216 [INFO][4972] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5f5396d674935327778079efcec428be923c236a0016b52a40c1d7a43854a510" HandleID="k8s-pod-network.5f5396d674935327778079efcec428be923c236a0016b52a40c1d7a43854a510" Workload="ci--4334--0--0--a--dce95649a9-k8s-coredns--7db6d8ff4d--krh5f-eth0" May 15 12:36:18.385860 containerd[1571]: 2025-05-15 12:36:18.226 [INFO][4972] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5f5396d674935327778079efcec428be923c236a0016b52a40c1d7a43854a510" HandleID="k8s-pod-network.5f5396d674935327778079efcec428be923c236a0016b52a40c1d7a43854a510" Workload="ci--4334--0--0--a--dce95649a9-k8s-coredns--7db6d8ff4d--krh5f-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00031b2d0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4334-0-0-a-dce95649a9", "pod":"coredns-7db6d8ff4d-krh5f", "timestamp":"2025-05-15 12:36:18.21665296 +0000 UTC"}, Hostname:"ci-4334-0-0-a-dce95649a9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 15 12:36:18.385860 containerd[1571]: 2025-05-15 12:36:18.226 [INFO][4972] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:36:18.385860 containerd[1571]: 2025-05-15 12:36:18.256 [INFO][4972] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:36:18.385860 containerd[1571]: 2025-05-15 12:36:18.256 [INFO][4972] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4334-0-0-a-dce95649a9' May 15 12:36:18.385860 containerd[1571]: 2025-05-15 12:36:18.260 [INFO][4972] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.5f5396d674935327778079efcec428be923c236a0016b52a40c1d7a43854a510" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:18.385860 containerd[1571]: 2025-05-15 12:36:18.269 [INFO][4972] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4334-0-0-a-dce95649a9" May 15 12:36:18.385860 containerd[1571]: 2025-05-15 12:36:18.277 [INFO][4972] ipam/ipam.go 489: Trying affinity for 192.168.88.192/26 host="ci-4334-0-0-a-dce95649a9" May 15 12:36:18.385860 containerd[1571]: 2025-05-15 12:36:18.292 [INFO][4972] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.192/26 host="ci-4334-0-0-a-dce95649a9" May 15 12:36:18.385860 containerd[1571]: 2025-05-15 12:36:18.303 [INFO][4972] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.192/26 host="ci-4334-0-0-a-dce95649a9" May 15 12:36:18.385860 containerd[1571]: 2025-05-15 12:36:18.303 [INFO][4972] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.192/26 handle="k8s-pod-network.5f5396d674935327778079efcec428be923c236a0016b52a40c1d7a43854a510" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:18.385860 containerd[1571]: 2025-05-15 12:36:18.307 [INFO][4972] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.5f5396d674935327778079efcec428be923c236a0016b52a40c1d7a43854a510 May 15 12:36:18.385860 containerd[1571]: 2025-05-15 12:36:18.314 [INFO][4972] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.192/26 handle="k8s-pod-network.5f5396d674935327778079efcec428be923c236a0016b52a40c1d7a43854a510" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:18.385860 containerd[1571]: 2025-05-15 12:36:18.329 [INFO][4972] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.199/26] block=192.168.88.192/26 handle="k8s-pod-network.5f5396d674935327778079efcec428be923c236a0016b52a40c1d7a43854a510" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:18.385860 containerd[1571]: 2025-05-15 12:36:18.329 [INFO][4972] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.199/26] handle="k8s-pod-network.5f5396d674935327778079efcec428be923c236a0016b52a40c1d7a43854a510" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:18.385860 containerd[1571]: 2025-05-15 12:36:18.329 [INFO][4972] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:36:18.385860 containerd[1571]: 2025-05-15 12:36:18.329 [INFO][4972] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.199/26] IPv6=[] ContainerID="5f5396d674935327778079efcec428be923c236a0016b52a40c1d7a43854a510" HandleID="k8s-pod-network.5f5396d674935327778079efcec428be923c236a0016b52a40c1d7a43854a510" Workload="ci--4334--0--0--a--dce95649a9-k8s-coredns--7db6d8ff4d--krh5f-eth0" May 15 12:36:18.387044 containerd[1571]: 2025-05-15 12:36:18.334 [INFO][4941] cni-plugin/k8s.go 386: Populated endpoint ContainerID="5f5396d674935327778079efcec428be923c236a0016b52a40c1d7a43854a510" Namespace="kube-system" Pod="coredns-7db6d8ff4d-krh5f" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-coredns--7db6d8ff4d--krh5f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334--0--0--a--dce95649a9-k8s-coredns--7db6d8ff4d--krh5f-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"6f5a1d6d-abb2-4c77-8b9b-cd49f30f6287", ResourceVersion:"748", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 35, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334-0-0-a-dce95649a9", ContainerID:"", Pod:"coredns-7db6d8ff4d-krh5f", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali66e4b893d86", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:36:18.387044 containerd[1571]: 2025-05-15 12:36:18.334 [INFO][4941] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.199/32] ContainerID="5f5396d674935327778079efcec428be923c236a0016b52a40c1d7a43854a510" Namespace="kube-system" Pod="coredns-7db6d8ff4d-krh5f" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-coredns--7db6d8ff4d--krh5f-eth0" May 15 12:36:18.387044 containerd[1571]: 2025-05-15 12:36:18.334 [INFO][4941] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali66e4b893d86 ContainerID="5f5396d674935327778079efcec428be923c236a0016b52a40c1d7a43854a510" Namespace="kube-system" Pod="coredns-7db6d8ff4d-krh5f" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-coredns--7db6d8ff4d--krh5f-eth0" May 15 12:36:18.387044 containerd[1571]: 2025-05-15 12:36:18.340 [INFO][4941] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5f5396d674935327778079efcec428be923c236a0016b52a40c1d7a43854a510" Namespace="kube-system" Pod="coredns-7db6d8ff4d-krh5f" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-coredns--7db6d8ff4d--krh5f-eth0" May 15 12:36:18.387044 containerd[1571]: 2025-05-15 12:36:18.341 [INFO][4941] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="5f5396d674935327778079efcec428be923c236a0016b52a40c1d7a43854a510" Namespace="kube-system" Pod="coredns-7db6d8ff4d-krh5f" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-coredns--7db6d8ff4d--krh5f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334--0--0--a--dce95649a9-k8s-coredns--7db6d8ff4d--krh5f-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"6f5a1d6d-abb2-4c77-8b9b-cd49f30f6287", ResourceVersion:"748", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 35, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334-0-0-a-dce95649a9", ContainerID:"5f5396d674935327778079efcec428be923c236a0016b52a40c1d7a43854a510", Pod:"coredns-7db6d8ff4d-krh5f", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali66e4b893d86", MAC:"52:b5:bb:d3:cf:45", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:36:18.387044 containerd[1571]: 2025-05-15 12:36:18.367 [INFO][4941] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="5f5396d674935327778079efcec428be923c236a0016b52a40c1d7a43854a510" Namespace="kube-system" Pod="coredns-7db6d8ff4d-krh5f" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-coredns--7db6d8ff4d--krh5f-eth0" May 15 12:36:18.423459 systemd[1]: Started cri-containerd-565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006.scope - libcontainer container 565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006. May 15 12:36:18.442046 kubelet[3207]: I0515 12:36:18.440654 3207 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-s6gwc" podStartSLOduration=50.440023784 podStartE2EDuration="50.440023784s" podCreationTimestamp="2025-05-15 12:35:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 12:36:18.419262273 +0000 UTC m=+65.379061117" watchObservedRunningTime="2025-05-15 12:36:18.440023784 +0000 UTC m=+65.399822608" May 15 12:36:18.460419 containerd[1571]: time="2025-05-15T12:36:18.460207401Z" level=info msg="connecting to shim 5f5396d674935327778079efcec428be923c236a0016b52a40c1d7a43854a510" address="unix:///run/containerd/s/911d7156ab6fcbf9411739b66640cf0c839cc2ad01fe83139c41d963b7e3658b" namespace=k8s.io protocol=ttrpc version=3 May 15 12:36:18.501475 systemd[1]: Started cri-containerd-5f5396d674935327778079efcec428be923c236a0016b52a40c1d7a43854a510.scope - libcontainer container 5f5396d674935327778079efcec428be923c236a0016b52a40c1d7a43854a510. May 15 12:36:18.564579 containerd[1571]: time="2025-05-15T12:36:18.564072273Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8b9db6c54-8mmnz,Uid:0cea196a-c35a-4ed1-aba8-1434c056f4d9,Namespace:calico-system,Attempt:0,} returns sandbox id \"565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006\"" May 15 12:36:18.577113 containerd[1571]: time="2025-05-15T12:36:18.577045060Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-krh5f,Uid:6f5a1d6d-abb2-4c77-8b9b-cd49f30f6287,Namespace:kube-system,Attempt:0,} returns sandbox id \"5f5396d674935327778079efcec428be923c236a0016b52a40c1d7a43854a510\"" May 15 12:36:18.582352 containerd[1571]: time="2025-05-15T12:36:18.582302702Z" level=info msg="CreateContainer within sandbox \"5f5396d674935327778079efcec428be923c236a0016b52a40c1d7a43854a510\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 15 12:36:18.592265 containerd[1571]: time="2025-05-15T12:36:18.592153768Z" level=info msg="Container ec558d8d6838d3bee06266635011de3bd373fbd79b7e9f7c2fac7a2ce6d248b7: CDI devices from CRI Config.CDIDevices: []" May 15 12:36:18.598101 containerd[1571]: time="2025-05-15T12:36:18.598061771Z" level=info msg="CreateContainer within sandbox \"5f5396d674935327778079efcec428be923c236a0016b52a40c1d7a43854a510\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"ec558d8d6838d3bee06266635011de3bd373fbd79b7e9f7c2fac7a2ce6d248b7\"" May 15 12:36:18.598993 containerd[1571]: time="2025-05-15T12:36:18.598964104Z" level=info msg="StartContainer for \"ec558d8d6838d3bee06266635011de3bd373fbd79b7e9f7c2fac7a2ce6d248b7\"" May 15 12:36:18.610805 containerd[1571]: time="2025-05-15T12:36:18.610761426Z" level=info msg="connecting to shim ec558d8d6838d3bee06266635011de3bd373fbd79b7e9f7c2fac7a2ce6d248b7" address="unix:///run/containerd/s/911d7156ab6fcbf9411739b66640cf0c839cc2ad01fe83139c41d963b7e3658b" protocol=ttrpc version=3 May 15 12:36:18.641457 systemd[1]: Started cri-containerd-ec558d8d6838d3bee06266635011de3bd373fbd79b7e9f7c2fac7a2ce6d248b7.scope - libcontainer container ec558d8d6838d3bee06266635011de3bd373fbd79b7e9f7c2fac7a2ce6d248b7. May 15 12:36:18.685480 containerd[1571]: time="2025-05-15T12:36:18.685438777Z" level=info msg="StartContainer for \"ec558d8d6838d3bee06266635011de3bd373fbd79b7e9f7c2fac7a2ce6d248b7\" returns successfully" May 15 12:36:19.108527 systemd-networkd[1471]: cali83fb3c70dde: Gained IPv6LL May 15 12:36:19.172514 systemd-networkd[1471]: cali8ec66ff2179: Gained IPv6LL May 15 12:36:19.205006 containerd[1571]: time="2025-05-15T12:36:19.204845295Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:36:19.205006 containerd[1571]: time="2025-05-15T12:36:19.204934812Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=43021437" May 15 12:36:19.214045 containerd[1571]: time="2025-05-15T12:36:19.213993470Z" level=info msg="ImageCreate event name:\"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:36:19.215616 containerd[1571]: time="2025-05-15T12:36:19.215500870Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:36:19.218524 containerd[1571]: time="2025-05-15T12:36:19.218486627Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"44514075\" in 4.64190714s" May 15 12:36:19.218591 containerd[1571]: time="2025-05-15T12:36:19.218524648Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\"" May 15 12:36:19.220012 containerd[1571]: time="2025-05-15T12:36:19.219965143Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 15 12:36:19.221415 containerd[1571]: time="2025-05-15T12:36:19.220708448Z" level=info msg="CreateContainer within sandbox \"add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 15 12:36:19.227382 containerd[1571]: time="2025-05-15T12:36:19.227352243Z" level=info msg="Container b1df85dab5bbfa1ae8016749b81b9ae9771c11dcce6ede09ebdd9dbebc15b2a6: CDI devices from CRI Config.CDIDevices: []" May 15 12:36:19.241043 containerd[1571]: time="2025-05-15T12:36:19.241004255Z" level=info msg="CreateContainer within sandbox \"add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"b1df85dab5bbfa1ae8016749b81b9ae9771c11dcce6ede09ebdd9dbebc15b2a6\"" May 15 12:36:19.241456 containerd[1571]: time="2025-05-15T12:36:19.241434242Z" level=info msg="StartContainer for \"b1df85dab5bbfa1ae8016749b81b9ae9771c11dcce6ede09ebdd9dbebc15b2a6\"" May 15 12:36:19.242179 containerd[1571]: time="2025-05-15T12:36:19.242146930Z" level=info msg="connecting to shim b1df85dab5bbfa1ae8016749b81b9ae9771c11dcce6ede09ebdd9dbebc15b2a6" address="unix:///run/containerd/s/ebdbb70bcca08bed7d39e3342d8fb68844f4ddb48fff95fdefc1f8c870cce150" protocol=ttrpc version=3 May 15 12:36:19.263709 systemd[1]: Started cri-containerd-b1df85dab5bbfa1ae8016749b81b9ae9771c11dcce6ede09ebdd9dbebc15b2a6.scope - libcontainer container b1df85dab5bbfa1ae8016749b81b9ae9771c11dcce6ede09ebdd9dbebc15b2a6. May 15 12:36:19.326673 containerd[1571]: time="2025-05-15T12:36:19.326636473Z" level=info msg="StartContainer for \"b1df85dab5bbfa1ae8016749b81b9ae9771c11dcce6ede09ebdd9dbebc15b2a6\" returns successfully" May 15 12:36:19.433301 kubelet[3207]: I0515 12:36:19.433236 3207 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-9bffb6db8-8dkvq" podStartSLOduration=38.786727791 podStartE2EDuration="43.433218026s" podCreationTimestamp="2025-05-15 12:35:36 +0000 UTC" firstStartedPulling="2025-05-15 12:36:14.572736752 +0000 UTC m=+61.532535576" lastFinishedPulling="2025-05-15 12:36:19.219226987 +0000 UTC m=+66.179025811" observedRunningTime="2025-05-15 12:36:19.43192538 +0000 UTC m=+66.391724204" watchObservedRunningTime="2025-05-15 12:36:19.433218026 +0000 UTC m=+66.393016850" May 15 12:36:19.448025 kubelet[3207]: I0515 12:36:19.447281 3207 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-krh5f" podStartSLOduration=51.447262706 podStartE2EDuration="51.447262706s" podCreationTimestamp="2025-05-15 12:35:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 12:36:19.446200502 +0000 UTC m=+66.405999326" watchObservedRunningTime="2025-05-15 12:36:19.447262706 +0000 UTC m=+66.407061530" May 15 12:36:19.876611 systemd-networkd[1471]: cali2e35093a7b1: Gained IPv6LL May 15 12:36:20.196617 systemd-networkd[1471]: cali66e4b893d86: Gained IPv6LL May 15 12:36:20.426106 kubelet[3207]: I0515 12:36:20.426013 3207 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 15 12:36:21.167480 containerd[1571]: time="2025-05-15T12:36:21.167372599Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:36:21.168939 containerd[1571]: time="2025-05-15T12:36:21.168878697Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=77" May 15 12:36:21.172201 containerd[1571]: time="2025-05-15T12:36:21.172148094Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"44514075\" in 1.952148677s" May 15 12:36:21.172201 containerd[1571]: time="2025-05-15T12:36:21.172195604Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\"" May 15 12:36:21.174510 containerd[1571]: time="2025-05-15T12:36:21.174465466Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\"" May 15 12:36:21.177117 containerd[1571]: time="2025-05-15T12:36:21.177020633Z" level=info msg="CreateContainer within sandbox \"0d6ae8ceda8de3ae0ff9f586e52d0189a81cb05ed38f9c3a625d84a512213ba7\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 15 12:36:21.193778 containerd[1571]: time="2025-05-15T12:36:21.192635781Z" level=info msg="Container 84c55632e59fcca1e774ff76084f18f5c07e68037e21fefb706fb4005253e311: CDI devices from CRI Config.CDIDevices: []" May 15 12:36:21.204026 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount752799863.mount: Deactivated successfully. May 15 12:36:21.207485 containerd[1571]: time="2025-05-15T12:36:21.207419188Z" level=info msg="CreateContainer within sandbox \"0d6ae8ceda8de3ae0ff9f586e52d0189a81cb05ed38f9c3a625d84a512213ba7\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"84c55632e59fcca1e774ff76084f18f5c07e68037e21fefb706fb4005253e311\"" May 15 12:36:21.208497 containerd[1571]: time="2025-05-15T12:36:21.208431458Z" level=info msg="StartContainer for \"84c55632e59fcca1e774ff76084f18f5c07e68037e21fefb706fb4005253e311\"" May 15 12:36:21.210297 containerd[1571]: time="2025-05-15T12:36:21.210223111Z" level=info msg="connecting to shim 84c55632e59fcca1e774ff76084f18f5c07e68037e21fefb706fb4005253e311" address="unix:///run/containerd/s/5059bf1800531beb0cecae590c558029208146bfbc8317399251103563719781" protocol=ttrpc version=3 May 15 12:36:21.232441 systemd[1]: Started cri-containerd-84c55632e59fcca1e774ff76084f18f5c07e68037e21fefb706fb4005253e311.scope - libcontainer container 84c55632e59fcca1e774ff76084f18f5c07e68037e21fefb706fb4005253e311. May 15 12:36:21.278287 containerd[1571]: time="2025-05-15T12:36:21.278204348Z" level=info msg="StartContainer for \"84c55632e59fcca1e774ff76084f18f5c07e68037e21fefb706fb4005253e311\" returns successfully" May 15 12:36:21.447435 kubelet[3207]: I0515 12:36:21.447008 3207 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6cf5655c7b-ch7hd" podStartSLOduration=39.609573135 podStartE2EDuration="45.446988175s" podCreationTimestamp="2025-05-15 12:35:36 +0000 UTC" firstStartedPulling="2025-05-15 12:36:15.336459937 +0000 UTC m=+62.296258761" lastFinishedPulling="2025-05-15 12:36:21.173874947 +0000 UTC m=+68.133673801" observedRunningTime="2025-05-15 12:36:21.446494859 +0000 UTC m=+68.406293712" watchObservedRunningTime="2025-05-15 12:36:21.446988175 +0000 UTC m=+68.406787009" May 15 12:36:22.433248 kubelet[3207]: I0515 12:36:22.433208 3207 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 15 12:36:28.756507 containerd[1571]: time="2025-05-15T12:36:28.756438363Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:36:28.757504 containerd[1571]: time="2025-05-15T12:36:28.757470922Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.3: active requests=0, bytes read=7912898" May 15 12:36:28.758385 containerd[1571]: time="2025-05-15T12:36:28.758317511Z" level=info msg="ImageCreate event name:\"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:36:28.760368 containerd[1571]: time="2025-05-15T12:36:28.760272462Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:36:28.761002 containerd[1571]: time="2025-05-15T12:36:28.760729499Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.3\" with image id \"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\", size \"9405520\" in 7.586212717s" May 15 12:36:28.761002 containerd[1571]: time="2025-05-15T12:36:28.760751511Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\" returns image reference \"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\"" May 15 12:36:28.761910 containerd[1571]: time="2025-05-15T12:36:28.761886481Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 15 12:36:28.763357 containerd[1571]: time="2025-05-15T12:36:28.762922466Z" level=info msg="CreateContainer within sandbox \"fd6a4542a2ce7eeec767ab419d676324a7c728ab2a544d92dc855d0ab870ab83\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 15 12:36:28.794349 containerd[1571]: time="2025-05-15T12:36:28.793756326Z" level=info msg="Container 48ec41ecac88c62f44bb8c074081b02de19c444a84f81b101aa9b1763cd905c3: CDI devices from CRI Config.CDIDevices: []" May 15 12:36:28.812679 containerd[1571]: time="2025-05-15T12:36:28.812641121Z" level=info msg="CreateContainer within sandbox \"fd6a4542a2ce7eeec767ab419d676324a7c728ab2a544d92dc855d0ab870ab83\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"48ec41ecac88c62f44bb8c074081b02de19c444a84f81b101aa9b1763cd905c3\"" May 15 12:36:28.813476 containerd[1571]: time="2025-05-15T12:36:28.813314585Z" level=info msg="StartContainer for \"48ec41ecac88c62f44bb8c074081b02de19c444a84f81b101aa9b1763cd905c3\"" May 15 12:36:28.814906 containerd[1571]: time="2025-05-15T12:36:28.814872370Z" level=info msg="connecting to shim 48ec41ecac88c62f44bb8c074081b02de19c444a84f81b101aa9b1763cd905c3" address="unix:///run/containerd/s/234b5e5ec00e1b6e6ba7b26814c525a9e6271e5ba24bac9acc45a7194dbe8b8a" protocol=ttrpc version=3 May 15 12:36:28.842457 systemd[1]: Started cri-containerd-48ec41ecac88c62f44bb8c074081b02de19c444a84f81b101aa9b1763cd905c3.scope - libcontainer container 48ec41ecac88c62f44bb8c074081b02de19c444a84f81b101aa9b1763cd905c3. May 15 12:36:28.877813 containerd[1571]: time="2025-05-15T12:36:28.877773374Z" level=info msg="StartContainer for \"48ec41ecac88c62f44bb8c074081b02de19c444a84f81b101aa9b1763cd905c3\" returns successfully" May 15 12:36:29.551596 kubelet[3207]: I0515 12:36:29.551368 3207 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 15 12:36:29.678745 kubelet[3207]: I0515 12:36:29.678700 3207 topology_manager.go:215] "Topology Admit Handler" podUID="a2ec9d1e-6ef3-474f-ab37-4af478fa1e3d" podNamespace="calico-apiserver" podName="calico-apiserver-6cf5655c7b-vmq6z" May 15 12:36:29.709648 systemd[1]: Created slice kubepods-besteffort-poda2ec9d1e_6ef3_474f_ab37_4af478fa1e3d.slice - libcontainer container kubepods-besteffort-poda2ec9d1e_6ef3_474f_ab37_4af478fa1e3d.slice. May 15 12:36:29.823888 containerd[1571]: time="2025-05-15T12:36:29.823758747Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:36:29.825422 containerd[1571]: time="2025-05-15T12:36:29.825393977Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=77" May 15 12:36:29.827880 containerd[1571]: time="2025-05-15T12:36:29.827842775Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"44514075\" in 1.065925906s" May 15 12:36:29.827963 containerd[1571]: time="2025-05-15T12:36:29.827879984Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\"" May 15 12:36:29.829044 containerd[1571]: time="2025-05-15T12:36:29.828991902Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\"" May 15 12:36:29.831524 containerd[1571]: time="2025-05-15T12:36:29.831274938Z" level=info msg="CreateContainer within sandbox \"9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 15 12:36:29.839356 containerd[1571]: time="2025-05-15T12:36:29.839101792Z" level=info msg="Container c96a5fa932ad77fd9d876a14ba22776a54f71eba8c0147d46b9090c180d858ef: CDI devices from CRI Config.CDIDevices: []" May 15 12:36:29.857189 kubelet[3207]: I0515 12:36:29.857123 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk9kn\" (UniqueName: \"kubernetes.io/projected/a2ec9d1e-6ef3-474f-ab37-4af478fa1e3d-kube-api-access-zk9kn\") pod \"calico-apiserver-6cf5655c7b-vmq6z\" (UID: \"a2ec9d1e-6ef3-474f-ab37-4af478fa1e3d\") " pod="calico-apiserver/calico-apiserver-6cf5655c7b-vmq6z" May 15 12:36:29.857548 kubelet[3207]: I0515 12:36:29.857505 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/a2ec9d1e-6ef3-474f-ab37-4af478fa1e3d-calico-apiserver-certs\") pod \"calico-apiserver-6cf5655c7b-vmq6z\" (UID: \"a2ec9d1e-6ef3-474f-ab37-4af478fa1e3d\") " pod="calico-apiserver/calico-apiserver-6cf5655c7b-vmq6z" May 15 12:36:29.858814 containerd[1571]: time="2025-05-15T12:36:29.858774356Z" level=info msg="CreateContainer within sandbox \"9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"c96a5fa932ad77fd9d876a14ba22776a54f71eba8c0147d46b9090c180d858ef\"" May 15 12:36:29.859241 containerd[1571]: time="2025-05-15T12:36:29.859192851Z" level=info msg="StartContainer for \"c96a5fa932ad77fd9d876a14ba22776a54f71eba8c0147d46b9090c180d858ef\"" May 15 12:36:29.860371 containerd[1571]: time="2025-05-15T12:36:29.860315679Z" level=info msg="connecting to shim c96a5fa932ad77fd9d876a14ba22776a54f71eba8c0147d46b9090c180d858ef" address="unix:///run/containerd/s/66a8cd14ac458f7422b7da83515afeceeb880bee838e2d09fd76b4b68efd7312" protocol=ttrpc version=3 May 15 12:36:29.877449 systemd[1]: Started cri-containerd-c96a5fa932ad77fd9d876a14ba22776a54f71eba8c0147d46b9090c180d858ef.scope - libcontainer container c96a5fa932ad77fd9d876a14ba22776a54f71eba8c0147d46b9090c180d858ef. May 15 12:36:29.940842 containerd[1571]: time="2025-05-15T12:36:29.940750712Z" level=info msg="StartContainer for \"c96a5fa932ad77fd9d876a14ba22776a54f71eba8c0147d46b9090c180d858ef\" returns successfully" May 15 12:36:30.013547 containerd[1571]: time="2025-05-15T12:36:30.013510545Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6cf5655c7b-vmq6z,Uid:a2ec9d1e-6ef3-474f-ab37-4af478fa1e3d,Namespace:calico-apiserver,Attempt:0,}" May 15 12:36:30.162696 systemd-networkd[1471]: cali110ad41eaef: Link UP May 15 12:36:30.163704 systemd-networkd[1471]: cali110ad41eaef: Gained carrier May 15 12:36:30.183855 containerd[1571]: 2025-05-15 12:36:30.085 [INFO][5302] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--6cf5655c7b--vmq6z-eth0 calico-apiserver-6cf5655c7b- calico-apiserver a2ec9d1e-6ef3-474f-ab37-4af478fa1e3d 912 0 2025-05-15 12:36:29 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6cf5655c7b projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4334-0-0-a-dce95649a9 calico-apiserver-6cf5655c7b-vmq6z eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali110ad41eaef [] []}} ContainerID="78d1b846842ec03352857672570f1ef471c8910fd75b59bd8e3789f51ba233cc" Namespace="calico-apiserver" Pod="calico-apiserver-6cf5655c7b-vmq6z" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--6cf5655c7b--vmq6z-" May 15 12:36:30.183855 containerd[1571]: 2025-05-15 12:36:30.086 [INFO][5302] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="78d1b846842ec03352857672570f1ef471c8910fd75b59bd8e3789f51ba233cc" Namespace="calico-apiserver" Pod="calico-apiserver-6cf5655c7b-vmq6z" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--6cf5655c7b--vmq6z-eth0" May 15 12:36:30.183855 containerd[1571]: 2025-05-15 12:36:30.122 [INFO][5315] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="78d1b846842ec03352857672570f1ef471c8910fd75b59bd8e3789f51ba233cc" HandleID="k8s-pod-network.78d1b846842ec03352857672570f1ef471c8910fd75b59bd8e3789f51ba233cc" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--6cf5655c7b--vmq6z-eth0" May 15 12:36:30.183855 containerd[1571]: 2025-05-15 12:36:30.130 [INFO][5315] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="78d1b846842ec03352857672570f1ef471c8910fd75b59bd8e3789f51ba233cc" HandleID="k8s-pod-network.78d1b846842ec03352857672570f1ef471c8910fd75b59bd8e3789f51ba233cc" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--6cf5655c7b--vmq6z-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003ba370), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4334-0-0-a-dce95649a9", "pod":"calico-apiserver-6cf5655c7b-vmq6z", "timestamp":"2025-05-15 12:36:30.122821197 +0000 UTC"}, Hostname:"ci-4334-0-0-a-dce95649a9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 15 12:36:30.183855 containerd[1571]: 2025-05-15 12:36:30.130 [INFO][5315] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:36:30.183855 containerd[1571]: 2025-05-15 12:36:30.130 [INFO][5315] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:36:30.183855 containerd[1571]: 2025-05-15 12:36:30.130 [INFO][5315] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4334-0-0-a-dce95649a9' May 15 12:36:30.183855 containerd[1571]: 2025-05-15 12:36:30.132 [INFO][5315] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.78d1b846842ec03352857672570f1ef471c8910fd75b59bd8e3789f51ba233cc" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:30.183855 containerd[1571]: 2025-05-15 12:36:30.135 [INFO][5315] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4334-0-0-a-dce95649a9" May 15 12:36:30.183855 containerd[1571]: 2025-05-15 12:36:30.139 [INFO][5315] ipam/ipam.go 489: Trying affinity for 192.168.88.192/26 host="ci-4334-0-0-a-dce95649a9" May 15 12:36:30.183855 containerd[1571]: 2025-05-15 12:36:30.141 [INFO][5315] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.192/26 host="ci-4334-0-0-a-dce95649a9" May 15 12:36:30.183855 containerd[1571]: 2025-05-15 12:36:30.144 [INFO][5315] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.192/26 host="ci-4334-0-0-a-dce95649a9" May 15 12:36:30.183855 containerd[1571]: 2025-05-15 12:36:30.144 [INFO][5315] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.192/26 handle="k8s-pod-network.78d1b846842ec03352857672570f1ef471c8910fd75b59bd8e3789f51ba233cc" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:30.183855 containerd[1571]: 2025-05-15 12:36:30.146 [INFO][5315] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.78d1b846842ec03352857672570f1ef471c8910fd75b59bd8e3789f51ba233cc May 15 12:36:30.183855 containerd[1571]: 2025-05-15 12:36:30.151 [INFO][5315] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.192/26 handle="k8s-pod-network.78d1b846842ec03352857672570f1ef471c8910fd75b59bd8e3789f51ba233cc" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:30.183855 containerd[1571]: 2025-05-15 12:36:30.157 [INFO][5315] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.200/26] block=192.168.88.192/26 handle="k8s-pod-network.78d1b846842ec03352857672570f1ef471c8910fd75b59bd8e3789f51ba233cc" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:30.183855 containerd[1571]: 2025-05-15 12:36:30.157 [INFO][5315] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.200/26] handle="k8s-pod-network.78d1b846842ec03352857672570f1ef471c8910fd75b59bd8e3789f51ba233cc" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:30.183855 containerd[1571]: 2025-05-15 12:36:30.157 [INFO][5315] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:36:30.183855 containerd[1571]: 2025-05-15 12:36:30.157 [INFO][5315] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.200/26] IPv6=[] ContainerID="78d1b846842ec03352857672570f1ef471c8910fd75b59bd8e3789f51ba233cc" HandleID="k8s-pod-network.78d1b846842ec03352857672570f1ef471c8910fd75b59bd8e3789f51ba233cc" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--6cf5655c7b--vmq6z-eth0" May 15 12:36:30.184459 containerd[1571]: 2025-05-15 12:36:30.160 [INFO][5302] cni-plugin/k8s.go 386: Populated endpoint ContainerID="78d1b846842ec03352857672570f1ef471c8910fd75b59bd8e3789f51ba233cc" Namespace="calico-apiserver" Pod="calico-apiserver-6cf5655c7b-vmq6z" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--6cf5655c7b--vmq6z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--6cf5655c7b--vmq6z-eth0", GenerateName:"calico-apiserver-6cf5655c7b-", Namespace:"calico-apiserver", SelfLink:"", UID:"a2ec9d1e-6ef3-474f-ab37-4af478fa1e3d", ResourceVersion:"912", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 36, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6cf5655c7b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334-0-0-a-dce95649a9", ContainerID:"", Pod:"calico-apiserver-6cf5655c7b-vmq6z", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali110ad41eaef", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:36:30.184459 containerd[1571]: 2025-05-15 12:36:30.160 [INFO][5302] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.200/32] ContainerID="78d1b846842ec03352857672570f1ef471c8910fd75b59bd8e3789f51ba233cc" Namespace="calico-apiserver" Pod="calico-apiserver-6cf5655c7b-vmq6z" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--6cf5655c7b--vmq6z-eth0" May 15 12:36:30.184459 containerd[1571]: 2025-05-15 12:36:30.160 [INFO][5302] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali110ad41eaef ContainerID="78d1b846842ec03352857672570f1ef471c8910fd75b59bd8e3789f51ba233cc" Namespace="calico-apiserver" Pod="calico-apiserver-6cf5655c7b-vmq6z" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--6cf5655c7b--vmq6z-eth0" May 15 12:36:30.184459 containerd[1571]: 2025-05-15 12:36:30.163 [INFO][5302] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="78d1b846842ec03352857672570f1ef471c8910fd75b59bd8e3789f51ba233cc" Namespace="calico-apiserver" Pod="calico-apiserver-6cf5655c7b-vmq6z" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--6cf5655c7b--vmq6z-eth0" May 15 12:36:30.184459 containerd[1571]: 2025-05-15 12:36:30.164 [INFO][5302] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="78d1b846842ec03352857672570f1ef471c8910fd75b59bd8e3789f51ba233cc" Namespace="calico-apiserver" Pod="calico-apiserver-6cf5655c7b-vmq6z" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--6cf5655c7b--vmq6z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--6cf5655c7b--vmq6z-eth0", GenerateName:"calico-apiserver-6cf5655c7b-", Namespace:"calico-apiserver", SelfLink:"", UID:"a2ec9d1e-6ef3-474f-ab37-4af478fa1e3d", ResourceVersion:"912", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 36, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6cf5655c7b", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334-0-0-a-dce95649a9", ContainerID:"78d1b846842ec03352857672570f1ef471c8910fd75b59bd8e3789f51ba233cc", Pod:"calico-apiserver-6cf5655c7b-vmq6z", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali110ad41eaef", MAC:"1e:ff:69:e5:aa:8e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:36:30.184459 containerd[1571]: 2025-05-15 12:36:30.176 [INFO][5302] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="78d1b846842ec03352857672570f1ef471c8910fd75b59bd8e3789f51ba233cc" Namespace="calico-apiserver" Pod="calico-apiserver-6cf5655c7b-vmq6z" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--6cf5655c7b--vmq6z-eth0" May 15 12:36:30.211123 containerd[1571]: time="2025-05-15T12:36:30.211085478Z" level=info msg="connecting to shim 78d1b846842ec03352857672570f1ef471c8910fd75b59bd8e3789f51ba233cc" address="unix:///run/containerd/s/34cf5ed4ac4d6a826bdd0c30f16c870c78b2cbe8daa14aa9c34b8b63ecad9885" namespace=k8s.io protocol=ttrpc version=3 May 15 12:36:30.231540 systemd[1]: Started cri-containerd-78d1b846842ec03352857672570f1ef471c8910fd75b59bd8e3789f51ba233cc.scope - libcontainer container 78d1b846842ec03352857672570f1ef471c8910fd75b59bd8e3789f51ba233cc. May 15 12:36:30.280088 containerd[1571]: time="2025-05-15T12:36:30.280049433Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6cf5655c7b-vmq6z,Uid:a2ec9d1e-6ef3-474f-ab37-4af478fa1e3d,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"78d1b846842ec03352857672570f1ef471c8910fd75b59bd8e3789f51ba233cc\"" May 15 12:36:30.285043 containerd[1571]: time="2025-05-15T12:36:30.285016368Z" level=info msg="CreateContainer within sandbox \"78d1b846842ec03352857672570f1ef471c8910fd75b59bd8e3789f51ba233cc\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 15 12:36:30.291483 containerd[1571]: time="2025-05-15T12:36:30.291428947Z" level=info msg="Container 517361bf533a57073f6a17e9882c97662984e2f9926a79dc1a1fb4c64c4dfdc3: CDI devices from CRI Config.CDIDevices: []" May 15 12:36:30.298020 containerd[1571]: time="2025-05-15T12:36:30.297937186Z" level=info msg="CreateContainer within sandbox \"78d1b846842ec03352857672570f1ef471c8910fd75b59bd8e3789f51ba233cc\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"517361bf533a57073f6a17e9882c97662984e2f9926a79dc1a1fb4c64c4dfdc3\"" May 15 12:36:30.299523 containerd[1571]: time="2025-05-15T12:36:30.299498347Z" level=info msg="StartContainer for \"517361bf533a57073f6a17e9882c97662984e2f9926a79dc1a1fb4c64c4dfdc3\"" May 15 12:36:30.300200 containerd[1571]: time="2025-05-15T12:36:30.300176670Z" level=info msg="connecting to shim 517361bf533a57073f6a17e9882c97662984e2f9926a79dc1a1fb4c64c4dfdc3" address="unix:///run/containerd/s/34cf5ed4ac4d6a826bdd0c30f16c870c78b2cbe8daa14aa9c34b8b63ecad9885" protocol=ttrpc version=3 May 15 12:36:30.322460 systemd[1]: Started cri-containerd-517361bf533a57073f6a17e9882c97662984e2f9926a79dc1a1fb4c64c4dfdc3.scope - libcontainer container 517361bf533a57073f6a17e9882c97662984e2f9926a79dc1a1fb4c64c4dfdc3. May 15 12:36:30.373997 containerd[1571]: time="2025-05-15T12:36:30.373920819Z" level=info msg="StartContainer for \"517361bf533a57073f6a17e9882c97662984e2f9926a79dc1a1fb4c64c4dfdc3\" returns successfully" May 15 12:36:30.509892 kubelet[3207]: I0515 12:36:30.508310 3207 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-9bffb6db8-psvtq" podStartSLOduration=42.106163663 podStartE2EDuration="54.508291802s" podCreationTimestamp="2025-05-15 12:35:36 +0000 UTC" firstStartedPulling="2025-05-15 12:36:17.426619263 +0000 UTC m=+64.386418087" lastFinishedPulling="2025-05-15 12:36:29.828747382 +0000 UTC m=+76.788546226" observedRunningTime="2025-05-15 12:36:30.505882468 +0000 UTC m=+77.465681303" watchObservedRunningTime="2025-05-15 12:36:30.508291802 +0000 UTC m=+77.468090636" May 15 12:36:30.527460 kubelet[3207]: I0515 12:36:30.527188 3207 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6cf5655c7b-vmq6z" podStartSLOduration=1.527175645 podStartE2EDuration="1.527175645s" podCreationTimestamp="2025-05-15 12:36:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 12:36:30.526873909 +0000 UTC m=+77.486672733" watchObservedRunningTime="2025-05-15 12:36:30.527175645 +0000 UTC m=+77.486974469" May 15 12:36:30.553120 containerd[1571]: time="2025-05-15T12:36:30.553052457Z" level=info msg="StopContainer for \"c96a5fa932ad77fd9d876a14ba22776a54f71eba8c0147d46b9090c180d858ef\" with timeout 30 (s)" May 15 12:36:30.562078 containerd[1571]: time="2025-05-15T12:36:30.562036954Z" level=info msg="Stop container \"c96a5fa932ad77fd9d876a14ba22776a54f71eba8c0147d46b9090c180d858ef\" with signal terminated" May 15 12:36:30.587241 systemd[1]: cri-containerd-c96a5fa932ad77fd9d876a14ba22776a54f71eba8c0147d46b9090c180d858ef.scope: Deactivated successfully. May 15 12:36:30.602345 containerd[1571]: time="2025-05-15T12:36:30.601781062Z" level=info msg="received exit event container_id:\"c96a5fa932ad77fd9d876a14ba22776a54f71eba8c0147d46b9090c180d858ef\" id:\"c96a5fa932ad77fd9d876a14ba22776a54f71eba8c0147d46b9090c180d858ef\" pid:5281 exit_status:1 exited_at:{seconds:1747312590 nanos:590432145}" May 15 12:36:30.602919 containerd[1571]: time="2025-05-15T12:36:30.602471488Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c96a5fa932ad77fd9d876a14ba22776a54f71eba8c0147d46b9090c180d858ef\" id:\"c96a5fa932ad77fd9d876a14ba22776a54f71eba8c0147d46b9090c180d858ef\" pid:5281 exit_status:1 exited_at:{seconds:1747312590 nanos:590432145}" May 15 12:36:30.663350 containerd[1571]: time="2025-05-15T12:36:30.663101727Z" level=info msg="StopContainer for \"c96a5fa932ad77fd9d876a14ba22776a54f71eba8c0147d46b9090c180d858ef\" returns successfully" May 15 12:36:30.676174 containerd[1571]: time="2025-05-15T12:36:30.676065475Z" level=info msg="StopPodSandbox for \"9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664\"" May 15 12:36:30.680916 containerd[1571]: time="2025-05-15T12:36:30.680883721Z" level=info msg="Container to stop \"c96a5fa932ad77fd9d876a14ba22776a54f71eba8c0147d46b9090c180d858ef\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 15 12:36:30.689581 systemd[1]: cri-containerd-9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664.scope: Deactivated successfully. May 15 12:36:30.695805 containerd[1571]: time="2025-05-15T12:36:30.695762656Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664\" id:\"9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664\" pid:4849 exit_status:137 exited_at:{seconds:1747312590 nanos:694371754}" May 15 12:36:30.733350 containerd[1571]: time="2025-05-15T12:36:30.733289289Z" level=info msg="shim disconnected" id=9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664 namespace=k8s.io May 15 12:36:30.733615 containerd[1571]: time="2025-05-15T12:36:30.733559928Z" level=warning msg="cleaning up after shim disconnected" id=9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664 namespace=k8s.io May 15 12:36:30.736462 containerd[1571]: time="2025-05-15T12:36:30.733574095Z" level=info msg="cleaning up dead shim" namespace=k8s.io May 15 12:36:30.848982 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c96a5fa932ad77fd9d876a14ba22776a54f71eba8c0147d46b9090c180d858ef-rootfs.mount: Deactivated successfully. May 15 12:36:30.849275 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664-rootfs.mount: Deactivated successfully. May 15 12:36:30.859020 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664-shm.mount: Deactivated successfully. May 15 12:36:30.878729 containerd[1571]: time="2025-05-15T12:36:30.878682906Z" level=info msg="received exit event sandbox_id:\"9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664\" exit_status:137 exited_at:{seconds:1747312590 nanos:694371754}" May 15 12:36:30.925621 systemd-networkd[1471]: cali8ec66ff2179: Link DOWN May 15 12:36:30.925630 systemd-networkd[1471]: cali8ec66ff2179: Lost carrier May 15 12:36:31.023082 containerd[1571]: 2025-05-15 12:36:30.924 [INFO][5488] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" May 15 12:36:31.023082 containerd[1571]: 2025-05-15 12:36:30.924 [INFO][5488] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" iface="eth0" netns="/var/run/netns/cni-dd2c9088-272e-a127-16e1-c28ed4ce663b" May 15 12:36:31.023082 containerd[1571]: 2025-05-15 12:36:30.924 [INFO][5488] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" iface="eth0" netns="/var/run/netns/cni-dd2c9088-272e-a127-16e1-c28ed4ce663b" May 15 12:36:31.023082 containerd[1571]: 2025-05-15 12:36:30.932 [INFO][5488] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" after=7.851632ms iface="eth0" netns="/var/run/netns/cni-dd2c9088-272e-a127-16e1-c28ed4ce663b" May 15 12:36:31.023082 containerd[1571]: 2025-05-15 12:36:30.932 [INFO][5488] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" May 15 12:36:31.023082 containerd[1571]: 2025-05-15 12:36:30.932 [INFO][5488] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" May 15 12:36:31.023082 containerd[1571]: 2025-05-15 12:36:30.982 [INFO][5496] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" HandleID="k8s-pod-network.9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--psvtq-eth0" May 15 12:36:31.023082 containerd[1571]: 2025-05-15 12:36:30.982 [INFO][5496] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:36:31.023082 containerd[1571]: 2025-05-15 12:36:30.982 [INFO][5496] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:36:31.023082 containerd[1571]: 2025-05-15 12:36:31.017 [INFO][5496] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" HandleID="k8s-pod-network.9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--psvtq-eth0" May 15 12:36:31.023082 containerd[1571]: 2025-05-15 12:36:31.017 [INFO][5496] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" HandleID="k8s-pod-network.9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--psvtq-eth0" May 15 12:36:31.023082 containerd[1571]: 2025-05-15 12:36:31.019 [INFO][5496] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:36:31.023082 containerd[1571]: 2025-05-15 12:36:31.021 [INFO][5488] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" May 15 12:36:31.026536 systemd[1]: run-netns-cni\x2ddd2c9088\x2d272e\x2da127\x2d16e1\x2dc28ed4ce663b.mount: Deactivated successfully. May 15 12:36:31.036098 containerd[1571]: time="2025-05-15T12:36:31.036062885Z" level=info msg="TearDown network for sandbox \"9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664\" successfully" May 15 12:36:31.036098 containerd[1571]: time="2025-05-15T12:36:31.036096949Z" level=info msg="StopPodSandbox for \"9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664\" returns successfully" May 15 12:36:31.189366 kubelet[3207]: I0515 12:36:31.189221 3207 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppqg2\" (UniqueName: \"kubernetes.io/projected/f7cfcf4a-92ec-4f4c-9f12-dafd70717cab-kube-api-access-ppqg2\") pod \"f7cfcf4a-92ec-4f4c-9f12-dafd70717cab\" (UID: \"f7cfcf4a-92ec-4f4c-9f12-dafd70717cab\") " May 15 12:36:31.189366 kubelet[3207]: I0515 12:36:31.189271 3207 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f7cfcf4a-92ec-4f4c-9f12-dafd70717cab-calico-apiserver-certs\") pod \"f7cfcf4a-92ec-4f4c-9f12-dafd70717cab\" (UID: \"f7cfcf4a-92ec-4f4c-9f12-dafd70717cab\") " May 15 12:36:31.199138 systemd[1]: var-lib-kubelet-pods-f7cfcf4a\x2d92ec\x2d4f4c\x2d9f12\x2ddafd70717cab-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. May 15 12:36:31.203759 systemd[1]: var-lib-kubelet-pods-f7cfcf4a\x2d92ec\x2d4f4c\x2d9f12\x2ddafd70717cab-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dppqg2.mount: Deactivated successfully. May 15 12:36:31.203999 kubelet[3207]: I0515 12:36:31.201402 3207 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7cfcf4a-92ec-4f4c-9f12-dafd70717cab-kube-api-access-ppqg2" (OuterVolumeSpecName: "kube-api-access-ppqg2") pod "f7cfcf4a-92ec-4f4c-9f12-dafd70717cab" (UID: "f7cfcf4a-92ec-4f4c-9f12-dafd70717cab"). InnerVolumeSpecName "kube-api-access-ppqg2". PluginName "kubernetes.io/projected", VolumeGidValue "" May 15 12:36:31.204060 kubelet[3207]: I0515 12:36:31.202694 3207 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7cfcf4a-92ec-4f4c-9f12-dafd70717cab-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "f7cfcf4a-92ec-4f4c-9f12-dafd70717cab" (UID: "f7cfcf4a-92ec-4f4c-9f12-dafd70717cab"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" May 15 12:36:31.291535 kubelet[3207]: I0515 12:36:31.291435 3207 reconciler_common.go:289] "Volume detached for volume \"kube-api-access-ppqg2\" (UniqueName: \"kubernetes.io/projected/f7cfcf4a-92ec-4f4c-9f12-dafd70717cab-kube-api-access-ppqg2\") on node \"ci-4334-0-0-a-dce95649a9\" DevicePath \"\"" May 15 12:36:31.291535 kubelet[3207]: I0515 12:36:31.291506 3207 reconciler_common.go:289] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f7cfcf4a-92ec-4f4c-9f12-dafd70717cab-calico-apiserver-certs\") on node \"ci-4334-0-0-a-dce95649a9\" DevicePath \"\"" May 15 12:36:31.502971 kubelet[3207]: I0515 12:36:31.502689 3207 scope.go:117] "RemoveContainer" containerID="c96a5fa932ad77fd9d876a14ba22776a54f71eba8c0147d46b9090c180d858ef" May 15 12:36:31.505341 systemd[1]: Removed slice kubepods-besteffort-podf7cfcf4a_92ec_4f4c_9f12_dafd70717cab.slice - libcontainer container kubepods-besteffort-podf7cfcf4a_92ec_4f4c_9f12_dafd70717cab.slice. May 15 12:36:31.505866 containerd[1571]: time="2025-05-15T12:36:31.505844053Z" level=info msg="RemoveContainer for \"c96a5fa932ad77fd9d876a14ba22776a54f71eba8c0147d46b9090c180d858ef\"" May 15 12:36:31.522236 containerd[1571]: time="2025-05-15T12:36:31.522169683Z" level=info msg="RemoveContainer for \"c96a5fa932ad77fd9d876a14ba22776a54f71eba8c0147d46b9090c180d858ef\" returns successfully" May 15 12:36:31.522620 kubelet[3207]: I0515 12:36:31.522600 3207 scope.go:117] "RemoveContainer" containerID="c96a5fa932ad77fd9d876a14ba22776a54f71eba8c0147d46b9090c180d858ef" May 15 12:36:31.523024 containerd[1571]: time="2025-05-15T12:36:31.522969455Z" level=error msg="ContainerStatus for \"c96a5fa932ad77fd9d876a14ba22776a54f71eba8c0147d46b9090c180d858ef\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"c96a5fa932ad77fd9d876a14ba22776a54f71eba8c0147d46b9090c180d858ef\": not found" May 15 12:36:31.531717 kubelet[3207]: E0515 12:36:31.531656 3207 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"c96a5fa932ad77fd9d876a14ba22776a54f71eba8c0147d46b9090c180d858ef\": not found" containerID="c96a5fa932ad77fd9d876a14ba22776a54f71eba8c0147d46b9090c180d858ef" May 15 12:36:31.540035 kubelet[3207]: I0515 12:36:31.531715 3207 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"c96a5fa932ad77fd9d876a14ba22776a54f71eba8c0147d46b9090c180d858ef"} err="failed to get container status \"c96a5fa932ad77fd9d876a14ba22776a54f71eba8c0147d46b9090c180d858ef\": rpc error: code = NotFound desc = an error occurred when try to find container \"c96a5fa932ad77fd9d876a14ba22776a54f71eba8c0147d46b9090c180d858ef\": not found" May 15 12:36:31.592360 kubelet[3207]: I0515 12:36:31.592310 3207 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 15 12:36:31.593444 containerd[1571]: time="2025-05-15T12:36:31.593342744Z" level=info msg="StopContainer for \"b1df85dab5bbfa1ae8016749b81b9ae9771c11dcce6ede09ebdd9dbebc15b2a6\" with timeout 30 (s)" May 15 12:36:31.594341 containerd[1571]: time="2025-05-15T12:36:31.594301714Z" level=info msg="Stop container \"b1df85dab5bbfa1ae8016749b81b9ae9771c11dcce6ede09ebdd9dbebc15b2a6\" with signal terminated" May 15 12:36:31.615134 systemd[1]: cri-containerd-b1df85dab5bbfa1ae8016749b81b9ae9771c11dcce6ede09ebdd9dbebc15b2a6.scope: Deactivated successfully. May 15 12:36:31.618865 containerd[1571]: time="2025-05-15T12:36:31.618819304Z" level=info msg="received exit event container_id:\"b1df85dab5bbfa1ae8016749b81b9ae9771c11dcce6ede09ebdd9dbebc15b2a6\" id:\"b1df85dab5bbfa1ae8016749b81b9ae9771c11dcce6ede09ebdd9dbebc15b2a6\" pid:5161 exit_status:1 exited_at:{seconds:1747312591 nanos:618270634}" May 15 12:36:31.619687 containerd[1571]: time="2025-05-15T12:36:31.619635587Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b1df85dab5bbfa1ae8016749b81b9ae9771c11dcce6ede09ebdd9dbebc15b2a6\" id:\"b1df85dab5bbfa1ae8016749b81b9ae9771c11dcce6ede09ebdd9dbebc15b2a6\" pid:5161 exit_status:1 exited_at:{seconds:1747312591 nanos:618270634}" May 15 12:36:31.645015 containerd[1571]: time="2025-05-15T12:36:31.644975200Z" level=info msg="StopContainer for \"b1df85dab5bbfa1ae8016749b81b9ae9771c11dcce6ede09ebdd9dbebc15b2a6\" returns successfully" May 15 12:36:31.645546 containerd[1571]: time="2025-05-15T12:36:31.645504313Z" level=info msg="StopPodSandbox for \"add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8\"" May 15 12:36:31.645607 containerd[1571]: time="2025-05-15T12:36:31.645560028Z" level=info msg="Container to stop \"b1df85dab5bbfa1ae8016749b81b9ae9771c11dcce6ede09ebdd9dbebc15b2a6\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 15 12:36:31.652923 systemd[1]: cri-containerd-add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8.scope: Deactivated successfully. May 15 12:36:31.654319 containerd[1571]: time="2025-05-15T12:36:31.654286943Z" level=info msg="TaskExit event in podsandbox handler container_id:\"add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8\" id:\"add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8\" pid:4582 exit_status:137 exited_at:{seconds:1747312591 nanos:653921536}" May 15 12:36:31.677048 containerd[1571]: time="2025-05-15T12:36:31.676518931Z" level=info msg="shim disconnected" id=add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8 namespace=k8s.io May 15 12:36:31.677048 containerd[1571]: time="2025-05-15T12:36:31.677038486Z" level=warning msg="cleaning up after shim disconnected" id=add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8 namespace=k8s.io May 15 12:36:31.677726 containerd[1571]: time="2025-05-15T12:36:31.677050258Z" level=info msg="cleaning up dead shim" namespace=k8s.io May 15 12:36:31.691019 containerd[1571]: time="2025-05-15T12:36:31.690757574Z" level=info msg="received exit event sandbox_id:\"add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8\" exit_status:137 exited_at:{seconds:1747312591 nanos:653921536}" May 15 12:36:31.742248 systemd-networkd[1471]: cali07bd55d4895: Link DOWN May 15 12:36:31.742951 systemd-networkd[1471]: cali07bd55d4895: Lost carrier May 15 12:36:31.810883 containerd[1571]: 2025-05-15 12:36:31.740 [INFO][5587] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" May 15 12:36:31.810883 containerd[1571]: 2025-05-15 12:36:31.740 [INFO][5587] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" iface="eth0" netns="/var/run/netns/cni-bb2ec328-8161-25f8-a9e3-693206e2d423" May 15 12:36:31.810883 containerd[1571]: 2025-05-15 12:36:31.741 [INFO][5587] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" iface="eth0" netns="/var/run/netns/cni-bb2ec328-8161-25f8-a9e3-693206e2d423" May 15 12:36:31.810883 containerd[1571]: 2025-05-15 12:36:31.748 [INFO][5587] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" after=7.536249ms iface="eth0" netns="/var/run/netns/cni-bb2ec328-8161-25f8-a9e3-693206e2d423" May 15 12:36:31.810883 containerd[1571]: 2025-05-15 12:36:31.748 [INFO][5587] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" May 15 12:36:31.810883 containerd[1571]: 2025-05-15 12:36:31.748 [INFO][5587] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" May 15 12:36:31.810883 containerd[1571]: 2025-05-15 12:36:31.775 [INFO][5596] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" HandleID="k8s-pod-network.add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--8dkvq-eth0" May 15 12:36:31.810883 containerd[1571]: 2025-05-15 12:36:31.775 [INFO][5596] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:36:31.810883 containerd[1571]: 2025-05-15 12:36:31.775 [INFO][5596] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:36:31.810883 containerd[1571]: 2025-05-15 12:36:31.806 [INFO][5596] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" HandleID="k8s-pod-network.add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--8dkvq-eth0" May 15 12:36:31.810883 containerd[1571]: 2025-05-15 12:36:31.806 [INFO][5596] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" HandleID="k8s-pod-network.add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--8dkvq-eth0" May 15 12:36:31.810883 containerd[1571]: 2025-05-15 12:36:31.807 [INFO][5596] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:36:31.810883 containerd[1571]: 2025-05-15 12:36:31.809 [INFO][5587] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" May 15 12:36:31.811649 containerd[1571]: time="2025-05-15T12:36:31.811607860Z" level=info msg="TearDown network for sandbox \"add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8\" successfully" May 15 12:36:31.811649 containerd[1571]: time="2025-05-15T12:36:31.811641493Z" level=info msg="StopPodSandbox for \"add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8\" returns successfully" May 15 12:36:31.840553 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b1df85dab5bbfa1ae8016749b81b9ae9771c11dcce6ede09ebdd9dbebc15b2a6-rootfs.mount: Deactivated successfully. May 15 12:36:31.840795 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8-rootfs.mount: Deactivated successfully. May 15 12:36:31.840954 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8-shm.mount: Deactivated successfully. May 15 12:36:31.841081 systemd[1]: run-netns-cni\x2dbb2ec328\x2d8161\x2d25f8\x2da9e3\x2d693206e2d423.mount: Deactivated successfully. May 15 12:36:31.895400 kubelet[3207]: I0515 12:36:31.895319 3207 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f536eb8d-06b8-43f3-9327-01d9d6a4d759-calico-apiserver-certs\") pod \"f536eb8d-06b8-43f3-9327-01d9d6a4d759\" (UID: \"f536eb8d-06b8-43f3-9327-01d9d6a4d759\") " May 15 12:36:31.895570 kubelet[3207]: I0515 12:36:31.895418 3207 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5htrp\" (UniqueName: \"kubernetes.io/projected/f536eb8d-06b8-43f3-9327-01d9d6a4d759-kube-api-access-5htrp\") pod \"f536eb8d-06b8-43f3-9327-01d9d6a4d759\" (UID: \"f536eb8d-06b8-43f3-9327-01d9d6a4d759\") " May 15 12:36:31.905068 kubelet[3207]: I0515 12:36:31.905021 3207 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f536eb8d-06b8-43f3-9327-01d9d6a4d759-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "f536eb8d-06b8-43f3-9327-01d9d6a4d759" (UID: "f536eb8d-06b8-43f3-9327-01d9d6a4d759"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" May 15 12:36:31.905163 systemd[1]: var-lib-kubelet-pods-f536eb8d\x2d06b8\x2d43f3\x2d9327\x2d01d9d6a4d759-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. May 15 12:36:31.909145 kubelet[3207]: I0515 12:36:31.909096 3207 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f536eb8d-06b8-43f3-9327-01d9d6a4d759-kube-api-access-5htrp" (OuterVolumeSpecName: "kube-api-access-5htrp") pod "f536eb8d-06b8-43f3-9327-01d9d6a4d759" (UID: "f536eb8d-06b8-43f3-9327-01d9d6a4d759"). InnerVolumeSpecName "kube-api-access-5htrp". PluginName "kubernetes.io/projected", VolumeGidValue "" May 15 12:36:31.909610 systemd[1]: var-lib-kubelet-pods-f536eb8d\x2d06b8\x2d43f3\x2d9327\x2d01d9d6a4d759-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d5htrp.mount: Deactivated successfully. May 15 12:36:31.995773 kubelet[3207]: I0515 12:36:31.995709 3207 reconciler_common.go:289] "Volume detached for volume \"kube-api-access-5htrp\" (UniqueName: \"kubernetes.io/projected/f536eb8d-06b8-43f3-9327-01d9d6a4d759-kube-api-access-5htrp\") on node \"ci-4334-0-0-a-dce95649a9\" DevicePath \"\"" May 15 12:36:31.995773 kubelet[3207]: I0515 12:36:31.995757 3207 reconciler_common.go:289] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f536eb8d-06b8-43f3-9327-01d9d6a4d759-calico-apiserver-certs\") on node \"ci-4334-0-0-a-dce95649a9\" DevicePath \"\"" May 15 12:36:32.164675 systemd-networkd[1471]: cali110ad41eaef: Gained IPv6LL May 15 12:36:32.503677 kubelet[3207]: I0515 12:36:32.503547 3207 scope.go:117] "RemoveContainer" containerID="b1df85dab5bbfa1ae8016749b81b9ae9771c11dcce6ede09ebdd9dbebc15b2a6" May 15 12:36:32.513395 containerd[1571]: time="2025-05-15T12:36:32.512783226Z" level=info msg="RemoveContainer for \"b1df85dab5bbfa1ae8016749b81b9ae9771c11dcce6ede09ebdd9dbebc15b2a6\"" May 15 12:36:32.517723 systemd[1]: Removed slice kubepods-besteffort-podf536eb8d_06b8_43f3_9327_01d9d6a4d759.slice - libcontainer container kubepods-besteffort-podf536eb8d_06b8_43f3_9327_01d9d6a4d759.slice. May 15 12:36:32.520562 containerd[1571]: time="2025-05-15T12:36:32.519987903Z" level=info msg="RemoveContainer for \"b1df85dab5bbfa1ae8016749b81b9ae9771c11dcce6ede09ebdd9dbebc15b2a6\" returns successfully" May 15 12:36:32.521858 kubelet[3207]: I0515 12:36:32.521798 3207 scope.go:117] "RemoveContainer" containerID="b1df85dab5bbfa1ae8016749b81b9ae9771c11dcce6ede09ebdd9dbebc15b2a6" May 15 12:36:32.522661 containerd[1571]: time="2025-05-15T12:36:32.522566373Z" level=error msg="ContainerStatus for \"b1df85dab5bbfa1ae8016749b81b9ae9771c11dcce6ede09ebdd9dbebc15b2a6\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"b1df85dab5bbfa1ae8016749b81b9ae9771c11dcce6ede09ebdd9dbebc15b2a6\": not found" May 15 12:36:32.523051 kubelet[3207]: E0515 12:36:32.522759 3207 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"b1df85dab5bbfa1ae8016749b81b9ae9771c11dcce6ede09ebdd9dbebc15b2a6\": not found" containerID="b1df85dab5bbfa1ae8016749b81b9ae9771c11dcce6ede09ebdd9dbebc15b2a6" May 15 12:36:32.523051 kubelet[3207]: I0515 12:36:32.522791 3207 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"b1df85dab5bbfa1ae8016749b81b9ae9771c11dcce6ede09ebdd9dbebc15b2a6"} err="failed to get container status \"b1df85dab5bbfa1ae8016749b81b9ae9771c11dcce6ede09ebdd9dbebc15b2a6\": rpc error: code = NotFound desc = an error occurred when try to find container \"b1df85dab5bbfa1ae8016749b81b9ae9771c11dcce6ede09ebdd9dbebc15b2a6\": not found" May 15 12:36:33.127626 kubelet[3207]: I0515 12:36:33.127192 3207 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f536eb8d-06b8-43f3-9327-01d9d6a4d759" path="/var/lib/kubelet/pods/f536eb8d-06b8-43f3-9327-01d9d6a4d759/volumes" May 15 12:36:33.128297 kubelet[3207]: I0515 12:36:33.128270 3207 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7cfcf4a-92ec-4f4c-9f12-dafd70717cab" path="/var/lib/kubelet/pods/f7cfcf4a-92ec-4f4c-9f12-dafd70717cab/volumes" May 15 12:36:36.460554 containerd[1571]: time="2025-05-15T12:36:36.460501017Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:36:36.461601 containerd[1571]: time="2025-05-15T12:36:36.461513899Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.3: active requests=0, bytes read=34789138" May 15 12:36:36.462396 containerd[1571]: time="2025-05-15T12:36:36.462372001Z" level=info msg="ImageCreate event name:\"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:36:36.464477 containerd[1571]: time="2025-05-15T12:36:36.463979348Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:36:36.464477 containerd[1571]: time="2025-05-15T12:36:36.464380311Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" with image id \"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\", size \"36281728\" in 6.635361208s" May 15 12:36:36.464477 containerd[1571]: time="2025-05-15T12:36:36.464403944Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" returns image reference \"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\"" May 15 12:36:36.465792 containerd[1571]: time="2025-05-15T12:36:36.465765952Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\"" May 15 12:36:36.480357 containerd[1571]: time="2025-05-15T12:36:36.480032987Z" level=info msg="CreateContainer within sandbox \"565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 15 12:36:36.498490 containerd[1571]: time="2025-05-15T12:36:36.498446566Z" level=info msg="Container dfe13dbb5798cdb52a3ec215905fed410779cbcf9497fddae52b1544acd6c818: CDI devices from CRI Config.CDIDevices: []" May 15 12:36:36.503062 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3090095474.mount: Deactivated successfully. May 15 12:36:36.514995 containerd[1571]: time="2025-05-15T12:36:36.514952334Z" level=info msg="CreateContainer within sandbox \"565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"dfe13dbb5798cdb52a3ec215905fed410779cbcf9497fddae52b1544acd6c818\"" May 15 12:36:36.516226 containerd[1571]: time="2025-05-15T12:36:36.515769277Z" level=info msg="StartContainer for \"dfe13dbb5798cdb52a3ec215905fed410779cbcf9497fddae52b1544acd6c818\"" May 15 12:36:36.520120 containerd[1571]: time="2025-05-15T12:36:36.520052938Z" level=info msg="connecting to shim dfe13dbb5798cdb52a3ec215905fed410779cbcf9497fddae52b1544acd6c818" address="unix:///run/containerd/s/9993a97c8181b5a4fb6794a0eeb35dc2d1393b8c0285d79783dd02f595c685a1" protocol=ttrpc version=3 May 15 12:36:36.555505 systemd[1]: Started cri-containerd-dfe13dbb5798cdb52a3ec215905fed410779cbcf9497fddae52b1544acd6c818.scope - libcontainer container dfe13dbb5798cdb52a3ec215905fed410779cbcf9497fddae52b1544acd6c818. May 15 12:36:36.608771 containerd[1571]: time="2025-05-15T12:36:36.608711515Z" level=info msg="StartContainer for \"dfe13dbb5798cdb52a3ec215905fed410779cbcf9497fddae52b1544acd6c818\" returns successfully" May 15 12:36:37.348934 containerd[1571]: time="2025-05-15T12:36:37.348767172Z" level=info msg="StopContainer for \"23933857f1169d98c8240e1a804eaa00fe5361f65e4d2e0e5cd48d201f401369\" with timeout 300 (s)" May 15 12:36:37.351012 containerd[1571]: time="2025-05-15T12:36:37.350040873Z" level=info msg="Stop container \"23933857f1169d98c8240e1a804eaa00fe5361f65e4d2e0e5cd48d201f401369\" with signal terminated" May 15 12:36:37.580139 kubelet[3207]: I0515 12:36:37.579187 3207 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-8b9db6c54-8mmnz" podStartSLOduration=43.680697712 podStartE2EDuration="1m1.579170623s" podCreationTimestamp="2025-05-15 12:35:36 +0000 UTC" firstStartedPulling="2025-05-15 12:36:18.566736415 +0000 UTC m=+65.526535240" lastFinishedPulling="2025-05-15 12:36:36.465209327 +0000 UTC m=+83.425008151" observedRunningTime="2025-05-15 12:36:37.578818471 +0000 UTC m=+84.538617296" watchObservedRunningTime="2025-05-15 12:36:37.579170623 +0000 UTC m=+84.538969457" May 15 12:36:37.636795 containerd[1571]: time="2025-05-15T12:36:37.636657816Z" level=info msg="TaskExit event in podsandbox handler container_id:\"dfe13dbb5798cdb52a3ec215905fed410779cbcf9497fddae52b1544acd6c818\" id:\"ffaf227020e8ffc0eb3bad4cf8fb9a846b0b51e8fe73fff3d069b7f574676c83\" pid:5717 exited_at:{seconds:1747312597 nanos:635071750}" May 15 12:36:37.739187 containerd[1571]: time="2025-05-15T12:36:37.739149141Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5\" id:\"bfc61d389e43806ef5c4dc95c775b05abc8f99c7aea9ccef3e82797532654bab\" pid:5693 exited_at:{seconds:1747312597 nanos:738830172}" May 15 12:36:37.770456 containerd[1571]: time="2025-05-15T12:36:37.770314329Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5\" id:\"9a730e653df6ceb5d12c6f9d8318b3d32269fd428f13a0b2f9f48aabf998057e\" pid:5707 exited_at:{seconds:1747312597 nanos:769822516}" May 15 12:36:37.774351 containerd[1571]: time="2025-05-15T12:36:37.774283500Z" level=info msg="StopContainer for \"5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5\" with timeout 5 (s)" May 15 12:36:37.775233 containerd[1571]: time="2025-05-15T12:36:37.774555802Z" level=info msg="Stop container \"5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5\" with signal terminated" May 15 12:36:37.792624 systemd[1]: cri-containerd-5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5.scope: Deactivated successfully. May 15 12:36:37.792854 systemd[1]: cri-containerd-5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5.scope: Consumed 1.548s CPU time, 218.1M memory peak, 73.8M read from disk, 692K written to disk. May 15 12:36:37.795776 containerd[1571]: time="2025-05-15T12:36:37.794627303Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5\" id:\"5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5\" pid:4175 exited_at:{seconds:1747312597 nanos:794315908}" May 15 12:36:37.795776 containerd[1571]: time="2025-05-15T12:36:37.794688929Z" level=info msg="received exit event container_id:\"5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5\" id:\"5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5\" pid:4175 exited_at:{seconds:1747312597 nanos:794315908}" May 15 12:36:37.813879 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5-rootfs.mount: Deactivated successfully. May 15 12:36:37.852345 containerd[1571]: time="2025-05-15T12:36:37.852305065Z" level=info msg="StopContainer for \"5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5\" returns successfully" May 15 12:36:37.852919 containerd[1571]: time="2025-05-15T12:36:37.852840781Z" level=info msg="StopPodSandbox for \"707fa5f61523b78a5e896fac2d98b2ffbfa13a54d3678c81085d638d8aa179c7\"" May 15 12:36:37.852919 containerd[1571]: time="2025-05-15T12:36:37.852889623Z" level=info msg="Container to stop \"618412cae37ee73f537ff619135965ecb8f483b34411e34fec694e7f48c64483\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 15 12:36:37.852919 containerd[1571]: time="2025-05-15T12:36:37.852898790Z" level=info msg="Container to stop \"5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 15 12:36:37.852919 containerd[1571]: time="2025-05-15T12:36:37.852905503Z" level=info msg="Container to stop \"8224fd3bf6efd144d236bd2174a49d4897766eb457260b15646427dbef6b9011\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 15 12:36:37.858606 systemd[1]: cri-containerd-707fa5f61523b78a5e896fac2d98b2ffbfa13a54d3678c81085d638d8aa179c7.scope: Deactivated successfully. May 15 12:36:37.865028 containerd[1571]: time="2025-05-15T12:36:37.864945506Z" level=info msg="TaskExit event in podsandbox handler container_id:\"707fa5f61523b78a5e896fac2d98b2ffbfa13a54d3678c81085d638d8aa179c7\" id:\"707fa5f61523b78a5e896fac2d98b2ffbfa13a54d3678c81085d638d8aa179c7\" pid:3695 exit_status:137 exited_at:{seconds:1747312597 nanos:864555544}" May 15 12:36:37.889455 containerd[1571]: time="2025-05-15T12:36:37.888870242Z" level=info msg="shim disconnected" id=707fa5f61523b78a5e896fac2d98b2ffbfa13a54d3678c81085d638d8aa179c7 namespace=k8s.io May 15 12:36:37.889455 containerd[1571]: time="2025-05-15T12:36:37.888946284Z" level=warning msg="cleaning up after shim disconnected" id=707fa5f61523b78a5e896fac2d98b2ffbfa13a54d3678c81085d638d8aa179c7 namespace=k8s.io May 15 12:36:37.889455 containerd[1571]: time="2025-05-15T12:36:37.888962054Z" level=info msg="cleaning up dead shim" namespace=k8s.io May 15 12:36:37.891219 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-707fa5f61523b78a5e896fac2d98b2ffbfa13a54d3678c81085d638d8aa179c7-rootfs.mount: Deactivated successfully. May 15 12:36:37.904646 containerd[1571]: time="2025-05-15T12:36:37.904614018Z" level=info msg="received exit event sandbox_id:\"707fa5f61523b78a5e896fac2d98b2ffbfa13a54d3678c81085d638d8aa179c7\" exit_status:137 exited_at:{seconds:1747312597 nanos:864555544}" May 15 12:36:37.906404 containerd[1571]: time="2025-05-15T12:36:37.906366729Z" level=info msg="TearDown network for sandbox \"707fa5f61523b78a5e896fac2d98b2ffbfa13a54d3678c81085d638d8aa179c7\" successfully" May 15 12:36:37.906650 containerd[1571]: time="2025-05-15T12:36:37.906552979Z" level=info msg="StopPodSandbox for \"707fa5f61523b78a5e896fac2d98b2ffbfa13a54d3678c81085d638d8aa179c7\" returns successfully" May 15 12:36:37.908076 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-707fa5f61523b78a5e896fac2d98b2ffbfa13a54d3678c81085d638d8aa179c7-shm.mount: Deactivated successfully. May 15 12:36:37.944255 kubelet[3207]: I0515 12:36:37.944182 3207 topology_manager.go:215] "Topology Admit Handler" podUID="e5170c42-2e72-43e8-8b06-2ca828709936" podNamespace="calico-system" podName="calico-node-gnw2x" May 15 12:36:37.950750 kubelet[3207]: E0515 12:36:37.950345 3207 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1" containerName="flexvol-driver" May 15 12:36:37.950750 kubelet[3207]: E0515 12:36:37.950385 3207 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1" containerName="install-cni" May 15 12:36:37.950750 kubelet[3207]: E0515 12:36:37.950395 3207 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="f7cfcf4a-92ec-4f4c-9f12-dafd70717cab" containerName="calico-apiserver" May 15 12:36:37.950750 kubelet[3207]: E0515 12:36:37.950405 3207 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1" containerName="calico-node" May 15 12:36:37.950750 kubelet[3207]: E0515 12:36:37.950410 3207 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="f536eb8d-06b8-43f3-9327-01d9d6a4d759" containerName="calico-apiserver" May 15 12:36:37.961015 kubelet[3207]: I0515 12:36:37.954852 3207 memory_manager.go:354] "RemoveStaleState removing state" podUID="f536eb8d-06b8-43f3-9327-01d9d6a4d759" containerName="calico-apiserver" May 15 12:36:37.961015 kubelet[3207]: I0515 12:36:37.960995 3207 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7cfcf4a-92ec-4f4c-9f12-dafd70717cab" containerName="calico-apiserver" May 15 12:36:37.961015 kubelet[3207]: I0515 12:36:37.961007 3207 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1" containerName="calico-node" May 15 12:36:37.978748 systemd[1]: Created slice kubepods-besteffort-pode5170c42_2e72_43e8_8b06_2ca828709936.slice - libcontainer container kubepods-besteffort-pode5170c42_2e72_43e8_8b06_2ca828709936.slice. May 15 12:36:38.032176 kubelet[3207]: I0515 12:36:38.032083 3207 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8whb\" (UniqueName: \"kubernetes.io/projected/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-kube-api-access-s8whb\") pod \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\" (UID: \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\") " May 15 12:36:38.032176 kubelet[3207]: I0515 12:36:38.032170 3207 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-policysync\") pod \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\" (UID: \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\") " May 15 12:36:38.032468 kubelet[3207]: I0515 12:36:38.032454 3207 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-node-certs\") pod \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\" (UID: \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\") " May 15 12:36:38.032619 kubelet[3207]: I0515 12:36:38.032602 3207 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-lib-modules\") pod \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\" (UID: \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\") " May 15 12:36:38.032654 kubelet[3207]: I0515 12:36:38.032622 3207 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-xtables-lock\") pod \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\" (UID: \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\") " May 15 12:36:38.032654 kubelet[3207]: I0515 12:36:38.032636 3207 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-cni-bin-dir\") pod \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\" (UID: \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\") " May 15 12:36:38.032654 kubelet[3207]: I0515 12:36:38.032647 3207 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-flexvol-driver-host\") pod \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\" (UID: \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\") " May 15 12:36:38.033269 kubelet[3207]: I0515 12:36:38.032658 3207 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-var-lib-calico\") pod \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\" (UID: \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\") " May 15 12:36:38.033269 kubelet[3207]: I0515 12:36:38.032674 3207 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-tigera-ca-bundle\") pod \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\" (UID: \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\") " May 15 12:36:38.033269 kubelet[3207]: I0515 12:36:38.032684 3207 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-var-run-calico\") pod \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\" (UID: \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\") " May 15 12:36:38.033269 kubelet[3207]: I0515 12:36:38.032696 3207 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-cni-log-dir\") pod \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\" (UID: \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\") " May 15 12:36:38.033269 kubelet[3207]: I0515 12:36:38.032706 3207 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-cni-net-dir\") pod \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\" (UID: \"fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1\") " May 15 12:36:38.033269 kubelet[3207]: I0515 12:36:38.032734 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/e5170c42-2e72-43e8-8b06-2ca828709936-policysync\") pod \"calico-node-gnw2x\" (UID: \"e5170c42-2e72-43e8-8b06-2ca828709936\") " pod="calico-system/calico-node-gnw2x" May 15 12:36:38.033460 kubelet[3207]: I0515 12:36:38.032754 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/e5170c42-2e72-43e8-8b06-2ca828709936-flexvol-driver-host\") pod \"calico-node-gnw2x\" (UID: \"e5170c42-2e72-43e8-8b06-2ca828709936\") " pod="calico-system/calico-node-gnw2x" May 15 12:36:38.033460 kubelet[3207]: I0515 12:36:38.032768 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w42kv\" (UniqueName: \"kubernetes.io/projected/e5170c42-2e72-43e8-8b06-2ca828709936-kube-api-access-w42kv\") pod \"calico-node-gnw2x\" (UID: \"e5170c42-2e72-43e8-8b06-2ca828709936\") " pod="calico-system/calico-node-gnw2x" May 15 12:36:38.033460 kubelet[3207]: I0515 12:36:38.032892 3207 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-policysync" (OuterVolumeSpecName: "policysync") pod "fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1" (UID: "fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1"). InnerVolumeSpecName "policysync". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 15 12:36:38.033460 kubelet[3207]: I0515 12:36:38.032994 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/e5170c42-2e72-43e8-8b06-2ca828709936-node-certs\") pod \"calico-node-gnw2x\" (UID: \"e5170c42-2e72-43e8-8b06-2ca828709936\") " pod="calico-system/calico-node-gnw2x" May 15 12:36:38.033460 kubelet[3207]: I0515 12:36:38.033011 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/e5170c42-2e72-43e8-8b06-2ca828709936-cni-bin-dir\") pod \"calico-node-gnw2x\" (UID: \"e5170c42-2e72-43e8-8b06-2ca828709936\") " pod="calico-system/calico-node-gnw2x" May 15 12:36:38.034296 kubelet[3207]: I0515 12:36:38.033062 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/e5170c42-2e72-43e8-8b06-2ca828709936-cni-net-dir\") pod \"calico-node-gnw2x\" (UID: \"e5170c42-2e72-43e8-8b06-2ca828709936\") " pod="calico-system/calico-node-gnw2x" May 15 12:36:38.034296 kubelet[3207]: I0515 12:36:38.033083 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/e5170c42-2e72-43e8-8b06-2ca828709936-cni-log-dir\") pod \"calico-node-gnw2x\" (UID: \"e5170c42-2e72-43e8-8b06-2ca828709936\") " pod="calico-system/calico-node-gnw2x" May 15 12:36:38.034296 kubelet[3207]: I0515 12:36:38.033096 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/e5170c42-2e72-43e8-8b06-2ca828709936-xtables-lock\") pod \"calico-node-gnw2x\" (UID: \"e5170c42-2e72-43e8-8b06-2ca828709936\") " pod="calico-system/calico-node-gnw2x" May 15 12:36:38.034296 kubelet[3207]: I0515 12:36:38.033110 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e5170c42-2e72-43e8-8b06-2ca828709936-lib-modules\") pod \"calico-node-gnw2x\" (UID: \"e5170c42-2e72-43e8-8b06-2ca828709936\") " pod="calico-system/calico-node-gnw2x" May 15 12:36:38.034296 kubelet[3207]: I0515 12:36:38.033123 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/e5170c42-2e72-43e8-8b06-2ca828709936-var-lib-calico\") pod \"calico-node-gnw2x\" (UID: \"e5170c42-2e72-43e8-8b06-2ca828709936\") " pod="calico-system/calico-node-gnw2x" May 15 12:36:38.034455 kubelet[3207]: I0515 12:36:38.033136 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5170c42-2e72-43e8-8b06-2ca828709936-tigera-ca-bundle\") pod \"calico-node-gnw2x\" (UID: \"e5170c42-2e72-43e8-8b06-2ca828709936\") " pod="calico-system/calico-node-gnw2x" May 15 12:36:38.034455 kubelet[3207]: I0515 12:36:38.033157 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/e5170c42-2e72-43e8-8b06-2ca828709936-var-run-calico\") pod \"calico-node-gnw2x\" (UID: \"e5170c42-2e72-43e8-8b06-2ca828709936\") " pod="calico-system/calico-node-gnw2x" May 15 12:36:38.034455 kubelet[3207]: I0515 12:36:38.033172 3207 reconciler_common.go:289] "Volume detached for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-policysync\") on node \"ci-4334-0-0-a-dce95649a9\" DevicePath \"\"" May 15 12:36:38.034455 kubelet[3207]: I0515 12:36:38.033079 3207 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1" (UID: "fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 15 12:36:38.034455 kubelet[3207]: I0515 12:36:38.033256 3207 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-var-run-calico" (OuterVolumeSpecName: "var-run-calico") pod "fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1" (UID: "fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1"). InnerVolumeSpecName "var-run-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 15 12:36:38.035090 kubelet[3207]: I0515 12:36:38.033274 3207 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-cni-log-dir" (OuterVolumeSpecName: "cni-log-dir") pod "fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1" (UID: "fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1"). InnerVolumeSpecName "cni-log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 15 12:36:38.035090 kubelet[3207]: I0515 12:36:38.033285 3207 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-cni-net-dir" (OuterVolumeSpecName: "cni-net-dir") pod "fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1" (UID: "fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1"). InnerVolumeSpecName "cni-net-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 15 12:36:38.035090 kubelet[3207]: I0515 12:36:38.033732 3207 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-xtables-lock" (OuterVolumeSpecName: "xtables-lock") pod "fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1" (UID: "fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1"). InnerVolumeSpecName "xtables-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 15 12:36:38.035090 kubelet[3207]: I0515 12:36:38.033756 3207 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-cni-bin-dir" (OuterVolumeSpecName: "cni-bin-dir") pod "fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1" (UID: "fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1"). InnerVolumeSpecName "cni-bin-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 15 12:36:38.035090 kubelet[3207]: I0515 12:36:38.033779 3207 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-flexvol-driver-host" (OuterVolumeSpecName: "flexvol-driver-host") pod "fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1" (UID: "fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1"). InnerVolumeSpecName "flexvol-driver-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 15 12:36:38.035247 kubelet[3207]: I0515 12:36:38.033794 3207 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-var-lib-calico" (OuterVolumeSpecName: "var-lib-calico") pod "fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1" (UID: "fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1"). InnerVolumeSpecName "var-lib-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 15 12:36:38.037711 systemd[1]: var-lib-kubelet-pods-fa5bd1ea\x2db5f7\x2d4bb5\x2db6d2\x2ddb45304772d1-volumes-kubernetes.io\x7esecret-node\x2dcerts.mount: Deactivated successfully. May 15 12:36:38.040469 kubelet[3207]: I0515 12:36:38.040101 3207 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-node-certs" (OuterVolumeSpecName: "node-certs") pod "fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1" (UID: "fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1"). InnerVolumeSpecName "node-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" May 15 12:36:38.043105 systemd[1]: var-lib-kubelet-pods-fa5bd1ea\x2db5f7\x2d4bb5\x2db6d2\x2ddb45304772d1-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2ds8whb.mount: Deactivated successfully. May 15 12:36:38.044658 kubelet[3207]: I0515 12:36:38.044612 3207 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-kube-api-access-s8whb" (OuterVolumeSpecName: "kube-api-access-s8whb") pod "fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1" (UID: "fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1"). InnerVolumeSpecName "kube-api-access-s8whb". PluginName "kubernetes.io/projected", VolumeGidValue "" May 15 12:36:38.048581 kubelet[3207]: I0515 12:36:38.048555 3207 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1" (UID: "fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" May 15 12:36:38.133971 kubelet[3207]: I0515 12:36:38.133750 3207 reconciler_common.go:289] "Volume detached for volume \"kube-api-access-s8whb\" (UniqueName: \"kubernetes.io/projected/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-kube-api-access-s8whb\") on node \"ci-4334-0-0-a-dce95649a9\" DevicePath \"\"" May 15 12:36:38.133971 kubelet[3207]: I0515 12:36:38.133779 3207 reconciler_common.go:289] "Volume detached for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-node-certs\") on node \"ci-4334-0-0-a-dce95649a9\" DevicePath \"\"" May 15 12:36:38.133971 kubelet[3207]: I0515 12:36:38.133788 3207 reconciler_common.go:289] "Volume detached for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-xtables-lock\") on node \"ci-4334-0-0-a-dce95649a9\" DevicePath \"\"" May 15 12:36:38.133971 kubelet[3207]: I0515 12:36:38.133798 3207 reconciler_common.go:289] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-tigera-ca-bundle\") on node \"ci-4334-0-0-a-dce95649a9\" DevicePath \"\"" May 15 12:36:38.133971 kubelet[3207]: I0515 12:36:38.133806 3207 reconciler_common.go:289] "Volume detached for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-cni-net-dir\") on node \"ci-4334-0-0-a-dce95649a9\" DevicePath \"\"" May 15 12:36:38.133971 kubelet[3207]: I0515 12:36:38.133813 3207 reconciler_common.go:289] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-lib-modules\") on node \"ci-4334-0-0-a-dce95649a9\" DevicePath \"\"" May 15 12:36:38.133971 kubelet[3207]: I0515 12:36:38.133820 3207 reconciler_common.go:289] "Volume detached for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-cni-bin-dir\") on node \"ci-4334-0-0-a-dce95649a9\" DevicePath \"\"" May 15 12:36:38.133971 kubelet[3207]: I0515 12:36:38.133826 3207 reconciler_common.go:289] "Volume detached for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-flexvol-driver-host\") on node \"ci-4334-0-0-a-dce95649a9\" DevicePath \"\"" May 15 12:36:38.135622 kubelet[3207]: I0515 12:36:38.133832 3207 reconciler_common.go:289] "Volume detached for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-var-lib-calico\") on node \"ci-4334-0-0-a-dce95649a9\" DevicePath \"\"" May 15 12:36:38.135622 kubelet[3207]: I0515 12:36:38.133841 3207 reconciler_common.go:289] "Volume detached for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-cni-log-dir\") on node \"ci-4334-0-0-a-dce95649a9\" DevicePath \"\"" May 15 12:36:38.135622 kubelet[3207]: I0515 12:36:38.133847 3207 reconciler_common.go:289] "Volume detached for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1-var-run-calico\") on node \"ci-4334-0-0-a-dce95649a9\" DevicePath \"\"" May 15 12:36:38.283379 containerd[1571]: time="2025-05-15T12:36:38.283235056Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-gnw2x,Uid:e5170c42-2e72-43e8-8b06-2ca828709936,Namespace:calico-system,Attempt:0,}" May 15 12:36:38.303852 containerd[1571]: time="2025-05-15T12:36:38.303724031Z" level=info msg="connecting to shim 27ed20acc5e4bd6d5fbcbc749f5df005727cc8ec1b0d82a21f16827d10e0a1e0" address="unix:///run/containerd/s/7ad2e154d44428339a378ea83ffda90016a5ee4efe007547df15b0cd5da18b4a" namespace=k8s.io protocol=ttrpc version=3 May 15 12:36:38.325484 systemd[1]: Started cri-containerd-27ed20acc5e4bd6d5fbcbc749f5df005727cc8ec1b0d82a21f16827d10e0a1e0.scope - libcontainer container 27ed20acc5e4bd6d5fbcbc749f5df005727cc8ec1b0d82a21f16827d10e0a1e0. May 15 12:36:38.353085 containerd[1571]: time="2025-05-15T12:36:38.353032941Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-gnw2x,Uid:e5170c42-2e72-43e8-8b06-2ca828709936,Namespace:calico-system,Attempt:0,} returns sandbox id \"27ed20acc5e4bd6d5fbcbc749f5df005727cc8ec1b0d82a21f16827d10e0a1e0\"" May 15 12:36:38.356486 containerd[1571]: time="2025-05-15T12:36:38.356438384Z" level=info msg="CreateContainer within sandbox \"27ed20acc5e4bd6d5fbcbc749f5df005727cc8ec1b0d82a21f16827d10e0a1e0\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 15 12:36:38.364689 containerd[1571]: time="2025-05-15T12:36:38.364551986Z" level=info msg="Container efa2e9fff9d26d83476741361b993b84d443e326cd115ff4ee764edd3904cc47: CDI devices from CRI Config.CDIDevices: []" May 15 12:36:38.370945 containerd[1571]: time="2025-05-15T12:36:38.370825474Z" level=info msg="CreateContainer within sandbox \"27ed20acc5e4bd6d5fbcbc749f5df005727cc8ec1b0d82a21f16827d10e0a1e0\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"efa2e9fff9d26d83476741361b993b84d443e326cd115ff4ee764edd3904cc47\"" May 15 12:36:38.371552 containerd[1571]: time="2025-05-15T12:36:38.371517263Z" level=info msg="StartContainer for \"efa2e9fff9d26d83476741361b993b84d443e326cd115ff4ee764edd3904cc47\"" May 15 12:36:38.372785 containerd[1571]: time="2025-05-15T12:36:38.372758212Z" level=info msg="connecting to shim efa2e9fff9d26d83476741361b993b84d443e326cd115ff4ee764edd3904cc47" address="unix:///run/containerd/s/7ad2e154d44428339a378ea83ffda90016a5ee4efe007547df15b0cd5da18b4a" protocol=ttrpc version=3 May 15 12:36:38.391459 systemd[1]: Started cri-containerd-efa2e9fff9d26d83476741361b993b84d443e326cd115ff4ee764edd3904cc47.scope - libcontainer container efa2e9fff9d26d83476741361b993b84d443e326cd115ff4ee764edd3904cc47. May 15 12:36:38.429423 containerd[1571]: time="2025-05-15T12:36:38.429372106Z" level=info msg="StartContainer for \"efa2e9fff9d26d83476741361b993b84d443e326cd115ff4ee764edd3904cc47\" returns successfully" May 15 12:36:38.469691 systemd[1]: cri-containerd-efa2e9fff9d26d83476741361b993b84d443e326cd115ff4ee764edd3904cc47.scope: Deactivated successfully. May 15 12:36:38.470201 systemd[1]: cri-containerd-efa2e9fff9d26d83476741361b993b84d443e326cd115ff4ee764edd3904cc47.scope: Consumed 31ms CPU time, 17.9M memory peak, 9.8M read from disk, 6.3M written to disk. May 15 12:36:38.472162 containerd[1571]: time="2025-05-15T12:36:38.472114970Z" level=info msg="received exit event container_id:\"efa2e9fff9d26d83476741361b993b84d443e326cd115ff4ee764edd3904cc47\" id:\"efa2e9fff9d26d83476741361b993b84d443e326cd115ff4ee764edd3904cc47\" pid:5864 exited_at:{seconds:1747312598 nanos:471683871}" May 15 12:36:38.472308 containerd[1571]: time="2025-05-15T12:36:38.472184731Z" level=info msg="TaskExit event in podsandbox handler container_id:\"efa2e9fff9d26d83476741361b993b84d443e326cd115ff4ee764edd3904cc47\" id:\"efa2e9fff9d26d83476741361b993b84d443e326cd115ff4ee764edd3904cc47\" pid:5864 exited_at:{seconds:1747312598 nanos:471683871}" May 15 12:36:38.547073 kubelet[3207]: I0515 12:36:38.546973 3207 scope.go:117] "RemoveContainer" containerID="5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5" May 15 12:36:38.563067 systemd[1]: Removed slice kubepods-besteffort-podfa5bd1ea_b5f7_4bb5_b6d2_db45304772d1.slice - libcontainer container kubepods-besteffort-podfa5bd1ea_b5f7_4bb5_b6d2_db45304772d1.slice. May 15 12:36:38.563210 systemd[1]: kubepods-besteffort-podfa5bd1ea_b5f7_4bb5_b6d2_db45304772d1.slice: Consumed 1.937s CPU time, 233.5M memory peak, 73.8M read from disk, 161.1M written to disk. May 15 12:36:38.573985 containerd[1571]: time="2025-05-15T12:36:38.573314077Z" level=info msg="RemoveContainer for \"5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5\"" May 15 12:36:38.577870 containerd[1571]: time="2025-05-15T12:36:38.577817161Z" level=info msg="CreateContainer within sandbox \"27ed20acc5e4bd6d5fbcbc749f5df005727cc8ec1b0d82a21f16827d10e0a1e0\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 15 12:36:38.586115 containerd[1571]: time="2025-05-15T12:36:38.585077690Z" level=info msg="RemoveContainer for \"5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5\" returns successfully" May 15 12:36:38.586549 kubelet[3207]: I0515 12:36:38.586456 3207 scope.go:117] "RemoveContainer" containerID="8224fd3bf6efd144d236bd2174a49d4897766eb457260b15646427dbef6b9011" May 15 12:36:38.589730 containerd[1571]: time="2025-05-15T12:36:38.589632342Z" level=info msg="Container 751fb54da8312eb3d058fcbb0a2841004712c966bf61ae3c6c799067795cb4f8: CDI devices from CRI Config.CDIDevices: []" May 15 12:36:38.591343 containerd[1571]: time="2025-05-15T12:36:38.590987886Z" level=info msg="RemoveContainer for \"8224fd3bf6efd144d236bd2174a49d4897766eb457260b15646427dbef6b9011\"" May 15 12:36:38.599943 containerd[1571]: time="2025-05-15T12:36:38.599903514Z" level=info msg="RemoveContainer for \"8224fd3bf6efd144d236bd2174a49d4897766eb457260b15646427dbef6b9011\" returns successfully" May 15 12:36:38.600406 kubelet[3207]: I0515 12:36:38.600108 3207 scope.go:117] "RemoveContainer" containerID="618412cae37ee73f537ff619135965ecb8f483b34411e34fec694e7f48c64483" May 15 12:36:38.603978 containerd[1571]: time="2025-05-15T12:36:38.603868097Z" level=info msg="CreateContainer within sandbox \"27ed20acc5e4bd6d5fbcbc749f5df005727cc8ec1b0d82a21f16827d10e0a1e0\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"751fb54da8312eb3d058fcbb0a2841004712c966bf61ae3c6c799067795cb4f8\"" May 15 12:36:38.604607 containerd[1571]: time="2025-05-15T12:36:38.604535179Z" level=info msg="StartContainer for \"751fb54da8312eb3d058fcbb0a2841004712c966bf61ae3c6c799067795cb4f8\"" May 15 12:36:38.605588 containerd[1571]: time="2025-05-15T12:36:38.605413208Z" level=info msg="RemoveContainer for \"618412cae37ee73f537ff619135965ecb8f483b34411e34fec694e7f48c64483\"" May 15 12:36:38.609350 containerd[1571]: time="2025-05-15T12:36:38.607625711Z" level=info msg="connecting to shim 751fb54da8312eb3d058fcbb0a2841004712c966bf61ae3c6c799067795cb4f8" address="unix:///run/containerd/s/7ad2e154d44428339a378ea83ffda90016a5ee4efe007547df15b0cd5da18b4a" protocol=ttrpc version=3 May 15 12:36:38.620030 containerd[1571]: time="2025-05-15T12:36:38.619956130Z" level=info msg="RemoveContainer for \"618412cae37ee73f537ff619135965ecb8f483b34411e34fec694e7f48c64483\" returns successfully" May 15 12:36:38.620809 kubelet[3207]: I0515 12:36:38.620793 3207 scope.go:117] "RemoveContainer" containerID="5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5" May 15 12:36:38.621701 containerd[1571]: time="2025-05-15T12:36:38.621613632Z" level=error msg="ContainerStatus for \"5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5\": not found" May 15 12:36:38.622131 kubelet[3207]: E0515 12:36:38.622101 3207 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5\": not found" containerID="5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5" May 15 12:36:38.622501 kubelet[3207]: I0515 12:36:38.622136 3207 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5"} err="failed to get container status \"5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5\": rpc error: code = NotFound desc = an error occurred when try to find container \"5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5\": not found" May 15 12:36:38.622501 kubelet[3207]: I0515 12:36:38.622159 3207 scope.go:117] "RemoveContainer" containerID="8224fd3bf6efd144d236bd2174a49d4897766eb457260b15646427dbef6b9011" May 15 12:36:38.622565 containerd[1571]: time="2025-05-15T12:36:38.622389378Z" level=error msg="ContainerStatus for \"8224fd3bf6efd144d236bd2174a49d4897766eb457260b15646427dbef6b9011\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"8224fd3bf6efd144d236bd2174a49d4897766eb457260b15646427dbef6b9011\": not found" May 15 12:36:38.622809 kubelet[3207]: E0515 12:36:38.622785 3207 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"8224fd3bf6efd144d236bd2174a49d4897766eb457260b15646427dbef6b9011\": not found" containerID="8224fd3bf6efd144d236bd2174a49d4897766eb457260b15646427dbef6b9011" May 15 12:36:38.622841 kubelet[3207]: I0515 12:36:38.622809 3207 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"8224fd3bf6efd144d236bd2174a49d4897766eb457260b15646427dbef6b9011"} err="failed to get container status \"8224fd3bf6efd144d236bd2174a49d4897766eb457260b15646427dbef6b9011\": rpc error: code = NotFound desc = an error occurred when try to find container \"8224fd3bf6efd144d236bd2174a49d4897766eb457260b15646427dbef6b9011\": not found" May 15 12:36:38.622841 kubelet[3207]: I0515 12:36:38.622824 3207 scope.go:117] "RemoveContainer" containerID="618412cae37ee73f537ff619135965ecb8f483b34411e34fec694e7f48c64483" May 15 12:36:38.623112 containerd[1571]: time="2025-05-15T12:36:38.622991839Z" level=error msg="ContainerStatus for \"618412cae37ee73f537ff619135965ecb8f483b34411e34fec694e7f48c64483\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"618412cae37ee73f537ff619135965ecb8f483b34411e34fec694e7f48c64483\": not found" May 15 12:36:38.623163 kubelet[3207]: E0515 12:36:38.623107 3207 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"618412cae37ee73f537ff619135965ecb8f483b34411e34fec694e7f48c64483\": not found" containerID="618412cae37ee73f537ff619135965ecb8f483b34411e34fec694e7f48c64483" May 15 12:36:38.623163 kubelet[3207]: I0515 12:36:38.623133 3207 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"618412cae37ee73f537ff619135965ecb8f483b34411e34fec694e7f48c64483"} err="failed to get container status \"618412cae37ee73f537ff619135965ecb8f483b34411e34fec694e7f48c64483\": rpc error: code = NotFound desc = an error occurred when try to find container \"618412cae37ee73f537ff619135965ecb8f483b34411e34fec694e7f48c64483\": not found" May 15 12:36:38.638487 systemd[1]: Started cri-containerd-751fb54da8312eb3d058fcbb0a2841004712c966bf61ae3c6c799067795cb4f8.scope - libcontainer container 751fb54da8312eb3d058fcbb0a2841004712c966bf61ae3c6c799067795cb4f8. May 15 12:36:38.676081 containerd[1571]: time="2025-05-15T12:36:38.676024180Z" level=info msg="StartContainer for \"751fb54da8312eb3d058fcbb0a2841004712c966bf61ae3c6c799067795cb4f8\" returns successfully" May 15 12:36:38.819451 systemd[1]: var-lib-kubelet-pods-fa5bd1ea\x2db5f7\x2d4bb5\x2db6d2\x2ddb45304772d1-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dnode-1.mount: Deactivated successfully. May 15 12:36:39.126209 kubelet[3207]: I0515 12:36:39.126070 3207 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1" path="/var/lib/kubelet/pods/fa5bd1ea-b5f7-4bb5-b6d2-db45304772d1/volumes" May 15 12:36:39.557672 containerd[1571]: time="2025-05-15T12:36:39.557528360Z" level=info msg="StopContainer for \"dfe13dbb5798cdb52a3ec215905fed410779cbcf9497fddae52b1544acd6c818\" with timeout 30 (s)" May 15 12:36:39.558323 containerd[1571]: time="2025-05-15T12:36:39.558264863Z" level=info msg="Stop container \"dfe13dbb5798cdb52a3ec215905fed410779cbcf9497fddae52b1544acd6c818\" with signal terminated" May 15 12:36:39.582495 systemd[1]: cri-containerd-23933857f1169d98c8240e1a804eaa00fe5361f65e4d2e0e5cd48d201f401369.scope: Deactivated successfully. May 15 12:36:39.582731 systemd[1]: cri-containerd-23933857f1169d98c8240e1a804eaa00fe5361f65e4d2e0e5cd48d201f401369.scope: Consumed 215ms CPU time, 37.3M memory peak, 26.1M read from disk. May 15 12:36:39.583676 containerd[1571]: time="2025-05-15T12:36:39.582813630Z" level=info msg="received exit event container_id:\"23933857f1169d98c8240e1a804eaa00fe5361f65e4d2e0e5cd48d201f401369\" id:\"23933857f1169d98c8240e1a804eaa00fe5361f65e4d2e0e5cd48d201f401369\" pid:3729 exit_status:1 exited_at:{seconds:1747312599 nanos:582317428}" May 15 12:36:39.583984 containerd[1571]: time="2025-05-15T12:36:39.583964590Z" level=info msg="TaskExit event in podsandbox handler container_id:\"23933857f1169d98c8240e1a804eaa00fe5361f65e4d2e0e5cd48d201f401369\" id:\"23933857f1169d98c8240e1a804eaa00fe5361f65e4d2e0e5cd48d201f401369\" pid:3729 exit_status:1 exited_at:{seconds:1747312599 nanos:582317428}" May 15 12:36:39.601025 systemd[1]: cri-containerd-dfe13dbb5798cdb52a3ec215905fed410779cbcf9497fddae52b1544acd6c818.scope: Deactivated successfully. May 15 12:36:39.606283 containerd[1571]: time="2025-05-15T12:36:39.606117499Z" level=info msg="received exit event container_id:\"dfe13dbb5798cdb52a3ec215905fed410779cbcf9497fddae52b1544acd6c818\" id:\"dfe13dbb5798cdb52a3ec215905fed410779cbcf9497fddae52b1544acd6c818\" pid:5636 exit_status:2 exited_at:{seconds:1747312599 nanos:605916382}" May 15 12:36:39.607985 containerd[1571]: time="2025-05-15T12:36:39.607940632Z" level=info msg="TaskExit event in podsandbox handler container_id:\"dfe13dbb5798cdb52a3ec215905fed410779cbcf9497fddae52b1544acd6c818\" id:\"dfe13dbb5798cdb52a3ec215905fed410779cbcf9497fddae52b1544acd6c818\" pid:5636 exit_status:2 exited_at:{seconds:1747312599 nanos:605916382}" May 15 12:36:39.633796 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-23933857f1169d98c8240e1a804eaa00fe5361f65e4d2e0e5cd48d201f401369-rootfs.mount: Deactivated successfully. May 15 12:36:39.650213 containerd[1571]: time="2025-05-15T12:36:39.648884517Z" level=info msg="StopContainer for \"23933857f1169d98c8240e1a804eaa00fe5361f65e4d2e0e5cd48d201f401369\" returns successfully" May 15 12:36:39.653360 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-dfe13dbb5798cdb52a3ec215905fed410779cbcf9497fddae52b1544acd6c818-rootfs.mount: Deactivated successfully. May 15 12:36:39.654286 containerd[1571]: time="2025-05-15T12:36:39.653764558Z" level=info msg="StopPodSandbox for \"fbbffb0dc23d24ebe2df35b6c6ffb88a730020f39a5222ec0a817e1b9ddaff3d\"" May 15 12:36:39.654286 containerd[1571]: time="2025-05-15T12:36:39.653829081Z" level=info msg="Container to stop \"23933857f1169d98c8240e1a804eaa00fe5361f65e4d2e0e5cd48d201f401369\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 15 12:36:39.662734 containerd[1571]: time="2025-05-15T12:36:39.662688042Z" level=info msg="StopContainer for \"dfe13dbb5798cdb52a3ec215905fed410779cbcf9497fddae52b1544acd6c818\" returns successfully" May 15 12:36:39.664226 containerd[1571]: time="2025-05-15T12:36:39.664171697Z" level=info msg="StopPodSandbox for \"565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006\"" May 15 12:36:39.665892 containerd[1571]: time="2025-05-15T12:36:39.664388494Z" level=info msg="Container to stop \"dfe13dbb5798cdb52a3ec215905fed410779cbcf9497fddae52b1544acd6c818\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 15 12:36:39.664805 systemd[1]: cri-containerd-fbbffb0dc23d24ebe2df35b6c6ffb88a730020f39a5222ec0a817e1b9ddaff3d.scope: Deactivated successfully. May 15 12:36:39.668061 containerd[1571]: time="2025-05-15T12:36:39.667922368Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbbffb0dc23d24ebe2df35b6c6ffb88a730020f39a5222ec0a817e1b9ddaff3d\" id:\"fbbffb0dc23d24ebe2df35b6c6ffb88a730020f39a5222ec0a817e1b9ddaff3d\" pid:3624 exit_status:137 exited_at:{seconds:1747312599 nanos:667554017}" May 15 12:36:39.675879 systemd[1]: cri-containerd-565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006.scope: Deactivated successfully. May 15 12:36:39.676442 systemd[1]: cri-containerd-565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006.scope: Consumed 28ms CPU time, 5.4M memory peak, 1.5M read from disk. May 15 12:36:39.705922 containerd[1571]: time="2025-05-15T12:36:39.705843409Z" level=info msg="TaskExit event in podsandbox handler container_id:\"565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006\" id:\"565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006\" pid:5037 exit_status:137 exited_at:{seconds:1747312599 nanos:680626418}" May 15 12:36:39.706607 containerd[1571]: time="2025-05-15T12:36:39.706413148Z" level=info msg="shim disconnected" id=fbbffb0dc23d24ebe2df35b6c6ffb88a730020f39a5222ec0a817e1b9ddaff3d namespace=k8s.io May 15 12:36:39.706946 containerd[1571]: time="2025-05-15T12:36:39.706868764Z" level=warning msg="cleaning up after shim disconnected" id=fbbffb0dc23d24ebe2df35b6c6ffb88a730020f39a5222ec0a817e1b9ddaff3d namespace=k8s.io May 15 12:36:39.707017 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-fbbffb0dc23d24ebe2df35b6c6ffb88a730020f39a5222ec0a817e1b9ddaff3d-rootfs.mount: Deactivated successfully. May 15 12:36:39.707835 containerd[1571]: time="2025-05-15T12:36:39.707242075Z" level=info msg="cleaning up dead shim" namespace=k8s.io May 15 12:36:39.707835 containerd[1571]: time="2025-05-15T12:36:39.707285928Z" level=info msg="TearDown network for sandbox \"fbbffb0dc23d24ebe2df35b6c6ffb88a730020f39a5222ec0a817e1b9ddaff3d\" successfully" May 15 12:36:39.707835 containerd[1571]: time="2025-05-15T12:36:39.707309792Z" level=info msg="StopPodSandbox for \"fbbffb0dc23d24ebe2df35b6c6ffb88a730020f39a5222ec0a817e1b9ddaff3d\" returns successfully" May 15 12:36:39.708283 containerd[1571]: time="2025-05-15T12:36:39.706456340Z" level=info msg="received exit event sandbox_id:\"fbbffb0dc23d24ebe2df35b6c6ffb88a730020f39a5222ec0a817e1b9ddaff3d\" exit_status:137 exited_at:{seconds:1747312599 nanos:667554017}" May 15 12:36:39.715049 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-fbbffb0dc23d24ebe2df35b6c6ffb88a730020f39a5222ec0a817e1b9ddaff3d-shm.mount: Deactivated successfully. May 15 12:36:39.726811 systemd[1]: cri-containerd-751fb54da8312eb3d058fcbb0a2841004712c966bf61ae3c6c799067795cb4f8.scope: Deactivated successfully. May 15 12:36:39.727013 systemd[1]: cri-containerd-751fb54da8312eb3d058fcbb0a2841004712c966bf61ae3c6c799067795cb4f8.scope: Consumed 595ms CPU time, 267M memory peak, 273.6M read from disk. May 15 12:36:39.729195 containerd[1571]: time="2025-05-15T12:36:39.729146648Z" level=info msg="shim disconnected" id=565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006 namespace=k8s.io May 15 12:36:39.729298 containerd[1571]: time="2025-05-15T12:36:39.729285217Z" level=warning msg="cleaning up after shim disconnected" id=565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006 namespace=k8s.io May 15 12:36:39.729819 containerd[1571]: time="2025-05-15T12:36:39.729377200Z" level=info msg="cleaning up dead shim" namespace=k8s.io May 15 12:36:39.732423 containerd[1571]: time="2025-05-15T12:36:39.730446978Z" level=info msg="received exit event sandbox_id:\"565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006\" exit_status:137 exited_at:{seconds:1747312599 nanos:680626418}" May 15 12:36:39.738741 containerd[1571]: time="2025-05-15T12:36:39.737899509Z" level=info msg="received exit event container_id:\"751fb54da8312eb3d058fcbb0a2841004712c966bf61ae3c6c799067795cb4f8\" id:\"751fb54da8312eb3d058fcbb0a2841004712c966bf61ae3c6c799067795cb4f8\" pid:5911 exited_at:{seconds:1747312599 nanos:737473950}" May 15 12:36:39.738741 containerd[1571]: time="2025-05-15T12:36:39.738134762Z" level=info msg="TaskExit event in podsandbox handler container_id:\"751fb54da8312eb3d058fcbb0a2841004712c966bf61ae3c6c799067795cb4f8\" id:\"751fb54da8312eb3d058fcbb0a2841004712c966bf61ae3c6c799067795cb4f8\" pid:5911 exited_at:{seconds:1747312599 nanos:737473950}" May 15 12:36:39.813601 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-751fb54da8312eb3d058fcbb0a2841004712c966bf61ae3c6c799067795cb4f8-rootfs.mount: Deactivated successfully. May 15 12:36:39.813806 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006-rootfs.mount: Deactivated successfully. May 15 12:36:39.813982 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006-shm.mount: Deactivated successfully. May 15 12:36:39.820311 systemd-networkd[1471]: cali2e35093a7b1: Link DOWN May 15 12:36:39.820320 systemd-networkd[1471]: cali2e35093a7b1: Lost carrier May 15 12:36:39.845762 kubelet[3207]: I0515 12:36:39.845624 3207 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/6c8535cc-4665-4d1e-a5d8-657a84679149-typha-certs\") pod \"6c8535cc-4665-4d1e-a5d8-657a84679149\" (UID: \"6c8535cc-4665-4d1e-a5d8-657a84679149\") " May 15 12:36:39.845762 kubelet[3207]: I0515 12:36:39.845656 3207 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c8535cc-4665-4d1e-a5d8-657a84679149-tigera-ca-bundle\") pod \"6c8535cc-4665-4d1e-a5d8-657a84679149\" (UID: \"6c8535cc-4665-4d1e-a5d8-657a84679149\") " May 15 12:36:39.845762 kubelet[3207]: I0515 12:36:39.845680 3207 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6g7q\" (UniqueName: \"kubernetes.io/projected/6c8535cc-4665-4d1e-a5d8-657a84679149-kube-api-access-k6g7q\") pod \"6c8535cc-4665-4d1e-a5d8-657a84679149\" (UID: \"6c8535cc-4665-4d1e-a5d8-657a84679149\") " May 15 12:36:39.853813 kubelet[3207]: I0515 12:36:39.853674 3207 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c8535cc-4665-4d1e-a5d8-657a84679149-kube-api-access-k6g7q" (OuterVolumeSpecName: "kube-api-access-k6g7q") pod "6c8535cc-4665-4d1e-a5d8-657a84679149" (UID: "6c8535cc-4665-4d1e-a5d8-657a84679149"). InnerVolumeSpecName "kube-api-access-k6g7q". PluginName "kubernetes.io/projected", VolumeGidValue "" May 15 12:36:39.854387 systemd[1]: var-lib-kubelet-pods-6c8535cc\x2d4665\x2d4d1e\x2da5d8\x2d657a84679149-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dk6g7q.mount: Deactivated successfully. May 15 12:36:39.858174 systemd[1]: var-lib-kubelet-pods-6c8535cc\x2d4665\x2d4d1e\x2da5d8\x2d657a84679149-volumes-kubernetes.io\x7esecret-typha\x2dcerts.mount: Deactivated successfully. May 15 12:36:39.861910 systemd[1]: var-lib-kubelet-pods-6c8535cc\x2d4665\x2d4d1e\x2da5d8\x2d657a84679149-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dtypha-1.mount: Deactivated successfully. May 15 12:36:39.862281 kubelet[3207]: I0515 12:36:39.862123 3207 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c8535cc-4665-4d1e-a5d8-657a84679149-typha-certs" (OuterVolumeSpecName: "typha-certs") pod "6c8535cc-4665-4d1e-a5d8-657a84679149" (UID: "6c8535cc-4665-4d1e-a5d8-657a84679149"). InnerVolumeSpecName "typha-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" May 15 12:36:39.865582 kubelet[3207]: I0515 12:36:39.865554 3207 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c8535cc-4665-4d1e-a5d8-657a84679149-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "6c8535cc-4665-4d1e-a5d8-657a84679149" (UID: "6c8535cc-4665-4d1e-a5d8-657a84679149"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" May 15 12:36:39.893460 containerd[1571]: 2025-05-15 12:36:39.817 [INFO][6037] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" May 15 12:36:39.893460 containerd[1571]: 2025-05-15 12:36:39.818 [INFO][6037] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" iface="eth0" netns="/var/run/netns/cni-7eed8ec4-8ceb-5fea-309e-cc4e4ef87f2b" May 15 12:36:39.893460 containerd[1571]: 2025-05-15 12:36:39.818 [INFO][6037] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" iface="eth0" netns="/var/run/netns/cni-7eed8ec4-8ceb-5fea-309e-cc4e4ef87f2b" May 15 12:36:39.893460 containerd[1571]: 2025-05-15 12:36:39.826 [INFO][6037] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" after=8.152425ms iface="eth0" netns="/var/run/netns/cni-7eed8ec4-8ceb-5fea-309e-cc4e4ef87f2b" May 15 12:36:39.893460 containerd[1571]: 2025-05-15 12:36:39.826 [INFO][6037] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" May 15 12:36:39.893460 containerd[1571]: 2025-05-15 12:36:39.826 [INFO][6037] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" May 15 12:36:39.893460 containerd[1571]: 2025-05-15 12:36:39.845 [INFO][6071] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" HandleID="k8s-pod-network.565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--8b9db6c54--8mmnz-eth0" May 15 12:36:39.893460 containerd[1571]: 2025-05-15 12:36:39.846 [INFO][6071] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:36:39.893460 containerd[1571]: 2025-05-15 12:36:39.846 [INFO][6071] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:36:39.893460 containerd[1571]: 2025-05-15 12:36:39.886 [INFO][6071] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" HandleID="k8s-pod-network.565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--8b9db6c54--8mmnz-eth0" May 15 12:36:39.893460 containerd[1571]: 2025-05-15 12:36:39.886 [INFO][6071] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" HandleID="k8s-pod-network.565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--8b9db6c54--8mmnz-eth0" May 15 12:36:39.893460 containerd[1571]: 2025-05-15 12:36:39.889 [INFO][6071] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:36:39.893460 containerd[1571]: 2025-05-15 12:36:39.891 [INFO][6037] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" May 15 12:36:39.895606 containerd[1571]: time="2025-05-15T12:36:39.895454845Z" level=info msg="TearDown network for sandbox \"565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006\" successfully" May 15 12:36:39.895606 containerd[1571]: time="2025-05-15T12:36:39.895501613Z" level=info msg="StopPodSandbox for \"565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006\" returns successfully" May 15 12:36:39.896437 systemd[1]: run-netns-cni\x2d7eed8ec4\x2d8ceb\x2d5fea\x2d309e\x2dcc4e4ef87f2b.mount: Deactivated successfully. May 15 12:36:39.946495 kubelet[3207]: I0515 12:36:39.946447 3207 reconciler_common.go:289] "Volume detached for volume \"kube-api-access-k6g7q\" (UniqueName: \"kubernetes.io/projected/6c8535cc-4665-4d1e-a5d8-657a84679149-kube-api-access-k6g7q\") on node \"ci-4334-0-0-a-dce95649a9\" DevicePath \"\"" May 15 12:36:39.946495 kubelet[3207]: I0515 12:36:39.946487 3207 reconciler_common.go:289] "Volume detached for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/6c8535cc-4665-4d1e-a5d8-657a84679149-typha-certs\") on node \"ci-4334-0-0-a-dce95649a9\" DevicePath \"\"" May 15 12:36:39.946675 kubelet[3207]: I0515 12:36:39.946498 3207 reconciler_common.go:289] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c8535cc-4665-4d1e-a5d8-657a84679149-tigera-ca-bundle\") on node \"ci-4334-0-0-a-dce95649a9\" DevicePath \"\"" May 15 12:36:40.046763 kubelet[3207]: I0515 12:36:40.046713 3207 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wf7nc\" (UniqueName: \"kubernetes.io/projected/0cea196a-c35a-4ed1-aba8-1434c056f4d9-kube-api-access-wf7nc\") pod \"0cea196a-c35a-4ed1-aba8-1434c056f4d9\" (UID: \"0cea196a-c35a-4ed1-aba8-1434c056f4d9\") " May 15 12:36:40.046763 kubelet[3207]: I0515 12:36:40.046771 3207 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0cea196a-c35a-4ed1-aba8-1434c056f4d9-tigera-ca-bundle\") pod \"0cea196a-c35a-4ed1-aba8-1434c056f4d9\" (UID: \"0cea196a-c35a-4ed1-aba8-1434c056f4d9\") " May 15 12:36:40.050376 kubelet[3207]: I0515 12:36:40.050312 3207 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cea196a-c35a-4ed1-aba8-1434c056f4d9-kube-api-access-wf7nc" (OuterVolumeSpecName: "kube-api-access-wf7nc") pod "0cea196a-c35a-4ed1-aba8-1434c056f4d9" (UID: "0cea196a-c35a-4ed1-aba8-1434c056f4d9"). InnerVolumeSpecName "kube-api-access-wf7nc". PluginName "kubernetes.io/projected", VolumeGidValue "" May 15 12:36:40.051027 kubelet[3207]: I0515 12:36:40.051005 3207 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cea196a-c35a-4ed1-aba8-1434c056f4d9-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "0cea196a-c35a-4ed1-aba8-1434c056f4d9" (UID: "0cea196a-c35a-4ed1-aba8-1434c056f4d9"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" May 15 12:36:40.147501 kubelet[3207]: I0515 12:36:40.147455 3207 reconciler_common.go:289] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0cea196a-c35a-4ed1-aba8-1434c056f4d9-tigera-ca-bundle\") on node \"ci-4334-0-0-a-dce95649a9\" DevicePath \"\"" May 15 12:36:40.147634 kubelet[3207]: I0515 12:36:40.147503 3207 reconciler_common.go:289] "Volume detached for volume \"kube-api-access-wf7nc\" (UniqueName: \"kubernetes.io/projected/0cea196a-c35a-4ed1-aba8-1434c056f4d9-kube-api-access-wf7nc\") on node \"ci-4334-0-0-a-dce95649a9\" DevicePath \"\"" May 15 12:36:40.566591 kubelet[3207]: I0515 12:36:40.564100 3207 scope.go:117] "RemoveContainer" containerID="23933857f1169d98c8240e1a804eaa00fe5361f65e4d2e0e5cd48d201f401369" May 15 12:36:40.578434 systemd[1]: Removed slice kubepods-besteffort-pod6c8535cc_4665_4d1e_a5d8_657a84679149.slice - libcontainer container kubepods-besteffort-pod6c8535cc_4665_4d1e_a5d8_657a84679149.slice. May 15 12:36:40.578605 systemd[1]: kubepods-besteffort-pod6c8535cc_4665_4d1e_a5d8_657a84679149.slice: Consumed 241ms CPU time, 37.6M memory peak, 26.1M read from disk. May 15 12:36:40.586103 containerd[1571]: time="2025-05-15T12:36:40.583804380Z" level=info msg="RemoveContainer for \"23933857f1169d98c8240e1a804eaa00fe5361f65e4d2e0e5cd48d201f401369\"" May 15 12:36:40.599771 containerd[1571]: time="2025-05-15T12:36:40.599725058Z" level=info msg="RemoveContainer for \"23933857f1169d98c8240e1a804eaa00fe5361f65e4d2e0e5cd48d201f401369\" returns successfully" May 15 12:36:40.601989 containerd[1571]: time="2025-05-15T12:36:40.601937561Z" level=info msg="CreateContainer within sandbox \"27ed20acc5e4bd6d5fbcbc749f5df005727cc8ec1b0d82a21f16827d10e0a1e0\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 15 12:36:40.604639 kubelet[3207]: I0515 12:36:40.604430 3207 scope.go:117] "RemoveContainer" containerID="23933857f1169d98c8240e1a804eaa00fe5361f65e4d2e0e5cd48d201f401369" May 15 12:36:40.605008 containerd[1571]: time="2025-05-15T12:36:40.604973821Z" level=error msg="ContainerStatus for \"23933857f1169d98c8240e1a804eaa00fe5361f65e4d2e0e5cd48d201f401369\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"23933857f1169d98c8240e1a804eaa00fe5361f65e4d2e0e5cd48d201f401369\": not found" May 15 12:36:40.605323 kubelet[3207]: E0515 12:36:40.605119 3207 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"23933857f1169d98c8240e1a804eaa00fe5361f65e4d2e0e5cd48d201f401369\": not found" containerID="23933857f1169d98c8240e1a804eaa00fe5361f65e4d2e0e5cd48d201f401369" May 15 12:36:40.605323 kubelet[3207]: I0515 12:36:40.605140 3207 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"23933857f1169d98c8240e1a804eaa00fe5361f65e4d2e0e5cd48d201f401369"} err="failed to get container status \"23933857f1169d98c8240e1a804eaa00fe5361f65e4d2e0e5cd48d201f401369\": rpc error: code = NotFound desc = an error occurred when try to find container \"23933857f1169d98c8240e1a804eaa00fe5361f65e4d2e0e5cd48d201f401369\": not found" May 15 12:36:40.605323 kubelet[3207]: I0515 12:36:40.605176 3207 scope.go:117] "RemoveContainer" containerID="dfe13dbb5798cdb52a3ec215905fed410779cbcf9497fddae52b1544acd6c818" May 15 12:36:40.606812 systemd[1]: Removed slice kubepods-besteffort-pod0cea196a_c35a_4ed1_aba8_1434c056f4d9.slice - libcontainer container kubepods-besteffort-pod0cea196a_c35a_4ed1_aba8_1434c056f4d9.slice. May 15 12:36:40.607068 systemd[1]: kubepods-besteffort-pod0cea196a_c35a_4ed1_aba8_1434c056f4d9.slice: Consumed 146ms CPU time, 14.1M memory peak, 1.5M read from disk, 20K written to disk. May 15 12:36:40.612528 containerd[1571]: time="2025-05-15T12:36:40.612306057Z" level=info msg="RemoveContainer for \"dfe13dbb5798cdb52a3ec215905fed410779cbcf9497fddae52b1544acd6c818\"" May 15 12:36:40.624725 containerd[1571]: time="2025-05-15T12:36:40.624699252Z" level=info msg="RemoveContainer for \"dfe13dbb5798cdb52a3ec215905fed410779cbcf9497fddae52b1544acd6c818\" returns successfully" May 15 12:36:40.628002 kubelet[3207]: I0515 12:36:40.627937 3207 scope.go:117] "RemoveContainer" containerID="dfe13dbb5798cdb52a3ec215905fed410779cbcf9497fddae52b1544acd6c818" May 15 12:36:40.630389 containerd[1571]: time="2025-05-15T12:36:40.630043565Z" level=error msg="ContainerStatus for \"dfe13dbb5798cdb52a3ec215905fed410779cbcf9497fddae52b1544acd6c818\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"dfe13dbb5798cdb52a3ec215905fed410779cbcf9497fddae52b1544acd6c818\": not found" May 15 12:36:40.632350 kubelet[3207]: E0515 12:36:40.632284 3207 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"dfe13dbb5798cdb52a3ec215905fed410779cbcf9497fddae52b1544acd6c818\": not found" containerID="dfe13dbb5798cdb52a3ec215905fed410779cbcf9497fddae52b1544acd6c818" May 15 12:36:40.632533 kubelet[3207]: I0515 12:36:40.632490 3207 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"dfe13dbb5798cdb52a3ec215905fed410779cbcf9497fddae52b1544acd6c818"} err="failed to get container status \"dfe13dbb5798cdb52a3ec215905fed410779cbcf9497fddae52b1544acd6c818\": rpc error: code = NotFound desc = an error occurred when try to find container \"dfe13dbb5798cdb52a3ec215905fed410779cbcf9497fddae52b1544acd6c818\": not found" May 15 12:36:40.644946 containerd[1571]: time="2025-05-15T12:36:40.644878185Z" level=info msg="Container ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718: CDI devices from CRI Config.CDIDevices: []" May 15 12:36:40.656732 containerd[1571]: time="2025-05-15T12:36:40.656691934Z" level=info msg="CreateContainer within sandbox \"27ed20acc5e4bd6d5fbcbc749f5df005727cc8ec1b0d82a21f16827d10e0a1e0\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\"" May 15 12:36:40.658121 containerd[1571]: time="2025-05-15T12:36:40.658096531Z" level=info msg="StartContainer for \"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\"" May 15 12:36:40.663627 containerd[1571]: time="2025-05-15T12:36:40.663540501Z" level=info msg="connecting to shim ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718" address="unix:///run/containerd/s/7ad2e154d44428339a378ea83ffda90016a5ee4efe007547df15b0cd5da18b4a" protocol=ttrpc version=3 May 15 12:36:40.692611 systemd[1]: Started cri-containerd-ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718.scope - libcontainer container ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718. May 15 12:36:40.733148 containerd[1571]: time="2025-05-15T12:36:40.733098375Z" level=info msg="StartContainer for \"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" returns successfully" May 15 12:36:40.815599 systemd[1]: var-lib-kubelet-pods-0cea196a\x2dc35a\x2d4ed1\x2daba8\x2d1434c056f4d9-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dkube\x2dcontrollers-1.mount: Deactivated successfully. May 15 12:36:40.816393 systemd[1]: var-lib-kubelet-pods-0cea196a\x2dc35a\x2d4ed1\x2daba8\x2d1434c056f4d9-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dwf7nc.mount: Deactivated successfully. May 15 12:36:40.940039 kubelet[3207]: I0515 12:36:40.939962 3207 topology_manager.go:215] "Topology Admit Handler" podUID="af86d615-c3b2-4b22-b128-f6697ae70d07" podNamespace="calico-system" podName="calico-typha-588588989-hhb74" May 15 12:36:40.940039 kubelet[3207]: E0515 12:36:40.940030 3207 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="6c8535cc-4665-4d1e-a5d8-657a84679149" containerName="calico-typha" May 15 12:36:40.940039 kubelet[3207]: E0515 12:36:40.940040 3207 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="0cea196a-c35a-4ed1-aba8-1434c056f4d9" containerName="calico-kube-controllers" May 15 12:36:40.941719 kubelet[3207]: I0515 12:36:40.940063 3207 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c8535cc-4665-4d1e-a5d8-657a84679149" containerName="calico-typha" May 15 12:36:40.941719 kubelet[3207]: I0515 12:36:40.940067 3207 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cea196a-c35a-4ed1-aba8-1434c056f4d9" containerName="calico-kube-controllers" May 15 12:36:40.954058 systemd[1]: Created slice kubepods-besteffort-podaf86d615_c3b2_4b22_b128_f6697ae70d07.slice - libcontainer container kubepods-besteffort-podaf86d615_c3b2_4b22_b128_f6697ae70d07.slice. May 15 12:36:41.052235 kubelet[3207]: I0515 12:36:41.052150 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/af86d615-c3b2-4b22-b128-f6697ae70d07-tigera-ca-bundle\") pod \"calico-typha-588588989-hhb74\" (UID: \"af86d615-c3b2-4b22-b128-f6697ae70d07\") " pod="calico-system/calico-typha-588588989-hhb74" May 15 12:36:41.052235 kubelet[3207]: I0515 12:36:41.052202 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/af86d615-c3b2-4b22-b128-f6697ae70d07-typha-certs\") pod \"calico-typha-588588989-hhb74\" (UID: \"af86d615-c3b2-4b22-b128-f6697ae70d07\") " pod="calico-system/calico-typha-588588989-hhb74" May 15 12:36:41.052235 kubelet[3207]: I0515 12:36:41.052226 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm4ws\" (UniqueName: \"kubernetes.io/projected/af86d615-c3b2-4b22-b128-f6697ae70d07-kube-api-access-gm4ws\") pod \"calico-typha-588588989-hhb74\" (UID: \"af86d615-c3b2-4b22-b128-f6697ae70d07\") " pod="calico-system/calico-typha-588588989-hhb74" May 15 12:36:41.125530 kubelet[3207]: I0515 12:36:41.125490 3207 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cea196a-c35a-4ed1-aba8-1434c056f4d9" path="/var/lib/kubelet/pods/0cea196a-c35a-4ed1-aba8-1434c056f4d9/volumes" May 15 12:36:41.126208 kubelet[3207]: I0515 12:36:41.126167 3207 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c8535cc-4665-4d1e-a5d8-657a84679149" path="/var/lib/kubelet/pods/6c8535cc-4665-4d1e-a5d8-657a84679149/volumes" May 15 12:36:41.277687 containerd[1571]: time="2025-05-15T12:36:41.277271408Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-588588989-hhb74,Uid:af86d615-c3b2-4b22-b128-f6697ae70d07,Namespace:calico-system,Attempt:0,}" May 15 12:36:41.299353 containerd[1571]: time="2025-05-15T12:36:41.299217788Z" level=info msg="connecting to shim 4513ae86be14adc72445121ef325df9df21717b2fae496f7aafad7f60a28af68" address="unix:///run/containerd/s/af6a63bdf978fdd85985f509bbc745acb25f92794f38271145369caa73132228" namespace=k8s.io protocol=ttrpc version=3 May 15 12:36:41.324491 systemd[1]: Started cri-containerd-4513ae86be14adc72445121ef325df9df21717b2fae496f7aafad7f60a28af68.scope - libcontainer container 4513ae86be14adc72445121ef325df9df21717b2fae496f7aafad7f60a28af68. May 15 12:36:41.366822 containerd[1571]: time="2025-05-15T12:36:41.366778552Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-588588989-hhb74,Uid:af86d615-c3b2-4b22-b128-f6697ae70d07,Namespace:calico-system,Attempt:0,} returns sandbox id \"4513ae86be14adc72445121ef325df9df21717b2fae496f7aafad7f60a28af68\"" May 15 12:36:41.372939 containerd[1571]: time="2025-05-15T12:36:41.372918759Z" level=info msg="CreateContainer within sandbox \"4513ae86be14adc72445121ef325df9df21717b2fae496f7aafad7f60a28af68\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 15 12:36:41.378053 containerd[1571]: time="2025-05-15T12:36:41.378030717Z" level=info msg="Container 9198737416f6a4ae71703954d84c90b6a7c29c9855c2e1909002ba343819c3b1: CDI devices from CRI Config.CDIDevices: []" May 15 12:36:41.383265 containerd[1571]: time="2025-05-15T12:36:41.383228074Z" level=info msg="CreateContainer within sandbox \"4513ae86be14adc72445121ef325df9df21717b2fae496f7aafad7f60a28af68\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"9198737416f6a4ae71703954d84c90b6a7c29c9855c2e1909002ba343819c3b1\"" May 15 12:36:41.384029 containerd[1571]: time="2025-05-15T12:36:41.383826016Z" level=info msg="StartContainer for \"9198737416f6a4ae71703954d84c90b6a7c29c9855c2e1909002ba343819c3b1\"" May 15 12:36:41.385064 containerd[1571]: time="2025-05-15T12:36:41.385009567Z" level=info msg="connecting to shim 9198737416f6a4ae71703954d84c90b6a7c29c9855c2e1909002ba343819c3b1" address="unix:///run/containerd/s/af6a63bdf978fdd85985f509bbc745acb25f92794f38271145369caa73132228" protocol=ttrpc version=3 May 15 12:36:41.404448 systemd[1]: Started cri-containerd-9198737416f6a4ae71703954d84c90b6a7c29c9855c2e1909002ba343819c3b1.scope - libcontainer container 9198737416f6a4ae71703954d84c90b6a7c29c9855c2e1909002ba343819c3b1. May 15 12:36:41.447204 containerd[1571]: time="2025-05-15T12:36:41.447156759Z" level=info msg="StartContainer for \"9198737416f6a4ae71703954d84c90b6a7c29c9855c2e1909002ba343819c3b1\" returns successfully" May 15 12:36:41.635155 kubelet[3207]: I0515 12:36:41.633792 3207 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-gnw2x" podStartSLOduration=4.63377899 podStartE2EDuration="4.63377899s" podCreationTimestamp="2025-05-15 12:36:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 12:36:41.632248417 +0000 UTC m=+88.592047241" watchObservedRunningTime="2025-05-15 12:36:41.63377899 +0000 UTC m=+88.593577814" May 15 12:36:41.705851 containerd[1571]: time="2025-05-15T12:36:41.705807200Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"41b2d0a73038a7b37e1a72ae825df0b68794291e420f5f15b9c667e37549890c\" pid:6231 exit_status:1 exited_at:{seconds:1747312601 nanos:705525892}" May 15 12:36:41.937827 kubelet[3207]: I0515 12:36:41.937769 3207 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-588588989-hhb74" podStartSLOduration=4.937750736 podStartE2EDuration="4.937750736s" podCreationTimestamp="2025-05-15 12:36:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 12:36:41.654559533 +0000 UTC m=+88.614358367" watchObservedRunningTime="2025-05-15 12:36:41.937750736 +0000 UTC m=+88.897549570" May 15 12:36:41.938047 kubelet[3207]: I0515 12:36:41.937904 3207 topology_manager.go:215] "Topology Admit Handler" podUID="c6710ce4-bab8-40e8-bf93-a39c0fce1f08" podNamespace="calico-system" podName="calico-kube-controllers-58c86669fc-58trq" May 15 12:36:41.945671 systemd[1]: Created slice kubepods-besteffort-podc6710ce4_bab8_40e8_bf93_a39c0fce1f08.slice - libcontainer container kubepods-besteffort-podc6710ce4_bab8_40e8_bf93_a39c0fce1f08.slice. May 15 12:36:42.060339 kubelet[3207]: I0515 12:36:42.060232 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7ss2\" (UniqueName: \"kubernetes.io/projected/c6710ce4-bab8-40e8-bf93-a39c0fce1f08-kube-api-access-d7ss2\") pod \"calico-kube-controllers-58c86669fc-58trq\" (UID: \"c6710ce4-bab8-40e8-bf93-a39c0fce1f08\") " pod="calico-system/calico-kube-controllers-58c86669fc-58trq" May 15 12:36:42.060339 kubelet[3207]: I0515 12:36:42.060269 3207 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c6710ce4-bab8-40e8-bf93-a39c0fce1f08-tigera-ca-bundle\") pod \"calico-kube-controllers-58c86669fc-58trq\" (UID: \"c6710ce4-bab8-40e8-bf93-a39c0fce1f08\") " pod="calico-system/calico-kube-controllers-58c86669fc-58trq" May 15 12:36:42.248563 containerd[1571]: time="2025-05-15T12:36:42.248458033Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-58c86669fc-58trq,Uid:c6710ce4-bab8-40e8-bf93-a39c0fce1f08,Namespace:calico-system,Attempt:0,}" May 15 12:36:42.358604 systemd-networkd[1471]: cali87e0cb6e8fb: Link UP May 15 12:36:42.359814 systemd-networkd[1471]: cali87e0cb6e8fb: Gained carrier May 15 12:36:42.386483 containerd[1571]: 2025-05-15 12:36:42.287 [INFO][6335] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--58c86669fc--58trq-eth0 calico-kube-controllers-58c86669fc- calico-system c6710ce4-bab8-40e8-bf93-a39c0fce1f08 1121 0 2025-05-15 12:36:40 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:58c86669fc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4334-0-0-a-dce95649a9 calico-kube-controllers-58c86669fc-58trq eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali87e0cb6e8fb [] []}} ContainerID="47fce455f6b4d1eb01788f7f8143ec8f9e0c30e4da18be5547c8e74c1251cb2b" Namespace="calico-system" Pod="calico-kube-controllers-58c86669fc-58trq" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--58c86669fc--58trq-" May 15 12:36:42.386483 containerd[1571]: 2025-05-15 12:36:42.287 [INFO][6335] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="47fce455f6b4d1eb01788f7f8143ec8f9e0c30e4da18be5547c8e74c1251cb2b" Namespace="calico-system" Pod="calico-kube-controllers-58c86669fc-58trq" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--58c86669fc--58trq-eth0" May 15 12:36:42.386483 containerd[1571]: 2025-05-15 12:36:42.310 [INFO][6347] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="47fce455f6b4d1eb01788f7f8143ec8f9e0c30e4da18be5547c8e74c1251cb2b" HandleID="k8s-pod-network.47fce455f6b4d1eb01788f7f8143ec8f9e0c30e4da18be5547c8e74c1251cb2b" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--58c86669fc--58trq-eth0" May 15 12:36:42.386483 containerd[1571]: 2025-05-15 12:36:42.321 [INFO][6347] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="47fce455f6b4d1eb01788f7f8143ec8f9e0c30e4da18be5547c8e74c1251cb2b" HandleID="k8s-pod-network.47fce455f6b4d1eb01788f7f8143ec8f9e0c30e4da18be5547c8e74c1251cb2b" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--58c86669fc--58trq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00031c7a0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4334-0-0-a-dce95649a9", "pod":"calico-kube-controllers-58c86669fc-58trq", "timestamp":"2025-05-15 12:36:42.310642654 +0000 UTC"}, Hostname:"ci-4334-0-0-a-dce95649a9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 15 12:36:42.386483 containerd[1571]: 2025-05-15 12:36:42.321 [INFO][6347] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:36:42.386483 containerd[1571]: 2025-05-15 12:36:42.321 [INFO][6347] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:36:42.386483 containerd[1571]: 2025-05-15 12:36:42.321 [INFO][6347] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4334-0-0-a-dce95649a9' May 15 12:36:42.386483 containerd[1571]: 2025-05-15 12:36:42.323 [INFO][6347] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.47fce455f6b4d1eb01788f7f8143ec8f9e0c30e4da18be5547c8e74c1251cb2b" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:42.386483 containerd[1571]: 2025-05-15 12:36:42.327 [INFO][6347] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4334-0-0-a-dce95649a9" May 15 12:36:42.386483 containerd[1571]: 2025-05-15 12:36:42.332 [INFO][6347] ipam/ipam.go 489: Trying affinity for 192.168.88.192/26 host="ci-4334-0-0-a-dce95649a9" May 15 12:36:42.386483 containerd[1571]: 2025-05-15 12:36:42.334 [INFO][6347] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.192/26 host="ci-4334-0-0-a-dce95649a9" May 15 12:36:42.386483 containerd[1571]: 2025-05-15 12:36:42.336 [INFO][6347] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.192/26 host="ci-4334-0-0-a-dce95649a9" May 15 12:36:42.386483 containerd[1571]: 2025-05-15 12:36:42.336 [INFO][6347] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.192/26 handle="k8s-pod-network.47fce455f6b4d1eb01788f7f8143ec8f9e0c30e4da18be5547c8e74c1251cb2b" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:42.386483 containerd[1571]: 2025-05-15 12:36:42.338 [INFO][6347] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.47fce455f6b4d1eb01788f7f8143ec8f9e0c30e4da18be5547c8e74c1251cb2b May 15 12:36:42.386483 containerd[1571]: 2025-05-15 12:36:42.343 [INFO][6347] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.192/26 handle="k8s-pod-network.47fce455f6b4d1eb01788f7f8143ec8f9e0c30e4da18be5547c8e74c1251cb2b" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:42.386483 containerd[1571]: 2025-05-15 12:36:42.352 [INFO][6347] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.201/26] block=192.168.88.192/26 handle="k8s-pod-network.47fce455f6b4d1eb01788f7f8143ec8f9e0c30e4da18be5547c8e74c1251cb2b" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:42.386483 containerd[1571]: 2025-05-15 12:36:42.352 [INFO][6347] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.201/26] handle="k8s-pod-network.47fce455f6b4d1eb01788f7f8143ec8f9e0c30e4da18be5547c8e74c1251cb2b" host="ci-4334-0-0-a-dce95649a9" May 15 12:36:42.386483 containerd[1571]: 2025-05-15 12:36:42.352 [INFO][6347] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:36:42.386483 containerd[1571]: 2025-05-15 12:36:42.352 [INFO][6347] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.201/26] IPv6=[] ContainerID="47fce455f6b4d1eb01788f7f8143ec8f9e0c30e4da18be5547c8e74c1251cb2b" HandleID="k8s-pod-network.47fce455f6b4d1eb01788f7f8143ec8f9e0c30e4da18be5547c8e74c1251cb2b" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--58c86669fc--58trq-eth0" May 15 12:36:42.386955 containerd[1571]: 2025-05-15 12:36:42.355 [INFO][6335] cni-plugin/k8s.go 386: Populated endpoint ContainerID="47fce455f6b4d1eb01788f7f8143ec8f9e0c30e4da18be5547c8e74c1251cb2b" Namespace="calico-system" Pod="calico-kube-controllers-58c86669fc-58trq" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--58c86669fc--58trq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--58c86669fc--58trq-eth0", GenerateName:"calico-kube-controllers-58c86669fc-", Namespace:"calico-system", SelfLink:"", UID:"c6710ce4-bab8-40e8-bf93-a39c0fce1f08", ResourceVersion:"1121", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 36, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"58c86669fc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334-0-0-a-dce95649a9", ContainerID:"", Pod:"calico-kube-controllers-58c86669fc-58trq", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.201/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali87e0cb6e8fb", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:36:42.386955 containerd[1571]: 2025-05-15 12:36:42.355 [INFO][6335] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.201/32] ContainerID="47fce455f6b4d1eb01788f7f8143ec8f9e0c30e4da18be5547c8e74c1251cb2b" Namespace="calico-system" Pod="calico-kube-controllers-58c86669fc-58trq" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--58c86669fc--58trq-eth0" May 15 12:36:42.386955 containerd[1571]: 2025-05-15 12:36:42.355 [INFO][6335] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali87e0cb6e8fb ContainerID="47fce455f6b4d1eb01788f7f8143ec8f9e0c30e4da18be5547c8e74c1251cb2b" Namespace="calico-system" Pod="calico-kube-controllers-58c86669fc-58trq" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--58c86669fc--58trq-eth0" May 15 12:36:42.386955 containerd[1571]: 2025-05-15 12:36:42.360 [INFO][6335] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="47fce455f6b4d1eb01788f7f8143ec8f9e0c30e4da18be5547c8e74c1251cb2b" Namespace="calico-system" Pod="calico-kube-controllers-58c86669fc-58trq" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--58c86669fc--58trq-eth0" May 15 12:36:42.386955 containerd[1571]: 2025-05-15 12:36:42.361 [INFO][6335] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="47fce455f6b4d1eb01788f7f8143ec8f9e0c30e4da18be5547c8e74c1251cb2b" Namespace="calico-system" Pod="calico-kube-controllers-58c86669fc-58trq" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--58c86669fc--58trq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--58c86669fc--58trq-eth0", GenerateName:"calico-kube-controllers-58c86669fc-", Namespace:"calico-system", SelfLink:"", UID:"c6710ce4-bab8-40e8-bf93-a39c0fce1f08", ResourceVersion:"1121", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 12, 36, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"58c86669fc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334-0-0-a-dce95649a9", ContainerID:"47fce455f6b4d1eb01788f7f8143ec8f9e0c30e4da18be5547c8e74c1251cb2b", Pod:"calico-kube-controllers-58c86669fc-58trq", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.201/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali87e0cb6e8fb", MAC:"26:4e:9d:9c:f3:75", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 12:36:42.386955 containerd[1571]: 2025-05-15 12:36:42.380 [INFO][6335] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="47fce455f6b4d1eb01788f7f8143ec8f9e0c30e4da18be5547c8e74c1251cb2b" Namespace="calico-system" Pod="calico-kube-controllers-58c86669fc-58trq" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--58c86669fc--58trq-eth0" May 15 12:36:42.417000 containerd[1571]: time="2025-05-15T12:36:42.416954845Z" level=info msg="connecting to shim 47fce455f6b4d1eb01788f7f8143ec8f9e0c30e4da18be5547c8e74c1251cb2b" address="unix:///run/containerd/s/6ff405b3753a2f02c47cc1a0e177a7989dcba78289fa9a5c75b0ef15c85553ea" namespace=k8s.io protocol=ttrpc version=3 May 15 12:36:42.462461 systemd[1]: Started cri-containerd-47fce455f6b4d1eb01788f7f8143ec8f9e0c30e4da18be5547c8e74c1251cb2b.scope - libcontainer container 47fce455f6b4d1eb01788f7f8143ec8f9e0c30e4da18be5547c8e74c1251cb2b. May 15 12:36:42.500466 containerd[1571]: time="2025-05-15T12:36:42.500276115Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-58c86669fc-58trq,Uid:c6710ce4-bab8-40e8-bf93-a39c0fce1f08,Namespace:calico-system,Attempt:0,} returns sandbox id \"47fce455f6b4d1eb01788f7f8143ec8f9e0c30e4da18be5547c8e74c1251cb2b\"" May 15 12:36:42.508759 containerd[1571]: time="2025-05-15T12:36:42.508728985Z" level=info msg="CreateContainer within sandbox \"47fce455f6b4d1eb01788f7f8143ec8f9e0c30e4da18be5547c8e74c1251cb2b\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 15 12:36:42.514343 containerd[1571]: time="2025-05-15T12:36:42.514312647Z" level=info msg="Container 5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119: CDI devices from CRI Config.CDIDevices: []" May 15 12:36:42.520584 containerd[1571]: time="2025-05-15T12:36:42.520557971Z" level=info msg="CreateContainer within sandbox \"47fce455f6b4d1eb01788f7f8143ec8f9e0c30e4da18be5547c8e74c1251cb2b\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\"" May 15 12:36:42.520992 containerd[1571]: time="2025-05-15T12:36:42.520968963Z" level=info msg="StartContainer for \"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\"" May 15 12:36:42.521770 containerd[1571]: time="2025-05-15T12:36:42.521743878Z" level=info msg="connecting to shim 5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119" address="unix:///run/containerd/s/6ff405b3753a2f02c47cc1a0e177a7989dcba78289fa9a5c75b0ef15c85553ea" protocol=ttrpc version=3 May 15 12:36:42.539453 systemd[1]: Started cri-containerd-5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119.scope - libcontainer container 5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119. May 15 12:36:42.585488 containerd[1571]: time="2025-05-15T12:36:42.585456888Z" level=info msg="StartContainer for \"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" returns successfully" May 15 12:36:42.653852 kubelet[3207]: I0515 12:36:42.653770 3207 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-58c86669fc-58trq" podStartSLOduration=2.653751518 podStartE2EDuration="2.653751518s" podCreationTimestamp="2025-05-15 12:36:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 12:36:42.651046871 +0000 UTC m=+89.610845715" watchObservedRunningTime="2025-05-15 12:36:42.653751518 +0000 UTC m=+89.613550343" May 15 12:36:42.753425 containerd[1571]: time="2025-05-15T12:36:42.753318798Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"80073261fa473767745ee21c73f347e7385271905599c99e893f9ceb04718032\" pid:6468 exit_status:1 exited_at:{seconds:1747312602 nanos:752995460}" May 15 12:36:42.755678 containerd[1571]: time="2025-05-15T12:36:42.755611261Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"aa7b6f126c94fc537ed583d0e474edb740c62391fb35c92193e6370df66dc00f\" pid:6477 exit_status:1 exited_at:{seconds:1747312602 nanos:755105321}" May 15 12:36:43.429655 systemd-networkd[1471]: cali87e0cb6e8fb: Gained IPv6LL May 15 12:36:43.716819 containerd[1571]: time="2025-05-15T12:36:43.716734461Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"9b2fee26260c49de58fa244b355f74e50e5a5112e736ed7136206dc2f1a3476f\" pid:6585 exit_status:1 exited_at:{seconds:1747312603 nanos:716531200}" May 15 12:36:44.671598 containerd[1571]: time="2025-05-15T12:36:44.671559249Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"eec202d4af63201e3f8e01576ee4ba90a3edad9aab0a168b86e8fe09c87fd6f2\" pid:6657 exit_status:1 exited_at:{seconds:1747312604 nanos:671392085}" May 15 12:36:48.128713 containerd[1571]: time="2025-05-15T12:36:48.128663719Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:36:48.129846 containerd[1571]: time="2025-05-15T12:36:48.129758244Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3: active requests=0, bytes read=13991773" May 15 12:36:48.130854 containerd[1571]: time="2025-05-15T12:36:48.130816010Z" level=info msg="ImageCreate event name:\"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:36:48.132417 containerd[1571]: time="2025-05-15T12:36:48.132374525Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 12:36:48.132998 containerd[1571]: time="2025-05-15T12:36:48.132939245Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" with image id \"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\", size \"15484347\" in 11.667143496s" May 15 12:36:48.133100 containerd[1571]: time="2025-05-15T12:36:48.133078597Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" returns image reference \"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\"" May 15 12:36:48.135485 containerd[1571]: time="2025-05-15T12:36:48.135412940Z" level=info msg="CreateContainer within sandbox \"fd6a4542a2ce7eeec767ab419d676324a7c728ab2a544d92dc855d0ab870ab83\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 15 12:36:48.146211 containerd[1571]: time="2025-05-15T12:36:48.145078354Z" level=info msg="Container 6364ea56f4b1c721c2cd1f73b6b3a3c081dc1c73326db48b5eafd4276a2db7fd: CDI devices from CRI Config.CDIDevices: []" May 15 12:36:48.237193 containerd[1571]: time="2025-05-15T12:36:48.237136311Z" level=info msg="CreateContainer within sandbox \"fd6a4542a2ce7eeec767ab419d676324a7c728ab2a544d92dc855d0ab870ab83\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"6364ea56f4b1c721c2cd1f73b6b3a3c081dc1c73326db48b5eafd4276a2db7fd\"" May 15 12:36:48.237935 containerd[1571]: time="2025-05-15T12:36:48.237907931Z" level=info msg="StartContainer for \"6364ea56f4b1c721c2cd1f73b6b3a3c081dc1c73326db48b5eafd4276a2db7fd\"" May 15 12:36:48.239283 containerd[1571]: time="2025-05-15T12:36:48.239260189Z" level=info msg="connecting to shim 6364ea56f4b1c721c2cd1f73b6b3a3c081dc1c73326db48b5eafd4276a2db7fd" address="unix:///run/containerd/s/234b5e5ec00e1b6e6ba7b26814c525a9e6271e5ba24bac9acc45a7194dbe8b8a" protocol=ttrpc version=3 May 15 12:36:48.269530 systemd[1]: Started cri-containerd-6364ea56f4b1c721c2cd1f73b6b3a3c081dc1c73326db48b5eafd4276a2db7fd.scope - libcontainer container 6364ea56f4b1c721c2cd1f73b6b3a3c081dc1c73326db48b5eafd4276a2db7fd. May 15 12:36:48.327838 containerd[1571]: time="2025-05-15T12:36:48.327689945Z" level=info msg="StartContainer for \"6364ea56f4b1c721c2cd1f73b6b3a3c081dc1c73326db48b5eafd4276a2db7fd\" returns successfully" May 15 12:36:48.666011 kubelet[3207]: I0515 12:36:48.665953 3207 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-2gvxv" podStartSLOduration=40.881311293 podStartE2EDuration="1m12.665598662s" podCreationTimestamp="2025-05-15 12:35:36 +0000 UTC" firstStartedPulling="2025-05-15 12:36:16.349598855 +0000 UTC m=+63.309397679" lastFinishedPulling="2025-05-15 12:36:48.133886223 +0000 UTC m=+95.093685048" observedRunningTime="2025-05-15 12:36:48.665446677 +0000 UTC m=+95.625245501" watchObservedRunningTime="2025-05-15 12:36:48.665598662 +0000 UTC m=+95.625397496" May 15 12:36:49.483644 kubelet[3207]: I0515 12:36:49.483592 3207 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 15 12:36:49.483644 kubelet[3207]: I0515 12:36:49.483647 3207 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 15 12:37:08.360564 containerd[1571]: time="2025-05-15T12:37:08.360453283Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"8762091cec34d6737f693c3ec7ff9bb7b8e4a662ee5b1f6c51e0fda538298943\" pid:6744 exited_at:{seconds:1747312628 nanos:360083829}" May 15 12:37:12.296719 containerd[1571]: time="2025-05-15T12:37:12.296579172Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"30cd8eb198a6daae61e3407141273fb3318b3eb056ce5e9684dc3acdc7e0514c\" pid:6769 exited_at:{seconds:1747312632 nanos:295552514}" May 15 12:37:13.163955 containerd[1571]: time="2025-05-15T12:37:13.163921765Z" level=info msg="StopPodSandbox for \"565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006\"" May 15 12:37:13.314809 containerd[1571]: 2025-05-15 12:37:13.254 [WARNING][6792] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--8b9db6c54--8mmnz-eth0" May 15 12:37:13.314809 containerd[1571]: 2025-05-15 12:37:13.255 [INFO][6792] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" May 15 12:37:13.314809 containerd[1571]: 2025-05-15 12:37:13.255 [INFO][6792] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" iface="eth0" netns="" May 15 12:37:13.314809 containerd[1571]: 2025-05-15 12:37:13.255 [INFO][6792] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" May 15 12:37:13.314809 containerd[1571]: 2025-05-15 12:37:13.255 [INFO][6792] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" May 15 12:37:13.314809 containerd[1571]: 2025-05-15 12:37:13.298 [INFO][6799] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" HandleID="k8s-pod-network.565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--8b9db6c54--8mmnz-eth0" May 15 12:37:13.314809 containerd[1571]: 2025-05-15 12:37:13.298 [INFO][6799] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:37:13.314809 containerd[1571]: 2025-05-15 12:37:13.298 [INFO][6799] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:37:13.314809 containerd[1571]: 2025-05-15 12:37:13.308 [WARNING][6799] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" HandleID="k8s-pod-network.565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--8b9db6c54--8mmnz-eth0" May 15 12:37:13.314809 containerd[1571]: 2025-05-15 12:37:13.308 [INFO][6799] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" HandleID="k8s-pod-network.565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--8b9db6c54--8mmnz-eth0" May 15 12:37:13.314809 containerd[1571]: 2025-05-15 12:37:13.310 [INFO][6799] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:37:13.314809 containerd[1571]: 2025-05-15 12:37:13.312 [INFO][6792] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" May 15 12:37:13.318598 containerd[1571]: time="2025-05-15T12:37:13.314836389Z" level=info msg="TearDown network for sandbox \"565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006\" successfully" May 15 12:37:13.318598 containerd[1571]: time="2025-05-15T12:37:13.314858511Z" level=info msg="StopPodSandbox for \"565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006\" returns successfully" May 15 12:37:13.356922 containerd[1571]: time="2025-05-15T12:37:13.356858009Z" level=info msg="RemovePodSandbox for \"565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006\"" May 15 12:37:13.360321 containerd[1571]: time="2025-05-15T12:37:13.360248202Z" level=info msg="Forcibly stopping sandbox \"565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006\"" May 15 12:37:13.459257 containerd[1571]: 2025-05-15 12:37:13.409 [WARNING][6817] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--8b9db6c54--8mmnz-eth0" May 15 12:37:13.459257 containerd[1571]: 2025-05-15 12:37:13.409 [INFO][6817] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" May 15 12:37:13.459257 containerd[1571]: 2025-05-15 12:37:13.409 [INFO][6817] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" iface="eth0" netns="" May 15 12:37:13.459257 containerd[1571]: 2025-05-15 12:37:13.409 [INFO][6817] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" May 15 12:37:13.459257 containerd[1571]: 2025-05-15 12:37:13.409 [INFO][6817] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" May 15 12:37:13.459257 containerd[1571]: 2025-05-15 12:37:13.444 [INFO][6824] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" HandleID="k8s-pod-network.565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--8b9db6c54--8mmnz-eth0" May 15 12:37:13.459257 containerd[1571]: 2025-05-15 12:37:13.445 [INFO][6824] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:37:13.459257 containerd[1571]: 2025-05-15 12:37:13.445 [INFO][6824] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:37:13.459257 containerd[1571]: 2025-05-15 12:37:13.451 [WARNING][6824] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" HandleID="k8s-pod-network.565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--8b9db6c54--8mmnz-eth0" May 15 12:37:13.459257 containerd[1571]: 2025-05-15 12:37:13.451 [INFO][6824] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" HandleID="k8s-pod-network.565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--kube--controllers--8b9db6c54--8mmnz-eth0" May 15 12:37:13.459257 containerd[1571]: 2025-05-15 12:37:13.453 [INFO][6824] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:37:13.459257 containerd[1571]: 2025-05-15 12:37:13.455 [INFO][6817] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006" May 15 12:37:13.460598 containerd[1571]: time="2025-05-15T12:37:13.459253717Z" level=info msg="TearDown network for sandbox \"565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006\" successfully" May 15 12:37:13.465845 containerd[1571]: time="2025-05-15T12:37:13.465789585Z" level=info msg="Ensure that sandbox 565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006 in task-service has been cleanup successfully" May 15 12:37:13.470853 containerd[1571]: time="2025-05-15T12:37:13.470802255Z" level=info msg="RemovePodSandbox \"565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006\" returns successfully" May 15 12:37:13.471539 containerd[1571]: time="2025-05-15T12:37:13.471499093Z" level=info msg="StopPodSandbox for \"fbbffb0dc23d24ebe2df35b6c6ffb88a730020f39a5222ec0a817e1b9ddaff3d\"" May 15 12:37:13.471795 containerd[1571]: time="2025-05-15T12:37:13.471766946Z" level=info msg="TearDown network for sandbox \"fbbffb0dc23d24ebe2df35b6c6ffb88a730020f39a5222ec0a817e1b9ddaff3d\" successfully" May 15 12:37:13.471795 containerd[1571]: time="2025-05-15T12:37:13.471787414Z" level=info msg="StopPodSandbox for \"fbbffb0dc23d24ebe2df35b6c6ffb88a730020f39a5222ec0a817e1b9ddaff3d\" returns successfully" May 15 12:37:13.472084 containerd[1571]: time="2025-05-15T12:37:13.472058923Z" level=info msg="RemovePodSandbox for \"fbbffb0dc23d24ebe2df35b6c6ffb88a730020f39a5222ec0a817e1b9ddaff3d\"" May 15 12:37:13.472244 containerd[1571]: time="2025-05-15T12:37:13.472100231Z" level=info msg="Forcibly stopping sandbox \"fbbffb0dc23d24ebe2df35b6c6ffb88a730020f39a5222ec0a817e1b9ddaff3d\"" May 15 12:37:13.472244 containerd[1571]: time="2025-05-15T12:37:13.472187385Z" level=info msg="TearDown network for sandbox \"fbbffb0dc23d24ebe2df35b6c6ffb88a730020f39a5222ec0a817e1b9ddaff3d\" successfully" May 15 12:37:13.476504 containerd[1571]: time="2025-05-15T12:37:13.476467238Z" level=info msg="Ensure that sandbox fbbffb0dc23d24ebe2df35b6c6ffb88a730020f39a5222ec0a817e1b9ddaff3d in task-service has been cleanup successfully" May 15 12:37:13.480365 containerd[1571]: time="2025-05-15T12:37:13.480340931Z" level=info msg="RemovePodSandbox \"fbbffb0dc23d24ebe2df35b6c6ffb88a730020f39a5222ec0a817e1b9ddaff3d\" returns successfully" May 15 12:37:13.480890 containerd[1571]: time="2025-05-15T12:37:13.480860585Z" level=info msg="StopPodSandbox for \"9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664\"" May 15 12:37:13.556374 containerd[1571]: 2025-05-15 12:37:13.522 [WARNING][6843] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--psvtq-eth0" May 15 12:37:13.556374 containerd[1571]: 2025-05-15 12:37:13.523 [INFO][6843] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" May 15 12:37:13.556374 containerd[1571]: 2025-05-15 12:37:13.523 [INFO][6843] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" iface="eth0" netns="" May 15 12:37:13.556374 containerd[1571]: 2025-05-15 12:37:13.523 [INFO][6843] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" May 15 12:37:13.556374 containerd[1571]: 2025-05-15 12:37:13.523 [INFO][6843] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" May 15 12:37:13.556374 containerd[1571]: 2025-05-15 12:37:13.544 [INFO][6850] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" HandleID="k8s-pod-network.9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--psvtq-eth0" May 15 12:37:13.556374 containerd[1571]: 2025-05-15 12:37:13.544 [INFO][6850] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:37:13.556374 containerd[1571]: 2025-05-15 12:37:13.544 [INFO][6850] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:37:13.556374 containerd[1571]: 2025-05-15 12:37:13.550 [WARNING][6850] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" HandleID="k8s-pod-network.9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--psvtq-eth0" May 15 12:37:13.556374 containerd[1571]: 2025-05-15 12:37:13.551 [INFO][6850] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" HandleID="k8s-pod-network.9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--psvtq-eth0" May 15 12:37:13.556374 containerd[1571]: 2025-05-15 12:37:13.552 [INFO][6850] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:37:13.556374 containerd[1571]: 2025-05-15 12:37:13.554 [INFO][6843] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" May 15 12:37:13.556374 containerd[1571]: time="2025-05-15T12:37:13.555711589Z" level=info msg="TearDown network for sandbox \"9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664\" successfully" May 15 12:37:13.556374 containerd[1571]: time="2025-05-15T12:37:13.555732739Z" level=info msg="StopPodSandbox for \"9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664\" returns successfully" May 15 12:37:13.556374 containerd[1571]: time="2025-05-15T12:37:13.556077185Z" level=info msg="RemovePodSandbox for \"9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664\"" May 15 12:37:13.556374 containerd[1571]: time="2025-05-15T12:37:13.556111299Z" level=info msg="Forcibly stopping sandbox \"9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664\"" May 15 12:37:13.651120 containerd[1571]: 2025-05-15 12:37:13.614 [WARNING][6869] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--psvtq-eth0" May 15 12:37:13.651120 containerd[1571]: 2025-05-15 12:37:13.614 [INFO][6869] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" May 15 12:37:13.651120 containerd[1571]: 2025-05-15 12:37:13.614 [INFO][6869] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" iface="eth0" netns="" May 15 12:37:13.651120 containerd[1571]: 2025-05-15 12:37:13.614 [INFO][6869] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" May 15 12:37:13.651120 containerd[1571]: 2025-05-15 12:37:13.614 [INFO][6869] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" May 15 12:37:13.651120 containerd[1571]: 2025-05-15 12:37:13.639 [INFO][6876] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" HandleID="k8s-pod-network.9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--psvtq-eth0" May 15 12:37:13.651120 containerd[1571]: 2025-05-15 12:37:13.639 [INFO][6876] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:37:13.651120 containerd[1571]: 2025-05-15 12:37:13.639 [INFO][6876] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:37:13.651120 containerd[1571]: 2025-05-15 12:37:13.646 [WARNING][6876] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" HandleID="k8s-pod-network.9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--psvtq-eth0" May 15 12:37:13.651120 containerd[1571]: 2025-05-15 12:37:13.646 [INFO][6876] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" HandleID="k8s-pod-network.9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--psvtq-eth0" May 15 12:37:13.651120 containerd[1571]: 2025-05-15 12:37:13.647 [INFO][6876] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:37:13.651120 containerd[1571]: 2025-05-15 12:37:13.649 [INFO][6869] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664" May 15 12:37:13.652412 containerd[1571]: time="2025-05-15T12:37:13.651196024Z" level=info msg="TearDown network for sandbox \"9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664\" successfully" May 15 12:37:13.653554 containerd[1571]: time="2025-05-15T12:37:13.653387678Z" level=info msg="Ensure that sandbox 9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664 in task-service has been cleanup successfully" May 15 12:37:13.656839 containerd[1571]: time="2025-05-15T12:37:13.656705196Z" level=info msg="RemovePodSandbox \"9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664\" returns successfully" May 15 12:37:13.658406 containerd[1571]: time="2025-05-15T12:37:13.658369900Z" level=info msg="StopPodSandbox for \"707fa5f61523b78a5e896fac2d98b2ffbfa13a54d3678c81085d638d8aa179c7\"" May 15 12:37:13.658508 containerd[1571]: time="2025-05-15T12:37:13.658485387Z" level=info msg="TearDown network for sandbox \"707fa5f61523b78a5e896fac2d98b2ffbfa13a54d3678c81085d638d8aa179c7\" successfully" May 15 12:37:13.658508 containerd[1571]: time="2025-05-15T12:37:13.658500816Z" level=info msg="StopPodSandbox for \"707fa5f61523b78a5e896fac2d98b2ffbfa13a54d3678c81085d638d8aa179c7\" returns successfully" May 15 12:37:13.658976 containerd[1571]: time="2025-05-15T12:37:13.658931515Z" level=info msg="RemovePodSandbox for \"707fa5f61523b78a5e896fac2d98b2ffbfa13a54d3678c81085d638d8aa179c7\"" May 15 12:37:13.658976 containerd[1571]: time="2025-05-15T12:37:13.658953135Z" level=info msg="Forcibly stopping sandbox \"707fa5f61523b78a5e896fac2d98b2ffbfa13a54d3678c81085d638d8aa179c7\"" May 15 12:37:13.659060 containerd[1571]: time="2025-05-15T12:37:13.659002979Z" level=info msg="TearDown network for sandbox \"707fa5f61523b78a5e896fac2d98b2ffbfa13a54d3678c81085d638d8aa179c7\" successfully" May 15 12:37:13.661539 containerd[1571]: time="2025-05-15T12:37:13.661419254Z" level=info msg="Ensure that sandbox 707fa5f61523b78a5e896fac2d98b2ffbfa13a54d3678c81085d638d8aa179c7 in task-service has been cleanup successfully" May 15 12:37:13.670487 containerd[1571]: time="2025-05-15T12:37:13.670453944Z" level=info msg="RemovePodSandbox \"707fa5f61523b78a5e896fac2d98b2ffbfa13a54d3678c81085d638d8aa179c7\" returns successfully" May 15 12:37:13.671248 containerd[1571]: time="2025-05-15T12:37:13.670921702Z" level=info msg="StopPodSandbox for \"add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8\"" May 15 12:37:13.747719 containerd[1571]: 2025-05-15 12:37:13.712 [WARNING][6894] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--8dkvq-eth0" May 15 12:37:13.747719 containerd[1571]: 2025-05-15 12:37:13.712 [INFO][6894] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" May 15 12:37:13.747719 containerd[1571]: 2025-05-15 12:37:13.712 [INFO][6894] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" iface="eth0" netns="" May 15 12:37:13.747719 containerd[1571]: 2025-05-15 12:37:13.712 [INFO][6894] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" May 15 12:37:13.747719 containerd[1571]: 2025-05-15 12:37:13.712 [INFO][6894] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" May 15 12:37:13.747719 containerd[1571]: 2025-05-15 12:37:13.735 [INFO][6901] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" HandleID="k8s-pod-network.add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--8dkvq-eth0" May 15 12:37:13.747719 containerd[1571]: 2025-05-15 12:37:13.736 [INFO][6901] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:37:13.747719 containerd[1571]: 2025-05-15 12:37:13.736 [INFO][6901] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:37:13.747719 containerd[1571]: 2025-05-15 12:37:13.741 [WARNING][6901] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" HandleID="k8s-pod-network.add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--8dkvq-eth0" May 15 12:37:13.747719 containerd[1571]: 2025-05-15 12:37:13.741 [INFO][6901] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" HandleID="k8s-pod-network.add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--8dkvq-eth0" May 15 12:37:13.747719 containerd[1571]: 2025-05-15 12:37:13.743 [INFO][6901] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:37:13.747719 containerd[1571]: 2025-05-15 12:37:13.745 [INFO][6894] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" May 15 12:37:13.748222 containerd[1571]: time="2025-05-15T12:37:13.748133286Z" level=info msg="TearDown network for sandbox \"add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8\" successfully" May 15 12:37:13.748222 containerd[1571]: time="2025-05-15T12:37:13.748161760Z" level=info msg="StopPodSandbox for \"add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8\" returns successfully" May 15 12:37:13.749541 containerd[1571]: time="2025-05-15T12:37:13.749068602Z" level=info msg="RemovePodSandbox for \"add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8\"" May 15 12:37:13.749541 containerd[1571]: time="2025-05-15T12:37:13.749126481Z" level=info msg="Forcibly stopping sandbox \"add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8\"" May 15 12:37:13.820986 containerd[1571]: 2025-05-15 12:37:13.789 [WARNING][6920] cni-plugin/k8s.go 566: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" WorkloadEndpoint="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--8dkvq-eth0" May 15 12:37:13.820986 containerd[1571]: 2025-05-15 12:37:13.790 [INFO][6920] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" May 15 12:37:13.820986 containerd[1571]: 2025-05-15 12:37:13.790 [INFO][6920] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" iface="eth0" netns="" May 15 12:37:13.820986 containerd[1571]: 2025-05-15 12:37:13.790 [INFO][6920] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" May 15 12:37:13.820986 containerd[1571]: 2025-05-15 12:37:13.790 [INFO][6920] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" May 15 12:37:13.820986 containerd[1571]: 2025-05-15 12:37:13.810 [INFO][6927] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" HandleID="k8s-pod-network.add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--8dkvq-eth0" May 15 12:37:13.820986 containerd[1571]: 2025-05-15 12:37:13.810 [INFO][6927] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 12:37:13.820986 containerd[1571]: 2025-05-15 12:37:13.810 [INFO][6927] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 12:37:13.820986 containerd[1571]: 2025-05-15 12:37:13.815 [WARNING][6927] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" HandleID="k8s-pod-network.add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--8dkvq-eth0" May 15 12:37:13.820986 containerd[1571]: 2025-05-15 12:37:13.815 [INFO][6927] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" HandleID="k8s-pod-network.add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" Workload="ci--4334--0--0--a--dce95649a9-k8s-calico--apiserver--9bffb6db8--8dkvq-eth0" May 15 12:37:13.820986 containerd[1571]: 2025-05-15 12:37:13.817 [INFO][6927] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 12:37:13.820986 containerd[1571]: 2025-05-15 12:37:13.819 [INFO][6920] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8" May 15 12:37:13.822856 containerd[1571]: time="2025-05-15T12:37:13.821236048Z" level=info msg="TearDown network for sandbox \"add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8\" successfully" May 15 12:37:13.823400 containerd[1571]: time="2025-05-15T12:37:13.823240070Z" level=info msg="Ensure that sandbox add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8 in task-service has been cleanup successfully" May 15 12:37:13.826206 containerd[1571]: time="2025-05-15T12:37:13.826107463Z" level=info msg="RemovePodSandbox \"add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8\" returns successfully" May 15 12:37:23.720080 systemd[1]: Started sshd@8-37.27.185.109:22-219.127.7.87:49854.service - OpenSSH per-connection server daemon (219.127.7.87:49854). May 15 12:37:25.605980 sshd[6935]: Invalid user shiva from 219.127.7.87 port 49854 May 15 12:37:25.960934 sshd[6935]: Received disconnect from 219.127.7.87 port 49854:11: Bye Bye [preauth] May 15 12:37:25.960934 sshd[6935]: Disconnected from invalid user shiva 219.127.7.87 port 49854 [preauth] May 15 12:37:25.963353 systemd[1]: sshd@8-37.27.185.109:22-219.127.7.87:49854.service: Deactivated successfully. May 15 12:37:38.356304 containerd[1571]: time="2025-05-15T12:37:38.356257471Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"ce43ee978b97eeadf1915a20510ddf4e0656136ea8cded7ee1bd75a0a7d2d7d7\" pid:6964 exited_at:{seconds:1747312658 nanos:355889050}" May 15 12:37:42.302426 containerd[1571]: time="2025-05-15T12:37:42.296486998Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"b5e16c258a1e6dfd92dd0446025d59bb98965e1aca2d6531dcf91a3f437e04b0\" pid:7006 exited_at:{seconds:1747312662 nanos:296149645}" May 15 12:37:42.303116 containerd[1571]: time="2025-05-15T12:37:42.302448779Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"d3156533886e77151f6cc5b265d827eedb52c62d402ffc5abc7e455e93b8ca21\" pid:6997 exited_at:{seconds:1747312662 nanos:302085447}" May 15 12:38:08.347796 containerd[1571]: time="2025-05-15T12:38:08.347682049Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"76a90e6e6d90d316a52662e8bc40773cd0880dc4a43cf45c37b6f2dc908f0f90\" pid:7041 exited_at:{seconds:1747312688 nanos:347390391}" May 15 12:38:12.290625 containerd[1571]: time="2025-05-15T12:38:12.290438601Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"de59b55621602c6fd2b77ed1d1f08773addb7680ed6469d4b1e84ae80e2f91c8\" pid:7066 exited_at:{seconds:1747312692 nanos:290243825}" May 15 12:38:27.216752 systemd[1]: Started sshd@9-37.27.185.109:22-161.35.7.113:54618.service - OpenSSH per-connection server daemon (161.35.7.113:54618). May 15 12:38:27.945110 sshd[7096]: Received disconnect from 161.35.7.113 port 54618:11: Bye Bye [preauth] May 15 12:38:27.945110 sshd[7096]: Disconnected from authenticating user root 161.35.7.113 port 54618 [preauth] May 15 12:38:27.947450 systemd[1]: sshd@9-37.27.185.109:22-161.35.7.113:54618.service: Deactivated successfully. May 15 12:38:37.190460 systemd[1]: Started sshd@10-37.27.185.109:22-219.127.7.87:36984.service - OpenSSH per-connection server daemon (219.127.7.87:36984). May 15 12:38:38.353105 containerd[1571]: time="2025-05-15T12:38:38.352940479Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"98d9bd05cb5dffc31d51868906976a0c3e41e3f5e019c80e9e206c33f336e00d\" pid:7117 exited_at:{seconds:1747312718 nanos:352402220}" May 15 12:38:38.904907 sshd[7103]: Invalid user zhang from 219.127.7.87 port 36984 May 15 12:38:39.263485 sshd[7103]: Received disconnect from 219.127.7.87 port 36984:11: Bye Bye [preauth] May 15 12:38:39.263485 sshd[7103]: Disconnected from invalid user zhang 219.127.7.87 port 36984 [preauth] May 15 12:38:39.265889 systemd[1]: sshd@10-37.27.185.109:22-219.127.7.87:36984.service: Deactivated successfully. May 15 12:38:42.301983 containerd[1571]: time="2025-05-15T12:38:42.301932024Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"fddc418336820ef9fb31b90515e89aa27d42d0b063b31bb0f51c3d6cba0cb869\" pid:7159 exited_at:{seconds:1747312722 nanos:301239113}" May 15 12:38:42.311649 containerd[1571]: time="2025-05-15T12:38:42.311612943Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"5719b68c2fa456ee67ab8c1d9dc1049e3b48132109cbf4b51a73208003f53696\" pid:7160 exited_at:{seconds:1747312722 nanos:311443826}" May 15 12:39:08.333537 containerd[1571]: time="2025-05-15T12:39:08.333478575Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"3ba801c3cbeb0ad96ef90736daeb7c3648178f45f272e67456b4c09093b51496\" pid:7193 exited_at:{seconds:1747312748 nanos:333153646}" May 15 12:39:12.289793 containerd[1571]: time="2025-05-15T12:39:12.289734512Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"d2419c7d4452325c87251b0cec99b4b4d4097a4ae3999eda5c65888305c0d7e7\" pid:7218 exited_at:{seconds:1747312752 nanos:289356083}" May 15 12:39:19.033917 systemd[1]: Started sshd@11-37.27.185.109:22-118.179.219.137:55048.service - OpenSSH per-connection server daemon (118.179.219.137:55048). May 15 12:39:20.075082 sshd[7230]: Invalid user alexandre from 118.179.219.137 port 55048 May 15 12:39:20.271831 sshd[7230]: Received disconnect from 118.179.219.137 port 55048:11: Bye Bye [preauth] May 15 12:39:20.271831 sshd[7230]: Disconnected from invalid user alexandre 118.179.219.137 port 55048 [preauth] May 15 12:39:20.273880 systemd[1]: sshd@11-37.27.185.109:22-118.179.219.137:55048.service: Deactivated successfully. May 15 12:39:28.160065 systemd[1]: Started sshd@12-37.27.185.109:22-219.127.7.87:48738.service - OpenSSH per-connection server daemon (219.127.7.87:48738). May 15 12:39:29.842156 sshd[7243]: Invalid user hjl from 219.127.7.87 port 48738 May 15 12:39:30.156012 sshd[7243]: Received disconnect from 219.127.7.87 port 48738:11: Bye Bye [preauth] May 15 12:39:30.156012 sshd[7243]: Disconnected from invalid user hjl 219.127.7.87 port 48738 [preauth] May 15 12:39:30.158060 systemd[1]: sshd@12-37.27.185.109:22-219.127.7.87:48738.service: Deactivated successfully. May 15 12:39:38.341851 containerd[1571]: time="2025-05-15T12:39:38.341800031Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"342130da57a6da6da7d664b6bc1a56f741351d16ca8f9e39d0b22e278d5ec1a7\" pid:7263 exited_at:{seconds:1747312778 nanos:341450555}" May 15 12:39:42.296860 containerd[1571]: time="2025-05-15T12:39:42.296807034Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"d992105c6fcbc2e38cf7019bb7673dfdf8ffc5a87a6e151646da2bcd4689fa12\" pid:7300 exited_at:{seconds:1747312782 nanos:295694014}" May 15 12:39:42.297663 containerd[1571]: time="2025-05-15T12:39:42.297135420Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"5df8ac95203a9d0db3c412c5b97c307ad85761e00094c76f7bd092568ca60998\" pid:7301 exited_at:{seconds:1747312782 nanos:296217347}" May 15 12:40:08.336751 containerd[1571]: time="2025-05-15T12:40:08.336604462Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"af1b67e21217494b7b74967732fe6f0b7742a0720dd860b6013140e03dfbdd05\" pid:7349 exited_at:{seconds:1747312808 nanos:336273139}" May 15 12:40:09.249741 containerd[1571]: time="2025-05-15T12:40:09.227163556Z" level=warning msg="container event discarded" container=93df93f6913f00e9171e7c72c9f70e71edfb30442efc3c36514f5c6dddec91d1 type=CONTAINER_CREATED_EVENT May 15 12:40:09.284010 containerd[1571]: time="2025-05-15T12:40:09.283949283Z" level=warning msg="container event discarded" container=93df93f6913f00e9171e7c72c9f70e71edfb30442efc3c36514f5c6dddec91d1 type=CONTAINER_STARTED_EVENT May 15 12:40:09.284010 containerd[1571]: time="2025-05-15T12:40:09.283993074Z" level=warning msg="container event discarded" container=917876ebd87e8b4a8b91df0799278ea36dd6037c26a39232aaeab1c0fa6a935b type=CONTAINER_CREATED_EVENT May 15 12:40:09.284010 containerd[1571]: time="2025-05-15T12:40:09.284005147Z" level=warning msg="container event discarded" container=917876ebd87e8b4a8b91df0799278ea36dd6037c26a39232aaeab1c0fa6a935b type=CONTAINER_STARTED_EVENT May 15 12:40:09.284206 containerd[1571]: time="2025-05-15T12:40:09.284018663Z" level=warning msg="container event discarded" container=5b91d80da7cbed93c6f129399f80eb62aeede167c161a123310fe28727169529 type=CONTAINER_CREATED_EVENT May 15 12:40:09.284206 containerd[1571]: time="2025-05-15T12:40:09.284038760Z" level=warning msg="container event discarded" container=5b91d80da7cbed93c6f129399f80eb62aeede167c161a123310fe28727169529 type=CONTAINER_STARTED_EVENT May 15 12:40:09.284206 containerd[1571]: time="2025-05-15T12:40:09.284046414Z" level=warning msg="container event discarded" container=e823510bedc9ed0e482b1d3eb7d8a8f7b1eef33bcbb7e7fea24183a83e878d89 type=CONTAINER_CREATED_EVENT May 15 12:40:09.284206 containerd[1571]: time="2025-05-15T12:40:09.284052917Z" level=warning msg="container event discarded" container=42c06d99f30ace834b14668a80883ed210238f4267dcb393331e444c50366d91 type=CONTAINER_CREATED_EVENT May 15 12:40:09.284206 containerd[1571]: time="2025-05-15T12:40:09.284058447Z" level=warning msg="container event discarded" container=cab22f6292af38347de93fe153d8da7c2930a591f769935458cd4c429f3c590f type=CONTAINER_CREATED_EVENT May 15 12:40:09.366578 containerd[1571]: time="2025-05-15T12:40:09.366488081Z" level=warning msg="container event discarded" container=cab22f6292af38347de93fe153d8da7c2930a591f769935458cd4c429f3c590f type=CONTAINER_STARTED_EVENT May 15 12:40:09.395131 containerd[1571]: time="2025-05-15T12:40:09.395080053Z" level=warning msg="container event discarded" container=42c06d99f30ace834b14668a80883ed210238f4267dcb393331e444c50366d91 type=CONTAINER_STARTED_EVENT May 15 12:40:09.395131 containerd[1571]: time="2025-05-15T12:40:09.395119797Z" level=warning msg="container event discarded" container=e823510bedc9ed0e482b1d3eb7d8a8f7b1eef33bcbb7e7fea24183a83e878d89 type=CONTAINER_STARTED_EVENT May 15 12:40:12.295023 containerd[1571]: time="2025-05-15T12:40:12.294942947Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"9bb9f8d24bbc8a935552cc270d74ba50930e9aedfff91c4615be97cb6b0c1748\" pid:7372 exited_at:{seconds:1747312812 nanos:294455011}" May 15 12:40:17.136710 systemd[1]: Started sshd@13-37.27.185.109:22-219.127.7.87:59984.service - OpenSSH per-connection server daemon (219.127.7.87:59984). May 15 12:40:18.877432 sshd[7385]: Invalid user admin from 219.127.7.87 port 59984 May 15 12:40:19.187544 sshd[7385]: Received disconnect from 219.127.7.87 port 59984:11: Bye Bye [preauth] May 15 12:40:19.187544 sshd[7385]: Disconnected from invalid user admin 219.127.7.87 port 59984 [preauth] May 15 12:40:19.190240 systemd[1]: sshd@13-37.27.185.109:22-219.127.7.87:59984.service: Deactivated successfully. May 15 12:40:29.003695 containerd[1571]: time="2025-05-15T12:40:29.003557295Z" level=warning msg="container event discarded" container=0716c37f90ed94c8e74b1390bb7423b13593d84cbf42b5c0f53b8e2fb7ea8134 type=CONTAINER_CREATED_EVENT May 15 12:40:29.003695 containerd[1571]: time="2025-05-15T12:40:29.003653275Z" level=warning msg="container event discarded" container=0716c37f90ed94c8e74b1390bb7423b13593d84cbf42b5c0f53b8e2fb7ea8134 type=CONTAINER_STARTED_EVENT May 15 12:40:29.033031 containerd[1571]: time="2025-05-15T12:40:29.032921055Z" level=warning msg="container event discarded" container=dfaffc62d6317420a389133f9bbb97ec14f878f35708e9c575a358bab4bb6009 type=CONTAINER_CREATED_EVENT May 15 12:40:29.084300 containerd[1571]: time="2025-05-15T12:40:29.084246553Z" level=warning msg="container event discarded" container=dfaffc62d6317420a389133f9bbb97ec14f878f35708e9c575a358bab4bb6009 type=CONTAINER_STARTED_EVENT May 15 12:40:30.922042 containerd[1571]: time="2025-05-15T12:40:30.921981672Z" level=warning msg="container event discarded" container=e9e748df1d7c8e825c55595cf6843ddb00fcf7fa8dfc3eed2501ddd8d80be070 type=CONTAINER_CREATED_EVENT May 15 12:40:30.922042 containerd[1571]: time="2025-05-15T12:40:30.922023881Z" level=warning msg="container event discarded" container=e9e748df1d7c8e825c55595cf6843ddb00fcf7fa8dfc3eed2501ddd8d80be070 type=CONTAINER_STARTED_EVENT May 15 12:40:33.297103 containerd[1571]: time="2025-05-15T12:40:33.297029618Z" level=warning msg="container event discarded" container=216d20dedc13f4de9ae415f3572568a64266231433b4b01a5537118c810d6cd7 type=CONTAINER_CREATED_EVENT May 15 12:40:33.352367 containerd[1571]: time="2025-05-15T12:40:33.352293347Z" level=warning msg="container event discarded" container=216d20dedc13f4de9ae415f3572568a64266231433b4b01a5537118c810d6cd7 type=CONTAINER_STARTED_EVENT May 15 12:40:36.807763 containerd[1571]: time="2025-05-15T12:40:36.807687119Z" level=warning msg="container event discarded" container=fbbffb0dc23d24ebe2df35b6c6ffb88a730020f39a5222ec0a817e1b9ddaff3d type=CONTAINER_CREATED_EVENT May 15 12:40:36.807763 containerd[1571]: time="2025-05-15T12:40:36.807749997Z" level=warning msg="container event discarded" container=fbbffb0dc23d24ebe2df35b6c6ffb88a730020f39a5222ec0a817e1b9ddaff3d type=CONTAINER_STARTED_EVENT May 15 12:40:36.894183 containerd[1571]: time="2025-05-15T12:40:36.894098215Z" level=warning msg="container event discarded" container=707fa5f61523b78a5e896fac2d98b2ffbfa13a54d3678c81085d638d8aa179c7 type=CONTAINER_CREATED_EVENT May 15 12:40:36.894183 containerd[1571]: time="2025-05-15T12:40:36.894156725Z" level=warning msg="container event discarded" container=707fa5f61523b78a5e896fac2d98b2ffbfa13a54d3678c81085d638d8aa179c7 type=CONTAINER_STARTED_EVENT May 15 12:40:38.329878 containerd[1571]: time="2025-05-15T12:40:38.329837065Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"5a9ef9bd4c83ed03db60e085daf8928068d4c1fc47bbb9f7bae92fea6768c833\" pid:7403 exited_at:{seconds:1747312838 nanos:329510172}" May 15 12:40:42.307726 containerd[1571]: time="2025-05-15T12:40:42.307491915Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"3689e804f09a39dec57e04112b06624df25581f856de50e3061764ee80a1d6ac\" pid:7440 exited_at:{seconds:1747312842 nanos:306422978}" May 15 12:40:42.314469 containerd[1571]: time="2025-05-15T12:40:42.314414137Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"aa76dbc4ebf2b5d96ac2857150b7d2307a4178f70d723715252457f7eee018f6\" pid:7439 exited_at:{seconds:1747312842 nanos:310828156}" May 15 12:40:49.177165 containerd[1571]: time="2025-05-15T12:40:49.177071409Z" level=warning msg="container event discarded" container=23933857f1169d98c8240e1a804eaa00fe5361f65e4d2e0e5cd48d201f401369 type=CONTAINER_CREATED_EVENT May 15 12:40:49.243006 containerd[1571]: time="2025-05-15T12:40:49.242932146Z" level=warning msg="container event discarded" container=23933857f1169d98c8240e1a804eaa00fe5361f65e4d2e0e5cd48d201f401369 type=CONTAINER_STARTED_EVENT May 15 12:40:54.549402 containerd[1571]: time="2025-05-15T12:40:54.549294476Z" level=warning msg="container event discarded" container=618412cae37ee73f537ff619135965ecb8f483b34411e34fec694e7f48c64483 type=CONTAINER_CREATED_EVENT May 15 12:40:54.644702 containerd[1571]: time="2025-05-15T12:40:54.644631173Z" level=warning msg="container event discarded" container=618412cae37ee73f537ff619135965ecb8f483b34411e34fec694e7f48c64483 type=CONTAINER_STARTED_EVENT May 15 12:40:54.764455 containerd[1571]: time="2025-05-15T12:40:54.764300919Z" level=warning msg="container event discarded" container=618412cae37ee73f537ff619135965ecb8f483b34411e34fec694e7f48c64483 type=CONTAINER_STOPPED_EVENT May 15 12:41:02.721513 containerd[1571]: time="2025-05-15T12:41:02.721320660Z" level=warning msg="container event discarded" container=8224fd3bf6efd144d236bd2174a49d4897766eb457260b15646427dbef6b9011 type=CONTAINER_CREATED_EVENT May 15 12:41:02.781940 containerd[1571]: time="2025-05-15T12:41:02.781835995Z" level=warning msg="container event discarded" container=8224fd3bf6efd144d236bd2174a49d4897766eb457260b15646427dbef6b9011 type=CONTAINER_STARTED_EVENT May 15 12:41:03.220101 containerd[1571]: time="2025-05-15T12:41:03.220050091Z" level=warning msg="container event discarded" container=8224fd3bf6efd144d236bd2174a49d4897766eb457260b15646427dbef6b9011 type=CONTAINER_STOPPED_EVENT May 15 12:41:08.355252 containerd[1571]: time="2025-05-15T12:41:08.355161460Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"70dfa6b552224dca85605b388459889daedd0982f5e628f9f47336dd2c3caec9\" pid:7473 exited_at:{seconds:1747312868 nanos:354798127}" May 15 12:41:08.564773 systemd[1]: Started sshd@14-37.27.185.109:22-219.127.7.87:41819.service - OpenSSH per-connection server daemon (219.127.7.87:41819). May 15 12:41:10.293949 sshd[7486]: Invalid user daniel from 219.127.7.87 port 41819 May 15 12:41:10.643664 sshd[7486]: Received disconnect from 219.127.7.87 port 41819:11: Bye Bye [preauth] May 15 12:41:10.643664 sshd[7486]: Disconnected from invalid user daniel 219.127.7.87 port 41819 [preauth] May 15 12:41:10.646004 systemd[1]: sshd@14-37.27.185.109:22-219.127.7.87:41819.service: Deactivated successfully. May 15 12:41:11.152412 containerd[1571]: time="2025-05-15T12:41:11.152291790Z" level=warning msg="container event discarded" container=5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5 type=CONTAINER_CREATED_EVENT May 15 12:41:11.285686 containerd[1571]: time="2025-05-15T12:41:11.285607185Z" level=warning msg="container event discarded" container=5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5 type=CONTAINER_STARTED_EVENT May 15 12:41:12.290973 containerd[1571]: time="2025-05-15T12:41:12.290923194Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"376f861d588555d12a09653014dc6a8a97f8f2c44041fb4c9a70d27f07a0ac7f\" pid:7505 exited_at:{seconds:1747312872 nanos:290716286}" May 15 12:41:14.582087 containerd[1571]: time="2025-05-15T12:41:14.581982793Z" level=warning msg="container event discarded" container=add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8 type=CONTAINER_CREATED_EVENT May 15 12:41:14.582087 containerd[1571]: time="2025-05-15T12:41:14.582065168Z" level=warning msg="container event discarded" container=add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8 type=CONTAINER_STARTED_EVENT May 15 12:41:15.345772 containerd[1571]: time="2025-05-15T12:41:15.345711013Z" level=warning msg="container event discarded" container=0d6ae8ceda8de3ae0ff9f586e52d0189a81cb05ed38f9c3a625d84a512213ba7 type=CONTAINER_CREATED_EVENT May 15 12:41:15.345772 containerd[1571]: time="2025-05-15T12:41:15.345751458Z" level=warning msg="container event discarded" container=0d6ae8ceda8de3ae0ff9f586e52d0189a81cb05ed38f9c3a625d84a512213ba7 type=CONTAINER_STARTED_EVENT May 15 12:41:16.357777 containerd[1571]: time="2025-05-15T12:41:16.357676239Z" level=warning msg="container event discarded" container=fd6a4542a2ce7eeec767ab419d676324a7c728ab2a544d92dc855d0ab870ab83 type=CONTAINER_CREATED_EVENT May 15 12:41:16.357777 containerd[1571]: time="2025-05-15T12:41:16.357741301Z" level=warning msg="container event discarded" container=fd6a4542a2ce7eeec767ab419d676324a7c728ab2a544d92dc855d0ab870ab83 type=CONTAINER_STARTED_EVENT May 15 12:41:17.435193 containerd[1571]: time="2025-05-15T12:41:17.435045195Z" level=warning msg="container event discarded" container=9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664 type=CONTAINER_CREATED_EVENT May 15 12:41:17.435193 containerd[1571]: time="2025-05-15T12:41:17.435128852Z" level=warning msg="container event discarded" container=9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664 type=CONTAINER_STARTED_EVENT May 15 12:41:17.448505 containerd[1571]: time="2025-05-15T12:41:17.448388156Z" level=warning msg="container event discarded" container=45ceec2235dc392448bc8c5e9de9582e8ccc155b844c8b3c91e25237383122ba type=CONTAINER_CREATED_EVENT May 15 12:41:17.448505 containerd[1571]: time="2025-05-15T12:41:17.448438521Z" level=warning msg="container event discarded" container=45ceec2235dc392448bc8c5e9de9582e8ccc155b844c8b3c91e25237383122ba type=CONTAINER_STARTED_EVENT May 15 12:41:17.470979 containerd[1571]: time="2025-05-15T12:41:17.470914193Z" level=warning msg="container event discarded" container=8ba3dd7cfc8f743257c0c14823e36d88034d782508bbcf011a2434c7e1569390 type=CONTAINER_CREATED_EVENT May 15 12:41:17.515409 containerd[1571]: time="2025-05-15T12:41:17.515307560Z" level=warning msg="container event discarded" container=8ba3dd7cfc8f743257c0c14823e36d88034d782508bbcf011a2434c7e1569390 type=CONTAINER_STARTED_EVENT May 15 12:41:18.574546 containerd[1571]: time="2025-05-15T12:41:18.574476659Z" level=warning msg="container event discarded" container=565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006 type=CONTAINER_CREATED_EVENT May 15 12:41:18.574546 containerd[1571]: time="2025-05-15T12:41:18.574516263Z" level=warning msg="container event discarded" container=565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006 type=CONTAINER_STARTED_EVENT May 15 12:41:18.587738 containerd[1571]: time="2025-05-15T12:41:18.587703463Z" level=warning msg="container event discarded" container=5f5396d674935327778079efcec428be923c236a0016b52a40c1d7a43854a510 type=CONTAINER_CREATED_EVENT May 15 12:41:18.587738 containerd[1571]: time="2025-05-15T12:41:18.587734201Z" level=warning msg="container event discarded" container=5f5396d674935327778079efcec428be923c236a0016b52a40c1d7a43854a510 type=CONTAINER_STARTED_EVENT May 15 12:41:18.608112 containerd[1571]: time="2025-05-15T12:41:18.608050791Z" level=warning msg="container event discarded" container=ec558d8d6838d3bee06266635011de3bd373fbd79b7e9f7c2fac7a2ce6d248b7 type=CONTAINER_CREATED_EVENT May 15 12:41:18.693527 containerd[1571]: time="2025-05-15T12:41:18.693432054Z" level=warning msg="container event discarded" container=ec558d8d6838d3bee06266635011de3bd373fbd79b7e9f7c2fac7a2ce6d248b7 type=CONTAINER_STARTED_EVENT May 15 12:41:19.250775 containerd[1571]: time="2025-05-15T12:41:19.250394194Z" level=warning msg="container event discarded" container=b1df85dab5bbfa1ae8016749b81b9ae9771c11dcce6ede09ebdd9dbebc15b2a6 type=CONTAINER_CREATED_EVENT May 15 12:41:19.336199 containerd[1571]: time="2025-05-15T12:41:19.336098624Z" level=warning msg="container event discarded" container=b1df85dab5bbfa1ae8016749b81b9ae9771c11dcce6ede09ebdd9dbebc15b2a6 type=CONTAINER_STARTED_EVENT May 15 12:41:21.216863 containerd[1571]: time="2025-05-15T12:41:21.216753639Z" level=warning msg="container event discarded" container=84c55632e59fcca1e774ff76084f18f5c07e68037e21fefb706fb4005253e311 type=CONTAINER_CREATED_EVENT May 15 12:41:21.287282 containerd[1571]: time="2025-05-15T12:41:21.287206856Z" level=warning msg="container event discarded" container=84c55632e59fcca1e774ff76084f18f5c07e68037e21fefb706fb4005253e311 type=CONTAINER_STARTED_EVENT May 15 12:41:28.822699 containerd[1571]: time="2025-05-15T12:41:28.822628099Z" level=warning msg="container event discarded" container=48ec41ecac88c62f44bb8c074081b02de19c444a84f81b101aa9b1763cd905c3 type=CONTAINER_CREATED_EVENT May 15 12:41:28.886993 containerd[1571]: time="2025-05-15T12:41:28.886918909Z" level=warning msg="container event discarded" container=48ec41ecac88c62f44bb8c074081b02de19c444a84f81b101aa9b1763cd905c3 type=CONTAINER_STARTED_EVENT May 15 12:41:29.868016 containerd[1571]: time="2025-05-15T12:41:29.867937996Z" level=warning msg="container event discarded" container=c96a5fa932ad77fd9d876a14ba22776a54f71eba8c0147d46b9090c180d858ef type=CONTAINER_CREATED_EVENT May 15 12:41:29.950461 containerd[1571]: time="2025-05-15T12:41:29.950380854Z" level=warning msg="container event discarded" container=c96a5fa932ad77fd9d876a14ba22776a54f71eba8c0147d46b9090c180d858ef type=CONTAINER_STARTED_EVENT May 15 12:41:30.290965 containerd[1571]: time="2025-05-15T12:41:30.290875249Z" level=warning msg="container event discarded" container=78d1b846842ec03352857672570f1ef471c8910fd75b59bd8e3789f51ba233cc type=CONTAINER_CREATED_EVENT May 15 12:41:30.290965 containerd[1571]: time="2025-05-15T12:41:30.290917579Z" level=warning msg="container event discarded" container=78d1b846842ec03352857672570f1ef471c8910fd75b59bd8e3789f51ba233cc type=CONTAINER_STARTED_EVENT May 15 12:41:30.307379 containerd[1571]: time="2025-05-15T12:41:30.307212832Z" level=warning msg="container event discarded" container=517361bf533a57073f6a17e9882c97662984e2f9926a79dc1a1fb4c64c4dfdc3 type=CONTAINER_CREATED_EVENT May 15 12:41:30.383675 containerd[1571]: time="2025-05-15T12:41:30.383596668Z" level=warning msg="container event discarded" container=517361bf533a57073f6a17e9882c97662984e2f9926a79dc1a1fb4c64c4dfdc3 type=CONTAINER_STARTED_EVENT May 15 12:41:30.672999 containerd[1571]: time="2025-05-15T12:41:30.672928223Z" level=warning msg="container event discarded" container=c96a5fa932ad77fd9d876a14ba22776a54f71eba8c0147d46b9090c180d858ef type=CONTAINER_STOPPED_EVENT May 15 12:41:30.889621 containerd[1571]: time="2025-05-15T12:41:30.889560255Z" level=warning msg="container event discarded" container=9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664 type=CONTAINER_STOPPED_EVENT May 15 12:41:31.533536 containerd[1571]: time="2025-05-15T12:41:31.533464318Z" level=warning msg="container event discarded" container=c96a5fa932ad77fd9d876a14ba22776a54f71eba8c0147d46b9090c180d858ef type=CONTAINER_DELETED_EVENT May 15 12:41:31.654833 containerd[1571]: time="2025-05-15T12:41:31.654758893Z" level=warning msg="container event discarded" container=b1df85dab5bbfa1ae8016749b81b9ae9771c11dcce6ede09ebdd9dbebc15b2a6 type=CONTAINER_STOPPED_EVENT May 15 12:41:31.702118 containerd[1571]: time="2025-05-15T12:41:31.702055908Z" level=warning msg="container event discarded" container=add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8 type=CONTAINER_STOPPED_EVENT May 15 12:41:32.531164 containerd[1571]: time="2025-05-15T12:41:32.531084882Z" level=warning msg="container event discarded" container=b1df85dab5bbfa1ae8016749b81b9ae9771c11dcce6ede09ebdd9dbebc15b2a6 type=CONTAINER_DELETED_EVENT May 15 12:41:34.312924 update_engine[1557]: I20250515 12:41:34.312852 1557 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs May 15 12:41:34.312924 update_engine[1557]: I20250515 12:41:34.312909 1557 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs May 15 12:41:34.315005 update_engine[1557]: I20250515 12:41:34.314968 1557 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs May 15 12:41:34.316110 update_engine[1557]: I20250515 12:41:34.316066 1557 omaha_request_params.cc:62] Current group set to developer May 15 12:41:34.317390 update_engine[1557]: I20250515 12:41:34.316210 1557 update_attempter.cc:499] Already updated boot flags. Skipping. May 15 12:41:34.317390 update_engine[1557]: I20250515 12:41:34.316224 1557 update_attempter.cc:643] Scheduling an action processor start. May 15 12:41:34.317390 update_engine[1557]: I20250515 12:41:34.316242 1557 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 15 12:41:34.317390 update_engine[1557]: I20250515 12:41:34.316294 1557 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs May 15 12:41:34.317390 update_engine[1557]: I20250515 12:41:34.316375 1557 omaha_request_action.cc:271] Posting an Omaha request to disabled May 15 12:41:34.317390 update_engine[1557]: I20250515 12:41:34.316385 1557 omaha_request_action.cc:272] Request: May 15 12:41:34.317390 update_engine[1557]: May 15 12:41:34.317390 update_engine[1557]: May 15 12:41:34.317390 update_engine[1557]: May 15 12:41:34.317390 update_engine[1557]: May 15 12:41:34.317390 update_engine[1557]: May 15 12:41:34.317390 update_engine[1557]: May 15 12:41:34.317390 update_engine[1557]: May 15 12:41:34.317390 update_engine[1557]: May 15 12:41:34.317390 update_engine[1557]: I20250515 12:41:34.316392 1557 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 15 12:41:34.330162 locksmithd[1607]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 May 15 12:41:34.333146 update_engine[1557]: I20250515 12:41:34.333106 1557 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 15 12:41:34.333510 update_engine[1557]: I20250515 12:41:34.333467 1557 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 15 12:41:34.336462 update_engine[1557]: E20250515 12:41:34.336421 1557 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 15 12:41:34.336540 update_engine[1557]: I20250515 12:41:34.336488 1557 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 May 15 12:41:36.523441 containerd[1571]: time="2025-05-15T12:41:36.523368948Z" level=warning msg="container event discarded" container=dfe13dbb5798cdb52a3ec215905fed410779cbcf9497fddae52b1544acd6c818 type=CONTAINER_CREATED_EVENT May 15 12:41:36.618748 containerd[1571]: time="2025-05-15T12:41:36.618668335Z" level=warning msg="container event discarded" container=dfe13dbb5798cdb52a3ec215905fed410779cbcf9497fddae52b1544acd6c818 type=CONTAINER_STARTED_EVENT May 15 12:41:37.864133 containerd[1571]: time="2025-05-15T12:41:37.864062457Z" level=warning msg="container event discarded" container=5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5 type=CONTAINER_STOPPED_EVENT May 15 12:41:37.916761 containerd[1571]: time="2025-05-15T12:41:37.916696871Z" level=warning msg="container event discarded" container=707fa5f61523b78a5e896fac2d98b2ffbfa13a54d3678c81085d638d8aa179c7 type=CONTAINER_STOPPED_EVENT May 15 12:41:38.343950 containerd[1571]: time="2025-05-15T12:41:38.343900593Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"bd1cf8afb6592ed0dc88f0c3112b2140d047731174b8fae6947818fc5ad4e960\" pid:7548 exited_at:{seconds:1747312898 nanos:343584600}" May 15 12:41:38.363177 containerd[1571]: time="2025-05-15T12:41:38.363086201Z" level=warning msg="container event discarded" container=27ed20acc5e4bd6d5fbcbc749f5df005727cc8ec1b0d82a21f16827d10e0a1e0 type=CONTAINER_CREATED_EVENT May 15 12:41:38.363177 containerd[1571]: time="2025-05-15T12:41:38.363131115Z" level=warning msg="container event discarded" container=27ed20acc5e4bd6d5fbcbc749f5df005727cc8ec1b0d82a21f16827d10e0a1e0 type=CONTAINER_STARTED_EVENT May 15 12:41:38.380470 containerd[1571]: time="2025-05-15T12:41:38.380393252Z" level=warning msg="container event discarded" container=efa2e9fff9d26d83476741361b993b84d443e326cd115ff4ee764edd3904cc47 type=CONTAINER_CREATED_EVENT May 15 12:41:38.438711 containerd[1571]: time="2025-05-15T12:41:38.438636223Z" level=warning msg="container event discarded" container=efa2e9fff9d26d83476741361b993b84d443e326cd115ff4ee764edd3904cc47 type=CONTAINER_STARTED_EVENT May 15 12:41:38.520099 containerd[1571]: time="2025-05-15T12:41:38.520018870Z" level=warning msg="container event discarded" container=efa2e9fff9d26d83476741361b993b84d443e326cd115ff4ee764edd3904cc47 type=CONTAINER_STOPPED_EVENT May 15 12:41:38.596460 containerd[1571]: time="2025-05-15T12:41:38.596291398Z" level=warning msg="container event discarded" container=5a6014b710a049ccfaeccbe7267e8e774877b81b9c78cc18d2b3f3d8990fbaf5 type=CONTAINER_DELETED_EVENT May 15 12:41:38.610839 containerd[1571]: time="2025-05-15T12:41:38.610725809Z" level=warning msg="container event discarded" container=8224fd3bf6efd144d236bd2174a49d4897766eb457260b15646427dbef6b9011 type=CONTAINER_DELETED_EVENT May 15 12:41:38.610839 containerd[1571]: time="2025-05-15T12:41:38.610785080Z" level=warning msg="container event discarded" container=751fb54da8312eb3d058fcbb0a2841004712c966bf61ae3c6c799067795cb4f8 type=CONTAINER_CREATED_EVENT May 15 12:41:38.631297 containerd[1571]: time="2025-05-15T12:41:38.631214572Z" level=warning msg="container event discarded" container=618412cae37ee73f537ff619135965ecb8f483b34411e34fec694e7f48c64483 type=CONTAINER_DELETED_EVENT May 15 12:41:38.685697 containerd[1571]: time="2025-05-15T12:41:38.685636691Z" level=warning msg="container event discarded" container=751fb54da8312eb3d058fcbb0a2841004712c966bf61ae3c6c799067795cb4f8 type=CONTAINER_STARTED_EVENT May 15 12:41:39.659192 containerd[1571]: time="2025-05-15T12:41:39.659116447Z" level=warning msg="container event discarded" container=23933857f1169d98c8240e1a804eaa00fe5361f65e4d2e0e5cd48d201f401369 type=CONTAINER_STOPPED_EVENT May 15 12:41:39.673469 containerd[1571]: time="2025-05-15T12:41:39.673388993Z" level=warning msg="container event discarded" container=dfe13dbb5798cdb52a3ec215905fed410779cbcf9497fddae52b1544acd6c818 type=CONTAINER_STOPPED_EVENT May 15 12:41:39.720802 containerd[1571]: time="2025-05-15T12:41:39.720728408Z" level=warning msg="container event discarded" container=fbbffb0dc23d24ebe2df35b6c6ffb88a730020f39a5222ec0a817e1b9ddaff3d type=CONTAINER_STOPPED_EVENT May 15 12:41:39.744064 containerd[1571]: time="2025-05-15T12:41:39.743987751Z" level=warning msg="container event discarded" container=565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006 type=CONTAINER_STOPPED_EVENT May 15 12:41:39.808467 containerd[1571]: time="2025-05-15T12:41:39.808409087Z" level=warning msg="container event discarded" container=751fb54da8312eb3d058fcbb0a2841004712c966bf61ae3c6c799067795cb4f8 type=CONTAINER_STOPPED_EVENT May 15 12:41:40.610752 containerd[1571]: time="2025-05-15T12:41:40.610659831Z" level=warning msg="container event discarded" container=23933857f1169d98c8240e1a804eaa00fe5361f65e4d2e0e5cd48d201f401369 type=CONTAINER_DELETED_EVENT May 15 12:41:40.638002 containerd[1571]: time="2025-05-15T12:41:40.637930713Z" level=warning msg="container event discarded" container=dfe13dbb5798cdb52a3ec215905fed410779cbcf9497fddae52b1544acd6c818 type=CONTAINER_DELETED_EVENT May 15 12:41:40.666429 containerd[1571]: time="2025-05-15T12:41:40.666355651Z" level=warning msg="container event discarded" container=ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718 type=CONTAINER_CREATED_EVENT May 15 12:41:40.742867 containerd[1571]: time="2025-05-15T12:41:40.742735741Z" level=warning msg="container event discarded" container=ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718 type=CONTAINER_STARTED_EVENT May 15 12:41:41.377413 containerd[1571]: time="2025-05-15T12:41:41.377316114Z" level=warning msg="container event discarded" container=4513ae86be14adc72445121ef325df9df21717b2fae496f7aafad7f60a28af68 type=CONTAINER_CREATED_EVENT May 15 12:41:41.377413 containerd[1571]: time="2025-05-15T12:41:41.377407336Z" level=warning msg="container event discarded" container=4513ae86be14adc72445121ef325df9df21717b2fae496f7aafad7f60a28af68 type=CONTAINER_STARTED_EVENT May 15 12:41:41.392690 containerd[1571]: time="2025-05-15T12:41:41.392634403Z" level=warning msg="container event discarded" container=9198737416f6a4ae71703954d84c90b6a7c29c9855c2e1909002ba343819c3b1 type=CONTAINER_CREATED_EVENT May 15 12:41:41.457151 containerd[1571]: time="2025-05-15T12:41:41.457038146Z" level=warning msg="container event discarded" container=9198737416f6a4ae71703954d84c90b6a7c29c9855c2e1909002ba343819c3b1 type=CONTAINER_STARTED_EVENT May 15 12:41:42.306221 containerd[1571]: time="2025-05-15T12:41:42.306160591Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"21c94d283bf8fcb22697d294216053e45ace1c22d12c372516deec61821402e3\" pid:7586 exited_at:{seconds:1747312902 nanos:305756180}" May 15 12:41:42.306763 containerd[1571]: time="2025-05-15T12:41:42.306645830Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"763b44194c4cb75d967c0625b75cd5340c40abc6d3038842a121d9fd5ebce638\" pid:7588 exited_at:{seconds:1747312902 nanos:306478697}" May 15 12:41:42.510992 containerd[1571]: time="2025-05-15T12:41:42.510888782Z" level=warning msg="container event discarded" container=47fce455f6b4d1eb01788f7f8143ec8f9e0c30e4da18be5547c8e74c1251cb2b type=CONTAINER_CREATED_EVENT May 15 12:41:42.510992 containerd[1571]: time="2025-05-15T12:41:42.510974052Z" level=warning msg="container event discarded" container=47fce455f6b4d1eb01788f7f8143ec8f9e0c30e4da18be5547c8e74c1251cb2b type=CONTAINER_STARTED_EVENT May 15 12:41:42.530290 containerd[1571]: time="2025-05-15T12:41:42.530242204Z" level=warning msg="container event discarded" container=5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119 type=CONTAINER_CREATED_EVENT May 15 12:41:42.594806 containerd[1571]: time="2025-05-15T12:41:42.594638793Z" level=warning msg="container event discarded" container=5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119 type=CONTAINER_STARTED_EVENT May 15 12:41:44.167930 update_engine[1557]: I20250515 12:41:44.167848 1557 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 15 12:41:44.168387 update_engine[1557]: I20250515 12:41:44.168053 1557 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 15 12:41:44.168387 update_engine[1557]: I20250515 12:41:44.168303 1557 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 15 12:41:44.168780 update_engine[1557]: E20250515 12:41:44.168744 1557 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 15 12:41:44.168860 update_engine[1557]: I20250515 12:41:44.168823 1557 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 May 15 12:41:48.246721 containerd[1571]: time="2025-05-15T12:41:48.246584869Z" level=warning msg="container event discarded" container=6364ea56f4b1c721c2cd1f73b6b3a3c081dc1c73326db48b5eafd4276a2db7fd type=CONTAINER_CREATED_EVENT May 15 12:41:48.322569 containerd[1571]: time="2025-05-15T12:41:48.322519583Z" level=warning msg="container event discarded" container=6364ea56f4b1c721c2cd1f73b6b3a3c081dc1c73326db48b5eafd4276a2db7fd type=CONTAINER_STARTED_EVENT May 15 12:41:54.168217 update_engine[1557]: I20250515 12:41:54.168098 1557 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 15 12:41:54.168798 update_engine[1557]: I20250515 12:41:54.168633 1557 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 15 12:41:54.169162 update_engine[1557]: I20250515 12:41:54.169108 1557 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 15 12:41:54.169521 update_engine[1557]: E20250515 12:41:54.169466 1557 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 15 12:41:54.169582 update_engine[1557]: I20250515 12:41:54.169546 1557 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 May 15 12:41:57.261933 systemd[1]: Started sshd@15-37.27.185.109:22-219.127.7.87:51270.service - OpenSSH per-connection server daemon (219.127.7.87:51270). May 15 12:42:00.079541 systemd[1]: Started sshd@16-37.27.185.109:22-103.171.85.115:41922.service - OpenSSH per-connection server daemon (103.171.85.115:41922). May 15 12:42:00.626616 sshd[7606]: Invalid user applmgr from 219.127.7.87 port 51270 May 15 12:42:01.016321 sshd[7606]: Received disconnect from 219.127.7.87 port 51270:11: Bye Bye [preauth] May 15 12:42:01.016321 sshd[7606]: Disconnected from invalid user applmgr 219.127.7.87 port 51270 [preauth] May 15 12:42:01.019254 systemd[1]: sshd@15-37.27.185.109:22-219.127.7.87:51270.service: Deactivated successfully. May 15 12:42:01.165729 sshd[7611]: Invalid user gaurav from 103.171.85.115 port 41922 May 15 12:42:01.364683 sshd[7611]: Received disconnect from 103.171.85.115 port 41922:11: Bye Bye [preauth] May 15 12:42:01.364683 sshd[7611]: Disconnected from invalid user gaurav 103.171.85.115 port 41922 [preauth] May 15 12:42:01.367969 systemd[1]: sshd@16-37.27.185.109:22-103.171.85.115:41922.service: Deactivated successfully. May 15 12:42:04.168174 update_engine[1557]: I20250515 12:42:04.168052 1557 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 15 12:42:04.168828 update_engine[1557]: I20250515 12:42:04.168480 1557 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 15 12:42:04.169038 update_engine[1557]: I20250515 12:42:04.168964 1557 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 15 12:42:04.169416 update_engine[1557]: E20250515 12:42:04.169299 1557 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 15 12:42:04.169416 update_engine[1557]: I20250515 12:42:04.169428 1557 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 15 12:42:04.169416 update_engine[1557]: I20250515 12:42:04.169445 1557 omaha_request_action.cc:617] Omaha request response: May 15 12:42:04.169738 update_engine[1557]: E20250515 12:42:04.169538 1557 omaha_request_action.cc:636] Omaha request network transfer failed. May 15 12:42:04.169738 update_engine[1557]: I20250515 12:42:04.169571 1557 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. May 15 12:42:04.169738 update_engine[1557]: I20250515 12:42:04.169579 1557 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 15 12:42:04.169738 update_engine[1557]: I20250515 12:42:04.169585 1557 update_attempter.cc:306] Processing Done. May 15 12:42:04.169738 update_engine[1557]: E20250515 12:42:04.169603 1557 update_attempter.cc:619] Update failed. May 15 12:42:04.169738 update_engine[1557]: I20250515 12:42:04.169612 1557 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse May 15 12:42:04.169738 update_engine[1557]: I20250515 12:42:04.169620 1557 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) May 15 12:42:04.169738 update_engine[1557]: I20250515 12:42:04.169625 1557 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. May 15 12:42:04.169738 update_engine[1557]: I20250515 12:42:04.169726 1557 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 15 12:42:04.170902 update_engine[1557]: I20250515 12:42:04.169756 1557 omaha_request_action.cc:271] Posting an Omaha request to disabled May 15 12:42:04.170902 update_engine[1557]: I20250515 12:42:04.169763 1557 omaha_request_action.cc:272] Request: May 15 12:42:04.170902 update_engine[1557]: May 15 12:42:04.170902 update_engine[1557]: May 15 12:42:04.170902 update_engine[1557]: May 15 12:42:04.170902 update_engine[1557]: May 15 12:42:04.170902 update_engine[1557]: May 15 12:42:04.170902 update_engine[1557]: May 15 12:42:04.170902 update_engine[1557]: I20250515 12:42:04.169771 1557 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 15 12:42:04.170902 update_engine[1557]: I20250515 12:42:04.169955 1557 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 15 12:42:04.170902 update_engine[1557]: I20250515 12:42:04.170521 1557 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 15 12:42:04.172038 update_engine[1557]: E20250515 12:42:04.170902 1557 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 15 12:42:04.172038 update_engine[1557]: I20250515 12:42:04.170957 1557 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 15 12:42:04.172038 update_engine[1557]: I20250515 12:42:04.170969 1557 omaha_request_action.cc:617] Omaha request response: May 15 12:42:04.172038 update_engine[1557]: I20250515 12:42:04.170976 1557 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 15 12:42:04.172038 update_engine[1557]: I20250515 12:42:04.170982 1557 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 15 12:42:04.172038 update_engine[1557]: I20250515 12:42:04.170989 1557 update_attempter.cc:306] Processing Done. May 15 12:42:04.172038 update_engine[1557]: I20250515 12:42:04.170995 1557 update_attempter.cc:310] Error event sent. May 15 12:42:04.172038 update_engine[1557]: I20250515 12:42:04.171007 1557 update_check_scheduler.cc:74] Next update check in 49m43s May 15 12:42:04.172488 locksmithd[1607]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 May 15 12:42:04.172488 locksmithd[1607]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 May 15 12:42:08.355008 containerd[1571]: time="2025-05-15T12:42:08.354934261Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"6976d0a09ffc1f570363e91439c8b9bcd581a223cf82ae5e23128b4acde1ad6b\" pid:7638 exited_at:{seconds:1747312928 nanos:354497761}" May 15 12:42:12.322565 containerd[1571]: time="2025-05-15T12:42:12.322473687Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"171fe28a9d88dba22b0e8a0e26cb0198e993f4897fe1b1afd8b756f64cfb75a5\" pid:7664 exited_at:{seconds:1747312932 nanos:321196089}" May 15 12:42:13.479702 containerd[1571]: time="2025-05-15T12:42:13.479624502Z" level=warning msg="container event discarded" container=565bee5be6cfedbedbf315540a5609e8fdbc09831d326a700c22ef4056ea8006 type=CONTAINER_DELETED_EVENT May 15 12:42:13.479702 containerd[1571]: time="2025-05-15T12:42:13.479693141Z" level=warning msg="container event discarded" container=fbbffb0dc23d24ebe2df35b6c6ffb88a730020f39a5222ec0a817e1b9ddaff3d type=CONTAINER_DELETED_EVENT May 15 12:42:13.666143 containerd[1571]: time="2025-05-15T12:42:13.666050263Z" level=warning msg="container event discarded" container=9a175aacc0af881c138965f0456266bf47d857e7368b5604adb3a084ba160664 type=CONTAINER_DELETED_EVENT May 15 12:42:13.680529 containerd[1571]: time="2025-05-15T12:42:13.680469776Z" level=warning msg="container event discarded" container=707fa5f61523b78a5e896fac2d98b2ffbfa13a54d3678c81085d638d8aa179c7 type=CONTAINER_DELETED_EVENT May 15 12:42:13.836218 containerd[1571]: time="2025-05-15T12:42:13.835963494Z" level=warning msg="container event discarded" container=add69d188f9bae23084128253007ec631373fec15d37a8119e1d7235bb2e89a8 type=CONTAINER_DELETED_EVENT May 15 12:42:37.292475 systemd[1]: Started sshd@17-37.27.185.109:22-39.109.116.40:42860.service - OpenSSH per-connection server daemon (39.109.116.40:42860). May 15 12:42:38.335299 containerd[1571]: time="2025-05-15T12:42:38.335244660Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"6d87df6f2b8a4dc0bc87462b0c9106660cb68ba4bfa99d6112bd13fcc91981ed\" pid:7692 exited_at:{seconds:1747312958 nanos:334908058}" May 15 12:42:39.014722 sshd[7678]: Invalid user fs from 39.109.116.40 port 42860 May 15 12:42:39.344263 sshd[7678]: Received disconnect from 39.109.116.40 port 42860:11: Bye Bye [preauth] May 15 12:42:39.344263 sshd[7678]: Disconnected from invalid user fs 39.109.116.40 port 42860 [preauth] May 15 12:42:39.347492 systemd[1]: sshd@17-37.27.185.109:22-39.109.116.40:42860.service: Deactivated successfully. May 15 12:42:42.301856 containerd[1571]: time="2025-05-15T12:42:42.301395614Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"85e677eff07aafb867b0d2731a260bd50b9992465ca1b8b0abee4965a7ac8e6a\" pid:7731 exited_at:{seconds:1747312962 nanos:301092215}" May 15 12:42:42.307707 containerd[1571]: time="2025-05-15T12:42:42.307656936Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"59ac9d7994348ae58c755785a99c08284d7d67c3b417ac9d6cdb47fcfe27c5b4\" pid:7728 exited_at:{seconds:1747312962 nanos:305620323}" May 15 12:42:46.428882 systemd[1]: Started sshd@18-37.27.185.109:22-219.127.7.87:59227.service - OpenSSH per-connection server daemon (219.127.7.87:59227). May 15 12:42:48.300664 sshd[7749]: Invalid user dev from 219.127.7.87 port 59227 May 15 12:42:48.639788 sshd[7749]: Received disconnect from 219.127.7.87 port 59227:11: Bye Bye [preauth] May 15 12:42:48.639788 sshd[7749]: Disconnected from invalid user dev 219.127.7.87 port 59227 [preauth] May 15 12:42:48.641291 systemd[1]: sshd@18-37.27.185.109:22-219.127.7.87:59227.service: Deactivated successfully. May 15 12:43:08.346372 containerd[1571]: time="2025-05-15T12:43:08.346040545Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"326f1fe548362e7b8b8a00a940e5a2b9b7f67f1dacaf87c0ce010f70fc507dc5\" pid:7779 exited_at:{seconds:1747312988 nanos:345685830}" May 15 12:43:12.292734 containerd[1571]: time="2025-05-15T12:43:12.292698011Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"8475a956d945d290c7d1cc37f036531e43d99e7276dc1b8b64846794eef4ac34\" pid:7807 exited_at:{seconds:1747312992 nanos:292413026}" May 15 12:43:17.223573 systemd[1]: Started sshd@19-37.27.185.109:22-107.175.33.240:37922.service - OpenSSH per-connection server daemon (107.175.33.240:37922). May 15 12:43:17.838803 sshd[7819]: Invalid user gpadmin from 107.175.33.240 port 37922 May 15 12:43:17.944468 sshd[7819]: Received disconnect from 107.175.33.240 port 37922:11: Bye Bye [preauth] May 15 12:43:17.944468 sshd[7819]: Disconnected from invalid user gpadmin 107.175.33.240 port 37922 [preauth] May 15 12:43:17.947812 systemd[1]: sshd@19-37.27.185.109:22-107.175.33.240:37922.service: Deactivated successfully. May 15 12:43:33.214616 systemd[1]: Started sshd@20-37.27.185.109:22-219.127.7.87:38775.service - OpenSSH per-connection server daemon (219.127.7.87:38775). May 15 12:43:35.089735 sshd[7826]: Invalid user tester from 219.127.7.87 port 38775 May 15 12:43:35.450900 sshd[7826]: Received disconnect from 219.127.7.87 port 38775:11: Bye Bye [preauth] May 15 12:43:35.450900 sshd[7826]: Disconnected from invalid user tester 219.127.7.87 port 38775 [preauth] May 15 12:43:35.452982 systemd[1]: sshd@20-37.27.185.109:22-219.127.7.87:38775.service: Deactivated successfully. May 15 12:43:38.333360 containerd[1571]: time="2025-05-15T12:43:38.333149892Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"3caa3585ef90bdf9eb399970f31e19d02b0735907b1a83b0a97260c22ff80b32\" pid:7843 exited_at:{seconds:1747313018 nanos:332826324}" May 15 12:43:42.288142 containerd[1571]: time="2025-05-15T12:43:42.288067590Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"a7bdf392a45890206f78828dbd95335a33de29f978cc95443900b5b81ca1d319\" pid:7880 exited_at:{seconds:1747313022 nanos:287396080}" May 15 12:43:42.292604 containerd[1571]: time="2025-05-15T12:43:42.292587614Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"731f7a24143162f4dc123a1dd50cf5a00fc90117c12e9031e5a757b83f8a4dbd\" pid:7879 exited_at:{seconds:1747313022 nanos:291998468}" May 15 12:44:06.116904 systemd[1]: Started sshd@21-37.27.185.109:22-49.64.242.249:53106.service - OpenSSH per-connection server daemon (49.64.242.249:53106). May 15 12:44:08.340182 containerd[1571]: time="2025-05-15T12:44:08.340111456Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"b0277c6ce167d1cb240a610fd196d43166e10e31f5b5dfbc14c27c4a30e90795\" pid:7916 exited_at:{seconds:1747313048 nanos:339771268}" May 15 12:44:08.574978 sshd[7901]: Invalid user testserver from 49.64.242.249 port 53106 May 15 12:44:08.795845 sshd[7901]: Received disconnect from 49.64.242.249 port 53106:11: Bye Bye [preauth] May 15 12:44:08.795845 sshd[7901]: Disconnected from invalid user testserver 49.64.242.249 port 53106 [preauth] May 15 12:44:08.798226 systemd[1]: sshd@21-37.27.185.109:22-49.64.242.249:53106.service: Deactivated successfully. May 15 12:44:12.290208 containerd[1571]: time="2025-05-15T12:44:12.290169506Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"eb2e9bdeb9a27d12b056e0d53b99a905bfd8bd153149f89f7d6e07ebdfcd32de\" pid:7942 exited_at:{seconds:1747313052 nanos:289049544}" May 15 12:44:22.006538 systemd[1]: Started sshd@22-37.27.185.109:22-219.127.7.87:46014.service - OpenSSH per-connection server daemon (219.127.7.87:46014). May 15 12:44:24.402203 sshd[7954]: Received disconnect from 219.127.7.87 port 46014:11: Bye Bye [preauth] May 15 12:44:24.402203 sshd[7954]: Disconnected from authenticating user root 219.127.7.87 port 46014 [preauth] May 15 12:44:24.404142 systemd[1]: sshd@22-37.27.185.109:22-219.127.7.87:46014.service: Deactivated successfully. May 15 12:44:25.229726 systemd[1]: Started sshd@23-37.27.185.109:22-161.35.7.113:51830.service - OpenSSH per-connection server daemon (161.35.7.113:51830). May 15 12:44:25.978542 sshd[7959]: Received disconnect from 161.35.7.113 port 51830:11: Bye Bye [preauth] May 15 12:44:25.978542 sshd[7959]: Disconnected from authenticating user root 161.35.7.113 port 51830 [preauth] May 15 12:44:25.980323 systemd[1]: sshd@23-37.27.185.109:22-161.35.7.113:51830.service: Deactivated successfully. May 15 12:44:38.351064 containerd[1571]: time="2025-05-15T12:44:38.351022077Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"6fd0f5c7e1ff723ad8a27ff71a90884f3d3df72807fa9f4a0050d8d0e5b00f40\" pid:7989 exited_at:{seconds:1747313078 nanos:350449172}" May 15 12:44:42.298208 containerd[1571]: time="2025-05-15T12:44:42.298168399Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"dd0ef6487ee6174c53c8f748ea60df41213be1b75e12d0d9e9716026f3e99e5c\" pid:8031 exited_at:{seconds:1747313082 nanos:297883755}" May 15 12:44:42.299382 containerd[1571]: time="2025-05-15T12:44:42.299200105Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"d57dda36f33a24719cb738a49db8e962b5330854417f2ed3fe226b82d62b575c\" pid:8035 exited_at:{seconds:1747313082 nanos:299062407}" May 15 12:45:08.349132 containerd[1571]: time="2025-05-15T12:45:08.349088519Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"02521e48e5d78e23d944a4daf26f7e7e70e653a75ddf15ac40b209d79f0a2d8b\" pid:8071 exited_at:{seconds:1747313108 nanos:348640008}" May 15 12:45:12.294419 containerd[1571]: time="2025-05-15T12:45:12.293913467Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"21e3463cd2c6f1bfbab5c89d378166436b8ffff1cd8016d671a4733efa280cda\" pid:8096 exited_at:{seconds:1747313112 nanos:293491505}" May 15 12:45:14.940049 systemd[1]: Started sshd@24-37.27.185.109:22-219.127.7.87:57440.service - OpenSSH per-connection server daemon (219.127.7.87:57440). May 15 12:45:16.798274 sshd[8109]: Invalid user majid from 219.127.7.87 port 57440 May 15 12:45:17.153034 sshd[8109]: Received disconnect from 219.127.7.87 port 57440:11: Bye Bye [preauth] May 15 12:45:17.153034 sshd[8109]: Disconnected from invalid user majid 219.127.7.87 port 57440 [preauth] May 15 12:45:17.155081 systemd[1]: sshd@24-37.27.185.109:22-219.127.7.87:57440.service: Deactivated successfully. May 15 12:45:26.655903 systemd[1]: Started sshd@25-37.27.185.109:22-134.209.119.98:59534.service - OpenSSH per-connection server daemon (134.209.119.98:59534). May 15 12:45:27.297433 sshd[8114]: Invalid user reelftptv from 134.209.119.98 port 59534 May 15 12:45:27.408865 sshd[8114]: Received disconnect from 134.209.119.98 port 59534:11: Bye Bye [preauth] May 15 12:45:27.408865 sshd[8114]: Disconnected from invalid user reelftptv 134.209.119.98 port 59534 [preauth] May 15 12:45:27.413041 systemd[1]: sshd@25-37.27.185.109:22-134.209.119.98:59534.service: Deactivated successfully. May 15 12:45:38.346871 containerd[1571]: time="2025-05-15T12:45:38.346826704Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"3ce45f56d417d4796d4035c044d9ab125fb2d9e1a4dbcbf352e7b7ced812aba9\" pid:8134 exited_at:{seconds:1747313138 nanos:346309053}" May 15 12:45:42.295708 containerd[1571]: time="2025-05-15T12:45:42.295668657Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"7f480dd4e9c868ffe8d993a1c29724e94d1b768cb330d277aa682c4f6672105b\" pid:8171 exited_at:{seconds:1747313142 nanos:295497326}" May 15 12:45:42.301595 containerd[1571]: time="2025-05-15T12:45:42.301499520Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"486d497b0dd3db7863728b232aab9031169191f5530f479822fe00b1975958eb\" pid:8169 exited_at:{seconds:1747313142 nanos:301275139}" May 15 12:45:46.396187 systemd[1]: Starting systemd-tmpfiles-clean.service - Cleanup of Temporary Directories... May 15 12:45:46.442309 systemd-tmpfiles[8188]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. May 15 12:45:46.442340 systemd-tmpfiles[8188]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. May 15 12:45:46.442542 systemd-tmpfiles[8188]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 15 12:45:46.443278 systemd-tmpfiles[8188]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 15 12:45:46.444459 systemd-tmpfiles[8188]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 15 12:45:46.444665 systemd-tmpfiles[8188]: ACLs are not supported, ignoring. May 15 12:45:46.444710 systemd-tmpfiles[8188]: ACLs are not supported, ignoring. May 15 12:45:46.449143 systemd-tmpfiles[8188]: Detected autofs mount point /boot during canonicalization of boot. May 15 12:45:46.449156 systemd-tmpfiles[8188]: Skipping /boot May 15 12:45:46.454962 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully. May 15 12:45:46.455290 systemd[1]: Finished systemd-tmpfiles-clean.service - Cleanup of Temporary Directories. May 15 12:45:46.458871 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully. May 15 12:45:47.347466 systemd[1]: Started sshd@26-37.27.185.109:22-43.160.203.139:37560.service - OpenSSH per-connection server daemon (43.160.203.139:37560). May 15 12:45:48.975856 sshd[8193]: Invalid user ubuntu from 43.160.203.139 port 37560 May 15 12:45:49.297720 sshd[8193]: Received disconnect from 43.160.203.139 port 37560:11: Bye Bye [preauth] May 15 12:45:49.297720 sshd[8193]: Disconnected from invalid user ubuntu 43.160.203.139 port 37560 [preauth] May 15 12:45:49.299801 systemd[1]: sshd@26-37.27.185.109:22-43.160.203.139:37560.service: Deactivated successfully. May 15 12:46:07.857847 systemd[1]: Started sshd@27-37.27.185.109:22-219.127.7.87:43908.service - OpenSSH per-connection server daemon (219.127.7.87:43908). May 15 12:46:08.356101 containerd[1571]: time="2025-05-15T12:46:08.356039101Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"d63a8133b2df4b337d0c256829ff775c5d314aa903662c32ccb8f2ae96eb99b7\" pid:8231 exited_at:{seconds:1747313168 nanos:355769285}" May 15 12:46:10.955878 sshd[8213]: Received disconnect from 219.127.7.87 port 43908:11: Bye Bye [preauth] May 15 12:46:10.955878 sshd[8213]: Disconnected from authenticating user root 219.127.7.87 port 43908 [preauth] May 15 12:46:10.958467 systemd[1]: sshd@27-37.27.185.109:22-219.127.7.87:43908.service: Deactivated successfully. May 15 12:46:12.303540 containerd[1571]: time="2025-05-15T12:46:12.303488551Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"482af8bb2a9ac522277ed9bbceba7ee115847b5ffc5b140ded4f9c47505a7cbc\" pid:8264 exited_at:{seconds:1747313172 nanos:303047663}" May 15 12:46:38.339522 containerd[1571]: time="2025-05-15T12:46:38.339480581Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"85cd1f7926490fe4226ff5958257cd685be1a3e72f7e6e36409955dbba968e94\" pid:8292 exited_at:{seconds:1747313198 nanos:339169378}" May 15 12:46:42.294966 containerd[1571]: time="2025-05-15T12:46:42.294775875Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"e3c99c5ef6c04c9265f0b74345e3d2e5482bbb7a95f0b5ff3e2c22c453ff056a\" pid:8335 exited_at:{seconds:1747313202 nanos:294644258}" May 15 12:46:42.295570 containerd[1571]: time="2025-05-15T12:46:42.295551240Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"416e157ed3ff8bcc0da65ec7c11d29445ed249715c124cafa25873f7e4d3f2c9\" pid:8328 exited_at:{seconds:1747313202 nanos:294648005}" May 15 12:46:53.393069 systemd[1]: Started sshd@28-37.27.185.109:22-161.35.7.113:48246.service - OpenSSH per-connection server daemon (161.35.7.113:48246). May 15 12:46:54.015882 sshd[8349]: Invalid user raul from 161.35.7.113 port 48246 May 15 12:46:54.126974 sshd[8349]: Received disconnect from 161.35.7.113 port 48246:11: Bye Bye [preauth] May 15 12:46:54.126974 sshd[8349]: Disconnected from invalid user raul 161.35.7.113 port 48246 [preauth] May 15 12:46:54.129013 systemd[1]: sshd@28-37.27.185.109:22-161.35.7.113:48246.service: Deactivated successfully. May 15 12:47:01.723352 systemd[1]: Started sshd@29-37.27.185.109:22-219.127.7.87:55949.service - OpenSSH per-connection server daemon (219.127.7.87:55949). May 15 12:47:03.930082 sshd[8356]: Received disconnect from 219.127.7.87 port 55949:11: Bye Bye [preauth] May 15 12:47:03.930082 sshd[8356]: Disconnected from authenticating user root 219.127.7.87 port 55949 [preauth] May 15 12:47:03.931578 systemd[1]: sshd@29-37.27.185.109:22-219.127.7.87:55949.service: Deactivated successfully. May 15 12:47:08.342311 containerd[1571]: time="2025-05-15T12:47:08.342219844Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"af669083c846887cf103f9df7a33b14ac189b6cec879b7d92fc0d75cf549dcb8\" pid:8375 exited_at:{seconds:1747313228 nanos:341769910}" May 15 12:47:12.313094 containerd[1571]: time="2025-05-15T12:47:12.313032430Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"fd8dfba7e9535889a2bb5890050ed844f4ef5b4a3c8ed98ff7d0659734bc7c78\" pid:8399 exited_at:{seconds:1747313232 nanos:312815743}" May 15 12:47:38.343823 containerd[1571]: time="2025-05-15T12:47:38.343749788Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"a5988c46f267855906315cfb5a8b4c732cdf9d744b95a679990148101fede1b4\" pid:8434 exited_at:{seconds:1747313258 nanos:343453562}" May 15 12:47:40.079309 systemd[1]: Started sshd@30-37.27.185.109:22-39.109.116.40:39794.service - OpenSSH per-connection server daemon (39.109.116.40:39794). May 15 12:47:41.801660 sshd[8447]: Invalid user chrism from 39.109.116.40 port 39794 May 15 12:47:42.131550 sshd[8447]: Received disconnect from 39.109.116.40 port 39794:11: Bye Bye [preauth] May 15 12:47:42.131550 sshd[8447]: Disconnected from invalid user chrism 39.109.116.40 port 39794 [preauth] May 15 12:47:42.134102 systemd[1]: sshd@30-37.27.185.109:22-39.109.116.40:39794.service: Deactivated successfully. May 15 12:47:42.292705 containerd[1571]: time="2025-05-15T12:47:42.292641743Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"4de1c77c0c9981d5053909742e71a8d43da6421c3039edcfe51779782c5162e5\" pid:8477 exited_at:{seconds:1747313262 nanos:289571651}" May 15 12:47:42.293491 containerd[1571]: time="2025-05-15T12:47:42.293230838Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"955195d58de0f0982818cebc966e23b4479431920d0f9910a8ab29512f7721da\" pid:8482 exited_at:{seconds:1747313262 nanos:292038410}" May 15 12:47:52.567943 systemd[1]: Started sshd@31-37.27.185.109:22-219.127.7.87:38365.service - OpenSSH per-connection server daemon (219.127.7.87:38365). May 15 12:47:55.300916 sshd[8515]: Invalid user ftpuser from 219.127.7.87 port 38365 May 15 12:47:55.610803 sshd[8515]: Received disconnect from 219.127.7.87 port 38365:11: Bye Bye [preauth] May 15 12:47:55.610803 sshd[8515]: Disconnected from invalid user ftpuser 219.127.7.87 port 38365 [preauth] May 15 12:47:55.613353 systemd[1]: sshd@31-37.27.185.109:22-219.127.7.87:38365.service: Deactivated successfully. May 15 12:48:08.368054 containerd[1571]: time="2025-05-15T12:48:08.367940649Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"71cc191e7e114f76271520609981a7bda03699904b1c06d1df7c61988d568b02\" pid:8534 exited_at:{seconds:1747313288 nanos:367466949}" May 15 12:48:12.300179 containerd[1571]: time="2025-05-15T12:48:12.300123242Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"a2721e8933cd85f383d1a74c22585e64d4ea82e85bccd80f53cd792d16029563\" pid:8559 exited_at:{seconds:1747313292 nanos:299597286}" May 15 12:48:38.367913 containerd[1571]: time="2025-05-15T12:48:38.367756191Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"ad5e026cd40538ba365a90a7515e92cda2f3fa2e93d4d116628d23a65d76af14\" pid:8585 exited_at:{seconds:1747313318 nanos:367262995}" May 15 12:48:42.300738 containerd[1571]: time="2025-05-15T12:48:42.300699324Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"f8795c7dd63481770b6175dbfa5d86bcb97e2e73d0a98a91c9ea018c4278d840\" pid:8620 exited_at:{seconds:1747313322 nanos:300382269}" May 15 12:48:42.307077 containerd[1571]: time="2025-05-15T12:48:42.307042990Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"6a3c9c78e498e63e7373bba2c1bf40940758b96d3dce4174bd55f75c6b493086\" pid:8622 exited_at:{seconds:1747313322 nanos:306811796}" May 15 12:48:43.745486 systemd[1]: Started sshd@32-37.27.185.109:22-219.127.7.87:48175.service - OpenSSH per-connection server daemon (219.127.7.87:48175). May 15 12:48:45.561735 sshd[8641]: Invalid user nexus from 219.127.7.87 port 48175 May 15 12:48:45.892468 sshd[8641]: Received disconnect from 219.127.7.87 port 48175:11: Bye Bye [preauth] May 15 12:48:45.892468 sshd[8641]: Disconnected from invalid user nexus 219.127.7.87 port 48175 [preauth] May 15 12:48:45.895221 systemd[1]: sshd@32-37.27.185.109:22-219.127.7.87:48175.service: Deactivated successfully. May 15 12:49:05.823873 systemd[1]: Started sshd@33-37.27.185.109:22-49.64.242.249:43764.service - OpenSSH per-connection server daemon (49.64.242.249:43764). May 15 12:49:08.346247 containerd[1571]: time="2025-05-15T12:49:08.346201861Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"f699e780c458a0d5eea4c3e60e1329f701afa645b30b5486b4edab74e3b0188b\" pid:8661 exited_at:{seconds:1747313348 nanos:345879507}" May 15 12:49:09.303650 sshd[8648]: Invalid user gameserver from 49.64.242.249 port 43764 May 15 12:49:09.536015 sshd[8648]: Received disconnect from 49.64.242.249 port 43764:11: Bye Bye [preauth] May 15 12:49:09.536015 sshd[8648]: Disconnected from invalid user gameserver 49.64.242.249 port 43764 [preauth] May 15 12:49:09.538100 systemd[1]: sshd@33-37.27.185.109:22-49.64.242.249:43764.service: Deactivated successfully. May 15 12:49:12.288657 containerd[1571]: time="2025-05-15T12:49:12.288479502Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"412e7f238870ddc129392746556d9646f7f3c747811af31c0fc4d70e9dfa1b92\" pid:8686 exited_at:{seconds:1747313352 nanos:288312629}" May 15 12:49:31.349382 systemd[1]: Started sshd@34-37.27.185.109:22-188.148.148.238:54604.service - OpenSSH per-connection server daemon (188.148.148.238:54604). May 15 12:49:32.472121 sshd-session[8719]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=188.148.148.238 user=root May 15 12:49:34.730808 sshd[8717]: PAM: Permission denied for root from 188.148.148.238 May 15 12:49:34.857382 sshd[8717]: Connection closed by authenticating user root 188.148.148.238 port 54604 [preauth] May 15 12:49:34.860573 systemd[1]: sshd@34-37.27.185.109:22-188.148.148.238:54604.service: Deactivated successfully. May 15 12:49:38.343362 containerd[1571]: time="2025-05-15T12:49:38.343295627Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"3559a3487baf44befe1220914470323ece367c831807dce1d8d47ca895fa1749\" pid:8734 exited_at:{seconds:1747313378 nanos:342918488}" May 15 12:49:38.579567 systemd[1]: Started sshd@35-37.27.185.109:22-219.127.7.87:59443.service - OpenSSH per-connection server daemon (219.127.7.87:59443). May 15 12:49:40.529663 sshd[8747]: Invalid user postgres from 219.127.7.87 port 59443 May 15 12:49:40.857898 sshd[8747]: Received disconnect from 219.127.7.87 port 59443:11: Bye Bye [preauth] May 15 12:49:40.857898 sshd[8747]: Disconnected from invalid user postgres 219.127.7.87 port 59443 [preauth] May 15 12:49:40.860235 systemd[1]: sshd@35-37.27.185.109:22-219.127.7.87:59443.service: Deactivated successfully. May 15 12:49:42.294673 containerd[1571]: time="2025-05-15T12:49:42.294624417Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"563e9c91e2222411128e3f537a75ad81cd55ac149b03fa88d1edef579c5f56af\" pid:8774 exited_at:{seconds:1747313382 nanos:293503072}" May 15 12:49:42.295565 containerd[1571]: time="2025-05-15T12:49:42.294902488Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"77965f44e8ff4152467641e387c94556d9d4aeb7d8af513aad488323b4a85391\" pid:8777 exited_at:{seconds:1747313382 nanos:293727704}" May 15 12:49:45.342288 systemd[1]: Started sshd@36-37.27.185.109:22-103.183.75.90:51374.service - OpenSSH per-connection server daemon (103.183.75.90:51374). May 15 12:49:47.281981 sshd[8793]: Invalid user keyvan from 103.183.75.90 port 51374 May 15 12:49:47.487783 sshd[8793]: Received disconnect from 103.183.75.90 port 51374:11: Bye Bye [preauth] May 15 12:49:47.487783 sshd[8793]: Disconnected from invalid user keyvan 103.183.75.90 port 51374 [preauth] May 15 12:49:47.490177 systemd[1]: sshd@36-37.27.185.109:22-103.183.75.90:51374.service: Deactivated successfully. May 15 12:50:08.340713 containerd[1571]: time="2025-05-15T12:50:08.340651283Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"0657d408b188030086f23a2f06df6890e074f7b33810046ba405c428dbf87324\" pid:8812 exited_at:{seconds:1747313408 nanos:340294925}" May 15 12:50:12.310368 containerd[1571]: time="2025-05-15T12:50:12.310253198Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"9cc5bbb78192e81122690000dd6c2cc780832af9115a5bef726dc4a024d04b9f\" pid:8837 exited_at:{seconds:1747313412 nanos:309510334}" May 15 12:50:13.432751 systemd[1]: Started sshd@37-37.27.185.109:22-49.64.242.249:56288.service - OpenSSH per-connection server daemon (49.64.242.249:56288). May 15 12:50:16.454446 sshd[8849]: Invalid user sammy from 49.64.242.249 port 56288 May 15 12:50:16.676660 sshd[8849]: Received disconnect from 49.64.242.249 port 56288:11: Bye Bye [preauth] May 15 12:50:16.676660 sshd[8849]: Disconnected from invalid user sammy 49.64.242.249 port 56288 [preauth] May 15 12:50:16.678705 systemd[1]: sshd@37-37.27.185.109:22-49.64.242.249:56288.service: Deactivated successfully. May 15 12:50:38.344494 containerd[1571]: time="2025-05-15T12:50:38.344394220Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"1d33aefc40464080439798f56462dc367ad1abe3f202e6900ffac24e9e8cee1c\" pid:8875 exited_at:{seconds:1747313438 nanos:343907436}" May 15 12:50:38.626750 systemd[1]: Started sshd@38-37.27.185.109:22-219.127.7.87:44237.service - OpenSSH per-connection server daemon (219.127.7.87:44237). May 15 12:50:41.414604 sshd[8889]: Received disconnect from 219.127.7.87 port 44237:11: Bye Bye [preauth] May 15 12:50:41.414604 sshd[8889]: Disconnected from authenticating user root 219.127.7.87 port 44237 [preauth] May 15 12:50:41.416873 systemd[1]: sshd@38-37.27.185.109:22-219.127.7.87:44237.service: Deactivated successfully. May 15 12:50:42.295055 containerd[1571]: time="2025-05-15T12:50:42.295001986Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"836d4b84a8d5071f17a079363945b2da5dace322b04a25175877d8eec21027b4\" pid:8915 exited_at:{seconds:1747313442 nanos:294806760}" May 15 12:50:42.301615 containerd[1571]: time="2025-05-15T12:50:42.301569872Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"6ab4f52f3d42bb08136dfd0aedfb1cd1c34af39e5f307dbe9370c747251f4cea\" pid:8925 exited_at:{seconds:1747313442 nanos:301424089}" May 15 12:50:58.458273 systemd[1]: Started sshd@39-37.27.185.109:22-80.210.52.198:58861.service - OpenSSH per-connection server daemon (80.210.52.198:58861). May 15 12:50:59.878809 sshd[8950]: Invalid user 123 from 80.210.52.198 port 58861 May 15 12:51:00.210914 sshd-session[8954]: pam_faillock(sshd:auth): User unknown May 15 12:51:00.215257 sshd[8950]: Postponed keyboard-interactive for invalid user 123 from 80.210.52.198 port 58861 ssh2 [preauth] May 15 12:51:00.509163 sshd-session[8954]: pam_unix(sshd:auth): check pass; user unknown May 15 12:51:00.509199 sshd-session[8954]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.210.52.198 May 15 12:51:00.509354 sshd-session[8954]: pam_faillock(sshd:auth): User unknown May 15 12:51:02.516130 sshd[8950]: PAM: Permission denied for illegal user 123 from 80.210.52.198 May 15 12:51:02.516521 sshd[8950]: Failed keyboard-interactive/pam for invalid user 123 from 80.210.52.198 port 58861 ssh2 May 15 12:51:02.836502 sshd[8950]: Connection closed by invalid user 123 80.210.52.198 port 58861 [preauth] May 15 12:51:02.838969 systemd[1]: sshd@39-37.27.185.109:22-80.210.52.198:58861.service: Deactivated successfully. May 15 12:51:08.337091 containerd[1571]: time="2025-05-15T12:51:08.337045732Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"f3c29e14ced06baf9d5313676e808426c8217898f5a8ef0510dc4c1caaabbf86\" pid:8970 exited_at:{seconds:1747313468 nanos:336778542}" May 15 12:51:12.292781 containerd[1571]: time="2025-05-15T12:51:12.292663110Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"f86735c6e45552d97bc0bb5728075f79f8f549730ebaf57d4474914622a31445\" pid:8999 exited_at:{seconds:1747313472 nanos:292456623}" May 15 12:51:38.180276 systemd[1]: Started sshd@40-37.27.185.109:22-219.127.7.87:56117.service - OpenSSH per-connection server daemon (219.127.7.87:56117). May 15 12:51:38.361963 containerd[1571]: time="2025-05-15T12:51:38.361862131Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"329809475bf9c951688b7f37326b6e35f455c9028d8f9a33abf80db7083cf2b1\" pid:9031 exited_at:{seconds:1747313498 nanos:361542942}" May 15 12:51:39.861988 sshd[9016]: Invalid user www-data from 219.127.7.87 port 56117 May 15 12:51:40.177523 sshd[9016]: Received disconnect from 219.127.7.87 port 56117:11: Bye Bye [preauth] May 15 12:51:40.177523 sshd[9016]: Disconnected from invalid user www-data 219.127.7.87 port 56117 [preauth] May 15 12:51:40.181090 systemd[1]: sshd@40-37.27.185.109:22-219.127.7.87:56117.service: Deactivated successfully. May 15 12:51:42.295142 containerd[1571]: time="2025-05-15T12:51:42.295105387Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"b8a5d1f852065098f92e8185617cc2add5230129a03bfc605595fb53f054dede\" pid:9069 exited_at:{seconds:1747313502 nanos:294050948}" May 15 12:51:42.296680 containerd[1571]: time="2025-05-15T12:51:42.296633535Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"ff139ded7c2f2d486690df85df567a329af9a3cb0d6de181c94534e2b8781c02\" pid:9074 exited_at:{seconds:1747313502 nanos:296177529}" May 15 12:51:44.253117 systemd[1]: Started sshd@41-37.27.185.109:22-118.179.219.137:52384.service - OpenSSH per-connection server daemon (118.179.219.137:52384). May 15 12:51:45.334899 sshd[9090]: Invalid user saeed from 118.179.219.137 port 52384 May 15 12:51:45.527394 sshd[9090]: Received disconnect from 118.179.219.137 port 52384:11: Bye Bye [preauth] May 15 12:51:45.527394 sshd[9090]: Disconnected from invalid user saeed 118.179.219.137 port 52384 [preauth] May 15 12:51:45.529938 systemd[1]: sshd@41-37.27.185.109:22-118.179.219.137:52384.service: Deactivated successfully. May 15 12:51:50.928283 systemd[1]: Started sshd@42-37.27.185.109:22-161.35.7.113:55102.service - OpenSSH per-connection server daemon (161.35.7.113:55102). May 15 12:51:51.530726 sshd[9095]: Invalid user zoom from 161.35.7.113 port 55102 May 15 12:51:51.641191 sshd[9095]: Received disconnect from 161.35.7.113 port 55102:11: Bye Bye [preauth] May 15 12:51:51.641191 sshd[9095]: Disconnected from invalid user zoom 161.35.7.113 port 55102 [preauth] May 15 12:51:51.643452 systemd[1]: sshd@42-37.27.185.109:22-161.35.7.113:55102.service: Deactivated successfully. May 15 12:52:08.334748 containerd[1571]: time="2025-05-15T12:52:08.334705555Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"260db0548e816326f669338d1d56aebdfca532acee65ed309bba8ae1c5220fb3\" pid:9114 exited_at:{seconds:1747313528 nanos:334433713}" May 15 12:52:12.292415 containerd[1571]: time="2025-05-15T12:52:12.292375074Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"2d19b2b2fec3c160d38a3e21bd71c1166ae8f106c551f934e181fba991ec076c\" pid:9138 exited_at:{seconds:1747313532 nanos:291914982}" May 15 12:52:28.109081 systemd[1]: Started sshd@43-37.27.185.109:22-49.64.242.249:53948.service - OpenSSH per-connection server daemon (49.64.242.249:53948). May 15 12:52:30.050696 sshd[9162]: Invalid user ming from 49.64.242.249 port 53948 May 15 12:52:30.273768 sshd[9162]: Received disconnect from 49.64.242.249 port 53948:11: Bye Bye [preauth] May 15 12:52:30.273768 sshd[9162]: Disconnected from invalid user ming 49.64.242.249 port 53948 [preauth] May 15 12:52:30.276059 systemd[1]: sshd@43-37.27.185.109:22-49.64.242.249:53948.service: Deactivated successfully. May 15 12:52:33.762178 systemd[1]: Started sshd@44-37.27.185.109:22-43.160.203.139:40688.service - OpenSSH per-connection server daemon (43.160.203.139:40688). May 15 12:52:34.062525 systemd[1]: Started sshd@45-37.27.185.109:22-219.127.7.87:38836.service - OpenSSH per-connection server daemon (219.127.7.87:38836). May 15 12:52:35.145164 systemd[1]: Started sshd@46-37.27.185.109:22-39.109.116.40:58700.service - OpenSSH per-connection server daemon (39.109.116.40:58700). May 15 12:52:35.667038 sshd[9170]: Received disconnect from 43.160.203.139 port 40688:11: Bye Bye [preauth] May 15 12:52:35.667038 sshd[9170]: Disconnected from authenticating user root 43.160.203.139 port 40688 [preauth] May 15 12:52:35.669979 systemd[1]: sshd@44-37.27.185.109:22-43.160.203.139:40688.service: Deactivated successfully. May 15 12:52:36.062152 sshd[9173]: Received disconnect from 219.127.7.87 port 38836:11: Bye Bye [preauth] May 15 12:52:36.062152 sshd[9173]: Disconnected from authenticating user root 219.127.7.87 port 38836 [preauth] May 15 12:52:36.064948 systemd[1]: sshd@45-37.27.185.109:22-219.127.7.87:38836.service: Deactivated successfully. May 15 12:52:36.876478 sshd[9176]: Invalid user camilla from 39.109.116.40 port 58700 May 15 12:52:37.203307 sshd[9176]: Received disconnect from 39.109.116.40 port 58700:11: Bye Bye [preauth] May 15 12:52:37.203307 sshd[9176]: Disconnected from invalid user camilla 39.109.116.40 port 58700 [preauth] May 15 12:52:37.205795 systemd[1]: sshd@46-37.27.185.109:22-39.109.116.40:58700.service: Deactivated successfully. May 15 12:52:38.353956 containerd[1571]: time="2025-05-15T12:52:38.353910420Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"b22cfa137690f93dd2a14fd240b8db7c6c268b63a59ace4dc303edb0546f1b9a\" pid:9197 exited_at:{seconds:1747313558 nanos:349143163}" May 15 12:52:41.718796 systemd[1]: Started sshd@47-37.27.185.109:22-147.75.109.163:50158.service - OpenSSH per-connection server daemon (147.75.109.163:50158). May 15 12:52:42.295840 containerd[1571]: time="2025-05-15T12:52:42.295792391Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"be310947a216dd96935afddda570b4892a0fe12c8b086478c59727746cd9188d\" pid:9243 exited_at:{seconds:1747313562 nanos:295482410}" May 15 12:52:42.298307 containerd[1571]: time="2025-05-15T12:52:42.298285100Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"e860120b9ce17a572b8e99f05b3c2b5766edb649896c8e00c67c173bb3d1435e\" pid:9245 exited_at:{seconds:1747313562 nanos:298118597}" May 15 12:52:42.711620 sshd[9218]: Accepted publickey for core from 147.75.109.163 port 50158 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 12:52:42.713340 sshd-session[9218]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:52:42.724078 systemd-logind[1554]: New session 8 of user core. May 15 12:52:42.732505 systemd[1]: Started session-8.scope - Session 8 of User core. May 15 12:52:43.850896 sshd[9262]: Connection closed by 147.75.109.163 port 50158 May 15 12:52:43.853117 sshd-session[9218]: pam_unix(sshd:session): session closed for user core May 15 12:52:43.860993 systemd[1]: sshd@47-37.27.185.109:22-147.75.109.163:50158.service: Deactivated successfully. May 15 12:52:43.863970 systemd[1]: session-8.scope: Deactivated successfully. May 15 12:52:43.868363 systemd-logind[1554]: Session 8 logged out. Waiting for processes to exit. May 15 12:52:43.870712 systemd-logind[1554]: Removed session 8. May 15 12:52:49.020247 systemd[1]: Started sshd@48-37.27.185.109:22-147.75.109.163:40414.service - OpenSSH per-connection server daemon (147.75.109.163:40414). May 15 12:52:50.036253 sshd[9278]: Accepted publickey for core from 147.75.109.163 port 40414 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 12:52:50.037891 sshd-session[9278]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:52:50.044179 systemd-logind[1554]: New session 9 of user core. May 15 12:52:50.048467 systemd[1]: Started session-9.scope - Session 9 of User core. May 15 12:52:50.813313 sshd[9280]: Connection closed by 147.75.109.163 port 40414 May 15 12:52:50.814369 sshd-session[9278]: pam_unix(sshd:session): session closed for user core May 15 12:52:50.819961 systemd[1]: sshd@48-37.27.185.109:22-147.75.109.163:40414.service: Deactivated successfully. May 15 12:52:50.821888 systemd[1]: session-9.scope: Deactivated successfully. May 15 12:52:50.823032 systemd-logind[1554]: Session 9 logged out. Waiting for processes to exit. May 15 12:52:50.824267 systemd-logind[1554]: Removed session 9. May 15 12:52:55.982658 systemd[1]: Started sshd@49-37.27.185.109:22-147.75.109.163:40426.service - OpenSSH per-connection server daemon (147.75.109.163:40426). May 15 12:52:56.999082 sshd[9295]: Accepted publickey for core from 147.75.109.163 port 40426 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 12:52:57.000760 sshd-session[9295]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:52:57.006488 systemd-logind[1554]: New session 10 of user core. May 15 12:52:57.012513 systemd[1]: Started session-10.scope - Session 10 of User core. May 15 12:52:57.727602 sshd[9297]: Connection closed by 147.75.109.163 port 40426 May 15 12:52:57.729004 sshd-session[9295]: pam_unix(sshd:session): session closed for user core May 15 12:52:57.733748 systemd[1]: sshd@49-37.27.185.109:22-147.75.109.163:40426.service: Deactivated successfully. May 15 12:52:57.735316 systemd[1]: session-10.scope: Deactivated successfully. May 15 12:52:57.736064 systemd-logind[1554]: Session 10 logged out. Waiting for processes to exit. May 15 12:52:57.737688 systemd-logind[1554]: Removed session 10. May 15 12:53:02.896226 systemd[1]: Started sshd@50-37.27.185.109:22-147.75.109.163:58630.service - OpenSSH per-connection server daemon (147.75.109.163:58630). May 15 12:53:03.882297 sshd[9313]: Accepted publickey for core from 147.75.109.163 port 58630 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 12:53:03.883689 sshd-session[9313]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:53:03.887949 systemd-logind[1554]: New session 11 of user core. May 15 12:53:03.893457 systemd[1]: Started session-11.scope - Session 11 of User core. May 15 12:53:04.617231 sshd[9315]: Connection closed by 147.75.109.163 port 58630 May 15 12:53:04.617801 sshd-session[9313]: pam_unix(sshd:session): session closed for user core May 15 12:53:04.621203 systemd-logind[1554]: Session 11 logged out. Waiting for processes to exit. May 15 12:53:04.621442 systemd[1]: sshd@50-37.27.185.109:22-147.75.109.163:58630.service: Deactivated successfully. May 15 12:53:04.623181 systemd[1]: session-11.scope: Deactivated successfully. May 15 12:53:04.625059 systemd-logind[1554]: Removed session 11. May 15 12:53:04.785092 systemd[1]: Started sshd@51-37.27.185.109:22-147.75.109.163:58642.service - OpenSSH per-connection server daemon (147.75.109.163:58642). May 15 12:53:05.765862 sshd[9328]: Accepted publickey for core from 147.75.109.163 port 58642 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 12:53:05.767138 sshd-session[9328]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:53:05.772151 systemd-logind[1554]: New session 12 of user core. May 15 12:53:05.775533 systemd[1]: Started session-12.scope - Session 12 of User core. May 15 12:53:06.532380 sshd[9330]: Connection closed by 147.75.109.163 port 58642 May 15 12:53:06.532705 sshd-session[9328]: pam_unix(sshd:session): session closed for user core May 15 12:53:06.537577 systemd-logind[1554]: Session 12 logged out. Waiting for processes to exit. May 15 12:53:06.538288 systemd[1]: sshd@51-37.27.185.109:22-147.75.109.163:58642.service: Deactivated successfully. May 15 12:53:06.540249 systemd[1]: session-12.scope: Deactivated successfully. May 15 12:53:06.542011 systemd-logind[1554]: Removed session 12. May 15 12:53:06.700903 systemd[1]: Started sshd@52-37.27.185.109:22-147.75.109.163:58648.service - OpenSSH per-connection server daemon (147.75.109.163:58648). May 15 12:53:07.680655 sshd[9340]: Accepted publickey for core from 147.75.109.163 port 58648 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 12:53:07.682011 sshd-session[9340]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:53:07.687525 systemd-logind[1554]: New session 13 of user core. May 15 12:53:07.693593 systemd[1]: Started session-13.scope - Session 13 of User core. May 15 12:53:08.408705 containerd[1571]: time="2025-05-15T12:53:08.408558494Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"da46d46f07f24f0bb1f0fbad33866a0333338384a299d8a25351338eb18bae6d\" pid:9362 exited_at:{seconds:1747313588 nanos:407957246}" May 15 12:53:08.473900 sshd[9342]: Connection closed by 147.75.109.163 port 58648 May 15 12:53:08.478452 sshd-session[9340]: pam_unix(sshd:session): session closed for user core May 15 12:53:08.486950 systemd[1]: sshd@52-37.27.185.109:22-147.75.109.163:58648.service: Deactivated successfully. May 15 12:53:08.488943 systemd[1]: session-13.scope: Deactivated successfully. May 15 12:53:08.491906 systemd-logind[1554]: Session 13 logged out. Waiting for processes to exit. May 15 12:53:08.493387 systemd-logind[1554]: Removed session 13. May 15 12:53:12.314534 containerd[1571]: time="2025-05-15T12:53:12.314495453Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"6f27222b8f09440c4311626c04b7a8c367963693272706e639c6169554463579\" pid:9393 exited_at:{seconds:1747313592 nanos:314062110}" May 15 12:53:13.645041 systemd[1]: Started sshd@53-37.27.185.109:22-147.75.109.163:59842.service - OpenSSH per-connection server daemon (147.75.109.163:59842). May 15 12:53:14.654568 sshd[9405]: Accepted publickey for core from 147.75.109.163 port 59842 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 12:53:14.656121 sshd-session[9405]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:53:14.661529 systemd-logind[1554]: New session 14 of user core. May 15 12:53:14.666494 systemd[1]: Started session-14.scope - Session 14 of User core. May 15 12:53:15.399736 sshd[9407]: Connection closed by 147.75.109.163 port 59842 May 15 12:53:15.400432 sshd-session[9405]: pam_unix(sshd:session): session closed for user core May 15 12:53:15.403868 systemd[1]: sshd@53-37.27.185.109:22-147.75.109.163:59842.service: Deactivated successfully. May 15 12:53:15.406080 systemd[1]: session-14.scope: Deactivated successfully. May 15 12:53:15.408030 systemd-logind[1554]: Session 14 logged out. Waiting for processes to exit. May 15 12:53:15.409631 systemd-logind[1554]: Removed session 14. May 15 12:53:20.567169 systemd[1]: Started sshd@54-37.27.185.109:22-147.75.109.163:52656.service - OpenSSH per-connection server daemon (147.75.109.163:52656). May 15 12:53:21.545050 sshd[9419]: Accepted publickey for core from 147.75.109.163 port 52656 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 12:53:21.546433 sshd-session[9419]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:53:21.552180 systemd-logind[1554]: New session 15 of user core. May 15 12:53:21.560635 systemd[1]: Started session-15.scope - Session 15 of User core. May 15 12:53:22.292467 sshd[9421]: Connection closed by 147.75.109.163 port 52656 May 15 12:53:22.293092 sshd-session[9419]: pam_unix(sshd:session): session closed for user core May 15 12:53:22.296843 systemd[1]: sshd@54-37.27.185.109:22-147.75.109.163:52656.service: Deactivated successfully. May 15 12:53:22.299289 systemd[1]: session-15.scope: Deactivated successfully. May 15 12:53:22.300205 systemd-logind[1554]: Session 15 logged out. Waiting for processes to exit. May 15 12:53:22.301444 systemd-logind[1554]: Removed session 15. May 15 12:53:24.657039 systemd[1]: Started sshd@55-37.27.185.109:22-219.127.7.87:47317.service - OpenSSH per-connection server daemon (219.127.7.87:47317). May 15 12:53:26.334023 sshd[9441]: Invalid user lilian from 219.127.7.87 port 47317 May 15 12:53:26.651695 sshd[9441]: Received disconnect from 219.127.7.87 port 47317:11: Bye Bye [preauth] May 15 12:53:26.651695 sshd[9441]: Disconnected from invalid user lilian 219.127.7.87 port 47317 [preauth] May 15 12:53:26.653610 systemd[1]: sshd@55-37.27.185.109:22-219.127.7.87:47317.service: Deactivated successfully. May 15 12:53:27.465536 systemd[1]: Started sshd@56-37.27.185.109:22-147.75.109.163:52666.service - OpenSSH per-connection server daemon (147.75.109.163:52666). May 15 12:53:28.443005 sshd[9446]: Accepted publickey for core from 147.75.109.163 port 52666 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 12:53:28.444701 sshd-session[9446]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:53:28.450140 systemd-logind[1554]: New session 16 of user core. May 15 12:53:28.454494 systemd[1]: Started session-16.scope - Session 16 of User core. May 15 12:53:29.183927 sshd[9448]: Connection closed by 147.75.109.163 port 52666 May 15 12:53:29.184683 sshd-session[9446]: pam_unix(sshd:session): session closed for user core May 15 12:53:29.188936 systemd[1]: sshd@56-37.27.185.109:22-147.75.109.163:52666.service: Deactivated successfully. May 15 12:53:29.191225 systemd[1]: session-16.scope: Deactivated successfully. May 15 12:53:29.192437 systemd-logind[1554]: Session 16 logged out. Waiting for processes to exit. May 15 12:53:29.194597 systemd-logind[1554]: Removed session 16. May 15 12:53:34.241559 systemd[1]: Started sshd@57-37.27.185.109:22-49.64.242.249:38587.service - OpenSSH per-connection server daemon (49.64.242.249:38587). May 15 12:53:34.355382 systemd[1]: Started sshd@58-37.27.185.109:22-147.75.109.163:53482.service - OpenSSH per-connection server daemon (147.75.109.163:53482). May 15 12:53:35.349462 sshd[9464]: Accepted publickey for core from 147.75.109.163 port 53482 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 12:53:35.350908 sshd-session[9464]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:53:35.355934 systemd-logind[1554]: New session 17 of user core. May 15 12:53:35.358489 systemd[1]: Started session-17.scope - Session 17 of User core. May 15 12:53:36.083221 sshd[9466]: Connection closed by 147.75.109.163 port 53482 May 15 12:53:36.083832 sshd-session[9464]: pam_unix(sshd:session): session closed for user core May 15 12:53:36.086827 systemd[1]: sshd@58-37.27.185.109:22-147.75.109.163:53482.service: Deactivated successfully. May 15 12:53:36.088441 systemd[1]: session-17.scope: Deactivated successfully. May 15 12:53:36.089676 systemd-logind[1554]: Session 17 logged out. Waiting for processes to exit. May 15 12:53:36.091489 systemd-logind[1554]: Removed session 17. May 15 12:53:36.253713 systemd[1]: Started sshd@59-37.27.185.109:22-147.75.109.163:53486.service - OpenSSH per-connection server daemon (147.75.109.163:53486). May 15 12:53:37.228991 sshd[9477]: Accepted publickey for core from 147.75.109.163 port 53486 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 12:53:37.230239 sshd-session[9477]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:53:37.234822 systemd-logind[1554]: New session 18 of user core. May 15 12:53:37.240468 systemd[1]: Started session-18.scope - Session 18 of User core. May 15 12:53:38.181225 sshd[9479]: Connection closed by 147.75.109.163 port 53486 May 15 12:53:38.194500 sshd-session[9477]: pam_unix(sshd:session): session closed for user core May 15 12:53:38.206060 systemd-logind[1554]: Session 18 logged out. Waiting for processes to exit. May 15 12:53:38.206285 systemd[1]: sshd@59-37.27.185.109:22-147.75.109.163:53486.service: Deactivated successfully. May 15 12:53:38.211280 systemd[1]: session-18.scope: Deactivated successfully. May 15 12:53:38.213986 systemd-logind[1554]: Removed session 18. May 15 12:53:38.350970 systemd[1]: Started sshd@60-37.27.185.109:22-147.75.109.163:34194.service - OpenSSH per-connection server daemon (147.75.109.163:34194). May 15 12:53:38.451254 containerd[1571]: time="2025-05-15T12:53:38.450842228Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"5882da632c39eca5a6ecc97edbe223769fa9a6f79933086afd8b09ae60a5e925\" pid:9502 exited_at:{seconds:1747313618 nanos:450119291}" May 15 12:53:39.379310 sshd[9490]: Accepted publickey for core from 147.75.109.163 port 34194 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 12:53:39.381083 sshd-session[9490]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:53:39.386693 systemd-logind[1554]: New session 19 of user core. May 15 12:53:39.390672 systemd[1]: Started session-19.scope - Session 19 of User core. May 15 12:53:41.947663 sshd[9516]: Connection closed by 147.75.109.163 port 34194 May 15 12:53:41.949118 sshd-session[9490]: pam_unix(sshd:session): session closed for user core May 15 12:53:41.955235 systemd-logind[1554]: Session 19 logged out. Waiting for processes to exit. May 15 12:53:41.956624 systemd[1]: sshd@60-37.27.185.109:22-147.75.109.163:34194.service: Deactivated successfully. May 15 12:53:41.958907 systemd[1]: session-19.scope: Deactivated successfully. May 15 12:53:41.959123 systemd[1]: session-19.scope: Consumed 460ms CPU time, 77.7M memory peak. May 15 12:53:41.961067 systemd-logind[1554]: Removed session 19. May 15 12:53:42.116845 systemd[1]: Started sshd@61-37.27.185.109:22-147.75.109.163:34202.service - OpenSSH per-connection server daemon (147.75.109.163:34202). May 15 12:53:42.413713 containerd[1571]: time="2025-05-15T12:53:42.413667012Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"ecda83f48abb88d635d417a74563dd251fd671d7c641886841d7768510f6fb23\" pid:9561 exited_at:{seconds:1747313622 nanos:397449767}" May 15 12:53:42.414609 containerd[1571]: time="2025-05-15T12:53:42.414086760Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"ab08e126f8b25f15df0494aad1bd42a9c83ed4bfd94496328763601eebebabb5\" pid:9559 exited_at:{seconds:1747313622 nanos:412873874}" May 15 12:53:43.049419 systemd[1]: Started sshd@62-37.27.185.109:22-75.159.86.94:39104.service - OpenSSH per-connection server daemon (75.159.86.94:39104). May 15 12:53:43.120228 sshd[9533]: Accepted publickey for core from 147.75.109.163 port 34202 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 12:53:43.122007 sshd-session[9533]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:53:43.127841 systemd-logind[1554]: New session 20 of user core. May 15 12:53:43.135498 systemd[1]: Started session-20.scope - Session 20 of User core. May 15 12:53:44.120922 sshd[9580]: Connection closed by 147.75.109.163 port 34202 May 15 12:53:44.121529 sshd-session[9533]: pam_unix(sshd:session): session closed for user core May 15 12:53:44.124626 systemd[1]: sshd@61-37.27.185.109:22-147.75.109.163:34202.service: Deactivated successfully. May 15 12:53:44.126484 systemd[1]: session-20.scope: Deactivated successfully. May 15 12:53:44.127514 systemd-logind[1554]: Session 20 logged out. Waiting for processes to exit. May 15 12:53:44.129629 systemd-logind[1554]: Removed session 20. May 15 12:53:44.288571 systemd[1]: Started sshd@63-37.27.185.109:22-147.75.109.163:34204.service - OpenSSH per-connection server daemon (147.75.109.163:34204). May 15 12:53:45.158271 sshd[9579]: Invalid user adam from 75.159.86.94 port 39104 May 15 12:53:45.271439 sshd[9591]: Accepted publickey for core from 147.75.109.163 port 34204 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 12:53:45.272904 sshd-session[9591]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:53:45.277837 systemd-logind[1554]: New session 21 of user core. May 15 12:53:45.285467 systemd[1]: Started session-21.scope - Session 21 of User core. May 15 12:53:45.654000 sshd-session[9594]: pam_faillock(sshd:auth): User unknown May 15 12:53:45.656624 sshd[9579]: Postponed keyboard-interactive for invalid user adam from 75.159.86.94 port 39104 ssh2 [preauth] May 15 12:53:46.031361 sshd[9593]: Connection closed by 147.75.109.163 port 34204 May 15 12:53:46.031565 sshd-session[9591]: pam_unix(sshd:session): session closed for user core May 15 12:53:46.036017 systemd-logind[1554]: Session 21 logged out. Waiting for processes to exit. May 15 12:53:46.037030 systemd[1]: sshd@63-37.27.185.109:22-147.75.109.163:34204.service: Deactivated successfully. May 15 12:53:46.040267 systemd[1]: session-21.scope: Deactivated successfully. May 15 12:53:46.043066 systemd-logind[1554]: Removed session 21. May 15 12:53:46.120717 sshd-session[9594]: pam_unix(sshd:auth): check pass; user unknown May 15 12:53:46.120743 sshd-session[9594]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=75.159.86.94 May 15 12:53:46.121612 sshd-session[9594]: pam_faillock(sshd:auth): User unknown May 15 12:53:48.445039 sshd[9579]: PAM: Permission denied for illegal user adam from 75.159.86.94 May 15 12:53:48.445575 sshd[9579]: Failed keyboard-interactive/pam for invalid user adam from 75.159.86.94 port 39104 ssh2 May 15 12:53:49.015196 sshd[9579]: Connection closed by invalid user adam 75.159.86.94 port 39104 [preauth] May 15 12:53:49.017834 systemd[1]: sshd@62-37.27.185.109:22-75.159.86.94:39104.service: Deactivated successfully. May 15 12:53:51.203378 systemd[1]: Started sshd@64-37.27.185.109:22-147.75.109.163:34038.service - OpenSSH per-connection server daemon (147.75.109.163:34038). May 15 12:53:52.184950 sshd[9611]: Accepted publickey for core from 147.75.109.163 port 34038 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 12:53:52.186459 sshd-session[9611]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:53:52.191584 systemd-logind[1554]: New session 22 of user core. May 15 12:53:52.196477 systemd[1]: Started session-22.scope - Session 22 of User core. May 15 12:53:52.917454 sshd[9613]: Connection closed by 147.75.109.163 port 34038 May 15 12:53:52.917931 sshd-session[9611]: pam_unix(sshd:session): session closed for user core May 15 12:53:52.921465 systemd-logind[1554]: Session 22 logged out. Waiting for processes to exit. May 15 12:53:52.921798 systemd[1]: sshd@64-37.27.185.109:22-147.75.109.163:34038.service: Deactivated successfully. May 15 12:53:52.923541 systemd[1]: session-22.scope: Deactivated successfully. May 15 12:53:52.924816 systemd-logind[1554]: Removed session 22. May 15 12:53:58.085133 systemd[1]: Started sshd@65-37.27.185.109:22-147.75.109.163:34050.service - OpenSSH per-connection server daemon (147.75.109.163:34050). May 15 12:53:59.072946 sshd[9625]: Accepted publickey for core from 147.75.109.163 port 34050 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 12:53:59.074206 sshd-session[9625]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 12:53:59.078388 systemd-logind[1554]: New session 23 of user core. May 15 12:53:59.083462 systemd[1]: Started session-23.scope - Session 23 of User core. May 15 12:53:59.798635 sshd[9627]: Connection closed by 147.75.109.163 port 34050 May 15 12:53:59.799162 sshd-session[9625]: pam_unix(sshd:session): session closed for user core May 15 12:53:59.802836 systemd-logind[1554]: Session 23 logged out. Waiting for processes to exit. May 15 12:53:59.802905 systemd[1]: sshd@65-37.27.185.109:22-147.75.109.163:34050.service: Deactivated successfully. May 15 12:53:59.804651 systemd[1]: session-23.scope: Deactivated successfully. May 15 12:53:59.806542 systemd-logind[1554]: Removed session 23. May 15 12:54:02.169561 systemd[1]: Started sshd@66-37.27.185.109:22-107.175.33.240:57056.service - OpenSSH per-connection server daemon (107.175.33.240:57056). May 15 12:54:02.865914 sshd[9653]: Received disconnect from 107.175.33.240 port 57056:11: Bye Bye [preauth] May 15 12:54:02.865914 sshd[9653]: Disconnected from authenticating user root 107.175.33.240 port 57056 [preauth] May 15 12:54:02.868992 systemd[1]: sshd@66-37.27.185.109:22-107.175.33.240:57056.service: Deactivated successfully. May 15 12:54:08.383995 containerd[1571]: time="2025-05-15T12:54:08.383941385Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed1481d289169893c705bb6de4c65684fa5e1d3a23fbac3df93368edfc546718\" id:\"93e16d6f53f4bb9b7a3c4c0b59be0d34ed9ce0fc78dbab282e124cf7fb4084fc\" pid:9669 exited_at:{seconds:1747313648 nanos:383626404}" May 15 12:54:12.316747 containerd[1571]: time="2025-05-15T12:54:12.316694254Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5342e7dd1bd24d897e2feeb7539eea80200cf2fde44f0e24b2063d92dd9bd119\" id:\"6a2e23700806c65ba239ed535c6f3421f49e6dd344ef548a72d3f1eeca96b633\" pid:9692 exit_status:1 exited_at:{seconds:1747313652 nanos:316082115}" May 15 12:54:14.551703 systemd[1]: Started sshd@67-37.27.185.109:22-219.127.7.87:55882.service - OpenSSH per-connection server daemon (219.127.7.87:55882). May 15 12:54:15.455407 systemd[1]: cri-containerd-e823510bedc9ed0e482b1d3eb7d8a8f7b1eef33bcbb7e7fea24183a83e878d89.scope: Deactivated successfully. May 15 12:54:15.455696 systemd[1]: cri-containerd-e823510bedc9ed0e482b1d3eb7d8a8f7b1eef33bcbb7e7fea24183a83e878d89.scope: Consumed 10.330s CPU time, 86.8M memory peak, 74.3M read from disk. May 15 12:54:15.520799 containerd[1571]: time="2025-05-15T12:54:15.520764573Z" level=info msg="received exit event container_id:\"e823510bedc9ed0e482b1d3eb7d8a8f7b1eef33bcbb7e7fea24183a83e878d89\" id:\"e823510bedc9ed0e482b1d3eb7d8a8f7b1eef33bcbb7e7fea24183a83e878d89\" pid:3060 exit_status:1 exited_at:{seconds:1747313655 nanos:515166876}" May 15 12:54:15.526746 containerd[1571]: time="2025-05-15T12:54:15.526626594Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e823510bedc9ed0e482b1d3eb7d8a8f7b1eef33bcbb7e7fea24183a83e878d89\" id:\"e823510bedc9ed0e482b1d3eb7d8a8f7b1eef33bcbb7e7fea24183a83e878d89\" pid:3060 exit_status:1 exited_at:{seconds:1747313655 nanos:515166876}" May 15 12:54:15.588966 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e823510bedc9ed0e482b1d3eb7d8a8f7b1eef33bcbb7e7fea24183a83e878d89-rootfs.mount: Deactivated successfully. May 15 12:54:15.740705 systemd[1]: cri-containerd-216d20dedc13f4de9ae415f3572568a64266231433b4b01a5537118c810d6cd7.scope: Deactivated successfully. May 15 12:54:15.741261 systemd[1]: cri-containerd-216d20dedc13f4de9ae415f3572568a64266231433b4b01a5537118c810d6cd7.scope: Consumed 5.405s CPU time, 67.4M memory peak, 48.1M read from disk. May 15 12:54:15.745142 containerd[1571]: time="2025-05-15T12:54:15.744285413Z" level=info msg="received exit event container_id:\"216d20dedc13f4de9ae415f3572568a64266231433b4b01a5537118c810d6cd7\" id:\"216d20dedc13f4de9ae415f3572568a64266231433b4b01a5537118c810d6cd7\" pid:3558 exit_status:1 exited_at:{seconds:1747313655 nanos:744002411}" May 15 12:54:15.745142 containerd[1571]: time="2025-05-15T12:54:15.745109379Z" level=info msg="TaskExit event in podsandbox handler container_id:\"216d20dedc13f4de9ae415f3572568a64266231433b4b01a5537118c810d6cd7\" id:\"216d20dedc13f4de9ae415f3572568a64266231433b4b01a5537118c810d6cd7\" pid:3558 exit_status:1 exited_at:{seconds:1747313655 nanos:744002411}" May 15 12:54:15.769584 kubelet[3207]: E0515 12:54:15.769557 3207 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:45728->10.0.0.2:2379: read: connection timed out" May 15 12:54:15.770020 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-216d20dedc13f4de9ae415f3572568a64266231433b4b01a5537118c810d6cd7-rootfs.mount: Deactivated successfully. May 15 12:54:16.089032 kubelet[3207]: I0515 12:54:16.088731 3207 scope.go:117] "RemoveContainer" containerID="216d20dedc13f4de9ae415f3572568a64266231433b4b01a5537118c810d6cd7" May 15 12:54:16.093279 kubelet[3207]: I0515 12:54:16.092968 3207 scope.go:117] "RemoveContainer" containerID="e823510bedc9ed0e482b1d3eb7d8a8f7b1eef33bcbb7e7fea24183a83e878d89" May 15 12:54:16.163268 containerd[1571]: time="2025-05-15T12:54:16.163103163Z" level=info msg="CreateContainer within sandbox \"93df93f6913f00e9171e7c72c9f70e71edfb30442efc3c36514f5c6dddec91d1\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" May 15 12:54:16.169793 containerd[1571]: time="2025-05-15T12:54:16.169682862Z" level=info msg="CreateContainer within sandbox \"e9e748df1d7c8e825c55595cf6843ddb00fcf7fa8dfc3eed2501ddd8d80be070\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" May 15 12:54:16.223348 containerd[1571]: time="2025-05-15T12:54:16.221856139Z" level=info msg="Container ce0973d6e7dd83b9b700ed56a7728a5d83eb5bcf9ee56218ca9102bf7bd90650: CDI devices from CRI Config.CDIDevices: []" May 15 12:54:16.242456 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount395779115.mount: Deactivated successfully. May 15 12:54:16.245349 containerd[1571]: time="2025-05-15T12:54:16.245051753Z" level=info msg="Container 2ead307e5fe8bb41f1cf1be3f73b09e8761123248c744e80296bbf7e5d15205c: CDI devices from CRI Config.CDIDevices: []" May 15 12:54:16.254809 containerd[1571]: time="2025-05-15T12:54:16.254723073Z" level=info msg="CreateContainer within sandbox \"e9e748df1d7c8e825c55595cf6843ddb00fcf7fa8dfc3eed2501ddd8d80be070\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"ce0973d6e7dd83b9b700ed56a7728a5d83eb5bcf9ee56218ca9102bf7bd90650\"" May 15 12:54:16.256152 containerd[1571]: time="2025-05-15T12:54:16.256129403Z" level=info msg="CreateContainer within sandbox \"93df93f6913f00e9171e7c72c9f70e71edfb30442efc3c36514f5c6dddec91d1\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"2ead307e5fe8bb41f1cf1be3f73b09e8761123248c744e80296bbf7e5d15205c\"" May 15 12:54:16.259063 containerd[1571]: time="2025-05-15T12:54:16.259043913Z" level=info msg="StartContainer for \"2ead307e5fe8bb41f1cf1be3f73b09e8761123248c744e80296bbf7e5d15205c\"" May 15 12:54:16.259413 sshd[9709]: Invalid user panda from 219.127.7.87 port 55882 May 15 12:54:16.260789 containerd[1571]: time="2025-05-15T12:54:16.260725739Z" level=info msg="connecting to shim 2ead307e5fe8bb41f1cf1be3f73b09e8761123248c744e80296bbf7e5d15205c" address="unix:///run/containerd/s/94c6bed3e700ed41d1b9d21c64b43a89fff4839b4637ebe9bd28f031af5d014c" protocol=ttrpc version=3 May 15 12:54:16.262091 containerd[1571]: time="2025-05-15T12:54:16.261222312Z" level=info msg="StartContainer for \"ce0973d6e7dd83b9b700ed56a7728a5d83eb5bcf9ee56218ca9102bf7bd90650\"" May 15 12:54:16.262091 containerd[1571]: time="2025-05-15T12:54:16.261787732Z" level=info msg="connecting to shim ce0973d6e7dd83b9b700ed56a7728a5d83eb5bcf9ee56218ca9102bf7bd90650" address="unix:///run/containerd/s/bc47da3063820745c9afbeb2e834ec1e43cd199b236a7022c9e2cc938278fecb" protocol=ttrpc version=3 May 15 12:54:16.317530 systemd[1]: Started cri-containerd-ce0973d6e7dd83b9b700ed56a7728a5d83eb5bcf9ee56218ca9102bf7bd90650.scope - libcontainer container ce0973d6e7dd83b9b700ed56a7728a5d83eb5bcf9ee56218ca9102bf7bd90650. May 15 12:54:16.332448 systemd[1]: Started cri-containerd-2ead307e5fe8bb41f1cf1be3f73b09e8761123248c744e80296bbf7e5d15205c.scope - libcontainer container 2ead307e5fe8bb41f1cf1be3f73b09e8761123248c744e80296bbf7e5d15205c. May 15 12:54:16.389146 containerd[1571]: time="2025-05-15T12:54:16.388277937Z" level=info msg="StartContainer for \"ce0973d6e7dd83b9b700ed56a7728a5d83eb5bcf9ee56218ca9102bf7bd90650\" returns successfully" May 15 12:54:16.419147 containerd[1571]: time="2025-05-15T12:54:16.419094052Z" level=info msg="StartContainer for \"2ead307e5fe8bb41f1cf1be3f73b09e8761123248c744e80296bbf7e5d15205c\" returns successfully" May 15 12:54:16.583809 sshd[9709]: Received disconnect from 219.127.7.87 port 55882:11: Bye Bye [preauth] May 15 12:54:16.583809 sshd[9709]: Disconnected from invalid user panda 219.127.7.87 port 55882 [preauth] May 15 12:54:16.585880 systemd[1]: sshd@67-37.27.185.109:22-219.127.7.87:55882.service: Deactivated successfully. May 15 12:54:20.208063 kubelet[3207]: E0515 12:54:20.201679 3207 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:45552->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4334-0-0-a-dce95649a9.183fb481ba2151f4 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4334-0-0-a-dce95649a9,UID:0a1b449f5386d2bcc340ff278fe6764c,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4334-0-0-a-dce95649a9,},FirstTimestamp:2025-05-15 12:54:09.7095685 +0000 UTC m=+1136.669367375,LastTimestamp:2025-05-15 12:54:09.7095685 +0000 UTC m=+1136.669367375,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4334-0-0-a-dce95649a9,}" May 15 12:54:20.884503 systemd[1]: cri-containerd-42c06d99f30ace834b14668a80883ed210238f4267dcb393331e444c50366d91.scope: Deactivated successfully. May 15 12:54:20.884829 systemd[1]: cri-containerd-42c06d99f30ace834b14668a80883ed210238f4267dcb393331e444c50366d91.scope: Consumed 2.564s CPU time, 37M memory peak, 39.9M read from disk. May 15 12:54:20.887646 containerd[1571]: time="2025-05-15T12:54:20.887423080Z" level=info msg="received exit event container_id:\"42c06d99f30ace834b14668a80883ed210238f4267dcb393331e444c50366d91\" id:\"42c06d99f30ace834b14668a80883ed210238f4267dcb393331e444c50366d91\" pid:3050 exit_status:1 exited_at:{seconds:1747313660 nanos:887016367}" May 15 12:54:20.889052 containerd[1571]: time="2025-05-15T12:54:20.887793296Z" level=info msg="TaskExit event in podsandbox handler container_id:\"42c06d99f30ace834b14668a80883ed210238f4267dcb393331e444c50366d91\" id:\"42c06d99f30ace834b14668a80883ed210238f4267dcb393331e444c50366d91\" pid:3050 exit_status:1 exited_at:{seconds:1747313660 nanos:887016367}" May 15 12:54:20.912530 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-42c06d99f30ace834b14668a80883ed210238f4267dcb393331e444c50366d91-rootfs.mount: Deactivated successfully. May 15 12:54:21.100693 kubelet[3207]: I0515 12:54:21.100662 3207 scope.go:117] "RemoveContainer" containerID="42c06d99f30ace834b14668a80883ed210238f4267dcb393331e444c50366d91" May 15 12:54:21.102910 containerd[1571]: time="2025-05-15T12:54:21.102849309Z" level=info msg="CreateContainer within sandbox \"917876ebd87e8b4a8b91df0799278ea36dd6037c26a39232aaeab1c0fa6a935b\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" May 15 12:54:21.112352 containerd[1571]: time="2025-05-15T12:54:21.111205151Z" level=info msg="Container ba93bcace0105ff57fc87977b937b04d200c117b84f7ec4d701282c3e393f4cf: CDI devices from CRI Config.CDIDevices: []" May 15 12:54:21.119039 containerd[1571]: time="2025-05-15T12:54:21.118996344Z" level=info msg="CreateContainer within sandbox \"917876ebd87e8b4a8b91df0799278ea36dd6037c26a39232aaeab1c0fa6a935b\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"ba93bcace0105ff57fc87977b937b04d200c117b84f7ec4d701282c3e393f4cf\"" May 15 12:54:21.119148 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount942505679.mount: Deactivated successfully. May 15 12:54:21.120044 containerd[1571]: time="2025-05-15T12:54:21.119897925Z" level=info msg="StartContainer for \"ba93bcace0105ff57fc87977b937b04d200c117b84f7ec4d701282c3e393f4cf\"" May 15 12:54:21.121511 containerd[1571]: time="2025-05-15T12:54:21.121475837Z" level=info msg="connecting to shim ba93bcace0105ff57fc87977b937b04d200c117b84f7ec4d701282c3e393f4cf" address="unix:///run/containerd/s/88b5e5f248e809be7b24b73355e47d65f0f5950bd0ff510890737c95f6125085" protocol=ttrpc version=3 May 15 12:54:21.145465 systemd[1]: Started cri-containerd-ba93bcace0105ff57fc87977b937b04d200c117b84f7ec4d701282c3e393f4cf.scope - libcontainer container ba93bcace0105ff57fc87977b937b04d200c117b84f7ec4d701282c3e393f4cf. May 15 12:54:21.192849 containerd[1571]: time="2025-05-15T12:54:21.192811368Z" level=info msg="StartContainer for \"ba93bcace0105ff57fc87977b937b04d200c117b84f7ec4d701282c3e393f4cf\" returns successfully" May 15 12:54:21.354602 systemd[1]: cri-containerd-ce0973d6e7dd83b9b700ed56a7728a5d83eb5bcf9ee56218ca9102bf7bd90650.scope: Deactivated successfully. May 15 12:54:21.354962 containerd[1571]: time="2025-05-15T12:54:21.354898622Z" level=info msg="received exit event container_id:\"ce0973d6e7dd83b9b700ed56a7728a5d83eb5bcf9ee56218ca9102bf7bd90650\" id:\"ce0973d6e7dd83b9b700ed56a7728a5d83eb5bcf9ee56218ca9102bf7bd90650\" pid:9759 exit_status:1 exited_at:{seconds:1747313661 nanos:354647930}" May 15 12:54:21.355097 systemd[1]: cri-containerd-ce0973d6e7dd83b9b700ed56a7728a5d83eb5bcf9ee56218ca9102bf7bd90650.scope: Consumed 71ms CPU time, 40.1M memory peak, 28.4M read from disk. May 15 12:54:21.356434 containerd[1571]: time="2025-05-15T12:54:21.355533333Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ce0973d6e7dd83b9b700ed56a7728a5d83eb5bcf9ee56218ca9102bf7bd90650\" id:\"ce0973d6e7dd83b9b700ed56a7728a5d83eb5bcf9ee56218ca9102bf7bd90650\" pid:9759 exit_status:1 exited_at:{seconds:1747313661 nanos:354647930}" May 15 12:54:21.382478 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ce0973d6e7dd83b9b700ed56a7728a5d83eb5bcf9ee56218ca9102bf7bd90650-rootfs.mount: Deactivated successfully. May 15 12:54:22.109874 kubelet[3207]: I0515 12:54:22.109807 3207 scope.go:117] "RemoveContainer" containerID="216d20dedc13f4de9ae415f3572568a64266231433b4b01a5537118c810d6cd7" May 15 12:54:22.110707 kubelet[3207]: I0515 12:54:22.110676 3207 scope.go:117] "RemoveContainer" containerID="ce0973d6e7dd83b9b700ed56a7728a5d83eb5bcf9ee56218ca9102bf7bd90650" May 15 12:54:22.113783 containerd[1571]: time="2025-05-15T12:54:22.113736225Z" level=info msg="RemoveContainer for \"216d20dedc13f4de9ae415f3572568a64266231433b4b01a5537118c810d6cd7\"" May 15 12:54:22.129364 containerd[1571]: time="2025-05-15T12:54:22.128454098Z" level=info msg="RemoveContainer for \"216d20dedc13f4de9ae415f3572568a64266231433b4b01a5537118c810d6cd7\" returns successfully" May 15 12:54:22.156585 kubelet[3207]: E0515 12:54:22.138104 3207 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-797db67f8-pv7m8_tigera-operator(582e964a-88e0-4f60-96d1-99730ced53cd)\"" pod="tigera-operator/tigera-operator-797db67f8-pv7m8" podUID="582e964a-88e0-4f60-96d1-99730ced53cd"