May 27 17:55:46.926884 kernel: Linux version 6.12.30-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue May 27 15:32:02 -00 2025 May 27 17:55:46.926929 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=daa3e2d55cc4a7ff0ec15aa9bb0c07df9999cb4e3041f3adad1b1101efdea101 May 27 17:55:46.926948 kernel: BIOS-provided physical RAM map: May 27 17:55:46.926958 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable May 27 17:55:46.926967 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved May 27 17:55:46.926977 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved May 27 17:55:46.926988 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ffdbfff] usable May 27 17:55:46.926999 kernel: BIOS-e820: [mem 0x000000007ffdc000-0x000000007fffffff] reserved May 27 17:55:46.927008 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved May 27 17:55:46.927018 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved May 27 17:55:46.927033 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved May 27 17:55:46.927043 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved May 27 17:55:46.927053 kernel: NX (Execute Disable) protection: active May 27 17:55:46.927063 kernel: APIC: Static calls initialized May 27 17:55:46.927075 kernel: SMBIOS 2.8 present. May 27 17:55:46.927086 kernel: DMI: Red Hat KVM/RHEL-AV, BIOS 1.13.0-2.module_el8.5.0+2608+72063365 04/01/2014 May 27 17:55:46.927101 kernel: DMI: Memory slots populated: 1/1 May 27 17:55:46.927112 kernel: Hypervisor detected: KVM May 27 17:55:46.927123 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 May 27 17:55:46.927146 kernel: kvm-clock: using sched offset of 5494461534 cycles May 27 17:55:46.927157 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns May 27 17:55:46.927168 kernel: tsc: Detected 2799.998 MHz processor May 27 17:55:46.927178 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved May 27 17:55:46.927190 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable May 27 17:55:46.927200 kernel: last_pfn = 0x7ffdc max_arch_pfn = 0x400000000 May 27 17:55:46.927215 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs May 27 17:55:46.927239 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT May 27 17:55:46.927250 kernel: Using GB pages for direct mapping May 27 17:55:46.927261 kernel: ACPI: Early table checksum verification disabled May 27 17:55:46.927272 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS ) May 27 17:55:46.927283 kernel: ACPI: RSDT 0x000000007FFE47A5 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 27 17:55:46.927294 kernel: ACPI: FACP 0x000000007FFE438D 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) May 27 17:55:46.927305 kernel: ACPI: DSDT 0x000000007FFDFD80 00460D (v01 BOCHS BXPC 00000001 BXPC 00000001) May 27 17:55:46.927316 kernel: ACPI: FACS 0x000000007FFDFD40 000040 May 27 17:55:46.927331 kernel: ACPI: APIC 0x000000007FFE4481 0000F0 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 27 17:55:46.927342 kernel: ACPI: SRAT 0x000000007FFE4571 0001D0 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 27 17:55:46.927366 kernel: ACPI: MCFG 0x000000007FFE4741 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) May 27 17:55:46.927376 kernel: ACPI: WAET 0x000000007FFE477D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 27 17:55:46.927387 kernel: ACPI: Reserving FACP table memory at [mem 0x7ffe438d-0x7ffe4480] May 27 17:55:46.927398 kernel: ACPI: Reserving DSDT table memory at [mem 0x7ffdfd80-0x7ffe438c] May 27 17:55:46.927427 kernel: ACPI: Reserving FACS table memory at [mem 0x7ffdfd40-0x7ffdfd7f] May 27 17:55:46.927442 kernel: ACPI: Reserving APIC table memory at [mem 0x7ffe4481-0x7ffe4570] May 27 17:55:46.927454 kernel: ACPI: Reserving SRAT table memory at [mem 0x7ffe4571-0x7ffe4740] May 27 17:55:46.927465 kernel: ACPI: Reserving MCFG table memory at [mem 0x7ffe4741-0x7ffe477c] May 27 17:55:46.927477 kernel: ACPI: Reserving WAET table memory at [mem 0x7ffe477d-0x7ffe47a4] May 27 17:55:46.927488 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] May 27 17:55:46.927499 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] May 27 17:55:46.927511 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x20800fffff] hotplug May 27 17:55:46.927527 kernel: NUMA: Node 0 [mem 0x00001000-0x0009ffff] + [mem 0x00100000-0x7ffdbfff] -> [mem 0x00001000-0x7ffdbfff] May 27 17:55:46.927538 kernel: NODE_DATA(0) allocated [mem 0x7ffd4dc0-0x7ffdbfff] May 27 17:55:46.927550 kernel: Zone ranges: May 27 17:55:46.927562 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] May 27 17:55:46.927573 kernel: DMA32 [mem 0x0000000001000000-0x000000007ffdbfff] May 27 17:55:46.927584 kernel: Normal empty May 27 17:55:46.927596 kernel: Device empty May 27 17:55:46.928391 kernel: Movable zone start for each node May 27 17:55:46.928404 kernel: Early memory node ranges May 27 17:55:46.928415 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] May 27 17:55:46.928432 kernel: node 0: [mem 0x0000000000100000-0x000000007ffdbfff] May 27 17:55:46.928444 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007ffdbfff] May 27 17:55:46.928455 kernel: On node 0, zone DMA: 1 pages in unavailable ranges May 27 17:55:46.928467 kernel: On node 0, zone DMA: 97 pages in unavailable ranges May 27 17:55:46.928490 kernel: On node 0, zone DMA32: 36 pages in unavailable ranges May 27 17:55:46.928502 kernel: ACPI: PM-Timer IO Port: 0x608 May 27 17:55:46.928513 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) May 27 17:55:46.928525 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 May 27 17:55:46.928536 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) May 27 17:55:46.928553 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) May 27 17:55:46.928564 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) May 27 17:55:46.929584 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) May 27 17:55:46.929603 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) May 27 17:55:46.929615 kernel: ACPI: Using ACPI (MADT) for SMP configuration information May 27 17:55:46.929627 kernel: TSC deadline timer available May 27 17:55:46.929638 kernel: CPU topo: Max. logical packages: 16 May 27 17:55:46.929650 kernel: CPU topo: Max. logical dies: 16 May 27 17:55:46.929661 kernel: CPU topo: Max. dies per package: 1 May 27 17:55:46.929679 kernel: CPU topo: Max. threads per core: 1 May 27 17:55:46.929690 kernel: CPU topo: Num. cores per package: 1 May 27 17:55:46.929702 kernel: CPU topo: Num. threads per package: 1 May 27 17:55:46.929713 kernel: CPU topo: Allowing 2 present CPUs plus 14 hotplug CPUs May 27 17:55:46.929725 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() May 27 17:55:46.929737 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices May 27 17:55:46.929748 kernel: Booting paravirtualized kernel on KVM May 27 17:55:46.929760 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns May 27 17:55:46.929772 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:16 nr_cpu_ids:16 nr_node_ids:1 May 27 17:55:46.929788 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u262144 May 27 17:55:46.929800 kernel: pcpu-alloc: s207832 r8192 d29736 u262144 alloc=1*2097152 May 27 17:55:46.929811 kernel: pcpu-alloc: [0] 00 01 02 03 04 05 06 07 [0] 08 09 10 11 12 13 14 15 May 27 17:55:46.929823 kernel: kvm-guest: PV spinlocks enabled May 27 17:55:46.929835 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) May 27 17:55:46.929848 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=daa3e2d55cc4a7ff0ec15aa9bb0c07df9999cb4e3041f3adad1b1101efdea101 May 27 17:55:46.929872 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 27 17:55:46.929884 kernel: random: crng init done May 27 17:55:46.929901 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 27 17:55:46.929913 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) May 27 17:55:46.929924 kernel: Fallback order for Node 0: 0 May 27 17:55:46.929936 kernel: Built 1 zonelists, mobility grouping on. Total pages: 524154 May 27 17:55:46.929947 kernel: Policy zone: DMA32 May 27 17:55:46.929959 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 27 17:55:46.929970 kernel: software IO TLB: area num 16. May 27 17:55:46.929982 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=16, Nodes=1 May 27 17:55:46.929994 kernel: Kernel/User page tables isolation: enabled May 27 17:55:46.930009 kernel: ftrace: allocating 40081 entries in 157 pages May 27 17:55:46.930021 kernel: ftrace: allocated 157 pages with 5 groups May 27 17:55:46.930033 kernel: Dynamic Preempt: voluntary May 27 17:55:46.930044 kernel: rcu: Preemptible hierarchical RCU implementation. May 27 17:55:46.930057 kernel: rcu: RCU event tracing is enabled. May 27 17:55:46.930069 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=16. May 27 17:55:46.930080 kernel: Trampoline variant of Tasks RCU enabled. May 27 17:55:46.930092 kernel: Rude variant of Tasks RCU enabled. May 27 17:55:46.930103 kernel: Tracing variant of Tasks RCU enabled. May 27 17:55:46.930115 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 27 17:55:46.930135 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=16 May 27 17:55:46.930147 kernel: RCU Tasks: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. May 27 17:55:46.930171 kernel: RCU Tasks Rude: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. May 27 17:55:46.930182 kernel: RCU Tasks Trace: Setting shift to 4 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=16. May 27 17:55:46.930193 kernel: NR_IRQS: 33024, nr_irqs: 552, preallocated irqs: 16 May 27 17:55:46.930204 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. May 27 17:55:46.930227 kernel: Console: colour VGA+ 80x25 May 27 17:55:46.930239 kernel: printk: legacy console [tty0] enabled May 27 17:55:46.930251 kernel: printk: legacy console [ttyS0] enabled May 27 17:55:46.930262 kernel: ACPI: Core revision 20240827 May 27 17:55:46.930274 kernel: APIC: Switch to symmetric I/O mode setup May 27 17:55:46.930289 kernel: x2apic enabled May 27 17:55:46.930301 kernel: APIC: Switched APIC routing to: physical x2apic May 27 17:55:46.930313 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x285c3ee517e, max_idle_ns: 440795257231 ns May 27 17:55:46.930325 kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998) May 27 17:55:46.930337 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated May 27 17:55:46.930365 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 May 27 17:55:46.930376 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 May 27 17:55:46.930388 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization May 27 17:55:46.930399 kernel: Spectre V2 : Mitigation: Retpolines May 27 17:55:46.930410 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT May 27 17:55:46.930422 kernel: Spectre V2 : Enabling Restricted Speculation for firmware calls May 27 17:55:46.930433 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier May 27 17:55:46.930444 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl May 27 17:55:46.930455 kernel: MDS: Mitigation: Clear CPU buffers May 27 17:55:46.930467 kernel: MMIO Stale Data: Unknown: No mitigations May 27 17:55:46.930478 kernel: SRBDS: Unknown: Dependent on hypervisor status May 27 17:55:46.930505 kernel: ITS: Mitigation: Aligned branch/return thunks May 27 17:55:46.930516 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' May 27 17:55:46.930527 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' May 27 17:55:46.930538 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' May 27 17:55:46.930549 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 May 27 17:55:46.930559 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. May 27 17:55:46.930570 kernel: Freeing SMP alternatives memory: 32K May 27 17:55:46.930581 kernel: pid_max: default: 32768 minimum: 301 May 27 17:55:46.930608 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima May 27 17:55:46.930632 kernel: landlock: Up and running. May 27 17:55:46.930643 kernel: SELinux: Initializing. May 27 17:55:46.930660 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) May 27 17:55:46.930671 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) May 27 17:55:46.930695 kernel: smpboot: CPU0: Intel Xeon E3-12xx v2 (Ivy Bridge, IBRS) (family: 0x6, model: 0x3a, stepping: 0x9) May 27 17:55:46.930707 kernel: Performance Events: unsupported p6 CPU model 58 no PMU driver, software events only. May 27 17:55:46.930719 kernel: signal: max sigframe size: 1776 May 27 17:55:46.930731 kernel: rcu: Hierarchical SRCU implementation. May 27 17:55:46.930743 kernel: rcu: Max phase no-delay instances is 400. May 27 17:55:46.930755 kernel: Timer migration: 2 hierarchy levels; 8 children per group; 2 crossnode level May 27 17:55:46.930767 kernel: NMI watchdog: Perf NMI watchdog permanently disabled May 27 17:55:46.930782 kernel: smp: Bringing up secondary CPUs ... May 27 17:55:46.930794 kernel: smpboot: x86: Booting SMP configuration: May 27 17:55:46.930806 kernel: .... node #0, CPUs: #1 May 27 17:55:46.930818 kernel: smp: Brought up 1 node, 2 CPUs May 27 17:55:46.930829 kernel: smpboot: Total of 2 processors activated (11199.99 BogoMIPS) May 27 17:55:46.930854 kernel: Memory: 1895680K/2096616K available (14336K kernel code, 2430K rwdata, 9952K rodata, 54416K init, 2552K bss, 194920K reserved, 0K cma-reserved) May 27 17:55:46.930878 kernel: devtmpfs: initialized May 27 17:55:46.930890 kernel: x86/mm: Memory block size: 128MB May 27 17:55:46.930902 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 27 17:55:46.930919 kernel: futex hash table entries: 4096 (order: 6, 262144 bytes, linear) May 27 17:55:46.930932 kernel: pinctrl core: initialized pinctrl subsystem May 27 17:55:46.930944 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 27 17:55:46.930956 kernel: audit: initializing netlink subsys (disabled) May 27 17:55:46.930969 kernel: audit: type=2000 audit(1748368543.508:1): state=initialized audit_enabled=0 res=1 May 27 17:55:46.930980 kernel: thermal_sys: Registered thermal governor 'step_wise' May 27 17:55:46.930992 kernel: thermal_sys: Registered thermal governor 'user_space' May 27 17:55:46.931005 kernel: cpuidle: using governor menu May 27 17:55:46.931017 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 27 17:55:46.931033 kernel: dca service started, version 1.12.1 May 27 17:55:46.931045 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff] May 27 17:55:46.931057 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] reserved as E820 entry May 27 17:55:46.931069 kernel: PCI: Using configuration type 1 for base access May 27 17:55:46.931082 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. May 27 17:55:46.931094 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages May 27 17:55:46.931106 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page May 27 17:55:46.931118 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 27 17:55:46.931130 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page May 27 17:55:46.931147 kernel: ACPI: Added _OSI(Module Device) May 27 17:55:46.931159 kernel: ACPI: Added _OSI(Processor Device) May 27 17:55:46.931171 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 27 17:55:46.931183 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 27 17:55:46.931195 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 27 17:55:46.931207 kernel: ACPI: Interpreter enabled May 27 17:55:46.931231 kernel: ACPI: PM: (supports S0 S5) May 27 17:55:46.931243 kernel: ACPI: Using IOAPIC for interrupt routing May 27 17:55:46.931255 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug May 27 17:55:46.931271 kernel: PCI: Using E820 reservations for host bridge windows May 27 17:55:46.931283 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F May 27 17:55:46.931294 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) May 27 17:55:46.933629 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 27 17:55:46.933811 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] May 27 17:55:46.933981 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] May 27 17:55:46.934001 kernel: PCI host bridge to bus 0000:00 May 27 17:55:46.934187 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] May 27 17:55:46.934350 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] May 27 17:55:46.934489 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] May 27 17:55:46.934646 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xafffffff window] May 27 17:55:46.934800 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] May 27 17:55:46.934969 kernel: pci_bus 0000:00: root bus resource [mem 0x20c0000000-0x28bfffffff window] May 27 17:55:46.935107 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] May 27 17:55:46.935310 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint May 27 17:55:46.935492 kernel: pci 0000:00:01.0: [1013:00b8] type 00 class 0x030000 conventional PCI endpoint May 27 17:55:46.937693 kernel: pci 0000:00:01.0: BAR 0 [mem 0xfa000000-0xfbffffff pref] May 27 17:55:46.937866 kernel: pci 0000:00:01.0: BAR 1 [mem 0xfea50000-0xfea50fff] May 27 17:55:46.938028 kernel: pci 0000:00:01.0: ROM [mem 0xfea40000-0xfea4ffff pref] May 27 17:55:46.938183 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] May 27 17:55:46.938364 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 27 17:55:46.938528 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfea51000-0xfea51fff] May 27 17:55:46.938727 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] May 27 17:55:46.938892 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] May 27 17:55:46.939048 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] May 27 17:55:46.939225 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 27 17:55:46.939380 kernel: pci 0000:00:02.1: BAR 0 [mem 0xfea52000-0xfea52fff] May 27 17:55:46.939534 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] May 27 17:55:46.943499 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] May 27 17:55:46.943691 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] May 27 17:55:46.943895 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 27 17:55:46.944054 kernel: pci 0000:00:02.2: BAR 0 [mem 0xfea53000-0xfea53fff] May 27 17:55:46.944237 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] May 27 17:55:46.944391 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] May 27 17:55:46.944543 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] May 27 17:55:46.944737 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 27 17:55:46.944907 kernel: pci 0000:00:02.3: BAR 0 [mem 0xfea54000-0xfea54fff] May 27 17:55:46.945061 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] May 27 17:55:46.945214 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] May 27 17:55:46.945366 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] May 27 17:55:46.945535 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 27 17:55:46.946789 kernel: pci 0000:00:02.4: BAR 0 [mem 0xfea55000-0xfea55fff] May 27 17:55:46.946980 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] May 27 17:55:46.947135 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] May 27 17:55:46.947289 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] May 27 17:55:46.947457 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 27 17:55:46.947632 kernel: pci 0000:00:02.5: BAR 0 [mem 0xfea56000-0xfea56fff] May 27 17:55:46.947786 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] May 27 17:55:46.947953 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] May 27 17:55:46.948114 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] May 27 17:55:46.948275 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 27 17:55:46.948429 kernel: pci 0000:00:02.6: BAR 0 [mem 0xfea57000-0xfea57fff] May 27 17:55:46.949650 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] May 27 17:55:46.949830 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] May 27 17:55:46.950003 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] May 27 17:55:46.950178 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 27 17:55:46.950373 kernel: pci 0000:00:02.7: BAR 0 [mem 0xfea58000-0xfea58fff] May 27 17:55:46.950526 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] May 27 17:55:46.950696 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] May 27 17:55:46.950847 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] May 27 17:55:46.951023 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint May 27 17:55:46.951175 kernel: pci 0000:00:03.0: BAR 0 [io 0xc0c0-0xc0df] May 27 17:55:46.951348 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfea59000-0xfea59fff] May 27 17:55:46.951656 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfd000000-0xfd003fff 64bit pref] May 27 17:55:46.952653 kernel: pci 0000:00:03.0: ROM [mem 0xfea00000-0xfea3ffff pref] May 27 17:55:46.952839 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint May 27 17:55:46.953024 kernel: pci 0000:00:04.0: BAR 0 [io 0xc000-0xc07f] May 27 17:55:46.953189 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfea5a000-0xfea5afff] May 27 17:55:46.956272 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfd004000-0xfd007fff 64bit pref] May 27 17:55:46.956462 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint May 27 17:55:46.956651 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO May 27 17:55:46.956816 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint May 27 17:55:46.956980 kernel: pci 0000:00:1f.2: BAR 4 [io 0xc0e0-0xc0ff] May 27 17:55:46.957131 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfea5b000-0xfea5bfff] May 27 17:55:46.957292 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint May 27 17:55:46.957443 kernel: pci 0000:00:1f.3: BAR 4 [io 0x0700-0x073f] May 27 17:55:46.958308 kernel: pci 0000:01:00.0: [1b36:000e] type 01 class 0x060400 PCIe to PCI/PCI-X bridge May 27 17:55:46.958471 kernel: pci 0000:01:00.0: BAR 0 [mem 0xfda00000-0xfda000ff 64bit] May 27 17:55:46.958707 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] May 27 17:55:46.958884 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] May 27 17:55:46.959041 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] May 27 17:55:46.959211 kernel: pci_bus 0000:02: extended config space not accessible May 27 17:55:46.959395 kernel: pci 0000:02:01.0: [8086:25ab] type 00 class 0x088000 conventional PCI endpoint May 27 17:55:46.959593 kernel: pci 0000:02:01.0: BAR 0 [mem 0xfd800000-0xfd80000f] May 27 17:55:46.959758 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] May 27 17:55:46.959940 kernel: pci 0000:03:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint May 27 17:55:46.960099 kernel: pci 0000:03:00.0: BAR 0 [mem 0xfe800000-0xfe803fff 64bit] May 27 17:55:46.960253 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] May 27 17:55:46.960422 kernel: pci 0000:04:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint May 27 17:55:46.962648 kernel: pci 0000:04:00.0: BAR 4 [mem 0xfca00000-0xfca03fff 64bit pref] May 27 17:55:46.962813 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] May 27 17:55:46.962983 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] May 27 17:55:46.963139 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] May 27 17:55:46.963293 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] May 27 17:55:46.963447 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] May 27 17:55:46.963629 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] May 27 17:55:46.963656 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 May 27 17:55:46.963669 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 May 27 17:55:46.963681 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 May 27 17:55:46.963694 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 May 27 17:55:46.963706 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 May 27 17:55:46.963719 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 May 27 17:55:46.963731 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 May 27 17:55:46.963743 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 May 27 17:55:46.963760 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 May 27 17:55:46.963773 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 May 27 17:55:46.963785 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 May 27 17:55:46.963797 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 May 27 17:55:46.963810 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 May 27 17:55:46.963822 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 May 27 17:55:46.963834 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 May 27 17:55:46.963847 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 May 27 17:55:46.963870 kernel: iommu: Default domain type: Translated May 27 17:55:46.963888 kernel: iommu: DMA domain TLB invalidation policy: lazy mode May 27 17:55:46.963900 kernel: PCI: Using ACPI for IRQ routing May 27 17:55:46.963913 kernel: PCI: pci_cache_line_size set to 64 bytes May 27 17:55:46.963925 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] May 27 17:55:46.963937 kernel: e820: reserve RAM buffer [mem 0x7ffdc000-0x7fffffff] May 27 17:55:46.964090 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device May 27 17:55:46.964242 kernel: pci 0000:00:01.0: vgaarb: bridge control possible May 27 17:55:46.964393 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none May 27 17:55:46.964417 kernel: vgaarb: loaded May 27 17:55:46.964430 kernel: clocksource: Switched to clocksource kvm-clock May 27 17:55:46.964442 kernel: VFS: Disk quotas dquot_6.6.0 May 27 17:55:46.964455 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 27 17:55:46.964467 kernel: pnp: PnP ACPI init May 27 17:55:46.966672 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved May 27 17:55:46.966694 kernel: pnp: PnP ACPI: found 5 devices May 27 17:55:46.966707 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns May 27 17:55:46.966719 kernel: NET: Registered PF_INET protocol family May 27 17:55:46.966739 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) May 27 17:55:46.966751 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) May 27 17:55:46.966764 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 27 17:55:46.966782 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) May 27 17:55:46.966794 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) May 27 17:55:46.966807 kernel: TCP: Hash tables configured (established 16384 bind 16384) May 27 17:55:46.966819 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) May 27 17:55:46.966832 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) May 27 17:55:46.966865 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 27 17:55:46.966880 kernel: NET: Registered PF_XDP protocol family May 27 17:55:46.967036 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01-02] add_size 1000 May 27 17:55:46.967191 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 May 27 17:55:46.967344 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 May 27 17:55:46.967496 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 May 27 17:55:46.967669 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 May 27 17:55:46.967823 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 May 27 17:55:46.967987 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 May 27 17:55:46.968154 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 May 27 17:55:46.968305 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff]: assigned May 27 17:55:46.968456 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff]: assigned May 27 17:55:46.968666 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff]: assigned May 27 17:55:46.968842 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff]: assigned May 27 17:55:46.969005 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff]: assigned May 27 17:55:46.969157 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff]: assigned May 27 17:55:46.969337 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff]: assigned May 27 17:55:46.969488 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff]: assigned May 27 17:55:46.969667 kernel: pci 0000:01:00.0: PCI bridge to [bus 02] May 27 17:55:46.969852 kernel: pci 0000:01:00.0: bridge window [mem 0xfd800000-0xfd9fffff] May 27 17:55:46.970018 kernel: pci 0000:00:02.0: PCI bridge to [bus 01-02] May 27 17:55:46.970171 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] May 27 17:55:46.970323 kernel: pci 0000:00:02.0: bridge window [mem 0xfd800000-0xfdbfffff] May 27 17:55:46.970475 kernel: pci 0000:00:02.0: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] May 27 17:55:46.972676 kernel: pci 0000:00:02.1: PCI bridge to [bus 03] May 27 17:55:46.972839 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] May 27 17:55:46.973006 kernel: pci 0000:00:02.1: bridge window [mem 0xfe800000-0xfe9fffff] May 27 17:55:46.973159 kernel: pci 0000:00:02.1: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] May 27 17:55:46.973321 kernel: pci 0000:00:02.2: PCI bridge to [bus 04] May 27 17:55:46.973490 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] May 27 17:55:46.973688 kernel: pci 0000:00:02.2: bridge window [mem 0xfe600000-0xfe7fffff] May 27 17:55:46.973879 kernel: pci 0000:00:02.2: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] May 27 17:55:46.974041 kernel: pci 0000:00:02.3: PCI bridge to [bus 05] May 27 17:55:46.974194 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] May 27 17:55:46.974355 kernel: pci 0000:00:02.3: bridge window [mem 0xfe400000-0xfe5fffff] May 27 17:55:46.974508 kernel: pci 0000:00:02.3: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] May 27 17:55:46.974707 kernel: pci 0000:00:02.4: PCI bridge to [bus 06] May 27 17:55:46.974870 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] May 27 17:55:46.975026 kernel: pci 0000:00:02.4: bridge window [mem 0xfe200000-0xfe3fffff] May 27 17:55:46.975178 kernel: pci 0000:00:02.4: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] May 27 17:55:46.975342 kernel: pci 0000:00:02.5: PCI bridge to [bus 07] May 27 17:55:46.975505 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] May 27 17:55:46.975692 kernel: pci 0000:00:02.5: bridge window [mem 0xfe000000-0xfe1fffff] May 27 17:55:46.975847 kernel: pci 0000:00:02.5: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] May 27 17:55:46.976013 kernel: pci 0000:00:02.6: PCI bridge to [bus 08] May 27 17:55:46.976165 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] May 27 17:55:46.976347 kernel: pci 0000:00:02.6: bridge window [mem 0xfde00000-0xfdffffff] May 27 17:55:46.976498 kernel: pci 0000:00:02.6: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] May 27 17:55:46.977725 kernel: pci 0000:00:02.7: PCI bridge to [bus 09] May 27 17:55:46.977909 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] May 27 17:55:46.978062 kernel: pci 0000:00:02.7: bridge window [mem 0xfdc00000-0xfddfffff] May 27 17:55:46.978221 kernel: pci 0000:00:02.7: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] May 27 17:55:46.978384 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] May 27 17:55:46.978544 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] May 27 17:55:46.978746 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] May 27 17:55:46.978908 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xafffffff window] May 27 17:55:46.979047 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] May 27 17:55:46.979184 kernel: pci_bus 0000:00: resource 9 [mem 0x20c0000000-0x28bfffffff window] May 27 17:55:46.979340 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] May 27 17:55:46.979483 kernel: pci_bus 0000:01: resource 1 [mem 0xfd800000-0xfdbfffff] May 27 17:55:46.980656 kernel: pci_bus 0000:01: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] May 27 17:55:46.980819 kernel: pci_bus 0000:02: resource 1 [mem 0xfd800000-0xfd9fffff] May 27 17:55:46.980987 kernel: pci_bus 0000:03: resource 0 [io 0x2000-0x2fff] May 27 17:55:46.981132 kernel: pci_bus 0000:03: resource 1 [mem 0xfe800000-0xfe9fffff] May 27 17:55:46.981274 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] May 27 17:55:46.981461 kernel: pci_bus 0000:04: resource 0 [io 0x3000-0x3fff] May 27 17:55:46.981649 kernel: pci_bus 0000:04: resource 1 [mem 0xfe600000-0xfe7fffff] May 27 17:55:46.981795 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] May 27 17:55:46.981994 kernel: pci_bus 0000:05: resource 0 [io 0x4000-0x4fff] May 27 17:55:46.982144 kernel: pci_bus 0000:05: resource 1 [mem 0xfe400000-0xfe5fffff] May 27 17:55:46.982303 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] May 27 17:55:46.982519 kernel: pci_bus 0000:06: resource 0 [io 0x5000-0x5fff] May 27 17:55:46.984737 kernel: pci_bus 0000:06: resource 1 [mem 0xfe200000-0xfe3fffff] May 27 17:55:46.984897 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] May 27 17:55:46.985051 kernel: pci_bus 0000:07: resource 0 [io 0x6000-0x6fff] May 27 17:55:46.985212 kernel: pci_bus 0000:07: resource 1 [mem 0xfe000000-0xfe1fffff] May 27 17:55:46.985355 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] May 27 17:55:46.985507 kernel: pci_bus 0000:08: resource 0 [io 0x7000-0x7fff] May 27 17:55:46.985671 kernel: pci_bus 0000:08: resource 1 [mem 0xfde00000-0xfdffffff] May 27 17:55:46.985827 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] May 27 17:55:46.986033 kernel: pci_bus 0000:09: resource 0 [io 0x8000-0x8fff] May 27 17:55:46.986179 kernel: pci_bus 0000:09: resource 1 [mem 0xfdc00000-0xfddfffff] May 27 17:55:46.986330 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] May 27 17:55:46.986350 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 May 27 17:55:46.986364 kernel: PCI: CLS 0 bytes, default 64 May 27 17:55:46.986377 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) May 27 17:55:46.986390 kernel: software IO TLB: mapped [mem 0x0000000079800000-0x000000007d800000] (64MB) May 27 17:55:46.986403 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer May 27 17:55:46.986416 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x285c3ee517e, max_idle_ns: 440795257231 ns May 27 17:55:46.986429 kernel: Initialise system trusted keyrings May 27 17:55:46.986448 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 May 27 17:55:46.986464 kernel: Key type asymmetric registered May 27 17:55:46.986477 kernel: Asymmetric key parser 'x509' registered May 27 17:55:46.986490 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) May 27 17:55:46.986503 kernel: io scheduler mq-deadline registered May 27 17:55:46.986516 kernel: io scheduler kyber registered May 27 17:55:46.986528 kernel: io scheduler bfq registered May 27 17:55:46.986697 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 May 27 17:55:46.986866 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 May 27 17:55:46.987031 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 27 17:55:46.987186 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 May 27 17:55:46.987338 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 May 27 17:55:46.987489 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 27 17:55:46.987673 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 May 27 17:55:46.987827 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 May 27 17:55:46.987999 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 27 17:55:46.988153 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 May 27 17:55:46.988321 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 May 27 17:55:46.988476 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 27 17:55:46.988680 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 May 27 17:55:46.988834 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 May 27 17:55:46.989008 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 27 17:55:46.989162 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 May 27 17:55:46.989314 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 May 27 17:55:46.989466 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 27 17:55:46.989648 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 May 27 17:55:46.989801 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 May 27 17:55:46.989978 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 27 17:55:46.990132 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 May 27 17:55:46.990284 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 May 27 17:55:46.990438 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 27 17:55:46.990457 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 May 27 17:55:46.990471 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 May 27 17:55:46.990490 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 May 27 17:55:46.990504 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 27 17:55:46.990517 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A May 27 17:55:46.990530 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 May 27 17:55:46.990542 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 May 27 17:55:46.990572 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 May 27 17:55:46.990727 kernel: rtc_cmos 00:03: RTC can wake from S4 May 27 17:55:46.990748 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 May 27 17:55:46.990910 kernel: rtc_cmos 00:03: registered as rtc0 May 27 17:55:46.991055 kernel: rtc_cmos 00:03: setting system clock to 2025-05-27T17:55:46 UTC (1748368546) May 27 17:55:46.991197 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram May 27 17:55:46.991216 kernel: intel_pstate: CPU model not supported May 27 17:55:46.991229 kernel: NET: Registered PF_INET6 protocol family May 27 17:55:46.991243 kernel: Segment Routing with IPv6 May 27 17:55:46.991255 kernel: In-situ OAM (IOAM) with IPv6 May 27 17:55:46.991268 kernel: NET: Registered PF_PACKET protocol family May 27 17:55:46.991281 kernel: Key type dns_resolver registered May 27 17:55:46.991300 kernel: IPI shorthand broadcast: enabled May 27 17:55:46.991313 kernel: sched_clock: Marking stable (3342003675, 217397211)->(3678963105, -119562219) May 27 17:55:46.991334 kernel: registered taskstats version 1 May 27 17:55:46.991347 kernel: Loading compiled-in X.509 certificates May 27 17:55:46.991360 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.30-flatcar: 9507e5c390e18536b38d58c90da64baf0ac9837c' May 27 17:55:46.991372 kernel: Demotion targets for Node 0: null May 27 17:55:46.991385 kernel: Key type .fscrypt registered May 27 17:55:46.991397 kernel: Key type fscrypt-provisioning registered May 27 17:55:46.991410 kernel: ima: No TPM chip found, activating TPM-bypass! May 27 17:55:46.991428 kernel: ima: Allocated hash algorithm: sha1 May 27 17:55:46.991440 kernel: ima: No architecture policies found May 27 17:55:46.991453 kernel: clk: Disabling unused clocks May 27 17:55:46.991466 kernel: Warning: unable to open an initial console. May 27 17:55:46.991479 kernel: Freeing unused kernel image (initmem) memory: 54416K May 27 17:55:46.991496 kernel: Write protecting the kernel read-only data: 24576k May 27 17:55:46.991509 kernel: Freeing unused kernel image (rodata/data gap) memory: 288K May 27 17:55:46.991522 kernel: Run /init as init process May 27 17:55:46.991535 kernel: with arguments: May 27 17:55:46.991576 kernel: /init May 27 17:55:46.991592 kernel: with environment: May 27 17:55:46.991605 kernel: HOME=/ May 27 17:55:46.991618 kernel: TERM=linux May 27 17:55:46.991630 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 27 17:55:46.991652 systemd[1]: Successfully made /usr/ read-only. May 27 17:55:46.991670 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 27 17:55:46.991690 systemd[1]: Detected virtualization kvm. May 27 17:55:46.991704 systemd[1]: Detected architecture x86-64. May 27 17:55:46.991717 systemd[1]: Running in initrd. May 27 17:55:46.991730 systemd[1]: No hostname configured, using default hostname. May 27 17:55:46.991743 systemd[1]: Hostname set to . May 27 17:55:46.991757 systemd[1]: Initializing machine ID from VM UUID. May 27 17:55:46.991770 systemd[1]: Queued start job for default target initrd.target. May 27 17:55:46.991783 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 17:55:46.991797 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 17:55:46.991816 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 27 17:55:46.991829 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 27 17:55:46.991843 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 27 17:55:46.991867 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 27 17:55:46.991883 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 27 17:55:46.991897 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 27 17:55:46.991916 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 17:55:46.991930 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 27 17:55:46.991943 systemd[1]: Reached target paths.target - Path Units. May 27 17:55:46.991957 systemd[1]: Reached target slices.target - Slice Units. May 27 17:55:46.991970 systemd[1]: Reached target swap.target - Swaps. May 27 17:55:46.991984 systemd[1]: Reached target timers.target - Timer Units. May 27 17:55:46.991997 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 27 17:55:46.992011 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 27 17:55:46.992024 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 27 17:55:46.992042 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 27 17:55:46.992056 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 27 17:55:46.992070 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 27 17:55:46.992083 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 27 17:55:46.992097 systemd[1]: Reached target sockets.target - Socket Units. May 27 17:55:46.992110 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 27 17:55:46.992124 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 27 17:55:46.992138 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 27 17:55:46.992156 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). May 27 17:55:46.992170 systemd[1]: Starting systemd-fsck-usr.service... May 27 17:55:46.992183 systemd[1]: Starting systemd-journald.service - Journal Service... May 27 17:55:46.992197 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 27 17:55:46.992211 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 17:55:46.992269 systemd-journald[229]: Collecting audit messages is disabled. May 27 17:55:46.992307 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 27 17:55:46.992322 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 27 17:55:46.992335 systemd[1]: Finished systemd-fsck-usr.service. May 27 17:55:46.992354 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 27 17:55:46.992368 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 27 17:55:46.992381 kernel: Bridge firewalling registered May 27 17:55:46.992395 systemd-journald[229]: Journal started May 27 17:55:46.992424 systemd-journald[229]: Runtime Journal (/run/log/journal/9975fd73c4ee41f0b21811445208194c) is 4.7M, max 38.2M, 33.4M free. May 27 17:55:46.951918 systemd-modules-load[231]: Inserted module 'overlay' May 27 17:55:47.044582 systemd[1]: Started systemd-journald.service - Journal Service. May 27 17:55:46.986041 systemd-modules-load[231]: Inserted module 'br_netfilter' May 27 17:55:47.047131 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 27 17:55:47.048676 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 17:55:47.055769 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 27 17:55:47.059701 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 27 17:55:47.067739 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 27 17:55:47.069802 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 27 17:55:47.074721 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 27 17:55:47.087983 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 27 17:55:47.094493 systemd-tmpfiles[250]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. May 27 17:55:47.099228 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 17:55:47.101234 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 27 17:55:47.102271 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 17:55:47.105337 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 27 17:55:47.108737 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 27 17:55:47.138708 dracut-cmdline[268]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=daa3e2d55cc4a7ff0ec15aa9bb0c07df9999cb4e3041f3adad1b1101efdea101 May 27 17:55:47.159580 systemd-resolved[269]: Positive Trust Anchors: May 27 17:55:47.160490 systemd-resolved[269]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 27 17:55:47.160532 systemd-resolved[269]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 27 17:55:47.168301 systemd-resolved[269]: Defaulting to hostname 'linux'. May 27 17:55:47.171640 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 27 17:55:47.172632 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 27 17:55:47.254625 kernel: SCSI subsystem initialized May 27 17:55:47.265618 kernel: Loading iSCSI transport class v2.0-870. May 27 17:55:47.277589 kernel: iscsi: registered transport (tcp) May 27 17:55:47.302692 kernel: iscsi: registered transport (qla4xxx) May 27 17:55:47.302775 kernel: QLogic iSCSI HBA Driver May 27 17:55:47.328122 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 27 17:55:47.362696 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 27 17:55:47.365512 systemd[1]: Reached target network-pre.target - Preparation for Network. May 27 17:55:47.420870 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 27 17:55:47.423528 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 27 17:55:47.486639 kernel: raid6: sse2x4 gen() 14708 MB/s May 27 17:55:47.503627 kernel: raid6: sse2x2 gen() 9899 MB/s May 27 17:55:47.522209 kernel: raid6: sse2x1 gen() 9682 MB/s May 27 17:55:47.522304 kernel: raid6: using algorithm sse2x4 gen() 14708 MB/s May 27 17:55:47.541086 kernel: raid6: .... xor() 8263 MB/s, rmw enabled May 27 17:55:47.541157 kernel: raid6: using ssse3x2 recovery algorithm May 27 17:55:47.565628 kernel: xor: automatically using best checksumming function avx May 27 17:55:47.745600 kernel: Btrfs loaded, zoned=no, fsverity=no May 27 17:55:47.754493 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 27 17:55:47.757664 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 17:55:47.789320 systemd-udevd[478]: Using default interface naming scheme 'v255'. May 27 17:55:47.799766 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 17:55:47.804717 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 27 17:55:47.829295 dracut-pre-trigger[482]: rd.md=0: removing MD RAID activation May 27 17:55:47.861887 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 27 17:55:47.864865 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 27 17:55:47.975995 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 27 17:55:47.979476 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 27 17:55:48.084622 kernel: virtio_blk virtio1: 2/0/0 default/read/poll queues May 27 17:55:48.100670 kernel: virtio_blk virtio1: [vda] 125829120 512-byte logical blocks (64.4 GB/60.0 GiB) May 27 17:55:48.116873 kernel: cryptd: max_cpu_qlen set to 1000 May 27 17:55:48.125945 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. May 27 17:55:48.125987 kernel: GPT:17805311 != 125829119 May 27 17:55:48.126005 kernel: GPT:Alternate GPT header not at the end of the disk. May 27 17:55:48.127866 kernel: GPT:17805311 != 125829119 May 27 17:55:48.127898 kernel: GPT: Use GNU Parted to correct GPT errors. May 27 17:55:48.130383 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 27 17:55:48.150067 kernel: AES CTR mode by8 optimization enabled May 27 17:55:48.153864 kernel: ACPI: bus type USB registered May 27 17:55:48.153905 kernel: usbcore: registered new interface driver usbfs May 27 17:55:48.155581 kernel: usbcore: registered new interface driver hub May 27 17:55:48.156968 kernel: usbcore: registered new device driver usb May 27 17:55:48.182388 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 27 17:55:48.184717 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 27 17:55:48.187132 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 27 17:55:48.193732 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 May 27 17:55:48.192173 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 17:55:48.193256 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 27 17:55:48.229933 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller May 27 17:55:48.235600 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 1 May 27 17:55:48.239589 kernel: xhci_hcd 0000:03:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 May 27 17:55:48.250578 kernel: libata version 3.00 loaded. May 27 17:55:48.260344 kernel: xhci_hcd 0000:03:00.0: xHCI Host Controller May 27 17:55:48.269808 kernel: xhci_hcd 0000:03:00.0: new USB bus registered, assigned bus number 2 May 27 17:55:48.270023 kernel: xhci_hcd 0000:03:00.0: Host supports USB 3.0 SuperSpeed May 27 17:55:48.271585 kernel: hub 1-0:1.0: USB hub found May 27 17:55:48.271844 kernel: hub 1-0:1.0: 4 ports detected May 27 17:55:48.274591 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. May 27 17:55:48.276605 kernel: hub 2-0:1.0: USB hub found May 27 17:55:48.276823 kernel: hub 2-0:1.0: 4 ports detected May 27 17:55:48.313577 kernel: ahci 0000:00:1f.2: version 3.0 May 27 17:55:48.313851 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 May 27 17:55:48.321012 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. May 27 17:55:48.351940 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode May 27 17:55:48.353071 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) May 27 17:55:48.354171 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only May 27 17:55:48.354363 kernel: scsi host0: ahci May 27 17:55:48.354678 kernel: scsi host1: ahci May 27 17:55:48.354898 kernel: scsi host2: ahci May 27 17:55:48.355083 kernel: scsi host3: ahci May 27 17:55:48.355269 kernel: scsi host4: ahci May 27 17:55:48.355450 kernel: scsi host5: ahci May 27 17:55:48.355706 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b100 irq 41 lpm-pol 0 May 27 17:55:48.355727 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b180 irq 41 lpm-pol 0 May 27 17:55:48.355745 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b200 irq 41 lpm-pol 0 May 27 17:55:48.355762 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b280 irq 41 lpm-pol 0 May 27 17:55:48.355778 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b300 irq 41 lpm-pol 0 May 27 17:55:48.355795 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea5b000 port 0xfea5b380 irq 41 lpm-pol 0 May 27 17:55:48.351789 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 17:55:48.381517 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. May 27 17:55:48.391313 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. May 27 17:55:48.392150 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. May 27 17:55:48.405087 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. May 27 17:55:48.407727 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 27 17:55:48.442613 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 27 17:55:48.443719 disk-uuid[633]: Primary Header is updated. May 27 17:55:48.443719 disk-uuid[633]: Secondary Entries is updated. May 27 17:55:48.443719 disk-uuid[633]: Secondary Header is updated. May 27 17:55:48.510579 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd May 27 17:55:48.654589 kernel: hid: raw HID events driver (C) Jiri Kosina May 27 17:55:48.659584 kernel: ata3: SATA link down (SStatus 0 SControl 300) May 27 17:55:48.659679 kernel: ata2: SATA link down (SStatus 0 SControl 300) May 27 17:55:48.661693 kernel: ata1: SATA link down (SStatus 0 SControl 300) May 27 17:55:48.671037 kernel: ata6: SATA link down (SStatus 0 SControl 300) May 27 17:55:48.671117 kernel: ata4: SATA link down (SStatus 0 SControl 300) May 27 17:55:48.672700 kernel: ata5: SATA link down (SStatus 0 SControl 300) May 27 17:55:48.683833 kernel: usbcore: registered new interface driver usbhid May 27 17:55:48.683885 kernel: usbhid: USB HID core driver May 27 17:55:48.691580 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:03:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input4 May 27 17:55:48.694574 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:03:00.0-1/input0 May 27 17:55:48.710849 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 27 17:55:48.712438 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 27 17:55:48.713231 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 17:55:48.714867 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 27 17:55:48.717437 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 27 17:55:48.745470 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 27 17:55:49.458340 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 27 17:55:49.459665 disk-uuid[634]: The operation has completed successfully. May 27 17:55:49.518091 systemd[1]: disk-uuid.service: Deactivated successfully. May 27 17:55:49.518254 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 27 17:55:49.559973 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 27 17:55:49.586954 sh[660]: Success May 27 17:55:49.609982 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 27 17:55:49.610061 kernel: device-mapper: uevent: version 1.0.3 May 27 17:55:49.611641 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev May 27 17:55:49.624602 kernel: device-mapper: verity: sha256 using shash "sha256-avx" May 27 17:55:49.692384 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 27 17:55:49.694016 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 27 17:55:49.709759 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 27 17:55:49.723585 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' May 27 17:55:49.728475 kernel: BTRFS: device fsid 7caef027-0915-4c01-a3d5-28eff70f7ebd devid 1 transid 39 /dev/mapper/usr (253:0) scanned by mount (672) May 27 17:55:49.728515 kernel: BTRFS info (device dm-0): first mount of filesystem 7caef027-0915-4c01-a3d5-28eff70f7ebd May 27 17:55:49.731634 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm May 27 17:55:49.731672 kernel: BTRFS info (device dm-0): using free-space-tree May 27 17:55:49.741965 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 27 17:55:49.743279 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. May 27 17:55:49.744076 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. May 27 17:55:49.745061 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 27 17:55:49.749679 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 27 17:55:49.777386 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 (254:6) scanned by mount (704) May 27 17:55:49.777452 kernel: BTRFS info (device vda6): first mount of filesystem be856aed-e34b-4b7b-be8a-0716b27db212 May 27 17:55:49.781017 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 27 17:55:49.781063 kernel: BTRFS info (device vda6): using free-space-tree May 27 17:55:49.796582 kernel: BTRFS info (device vda6): last unmount of filesystem be856aed-e34b-4b7b-be8a-0716b27db212 May 27 17:55:49.797032 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 27 17:55:49.800769 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 27 17:55:49.886262 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 27 17:55:49.898148 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 27 17:55:49.955728 systemd-networkd[841]: lo: Link UP May 27 17:55:49.955741 systemd-networkd[841]: lo: Gained carrier May 27 17:55:49.957728 systemd-networkd[841]: Enumeration completed May 27 17:55:49.957878 systemd[1]: Started systemd-networkd.service - Network Configuration. May 27 17:55:49.958718 systemd[1]: Reached target network.target - Network. May 27 17:55:49.959815 systemd-networkd[841]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 17:55:49.959822 systemd-networkd[841]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 27 17:55:49.961156 systemd-networkd[841]: eth0: Link UP May 27 17:55:49.961162 systemd-networkd[841]: eth0: Gained carrier May 27 17:55:49.961174 systemd-networkd[841]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 17:55:50.003661 systemd-networkd[841]: eth0: DHCPv4 address 10.230.41.6/30, gateway 10.230.41.5 acquired from 10.230.41.5 May 27 17:55:50.007534 ignition[752]: Ignition 2.21.0 May 27 17:55:50.007569 ignition[752]: Stage: fetch-offline May 27 17:55:50.007656 ignition[752]: no configs at "/usr/lib/ignition/base.d" May 27 17:55:50.010290 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 27 17:55:50.007683 ignition[752]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 27 17:55:50.007892 ignition[752]: parsed url from cmdline: "" May 27 17:55:50.007899 ignition[752]: no config URL provided May 27 17:55:50.007909 ignition[752]: reading system config file "/usr/lib/ignition/user.ign" May 27 17:55:50.007924 ignition[752]: no config at "/usr/lib/ignition/user.ign" May 27 17:55:50.014710 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... May 27 17:55:50.007937 ignition[752]: failed to fetch config: resource requires networking May 27 17:55:50.008186 ignition[752]: Ignition finished successfully May 27 17:55:50.046921 ignition[851]: Ignition 2.21.0 May 27 17:55:50.046941 ignition[851]: Stage: fetch May 27 17:55:50.047126 ignition[851]: no configs at "/usr/lib/ignition/base.d" May 27 17:55:50.047144 ignition[851]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 27 17:55:50.048223 ignition[851]: parsed url from cmdline: "" May 27 17:55:50.048229 ignition[851]: no config URL provided May 27 17:55:50.048239 ignition[851]: reading system config file "/usr/lib/ignition/user.ign" May 27 17:55:50.048253 ignition[851]: no config at "/usr/lib/ignition/user.ign" May 27 17:55:50.048427 ignition[851]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... May 27 17:55:50.048451 ignition[851]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 May 27 17:55:50.048462 ignition[851]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... May 27 17:55:50.063441 ignition[851]: GET result: OK May 27 17:55:50.063676 ignition[851]: parsing config with SHA512: 8e96424ed33445e4869184160255d48bf4c8d9ec1d97d8137b53b1591d056aa8e84582d2cbe4ca8c05a2aefdfed92276be3256840e601a37c801c1ec78b1c5b3 May 27 17:55:50.068639 unknown[851]: fetched base config from "system" May 27 17:55:50.068655 unknown[851]: fetched base config from "system" May 27 17:55:50.069038 ignition[851]: fetch: fetch complete May 27 17:55:50.068664 unknown[851]: fetched user config from "openstack" May 27 17:55:50.069046 ignition[851]: fetch: fetch passed May 27 17:55:50.071701 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). May 27 17:55:50.069104 ignition[851]: Ignition finished successfully May 27 17:55:50.074748 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 27 17:55:50.109290 ignition[858]: Ignition 2.21.0 May 27 17:55:50.110281 ignition[858]: Stage: kargs May 27 17:55:50.110480 ignition[858]: no configs at "/usr/lib/ignition/base.d" May 27 17:55:50.110498 ignition[858]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 27 17:55:50.112224 ignition[858]: kargs: kargs passed May 27 17:55:50.112301 ignition[858]: Ignition finished successfully May 27 17:55:50.114963 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 27 17:55:50.117624 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 27 17:55:50.144957 ignition[864]: Ignition 2.21.0 May 27 17:55:50.144981 ignition[864]: Stage: disks May 27 17:55:50.145153 ignition[864]: no configs at "/usr/lib/ignition/base.d" May 27 17:55:50.145171 ignition[864]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 27 17:55:50.146796 ignition[864]: disks: disks passed May 27 17:55:50.148942 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 27 17:55:50.146871 ignition[864]: Ignition finished successfully May 27 17:55:50.150267 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 27 17:55:50.151638 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 27 17:55:50.152936 systemd[1]: Reached target local-fs.target - Local File Systems. May 27 17:55:50.154314 systemd[1]: Reached target sysinit.target - System Initialization. May 27 17:55:50.155956 systemd[1]: Reached target basic.target - Basic System. May 27 17:55:50.158478 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 27 17:55:50.203872 systemd-fsck[872]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks May 27 17:55:50.206806 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 27 17:55:50.209671 systemd[1]: Mounting sysroot.mount - /sysroot... May 27 17:55:50.333598 kernel: EXT4-fs (vda9): mounted filesystem bf93e767-f532-4480-b210-a196f7ac181e r/w with ordered data mode. Quota mode: none. May 27 17:55:50.335344 systemd[1]: Mounted sysroot.mount - /sysroot. May 27 17:55:50.337223 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 27 17:55:50.340320 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 27 17:55:50.343642 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 27 17:55:50.345371 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. May 27 17:55:50.354795 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... May 27 17:55:50.357654 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 27 17:55:50.358688 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 27 17:55:50.362494 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 27 17:55:50.366708 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 27 17:55:50.370582 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 (254:6) scanned by mount (880) May 27 17:55:50.374579 kernel: BTRFS info (device vda6): first mount of filesystem be856aed-e34b-4b7b-be8a-0716b27db212 May 27 17:55:50.378176 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 27 17:55:50.378215 kernel: BTRFS info (device vda6): using free-space-tree May 27 17:55:50.399422 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 27 17:55:50.454583 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev May 27 17:55:50.462787 initrd-setup-root[908]: cut: /sysroot/etc/passwd: No such file or directory May 27 17:55:50.470495 initrd-setup-root[915]: cut: /sysroot/etc/group: No such file or directory May 27 17:55:50.479055 initrd-setup-root[922]: cut: /sysroot/etc/shadow: No such file or directory May 27 17:55:50.484590 initrd-setup-root[929]: cut: /sysroot/etc/gshadow: No such file or directory May 27 17:55:50.586990 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 27 17:55:50.589501 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 27 17:55:50.590989 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 27 17:55:50.611592 kernel: BTRFS info (device vda6): last unmount of filesystem be856aed-e34b-4b7b-be8a-0716b27db212 May 27 17:55:50.631326 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 27 17:55:50.644328 ignition[998]: INFO : Ignition 2.21.0 May 27 17:55:50.644328 ignition[998]: INFO : Stage: mount May 27 17:55:50.645992 ignition[998]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 17:55:50.645992 ignition[998]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 27 17:55:50.645992 ignition[998]: INFO : mount: mount passed May 27 17:55:50.645992 ignition[998]: INFO : Ignition finished successfully May 27 17:55:50.646620 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 27 17:55:50.723330 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 27 17:55:51.486592 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev May 27 17:55:51.899836 systemd-networkd[841]: eth0: Gained IPv6LL May 27 17:55:53.083138 systemd-networkd[841]: eth0: Ignoring DHCPv6 address 2a02:1348:179:8a41:24:19ff:fee6:2906/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:179:8a41:24:19ff:fee6:2906/64 assigned by NDisc. May 27 17:55:53.083153 systemd-networkd[841]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. May 27 17:55:53.494596 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev May 27 17:55:57.505630 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev May 27 17:55:57.516011 coreos-metadata[882]: May 27 17:55:57.515 WARN failed to locate config-drive, using the metadata service API instead May 27 17:55:57.539740 coreos-metadata[882]: May 27 17:55:57.539 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 May 27 17:55:57.560157 coreos-metadata[882]: May 27 17:55:57.560 INFO Fetch successful May 27 17:55:57.561096 coreos-metadata[882]: May 27 17:55:57.561 INFO wrote hostname srv-kh28t.gb1.brightbox.com to /sysroot/etc/hostname May 27 17:55:57.563183 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. May 27 17:55:57.563373 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. May 27 17:55:57.567485 systemd[1]: Starting ignition-files.service - Ignition (files)... May 27 17:55:57.604740 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 27 17:55:57.635598 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 (254:6) scanned by mount (1014) May 27 17:55:57.642756 kernel: BTRFS info (device vda6): first mount of filesystem be856aed-e34b-4b7b-be8a-0716b27db212 May 27 17:55:57.642795 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 27 17:55:57.642818 kernel: BTRFS info (device vda6): using free-space-tree May 27 17:55:57.648841 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 27 17:55:57.682724 ignition[1032]: INFO : Ignition 2.21.0 May 27 17:55:57.682724 ignition[1032]: INFO : Stage: files May 27 17:55:57.684358 ignition[1032]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 17:55:57.684358 ignition[1032]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 27 17:55:57.684358 ignition[1032]: DEBUG : files: compiled without relabeling support, skipping May 27 17:55:57.686892 ignition[1032]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 27 17:55:57.686892 ignition[1032]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 27 17:55:57.688735 ignition[1032]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 27 17:55:57.688735 ignition[1032]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 27 17:55:57.688735 ignition[1032]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 27 17:55:57.688709 unknown[1032]: wrote ssh authorized keys file for user: core May 27 17:55:57.697761 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" May 27 17:55:57.697761 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 May 27 17:55:58.245196 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 27 17:55:58.767460 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" May 27 17:55:58.767460 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 27 17:55:58.771080 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 27 17:55:58.771080 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 27 17:55:58.771080 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 27 17:55:58.771080 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 27 17:55:58.771080 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 27 17:55:58.771080 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 27 17:55:58.771080 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 27 17:55:58.778419 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 27 17:55:58.778419 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 27 17:55:58.778419 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" May 27 17:55:58.778419 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" May 27 17:55:58.778419 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" May 27 17:55:58.778419 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-x86-64.raw: attempt #1 May 27 17:55:59.540129 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 27 17:56:01.752769 ignition[1032]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-x86-64.raw" May 27 17:56:01.752769 ignition[1032]: INFO : files: op(b): [started] processing unit "prepare-helm.service" May 27 17:56:01.757571 ignition[1032]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 27 17:56:01.759214 ignition[1032]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 27 17:56:01.759214 ignition[1032]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" May 27 17:56:01.759214 ignition[1032]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" May 27 17:56:01.765471 ignition[1032]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" May 27 17:56:01.765471 ignition[1032]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" May 27 17:56:01.765471 ignition[1032]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" May 27 17:56:01.765471 ignition[1032]: INFO : files: files passed May 27 17:56:01.765471 ignition[1032]: INFO : Ignition finished successfully May 27 17:56:01.762242 systemd[1]: Finished ignition-files.service - Ignition (files). May 27 17:56:01.766808 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 27 17:56:01.771438 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 27 17:56:01.784462 systemd[1]: ignition-quench.service: Deactivated successfully. May 27 17:56:01.784736 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 27 17:56:01.793527 initrd-setup-root-after-ignition[1062]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 27 17:56:01.793527 initrd-setup-root-after-ignition[1062]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 27 17:56:01.797589 initrd-setup-root-after-ignition[1066]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 27 17:56:01.798969 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 27 17:56:01.800532 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 27 17:56:01.802344 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 27 17:56:01.857914 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 27 17:56:01.858115 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 27 17:56:01.859812 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 27 17:56:01.861067 systemd[1]: Reached target initrd.target - Initrd Default Target. May 27 17:56:01.862567 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 27 17:56:01.863714 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 27 17:56:01.890754 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 27 17:56:01.893655 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 27 17:56:01.921276 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 27 17:56:01.923083 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 17:56:01.923945 systemd[1]: Stopped target timers.target - Timer Units. May 27 17:56:01.925373 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 27 17:56:01.925644 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 27 17:56:01.927257 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 27 17:56:01.928139 systemd[1]: Stopped target basic.target - Basic System. May 27 17:56:01.929664 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 27 17:56:01.931004 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 27 17:56:01.932330 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 27 17:56:01.933801 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. May 27 17:56:01.935446 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 27 17:56:01.936983 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 27 17:56:01.938537 systemd[1]: Stopped target sysinit.target - System Initialization. May 27 17:56:01.939863 systemd[1]: Stopped target local-fs.target - Local File Systems. May 27 17:56:01.941409 systemd[1]: Stopped target swap.target - Swaps. May 27 17:56:01.942664 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 27 17:56:01.942844 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 27 17:56:01.944446 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 27 17:56:01.945412 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 17:56:01.946732 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 27 17:56:01.946909 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 17:56:01.948423 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 27 17:56:01.948728 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 27 17:56:01.950502 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 27 17:56:01.950730 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 27 17:56:01.952497 systemd[1]: ignition-files.service: Deactivated successfully. May 27 17:56:01.952850 systemd[1]: Stopped ignition-files.service - Ignition (files). May 27 17:56:01.956679 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 27 17:56:01.967858 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 27 17:56:01.968524 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 27 17:56:01.968786 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 27 17:56:01.970800 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 27 17:56:01.970961 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 27 17:56:01.979950 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 27 17:56:01.983128 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 27 17:56:01.993498 ignition[1086]: INFO : Ignition 2.21.0 May 27 17:56:01.993498 ignition[1086]: INFO : Stage: umount May 27 17:56:01.996348 ignition[1086]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 17:56:01.996348 ignition[1086]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 27 17:56:01.996348 ignition[1086]: INFO : umount: umount passed May 27 17:56:01.996348 ignition[1086]: INFO : Ignition finished successfully May 27 17:56:01.995899 systemd[1]: ignition-mount.service: Deactivated successfully. May 27 17:56:01.996060 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 27 17:56:01.997491 systemd[1]: ignition-disks.service: Deactivated successfully. May 27 17:56:01.997815 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 27 17:56:01.998518 systemd[1]: ignition-kargs.service: Deactivated successfully. May 27 17:56:01.998619 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 27 17:56:01.999269 systemd[1]: ignition-fetch.service: Deactivated successfully. May 27 17:56:01.999329 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). May 27 17:56:02.000028 systemd[1]: Stopped target network.target - Network. May 27 17:56:02.001372 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 27 17:56:02.001441 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 27 17:56:02.002763 systemd[1]: Stopped target paths.target - Path Units. May 27 17:56:02.004424 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 27 17:56:02.008695 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 17:56:02.010064 systemd[1]: Stopped target slices.target - Slice Units. May 27 17:56:02.012907 systemd[1]: Stopped target sockets.target - Socket Units. May 27 17:56:02.015110 systemd[1]: iscsid.socket: Deactivated successfully. May 27 17:56:02.015189 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 27 17:56:02.016416 systemd[1]: iscsiuio.socket: Deactivated successfully. May 27 17:56:02.016475 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 27 17:56:02.017851 systemd[1]: ignition-setup.service: Deactivated successfully. May 27 17:56:02.017934 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 27 17:56:02.019122 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 27 17:56:02.019186 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 27 17:56:02.020809 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 27 17:56:02.022500 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 27 17:56:02.025525 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 27 17:56:02.027406 systemd-networkd[841]: eth0: DHCPv6 lease lost May 27 17:56:02.029043 systemd[1]: systemd-resolved.service: Deactivated successfully. May 27 17:56:02.029238 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 27 17:56:02.036044 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 27 17:56:02.036459 systemd[1]: systemd-networkd.service: Deactivated successfully. May 27 17:56:02.036668 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 27 17:56:02.038721 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 27 17:56:02.039049 systemd[1]: sysroot-boot.service: Deactivated successfully. May 27 17:56:02.039188 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 27 17:56:02.042465 systemd[1]: Stopped target network-pre.target - Preparation for Network. May 27 17:56:02.044831 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 27 17:56:02.044916 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 27 17:56:02.045635 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 27 17:56:02.045721 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 27 17:56:02.047380 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 27 17:56:02.049003 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 27 17:56:02.049078 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 27 17:56:02.052682 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 27 17:56:02.052775 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 27 17:56:02.053532 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 27 17:56:02.053608 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 27 17:56:02.054252 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 27 17:56:02.054312 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 17:56:02.056063 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 17:56:02.060117 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 27 17:56:02.060220 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 27 17:56:02.076535 systemd[1]: systemd-udevd.service: Deactivated successfully. May 27 17:56:02.077290 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 17:56:02.078799 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 27 17:56:02.078872 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 27 17:56:02.079883 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 27 17:56:02.079935 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 27 17:56:02.082989 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 27 17:56:02.083075 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 27 17:56:02.085118 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 27 17:56:02.085187 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 27 17:56:02.086587 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 27 17:56:02.086703 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 27 17:56:02.089007 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 27 17:56:02.090876 systemd[1]: systemd-network-generator.service: Deactivated successfully. May 27 17:56:02.090947 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. May 27 17:56:02.093545 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 27 17:56:02.093637 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 17:56:02.097074 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. May 27 17:56:02.097143 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 27 17:56:02.099181 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 27 17:56:02.099247 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 27 17:56:02.100205 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 27 17:56:02.100280 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 27 17:56:02.103876 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. May 27 17:56:02.103953 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. May 27 17:56:02.104014 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. May 27 17:56:02.104079 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 27 17:56:02.104684 systemd[1]: network-cleanup.service: Deactivated successfully. May 27 17:56:02.106668 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 27 17:56:02.115985 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 27 17:56:02.116134 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 27 17:56:02.117832 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 27 17:56:02.119978 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 27 17:56:02.143973 systemd[1]: Switching root. May 27 17:56:02.180281 systemd-journald[229]: Journal stopped May 27 17:56:03.735717 systemd-journald[229]: Received SIGTERM from PID 1 (systemd). May 27 17:56:03.735829 kernel: SELinux: policy capability network_peer_controls=1 May 27 17:56:03.735868 kernel: SELinux: policy capability open_perms=1 May 27 17:56:03.735887 kernel: SELinux: policy capability extended_socket_class=1 May 27 17:56:03.735918 kernel: SELinux: policy capability always_check_network=0 May 27 17:56:03.735937 kernel: SELinux: policy capability cgroup_seclabel=1 May 27 17:56:03.735956 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 27 17:56:03.735974 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 27 17:56:03.736001 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 27 17:56:03.736019 kernel: SELinux: policy capability userspace_initial_context=0 May 27 17:56:03.736038 systemd[1]: Successfully loaded SELinux policy in 46.540ms. May 27 17:56:03.736076 kernel: audit: type=1403 audit(1748368562.554:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 27 17:56:03.736097 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 20.805ms. May 27 17:56:03.736129 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 27 17:56:03.736156 systemd[1]: Detected virtualization kvm. May 27 17:56:03.736176 systemd[1]: Detected architecture x86-64. May 27 17:56:03.736194 systemd[1]: Detected first boot. May 27 17:56:03.736221 systemd[1]: Hostname set to . May 27 17:56:03.736246 systemd[1]: Initializing machine ID from VM UUID. May 27 17:56:03.736265 zram_generator::config[1129]: No configuration found. May 27 17:56:03.736291 kernel: Guest personality initialized and is inactive May 27 17:56:03.736321 kernel: VMCI host device registered (name=vmci, major=10, minor=125) May 27 17:56:03.736341 kernel: Initialized host personality May 27 17:56:03.736358 kernel: NET: Registered PF_VSOCK protocol family May 27 17:56:03.736376 systemd[1]: Populated /etc with preset unit settings. May 27 17:56:03.736407 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 27 17:56:03.736430 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 27 17:56:03.736448 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 27 17:56:03.736479 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 27 17:56:03.736501 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 27 17:56:03.736536 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 27 17:56:03.737593 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 27 17:56:03.737623 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 27 17:56:03.737644 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 27 17:56:03.737664 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 27 17:56:03.737683 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 27 17:56:03.737703 systemd[1]: Created slice user.slice - User and Session Slice. May 27 17:56:03.737761 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 17:56:03.737793 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 17:56:03.737814 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 27 17:56:03.737837 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 27 17:56:03.737867 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 27 17:56:03.737888 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 27 17:56:03.737913 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... May 27 17:56:03.737932 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 17:56:03.737957 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 27 17:56:03.737977 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 27 17:56:03.737997 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 27 17:56:03.738030 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 27 17:56:03.738060 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 27 17:56:03.738091 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 17:56:03.738113 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 27 17:56:03.738133 systemd[1]: Reached target slices.target - Slice Units. May 27 17:56:03.738152 systemd[1]: Reached target swap.target - Swaps. May 27 17:56:03.738171 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 27 17:56:03.738195 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 27 17:56:03.738223 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 27 17:56:03.738243 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 27 17:56:03.738262 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 27 17:56:03.738286 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 27 17:56:03.738316 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 27 17:56:03.738345 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 27 17:56:03.738364 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 27 17:56:03.738383 systemd[1]: Mounting media.mount - External Media Directory... May 27 17:56:03.738403 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 17:56:03.738422 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 27 17:56:03.738441 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 27 17:56:03.738477 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 27 17:56:03.738511 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 27 17:56:03.738532 systemd[1]: Reached target machines.target - Containers. May 27 17:56:03.738566 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 27 17:56:03.738589 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 17:56:03.738609 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 27 17:56:03.738629 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 27 17:56:03.738649 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 27 17:56:03.738668 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 27 17:56:03.738687 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 27 17:56:03.738722 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 27 17:56:03.738743 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 27 17:56:03.738763 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 27 17:56:03.738783 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 27 17:56:03.738802 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 27 17:56:03.738821 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 27 17:56:03.738841 systemd[1]: Stopped systemd-fsck-usr.service. May 27 17:56:03.738861 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 17:56:03.738891 systemd[1]: Starting systemd-journald.service - Journal Service... May 27 17:56:03.738921 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 27 17:56:03.738943 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 27 17:56:03.738963 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 27 17:56:03.738982 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 27 17:56:03.739011 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 27 17:56:03.739033 systemd[1]: verity-setup.service: Deactivated successfully. May 27 17:56:03.739052 systemd[1]: Stopped verity-setup.service. May 27 17:56:03.739081 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 17:56:03.739101 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 27 17:56:03.739132 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 27 17:56:03.739162 systemd[1]: Mounted media.mount - External Media Directory. May 27 17:56:03.739183 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 27 17:56:03.739211 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 27 17:56:03.739230 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 27 17:56:03.739254 kernel: loop: module loaded May 27 17:56:03.739282 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 27 17:56:03.739301 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 27 17:56:03.739337 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 27 17:56:03.739357 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 27 17:56:03.739376 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 27 17:56:03.739398 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 27 17:56:03.739478 systemd-journald[1223]: Collecting audit messages is disabled. May 27 17:56:03.739519 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 27 17:56:03.739540 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 27 17:56:03.741878 kernel: fuse: init (API version 7.41) May 27 17:56:03.741907 systemd[1]: modprobe@loop.service: Deactivated successfully. May 27 17:56:03.741928 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 27 17:56:03.741955 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 27 17:56:03.741975 systemd-journald[1223]: Journal started May 27 17:56:03.742019 systemd-journald[1223]: Runtime Journal (/run/log/journal/9975fd73c4ee41f0b21811445208194c) is 4.7M, max 38.2M, 33.4M free. May 27 17:56:03.357216 systemd[1]: Queued start job for default target multi-user.target. May 27 17:56:03.383036 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. May 27 17:56:03.383822 systemd[1]: systemd-journald.service: Deactivated successfully. May 27 17:56:03.745649 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 27 17:56:03.750629 systemd[1]: Started systemd-journald.service - Journal Service. May 27 17:56:03.754475 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 27 17:56:03.755587 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 27 17:56:03.756672 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 27 17:56:03.773473 systemd[1]: Reached target network-pre.target - Preparation for Network. May 27 17:56:03.779658 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 27 17:56:03.784652 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 27 17:56:03.785397 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 27 17:56:03.785452 systemd[1]: Reached target local-fs.target - Local File Systems. May 27 17:56:03.790738 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 27 17:56:03.804830 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 27 17:56:03.807806 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 17:56:03.811861 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 27 17:56:03.814588 kernel: ACPI: bus type drm_connector registered May 27 17:56:03.816860 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 27 17:56:03.817684 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 27 17:56:03.820912 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 27 17:56:03.821665 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 27 17:56:03.825871 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 27 17:56:03.835177 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 27 17:56:03.839774 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 27 17:56:03.842987 systemd[1]: modprobe@drm.service: Deactivated successfully. May 27 17:56:03.843279 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 27 17:56:03.844631 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 27 17:56:03.848077 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 27 17:56:03.848939 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 27 17:56:03.878411 systemd-journald[1223]: Time spent on flushing to /var/log/journal/9975fd73c4ee41f0b21811445208194c is 119.179ms for 1163 entries. May 27 17:56:03.878411 systemd-journald[1223]: System Journal (/var/log/journal/9975fd73c4ee41f0b21811445208194c) is 8M, max 584.8M, 576.8M free. May 27 17:56:04.013131 systemd-journald[1223]: Received client request to flush runtime journal. May 27 17:56:04.013706 kernel: loop0: detected capacity change from 0 to 146240 May 27 17:56:04.013761 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 27 17:56:03.919136 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 27 17:56:03.920926 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 27 17:56:03.933032 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 27 17:56:03.986587 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 27 17:56:04.022122 kernel: loop1: detected capacity change from 0 to 221472 May 27 17:56:04.022088 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 27 17:56:04.025510 systemd-tmpfiles[1266]: ACLs are not supported, ignoring. May 27 17:56:04.025544 systemd-tmpfiles[1266]: ACLs are not supported, ignoring. May 27 17:56:04.030022 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 27 17:56:04.043031 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 27 17:56:04.047853 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 27 17:56:04.080640 kernel: loop2: detected capacity change from 0 to 113872 May 27 17:56:04.116023 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 27 17:56:04.136090 kernel: loop3: detected capacity change from 0 to 8 May 27 17:56:04.154742 kernel: loop4: detected capacity change from 0 to 146240 May 27 17:56:04.186955 kernel: loop5: detected capacity change from 0 to 221472 May 27 17:56:04.197388 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 27 17:56:04.200354 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 27 17:56:04.211593 kernel: loop6: detected capacity change from 0 to 113872 May 27 17:56:04.233592 kernel: loop7: detected capacity change from 0 to 8 May 27 17:56:04.239576 (sd-merge)[1290]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. May 27 17:56:04.240353 (sd-merge)[1290]: Merged extensions into '/usr'. May 27 17:56:04.258108 systemd[1]: Reload requested from client PID 1265 ('systemd-sysext') (unit systemd-sysext.service)... May 27 17:56:04.258627 systemd[1]: Reloading... May 27 17:56:04.263205 systemd-tmpfiles[1292]: ACLs are not supported, ignoring. May 27 17:56:04.263234 systemd-tmpfiles[1292]: ACLs are not supported, ignoring. May 27 17:56:04.427247 zram_generator::config[1320]: No configuration found. May 27 17:56:04.654487 ldconfig[1260]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 27 17:56:04.700479 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 17:56:04.815878 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 27 17:56:04.817096 systemd[1]: Reloading finished in 557 ms. May 27 17:56:04.839211 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 27 17:56:04.841851 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 27 17:56:04.843328 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 17:56:04.858745 systemd[1]: Starting ensure-sysext.service... May 27 17:56:04.863766 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 27 17:56:04.902711 systemd[1]: Reload requested from client PID 1377 ('systemctl') (unit ensure-sysext.service)... May 27 17:56:04.902735 systemd[1]: Reloading... May 27 17:56:04.936417 systemd-tmpfiles[1378]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. May 27 17:56:04.937007 systemd-tmpfiles[1378]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. May 27 17:56:04.937474 systemd-tmpfiles[1378]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 27 17:56:04.937892 systemd-tmpfiles[1378]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 27 17:56:04.940161 systemd-tmpfiles[1378]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 27 17:56:04.943121 systemd-tmpfiles[1378]: ACLs are not supported, ignoring. May 27 17:56:04.943336 systemd-tmpfiles[1378]: ACLs are not supported, ignoring. May 27 17:56:04.958792 systemd-tmpfiles[1378]: Detected autofs mount point /boot during canonicalization of boot. May 27 17:56:04.959009 systemd-tmpfiles[1378]: Skipping /boot May 27 17:56:04.990295 systemd-tmpfiles[1378]: Detected autofs mount point /boot during canonicalization of boot. May 27 17:56:04.990847 systemd-tmpfiles[1378]: Skipping /boot May 27 17:56:05.030597 zram_generator::config[1408]: No configuration found. May 27 17:56:05.174612 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 17:56:05.290369 systemd[1]: Reloading finished in 387 ms. May 27 17:56:05.313692 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 27 17:56:05.328613 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 17:56:05.340835 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 27 17:56:05.343812 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 27 17:56:05.350397 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 27 17:56:05.355932 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 27 17:56:05.362231 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 17:56:05.366963 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 27 17:56:05.374147 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 17:56:05.374418 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 17:56:05.377341 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 27 17:56:05.381263 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 27 17:56:05.385044 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 27 17:56:05.386251 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 17:56:05.386423 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 17:56:05.386627 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 17:56:05.394942 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 27 17:56:05.397437 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 17:56:05.398810 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 17:56:05.399068 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 17:56:05.399202 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 17:56:05.399331 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 17:56:05.405383 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 17:56:05.405761 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 17:56:05.415908 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 27 17:56:05.417094 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 17:56:05.417238 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 17:56:05.417467 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 17:56:05.424371 systemd[1]: Finished ensure-sysext.service. May 27 17:56:05.451700 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... May 27 17:56:05.458339 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 27 17:56:05.460105 systemd[1]: modprobe@loop.service: Deactivated successfully. May 27 17:56:05.461003 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 27 17:56:05.462822 systemd[1]: modprobe@drm.service: Deactivated successfully. May 27 17:56:05.463607 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 27 17:56:05.479773 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 27 17:56:05.486813 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 27 17:56:05.493471 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 27 17:56:05.494666 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 27 17:56:05.496505 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 27 17:56:05.501132 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 27 17:56:05.501459 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 27 17:56:05.503980 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 27 17:56:05.524847 systemd-udevd[1467]: Using default interface naming scheme 'v255'. May 27 17:56:05.530660 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 27 17:56:05.531643 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 27 17:56:05.545690 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 27 17:56:05.553071 augenrules[1504]: No rules May 27 17:56:05.555236 systemd[1]: audit-rules.service: Deactivated successfully. May 27 17:56:05.556650 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 27 17:56:05.563838 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 27 17:56:05.580173 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 17:56:05.585853 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 27 17:56:05.723507 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. May 27 17:56:05.724483 systemd[1]: Reached target time-set.target - System Time Set. May 27 17:56:05.735047 systemd-networkd[1519]: lo: Link UP May 27 17:56:05.735059 systemd-networkd[1519]: lo: Gained carrier May 27 17:56:05.736412 systemd-networkd[1519]: Enumeration completed May 27 17:56:05.736526 systemd[1]: Started systemd-networkd.service - Network Configuration. May 27 17:56:05.747935 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 27 17:56:05.751807 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 27 17:56:05.785826 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 27 17:56:05.798223 systemd-resolved[1466]: Positive Trust Anchors: May 27 17:56:05.798257 systemd-resolved[1466]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 27 17:56:05.798299 systemd-resolved[1466]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 27 17:56:05.808536 systemd-resolved[1466]: Using system hostname 'srv-kh28t.gb1.brightbox.com'. May 27 17:56:05.813764 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 27 17:56:05.814649 systemd[1]: Reached target network.target - Network. May 27 17:56:05.815261 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 27 17:56:05.816366 systemd[1]: Reached target sysinit.target - System Initialization. May 27 17:56:05.817411 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 27 17:56:05.818201 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 27 17:56:05.819207 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. May 27 17:56:05.820222 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 27 17:56:05.821139 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 27 17:56:05.822134 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 27 17:56:05.822883 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 27 17:56:05.822936 systemd[1]: Reached target paths.target - Path Units. May 27 17:56:05.823796 systemd[1]: Reached target timers.target - Timer Units. May 27 17:56:05.825931 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 27 17:56:05.829729 systemd[1]: Starting docker.socket - Docker Socket for the API... May 27 17:56:05.835850 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 27 17:56:05.837070 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). May 27 17:56:05.838513 systemd[1]: Reached target ssh-access.target - SSH Access Available. May 27 17:56:05.848193 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 27 17:56:05.849529 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 27 17:56:05.851409 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 27 17:56:05.855569 systemd[1]: Reached target sockets.target - Socket Units. May 27 17:56:05.856965 systemd[1]: Reached target basic.target - Basic System. May 27 17:56:05.858686 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 27 17:56:05.858736 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 27 17:56:05.860059 systemd[1]: Starting containerd.service - containerd container runtime... May 27 17:56:05.865727 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... May 27 17:56:05.870003 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 27 17:56:05.873815 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 27 17:56:05.882707 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 27 17:56:05.888215 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 27 17:56:05.888945 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 27 17:56:05.893870 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... May 27 17:56:05.898450 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev May 27 17:56:05.900805 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 27 17:56:05.911731 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 27 17:56:05.915916 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 27 17:56:05.919778 jq[1552]: false May 27 17:56:05.924519 oslogin_cache_refresh[1554]: Refreshing passwd entry cache May 27 17:56:05.924929 google_oslogin_nss_cache[1554]: oslogin_cache_refresh[1554]: Refreshing passwd entry cache May 27 17:56:05.928749 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 27 17:56:05.930217 google_oslogin_nss_cache[1554]: oslogin_cache_refresh[1554]: Failure getting users, quitting May 27 17:56:05.931596 oslogin_cache_refresh[1554]: Failure getting users, quitting May 27 17:56:05.932821 google_oslogin_nss_cache[1554]: oslogin_cache_refresh[1554]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. May 27 17:56:05.932821 google_oslogin_nss_cache[1554]: oslogin_cache_refresh[1554]: Refreshing group entry cache May 27 17:56:05.932821 google_oslogin_nss_cache[1554]: oslogin_cache_refresh[1554]: Failure getting groups, quitting May 27 17:56:05.932821 google_oslogin_nss_cache[1554]: oslogin_cache_refresh[1554]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. May 27 17:56:05.931639 oslogin_cache_refresh[1554]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. May 27 17:56:05.931712 oslogin_cache_refresh[1554]: Refreshing group entry cache May 27 17:56:05.932467 oslogin_cache_refresh[1554]: Failure getting groups, quitting May 27 17:56:05.932480 oslogin_cache_refresh[1554]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. May 27 17:56:05.947251 systemd[1]: Starting systemd-logind.service - User Login Management... May 27 17:56:05.950799 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 27 17:56:05.951573 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 27 17:56:05.956703 systemd[1]: Starting update-engine.service - Update Engine... May 27 17:56:05.971610 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 27 17:56:05.977618 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 27 17:56:05.980972 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 27 17:56:05.981314 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 27 17:56:05.981886 systemd[1]: google-oslogin-cache.service: Deactivated successfully. May 27 17:56:05.982183 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. May 27 17:56:06.001742 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 27 17:56:06.004972 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 27 17:56:06.007486 extend-filesystems[1553]: Found loop4 May 27 17:56:06.008492 extend-filesystems[1553]: Found loop5 May 27 17:56:06.009142 extend-filesystems[1553]: Found loop6 May 27 17:56:06.009142 extend-filesystems[1553]: Found loop7 May 27 17:56:06.009142 extend-filesystems[1553]: Found vda May 27 17:56:06.009142 extend-filesystems[1553]: Found vda1 May 27 17:56:06.009142 extend-filesystems[1553]: Found vda2 May 27 17:56:06.009142 extend-filesystems[1553]: Found vda3 May 27 17:56:06.009142 extend-filesystems[1553]: Found usr May 27 17:56:06.009142 extend-filesystems[1553]: Found vda4 May 27 17:56:06.009142 extend-filesystems[1553]: Found vda6 May 27 17:56:06.009142 extend-filesystems[1553]: Found vda7 May 27 17:56:06.009142 extend-filesystems[1553]: Found vda9 May 27 17:56:06.017787 systemd[1]: extend-filesystems.service: Deactivated successfully. May 27 17:56:06.018697 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 27 17:56:06.058901 update_engine[1563]: I20250527 17:56:06.058805 1563 main.cc:92] Flatcar Update Engine starting May 27 17:56:06.072100 jq[1564]: true May 27 17:56:06.083306 systemd-networkd[1519]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 17:56:06.083318 systemd-networkd[1519]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 27 17:56:06.086728 tar[1567]: linux-amd64/helm May 27 17:56:06.091763 systemd-networkd[1519]: eth0: Link UP May 27 17:56:06.092803 systemd-networkd[1519]: eth0: Gained carrier May 27 17:56:06.092836 systemd-networkd[1519]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 17:56:06.093426 systemd[1]: motdgen.service: Deactivated successfully. May 27 17:56:06.093881 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 27 17:56:06.103144 jq[1593]: true May 27 17:56:06.105622 (ntainerd)[1585]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 27 17:56:06.109463 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. May 27 17:56:06.124026 systemd-networkd[1519]: eth0: DHCPv4 address 10.230.41.6/30, gateway 10.230.41.5 acquired from 10.230.41.5 May 27 17:56:06.125580 systemd-timesyncd[1481]: Network configuration changed, trying to establish connection. May 27 17:56:06.143253 dbus-daemon[1550]: [system] SELinux support is enabled May 27 17:56:06.149213 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 27 17:56:06.156350 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 27 17:56:06.156393 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 27 17:56:06.157169 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 27 17:56:06.157202 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 27 17:56:06.162256 dbus-daemon[1550]: [system] Activating systemd to hand-off: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.2' (uid=244 pid=1519 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") May 27 17:56:06.167383 dbus-daemon[1550]: [system] Successfully activated service 'org.freedesktop.systemd1' May 27 17:56:06.167606 systemd[1]: Started update-engine.service - Update Engine. May 27 17:56:06.173576 update_engine[1563]: I20250527 17:56:06.168831 1563 update_check_scheduler.cc:74] Next update check in 7m47s May 27 17:56:06.189339 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 27 17:56:06.197129 systemd[1]: Starting systemd-hostnamed.service - Hostname Service... May 27 17:56:06.273940 bash[1612]: Updated "/home/core/.ssh/authorized_keys" May 27 17:56:06.275609 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 27 17:56:06.286267 systemd[1]: Starting sshkeys.service... May 27 17:56:06.313212 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. May 27 17:56:06.318940 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... May 27 17:56:06.340577 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev May 27 17:56:06.449802 systemd[1]: Started systemd-hostnamed.service - Hostname Service. May 27 17:56:06.486207 dbus-daemon[1550]: [system] Successfully activated service 'org.freedesktop.hostname1' May 27 17:56:06.486603 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 27 17:56:06.488574 dbus-daemon[1550]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.5' (uid=0 pid=1605 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") May 27 17:56:06.502869 systemd[1]: Starting polkit.service - Authorization Manager... May 27 17:56:06.580727 systemd-logind[1561]: New seat seat0. May 27 17:56:06.587825 systemd[1]: Started systemd-logind.service - User Login Management. May 27 17:56:06.640298 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. May 27 17:56:06.661079 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 27 17:56:06.682321 containerd[1585]: time="2025-05-27T17:56:06Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 27 17:56:06.692426 containerd[1585]: time="2025-05-27T17:56:06.692217751Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 May 27 17:56:06.740641 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 27 17:56:06.758587 containerd[1585]: time="2025-05-27T17:56:06.758192136Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t=4.636759ms May 27 17:56:06.758587 containerd[1585]: time="2025-05-27T17:56:06.758265301Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 27 17:56:06.758587 containerd[1585]: time="2025-05-27T17:56:06.758325982Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 27 17:56:06.758926 containerd[1585]: time="2025-05-27T17:56:06.758899300Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 27 17:56:06.759498 containerd[1585]: time="2025-05-27T17:56:06.759469511Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 27 17:56:06.759653 containerd[1585]: time="2025-05-27T17:56:06.759626839Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 27 17:56:06.760227 containerd[1585]: time="2025-05-27T17:56:06.760186049Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 27 17:56:06.763705 containerd[1585]: time="2025-05-27T17:56:06.762860559Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 27 17:56:06.763705 containerd[1585]: time="2025-05-27T17:56:06.763195839Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 27 17:56:06.763705 containerd[1585]: time="2025-05-27T17:56:06.763231884Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 27 17:56:06.763705 containerd[1585]: time="2025-05-27T17:56:06.763254316Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 27 17:56:06.763705 containerd[1585]: time="2025-05-27T17:56:06.763268904Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 27 17:56:06.763705 containerd[1585]: time="2025-05-27T17:56:06.763456456Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 27 17:56:06.764466 containerd[1585]: time="2025-05-27T17:56:06.764437224Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 27 17:56:06.767543 containerd[1585]: time="2025-05-27T17:56:06.766690237Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 27 17:56:06.767543 containerd[1585]: time="2025-05-27T17:56:06.766718838Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 27 17:56:06.767543 containerd[1585]: time="2025-05-27T17:56:06.766796452Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 27 17:56:06.767543 containerd[1585]: time="2025-05-27T17:56:06.767220518Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 27 17:56:06.767543 containerd[1585]: time="2025-05-27T17:56:06.767323951Z" level=info msg="metadata content store policy set" policy=shared May 27 17:56:06.769581 kernel: mousedev: PS/2 mouse device common for all mice May 27 17:56:06.780660 containerd[1585]: time="2025-05-27T17:56:06.779490364Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 27 17:56:06.780660 containerd[1585]: time="2025-05-27T17:56:06.779825719Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 27 17:56:06.780660 containerd[1585]: time="2025-05-27T17:56:06.779933274Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 27 17:56:06.780660 containerd[1585]: time="2025-05-27T17:56:06.779976837Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 27 17:56:06.780660 containerd[1585]: time="2025-05-27T17:56:06.779995179Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 27 17:56:06.780660 containerd[1585]: time="2025-05-27T17:56:06.780041006Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 27 17:56:06.780660 containerd[1585]: time="2025-05-27T17:56:06.780113467Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 27 17:56:06.780660 containerd[1585]: time="2025-05-27T17:56:06.780136572Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 27 17:56:06.780660 containerd[1585]: time="2025-05-27T17:56:06.780182926Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 27 17:56:06.780660 containerd[1585]: time="2025-05-27T17:56:06.780202304Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 27 17:56:06.780660 containerd[1585]: time="2025-05-27T17:56:06.780216764Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 27 17:56:06.780660 containerd[1585]: time="2025-05-27T17:56:06.780285977Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 27 17:56:06.780660 containerd[1585]: time="2025-05-27T17:56:06.780543207Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 27 17:56:06.780660 containerd[1585]: time="2025-05-27T17:56:06.780612064Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 27 17:56:06.781948 containerd[1585]: time="2025-05-27T17:56:06.780638315Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 27 17:56:06.781948 containerd[1585]: time="2025-05-27T17:56:06.781545131Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 27 17:56:06.781948 containerd[1585]: time="2025-05-27T17:56:06.781624385Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 27 17:56:06.781948 containerd[1585]: time="2025-05-27T17:56:06.781642392Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 27 17:56:06.786933 containerd[1585]: time="2025-05-27T17:56:06.781659198Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 27 17:56:06.786933 containerd[1585]: time="2025-05-27T17:56:06.783615357Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 27 17:56:06.786933 containerd[1585]: time="2025-05-27T17:56:06.783663446Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 27 17:56:06.786933 containerd[1585]: time="2025-05-27T17:56:06.783718490Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 27 17:56:06.786933 containerd[1585]: time="2025-05-27T17:56:06.783760077Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 27 17:56:06.786933 containerd[1585]: time="2025-05-27T17:56:06.786612820Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 27 17:56:06.788122 containerd[1585]: time="2025-05-27T17:56:06.787623907Z" level=info msg="Start snapshots syncer" May 27 17:56:06.788122 containerd[1585]: time="2025-05-27T17:56:06.787700826Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 27 17:56:06.791656 containerd[1585]: time="2025-05-27T17:56:06.791258460Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 27 17:56:06.791656 containerd[1585]: time="2025-05-27T17:56:06.791428059Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 27 17:56:06.795644 containerd[1585]: time="2025-05-27T17:56:06.795614325Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 27 17:56:06.801267 containerd[1585]: time="2025-05-27T17:56:06.799868429Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 27 17:56:06.801267 containerd[1585]: time="2025-05-27T17:56:06.799958770Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 27 17:56:06.801267 containerd[1585]: time="2025-05-27T17:56:06.799981390Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 27 17:56:06.801267 containerd[1585]: time="2025-05-27T17:56:06.800019577Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 27 17:56:06.801267 containerd[1585]: time="2025-05-27T17:56:06.800046948Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 27 17:56:06.801267 containerd[1585]: time="2025-05-27T17:56:06.800101412Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 27 17:56:06.801267 containerd[1585]: time="2025-05-27T17:56:06.800141832Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 27 17:56:06.801267 containerd[1585]: time="2025-05-27T17:56:06.800214136Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 27 17:56:06.801267 containerd[1585]: time="2025-05-27T17:56:06.800237518Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 27 17:56:06.801267 containerd[1585]: time="2025-05-27T17:56:06.800278938Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 27 17:56:06.801267 containerd[1585]: time="2025-05-27T17:56:06.800358639Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 27 17:56:06.801267 containerd[1585]: time="2025-05-27T17:56:06.800401241Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 27 17:56:06.801267 containerd[1585]: time="2025-05-27T17:56:06.800438040Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 27 17:56:06.801789 containerd[1585]: time="2025-05-27T17:56:06.800464700Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 27 17:56:06.801789 containerd[1585]: time="2025-05-27T17:56:06.800480672Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 27 17:56:06.801789 containerd[1585]: time="2025-05-27T17:56:06.800496847Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 27 17:56:06.801789 containerd[1585]: time="2025-05-27T17:56:06.800534276Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 27 17:56:06.801789 containerd[1585]: time="2025-05-27T17:56:06.800597370Z" level=info msg="runtime interface created" May 27 17:56:06.801789 containerd[1585]: time="2025-05-27T17:56:06.800610582Z" level=info msg="created NRI interface" May 27 17:56:06.801789 containerd[1585]: time="2025-05-27T17:56:06.800624569Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 27 17:56:06.801789 containerd[1585]: time="2025-05-27T17:56:06.800678890Z" level=info msg="Connect containerd service" May 27 17:56:06.801789 containerd[1585]: time="2025-05-27T17:56:06.800738940Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 27 17:56:06.810574 containerd[1585]: time="2025-05-27T17:56:06.809476243Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 27 17:56:06.851652 locksmithd[1601]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 27 17:56:06.929469 polkitd[1624]: Started polkitd version 126 May 27 17:56:06.940484 polkitd[1624]: Loading rules from directory /etc/polkit-1/rules.d May 27 17:56:06.942695 polkitd[1624]: Loading rules from directory /run/polkit-1/rules.d May 27 17:56:06.942783 polkitd[1624]: Error opening rules directory: Error opening directory “/run/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) May 27 17:56:06.943134 polkitd[1624]: Loading rules from directory /usr/local/share/polkit-1/rules.d May 27 17:56:06.943169 polkitd[1624]: Error opening rules directory: Error opening directory “/usr/local/share/polkit-1/rules.d”: No such file or directory (g-file-error-quark, 4) May 27 17:56:06.943238 polkitd[1624]: Loading rules from directory /usr/share/polkit-1/rules.d May 27 17:56:06.946702 polkitd[1624]: Finished loading, compiling and executing 2 rules May 27 17:56:06.947681 systemd[1]: Started polkit.service - Authorization Manager. May 27 17:56:06.948165 dbus-daemon[1550]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' May 27 17:56:06.948750 polkitd[1624]: Acquired the name org.freedesktop.PolicyKit1 on the system bus May 27 17:56:06.960572 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input5 May 27 17:56:06.980640 systemd-hostnamed[1605]: Hostname set to (static) May 27 17:56:07.007502 kernel: ACPI: button: Power Button [PWRF] May 27 17:56:07.116663 containerd[1585]: time="2025-05-27T17:56:07.113790088Z" level=info msg="Start subscribing containerd event" May 27 17:56:07.116663 containerd[1585]: time="2025-05-27T17:56:07.115900352Z" level=info msg="Start recovering state" May 27 17:56:07.116663 containerd[1585]: time="2025-05-27T17:56:07.116116971Z" level=info msg="Start event monitor" May 27 17:56:07.116663 containerd[1585]: time="2025-05-27T17:56:07.116139094Z" level=info msg="Start cni network conf syncer for default" May 27 17:56:07.116663 containerd[1585]: time="2025-05-27T17:56:07.116152259Z" level=info msg="Start streaming server" May 27 17:56:07.116663 containerd[1585]: time="2025-05-27T17:56:07.116192822Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 27 17:56:07.116663 containerd[1585]: time="2025-05-27T17:56:07.116207099Z" level=info msg="runtime interface starting up..." May 27 17:56:07.116663 containerd[1585]: time="2025-05-27T17:56:07.116220830Z" level=info msg="starting plugins..." May 27 17:56:07.116663 containerd[1585]: time="2025-05-27T17:56:07.116250856Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 27 17:56:07.119368 containerd[1585]: time="2025-05-27T17:56:07.118863456Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 27 17:56:07.119368 containerd[1585]: time="2025-05-27T17:56:07.118978151Z" level=info msg=serving... address=/run/containerd/containerd.sock May 27 17:56:07.120008 systemd[1]: Started containerd.service - containerd container runtime. May 27 17:56:07.126571 containerd[1585]: time="2025-05-27T17:56:07.123016622Z" level=info msg="containerd successfully booted in 0.441226s" May 27 17:56:07.263587 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt May 27 17:56:07.270746 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD May 27 17:56:07.417103 tar[1567]: linux-amd64/LICENSE May 27 17:56:07.417103 tar[1567]: linux-amd64/README.md May 27 17:56:07.463286 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 27 17:56:07.502136 systemd-logind[1561]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) May 27 17:56:07.515777 systemd-networkd[1519]: eth0: Gained IPv6LL May 27 17:56:07.517773 systemd-timesyncd[1481]: Network configuration changed, trying to establish connection. May 27 17:56:07.522632 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 27 17:56:07.528253 systemd[1]: Reached target network-online.target - Network is Online. May 27 17:56:07.530724 systemd-logind[1561]: Watching system buttons on /dev/input/event3 (Power Button) May 27 17:56:07.532925 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:56:07.538990 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 27 17:56:07.617743 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 17:56:07.637741 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 27 17:56:07.925241 sshd_keygen[1586]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 27 17:56:07.978258 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 27 17:56:07.994940 systemd[1]: Starting issuegen.service - Generate /run/issue... May 27 17:56:08.026744 systemd[1]: Started sshd@0-10.230.41.6:22-139.178.68.195:45192.service - OpenSSH per-connection server daemon (139.178.68.195:45192). May 27 17:56:08.064535 systemd[1]: issuegen.service: Deactivated successfully. May 27 17:56:08.068660 systemd[1]: Finished issuegen.service - Generate /run/issue. May 27 17:56:08.084829 systemd-timesyncd[1481]: Network configuration changed, trying to establish connection. May 27 17:56:08.087213 systemd-networkd[1519]: eth0: Ignoring DHCPv6 address 2a02:1348:179:8a41:24:19ff:fee6:2906/128 (valid for 59min 59s, preferred for 59min 59s) which conflicts with 2a02:1348:179:8a41:24:19ff:fee6:2906/64 assigned by NDisc. May 27 17:56:08.087411 systemd-networkd[1519]: eth0: Hint: use IPv6Token= setting to change the address generated by NDisc or set UseAutonomousPrefix=no. May 27 17:56:08.107144 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 17:56:08.115978 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 27 17:56:08.141176 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 27 17:56:08.145973 systemd[1]: Started getty@tty1.service - Getty on tty1. May 27 17:56:08.149726 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. May 27 17:56:08.150767 systemd[1]: Reached target getty.target - Login Prompts. May 27 17:56:08.728409 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:56:08.740048 (kubelet)[1713]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:56:08.842773 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev May 27 17:56:08.842876 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev May 27 17:56:09.009626 sshd[1698]: Accepted publickey for core from 139.178.68.195 port 45192 ssh2: RSA SHA256:TP4s4bOAUCtqDEOOWsp9GTiG5zPCRK6jIwLqN8nWHAI May 27 17:56:09.011925 sshd-session[1698]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:56:09.036620 systemd-logind[1561]: New session 1 of user core. May 27 17:56:09.039268 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 27 17:56:09.047783 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 27 17:56:09.087495 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 27 17:56:09.094028 systemd[1]: Starting user@500.service - User Manager for UID 500... May 27 17:56:09.112029 (systemd)[1724]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 27 17:56:09.118798 systemd-logind[1561]: New session c1 of user core. May 27 17:56:09.182250 systemd-timesyncd[1481]: Network configuration changed, trying to establish connection. May 27 17:56:09.309308 systemd[1724]: Queued start job for default target default.target. May 27 17:56:09.316773 systemd[1724]: Created slice app.slice - User Application Slice. May 27 17:56:09.316815 systemd[1724]: Reached target paths.target - Paths. May 27 17:56:09.317210 systemd[1724]: Reached target timers.target - Timers. May 27 17:56:09.321664 systemd[1724]: Starting dbus.socket - D-Bus User Message Bus Socket... May 27 17:56:09.346021 systemd[1724]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 27 17:56:09.346222 systemd[1724]: Reached target sockets.target - Sockets. May 27 17:56:09.346314 systemd[1724]: Reached target basic.target - Basic System. May 27 17:56:09.346410 systemd[1724]: Reached target default.target - Main User Target. May 27 17:56:09.346474 systemd[1724]: Startup finished in 216ms. May 27 17:56:09.347078 systemd[1]: Started user@500.service - User Manager for UID 500. May 27 17:56:09.359344 kubelet[1713]: E0527 17:56:09.359218 1713 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:56:09.361964 systemd[1]: Started session-1.scope - Session 1 of User core. May 27 17:56:09.363538 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:56:09.363860 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:56:09.364832 systemd[1]: kubelet.service: Consumed 1.001s CPU time, 266M memory peak. May 27 17:56:10.002027 systemd[1]: Started sshd@1-10.230.41.6:22-139.178.68.195:45198.service - OpenSSH per-connection server daemon (139.178.68.195:45198). May 27 17:56:10.870747 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev May 27 17:56:10.875645 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev May 27 17:56:10.916774 sshd[1736]: Accepted publickey for core from 139.178.68.195 port 45198 ssh2: RSA SHA256:TP4s4bOAUCtqDEOOWsp9GTiG5zPCRK6jIwLqN8nWHAI May 27 17:56:10.919705 sshd-session[1736]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:56:10.932466 systemd-logind[1561]: New session 2 of user core. May 27 17:56:10.937842 systemd[1]: Started session-2.scope - Session 2 of User core. May 27 17:56:11.534464 sshd[1740]: Connection closed by 139.178.68.195 port 45198 May 27 17:56:11.535406 sshd-session[1736]: pam_unix(sshd:session): session closed for user core May 27 17:56:11.540085 systemd-logind[1561]: Session 2 logged out. Waiting for processes to exit. May 27 17:56:11.541201 systemd[1]: sshd@1-10.230.41.6:22-139.178.68.195:45198.service: Deactivated successfully. May 27 17:56:11.543375 systemd[1]: session-2.scope: Deactivated successfully. May 27 17:56:11.545619 systemd-logind[1561]: Removed session 2. May 27 17:56:11.690127 systemd[1]: Started sshd@2-10.230.41.6:22-139.178.68.195:45206.service - OpenSSH per-connection server daemon (139.178.68.195:45206). May 27 17:56:12.603036 sshd[1746]: Accepted publickey for core from 139.178.68.195 port 45206 ssh2: RSA SHA256:TP4s4bOAUCtqDEOOWsp9GTiG5zPCRK6jIwLqN8nWHAI May 27 17:56:12.604866 sshd-session[1746]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:56:12.613484 systemd-logind[1561]: New session 3 of user core. May 27 17:56:12.622898 systemd[1]: Started session-3.scope - Session 3 of User core. May 27 17:56:13.223906 sshd[1748]: Connection closed by 139.178.68.195 port 45206 May 27 17:56:13.222385 sshd-session[1746]: pam_unix(sshd:session): session closed for user core May 27 17:56:13.241333 systemd[1]: sshd@2-10.230.41.6:22-139.178.68.195:45206.service: Deactivated successfully. May 27 17:56:13.243628 systemd-logind[1561]: Session 3 logged out. Waiting for processes to exit. May 27 17:56:13.245899 login[1707]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) May 27 17:56:13.244600 systemd[1]: session-3.scope: Deactivated successfully. May 27 17:56:13.248569 systemd-logind[1561]: Removed session 3. May 27 17:56:13.253668 login[1708]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) May 27 17:56:13.255059 systemd-logind[1561]: New session 4 of user core. May 27 17:56:13.261823 systemd[1]: Started session-4.scope - Session 4 of User core. May 27 17:56:13.266358 systemd-logind[1561]: New session 5 of user core. May 27 17:56:13.275840 systemd[1]: Started session-5.scope - Session 5 of User core. May 27 17:56:14.896590 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev May 27 17:56:14.901580 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev May 27 17:56:14.909447 coreos-metadata[1549]: May 27 17:56:14.909 WARN failed to locate config-drive, using the metadata service API instead May 27 17:56:14.913580 coreos-metadata[1616]: May 27 17:56:14.913 WARN failed to locate config-drive, using the metadata service API instead May 27 17:56:14.934931 coreos-metadata[1616]: May 27 17:56:14.934 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 May 27 17:56:14.935513 coreos-metadata[1549]: May 27 17:56:14.935 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 May 27 17:56:14.943010 coreos-metadata[1549]: May 27 17:56:14.942 INFO Fetch failed with 404: resource not found May 27 17:56:14.943010 coreos-metadata[1549]: May 27 17:56:14.942 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 May 27 17:56:14.944002 coreos-metadata[1549]: May 27 17:56:14.943 INFO Fetch successful May 27 17:56:14.944178 coreos-metadata[1549]: May 27 17:56:14.944 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 May 27 17:56:14.959819 coreos-metadata[1549]: May 27 17:56:14.959 INFO Fetch successful May 27 17:56:14.960134 coreos-metadata[1549]: May 27 17:56:14.960 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 May 27 17:56:14.960379 coreos-metadata[1616]: May 27 17:56:14.960 INFO Fetch successful May 27 17:56:14.960633 coreos-metadata[1616]: May 27 17:56:14.960 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 May 27 17:56:14.975018 coreos-metadata[1549]: May 27 17:56:14.974 INFO Fetch successful May 27 17:56:14.975444 coreos-metadata[1549]: May 27 17:56:14.975 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 May 27 17:56:14.984496 coreos-metadata[1616]: May 27 17:56:14.984 INFO Fetch successful May 27 17:56:14.986717 unknown[1616]: wrote ssh authorized keys file for user: core May 27 17:56:14.994362 coreos-metadata[1549]: May 27 17:56:14.994 INFO Fetch successful May 27 17:56:14.994362 coreos-metadata[1549]: May 27 17:56:14.994 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 May 27 17:56:15.015027 coreos-metadata[1549]: May 27 17:56:15.014 INFO Fetch successful May 27 17:56:15.017275 update-ssh-keys[1784]: Updated "/home/core/.ssh/authorized_keys" May 27 17:56:15.020042 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). May 27 17:56:15.022110 systemd[1]: Finished sshkeys.service. May 27 17:56:15.046493 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. May 27 17:56:15.047511 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 27 17:56:15.047781 systemd[1]: Reached target multi-user.target - Multi-User System. May 27 17:56:15.050742 systemd[1]: Startup finished in 3.421s (kernel) + 15.914s (initrd) + 12.542s (userspace) = 31.878s. May 27 17:56:19.575071 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 27 17:56:19.577505 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:56:19.887366 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:56:19.898207 (kubelet)[1801]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:56:19.963953 kubelet[1801]: E0527 17:56:19.963855 1801 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:56:19.968363 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:56:19.968617 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:56:19.969197 systemd[1]: kubelet.service: Consumed 243ms CPU time, 111.5M memory peak. May 27 17:56:23.380947 systemd[1]: Started sshd@3-10.230.41.6:22-139.178.68.195:53414.service - OpenSSH per-connection server daemon (139.178.68.195:53414). May 27 17:56:24.289643 sshd[1809]: Accepted publickey for core from 139.178.68.195 port 53414 ssh2: RSA SHA256:TP4s4bOAUCtqDEOOWsp9GTiG5zPCRK6jIwLqN8nWHAI May 27 17:56:24.291388 sshd-session[1809]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:56:24.298628 systemd-logind[1561]: New session 6 of user core. May 27 17:56:24.305754 systemd[1]: Started session-6.scope - Session 6 of User core. May 27 17:56:24.909645 sshd[1811]: Connection closed by 139.178.68.195 port 53414 May 27 17:56:24.910583 sshd-session[1809]: pam_unix(sshd:session): session closed for user core May 27 17:56:24.915642 systemd-logind[1561]: Session 6 logged out. Waiting for processes to exit. May 27 17:56:24.916100 systemd[1]: sshd@3-10.230.41.6:22-139.178.68.195:53414.service: Deactivated successfully. May 27 17:56:24.918642 systemd[1]: session-6.scope: Deactivated successfully. May 27 17:56:24.921039 systemd-logind[1561]: Removed session 6. May 27 17:56:25.067544 systemd[1]: Started sshd@4-10.230.41.6:22-139.178.68.195:53430.service - OpenSSH per-connection server daemon (139.178.68.195:53430). May 27 17:56:25.973266 sshd[1817]: Accepted publickey for core from 139.178.68.195 port 53430 ssh2: RSA SHA256:TP4s4bOAUCtqDEOOWsp9GTiG5zPCRK6jIwLqN8nWHAI May 27 17:56:25.975064 sshd-session[1817]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:56:25.982589 systemd-logind[1561]: New session 7 of user core. May 27 17:56:25.987773 systemd[1]: Started session-7.scope - Session 7 of User core. May 27 17:56:26.591965 sshd[1819]: Connection closed by 139.178.68.195 port 53430 May 27 17:56:26.591792 sshd-session[1817]: pam_unix(sshd:session): session closed for user core May 27 17:56:26.596993 systemd-logind[1561]: Session 7 logged out. Waiting for processes to exit. May 27 17:56:26.597412 systemd[1]: sshd@4-10.230.41.6:22-139.178.68.195:53430.service: Deactivated successfully. May 27 17:56:26.599746 systemd[1]: session-7.scope: Deactivated successfully. May 27 17:56:26.602035 systemd-logind[1561]: Removed session 7. May 27 17:56:26.748182 systemd[1]: Started sshd@5-10.230.41.6:22-139.178.68.195:53444.service - OpenSSH per-connection server daemon (139.178.68.195:53444). May 27 17:56:27.659461 sshd[1825]: Accepted publickey for core from 139.178.68.195 port 53444 ssh2: RSA SHA256:TP4s4bOAUCtqDEOOWsp9GTiG5zPCRK6jIwLqN8nWHAI May 27 17:56:27.661624 sshd-session[1825]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:56:27.670627 systemd-logind[1561]: New session 8 of user core. May 27 17:56:27.676845 systemd[1]: Started session-8.scope - Session 8 of User core. May 27 17:56:28.284603 sshd[1827]: Connection closed by 139.178.68.195 port 53444 May 27 17:56:28.285805 sshd-session[1825]: pam_unix(sshd:session): session closed for user core May 27 17:56:28.292349 systemd[1]: sshd@5-10.230.41.6:22-139.178.68.195:53444.service: Deactivated successfully. May 27 17:56:28.294841 systemd[1]: session-8.scope: Deactivated successfully. May 27 17:56:28.296027 systemd-logind[1561]: Session 8 logged out. Waiting for processes to exit. May 27 17:56:28.297989 systemd-logind[1561]: Removed session 8. May 27 17:56:28.439659 systemd[1]: Started sshd@6-10.230.41.6:22-139.178.68.195:53458.service - OpenSSH per-connection server daemon (139.178.68.195:53458). May 27 17:56:29.347667 sshd[1833]: Accepted publickey for core from 139.178.68.195 port 53458 ssh2: RSA SHA256:TP4s4bOAUCtqDEOOWsp9GTiG5zPCRK6jIwLqN8nWHAI May 27 17:56:29.349927 sshd-session[1833]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:56:29.357737 systemd-logind[1561]: New session 9 of user core. May 27 17:56:29.368974 systemd[1]: Started session-9.scope - Session 9 of User core. May 27 17:56:29.833466 sudo[1836]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 27 17:56:29.834494 sudo[1836]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 17:56:29.852051 sudo[1836]: pam_unix(sudo:session): session closed for user root May 27 17:56:29.995635 sshd[1835]: Connection closed by 139.178.68.195 port 53458 May 27 17:56:29.996745 sshd-session[1833]: pam_unix(sshd:session): session closed for user core May 27 17:56:30.002549 systemd[1]: sshd@6-10.230.41.6:22-139.178.68.195:53458.service: Deactivated successfully. May 27 17:56:30.004640 systemd[1]: session-9.scope: Deactivated successfully. May 27 17:56:30.006165 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. May 27 17:56:30.007720 systemd-logind[1561]: Session 9 logged out. Waiting for processes to exit. May 27 17:56:30.010404 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:56:30.012193 systemd-logind[1561]: Removed session 9. May 27 17:56:30.151545 systemd[1]: Started sshd@7-10.230.41.6:22-139.178.68.195:53466.service - OpenSSH per-connection server daemon (139.178.68.195:53466). May 27 17:56:30.241731 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:56:30.259130 (kubelet)[1852]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:56:30.310246 kubelet[1852]: E0527 17:56:30.310167 1852 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:56:30.313155 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:56:30.313385 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:56:30.314223 systemd[1]: kubelet.service: Consumed 189ms CPU time, 108.6M memory peak. May 27 17:56:31.072573 sshd[1845]: Accepted publickey for core from 139.178.68.195 port 53466 ssh2: RSA SHA256:TP4s4bOAUCtqDEOOWsp9GTiG5zPCRK6jIwLqN8nWHAI May 27 17:56:31.074491 sshd-session[1845]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:56:31.082195 systemd-logind[1561]: New session 10 of user core. May 27 17:56:31.087782 systemd[1]: Started session-10.scope - Session 10 of User core. May 27 17:56:31.549654 sudo[1861]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 27 17:56:31.550099 sudo[1861]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 17:56:31.557079 sudo[1861]: pam_unix(sudo:session): session closed for user root May 27 17:56:31.565343 sudo[1860]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 27 17:56:31.565826 sudo[1860]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 17:56:31.578229 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 27 17:56:31.635028 augenrules[1883]: No rules May 27 17:56:31.636694 systemd[1]: audit-rules.service: Deactivated successfully. May 27 17:56:31.637067 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 27 17:56:31.638850 sudo[1860]: pam_unix(sudo:session): session closed for user root May 27 17:56:31.785593 sshd[1859]: Connection closed by 139.178.68.195 port 53466 May 27 17:56:31.785439 sshd-session[1845]: pam_unix(sshd:session): session closed for user core May 27 17:56:31.790616 systemd[1]: sshd@7-10.230.41.6:22-139.178.68.195:53466.service: Deactivated successfully. May 27 17:56:31.793055 systemd[1]: session-10.scope: Deactivated successfully. May 27 17:56:31.794673 systemd-logind[1561]: Session 10 logged out. Waiting for processes to exit. May 27 17:56:31.796987 systemd-logind[1561]: Removed session 10. May 27 17:56:31.942970 systemd[1]: Started sshd@8-10.230.41.6:22-139.178.68.195:53476.service - OpenSSH per-connection server daemon (139.178.68.195:53476). May 27 17:56:32.851645 sshd[1892]: Accepted publickey for core from 139.178.68.195 port 53476 ssh2: RSA SHA256:TP4s4bOAUCtqDEOOWsp9GTiG5zPCRK6jIwLqN8nWHAI May 27 17:56:32.853395 sshd-session[1892]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:56:32.859883 systemd-logind[1561]: New session 11 of user core. May 27 17:56:32.870771 systemd[1]: Started session-11.scope - Session 11 of User core. May 27 17:56:33.326024 sudo[1895]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 27 17:56:33.326431 sudo[1895]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 17:56:33.808998 systemd[1]: Starting docker.service - Docker Application Container Engine... May 27 17:56:33.824136 (dockerd)[1913]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 27 17:56:34.128395 dockerd[1913]: time="2025-05-27T17:56:34.128269491Z" level=info msg="Starting up" May 27 17:56:34.129319 dockerd[1913]: time="2025-05-27T17:56:34.129280317Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 27 17:56:34.205977 dockerd[1913]: time="2025-05-27T17:56:34.205924719Z" level=info msg="Loading containers: start." May 27 17:56:34.219815 kernel: Initializing XFRM netlink socket May 27 17:56:34.467782 systemd-timesyncd[1481]: Network configuration changed, trying to establish connection. May 27 17:56:34.522142 systemd-networkd[1519]: docker0: Link UP May 27 17:56:34.526350 dockerd[1913]: time="2025-05-27T17:56:34.526304418Z" level=info msg="Loading containers: done." May 27 17:56:34.546696 dockerd[1913]: time="2025-05-27T17:56:34.545984443Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 27 17:56:34.546696 dockerd[1913]: time="2025-05-27T17:56:34.546094342Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 May 27 17:56:34.546696 dockerd[1913]: time="2025-05-27T17:56:34.546268194Z" level=info msg="Initializing buildkit" May 27 17:56:34.573031 dockerd[1913]: time="2025-05-27T17:56:34.572981594Z" level=info msg="Completed buildkit initialization" May 27 17:56:34.583085 dockerd[1913]: time="2025-05-27T17:56:34.582660194Z" level=info msg="Daemon has completed initialization" May 27 17:56:34.583085 dockerd[1913]: time="2025-05-27T17:56:34.582780978Z" level=info msg="API listen on /run/docker.sock" May 27 17:56:34.583645 systemd[1]: Started docker.service - Docker Application Container Engine. May 27 17:56:35.321201 systemd-resolved[1466]: Clock change detected. Flushing caches. May 27 17:56:35.321520 systemd-timesyncd[1481]: Contacted time server [2a02:8010:d015::123]:123 (2.flatcar.pool.ntp.org). May 27 17:56:35.322967 systemd-timesyncd[1481]: Initial clock synchronization to Tue 2025-05-27 17:56:35.320713 UTC. May 27 17:56:35.897155 containerd[1585]: time="2025-05-27T17:56:35.897057798Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.8\"" May 27 17:56:36.739714 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3504546956.mount: Deactivated successfully. May 27 17:56:38.256228 containerd[1585]: time="2025-05-27T17:56:38.256162478Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:56:38.257552 containerd[1585]: time="2025-05-27T17:56:38.257514578Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.8: active requests=0, bytes read=27960995" May 27 17:56:38.258116 containerd[1585]: time="2025-05-27T17:56:38.258082547Z" level=info msg="ImageCreate event name:\"sha256:e6d208e868a9ca7f89efcb0d5bddc55a62df551cb4fb39c5099a2fe7b0e33adc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:56:38.261309 containerd[1585]: time="2025-05-27T17:56:38.261264937Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:30090db6a7d53799163ce82dae9e8ddb645fd47db93f2ec9da0cc787fd825625\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:56:38.263208 containerd[1585]: time="2025-05-27T17:56:38.262555763Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.8\" with image id \"sha256:e6d208e868a9ca7f89efcb0d5bddc55a62df551cb4fb39c5099a2fe7b0e33adc\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.8\", repo digest \"registry.k8s.io/kube-apiserver@sha256:30090db6a7d53799163ce82dae9e8ddb645fd47db93f2ec9da0cc787fd825625\", size \"27957787\" in 2.365371871s" May 27 17:56:38.263208 containerd[1585]: time="2025-05-27T17:56:38.262605949Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.8\" returns image reference \"sha256:e6d208e868a9ca7f89efcb0d5bddc55a62df551cb4fb39c5099a2fe7b0e33adc\"" May 27 17:56:38.264234 containerd[1585]: time="2025-05-27T17:56:38.264206002Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.8\"" May 27 17:56:38.736876 systemd[1]: systemd-hostnamed.service: Deactivated successfully. May 27 17:56:40.888229 containerd[1585]: time="2025-05-27T17:56:40.887838454Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:56:40.889711 containerd[1585]: time="2025-05-27T17:56:40.889194086Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.8: active requests=0, bytes read=24713784" May 27 17:56:40.890701 containerd[1585]: time="2025-05-27T17:56:40.890357085Z" level=info msg="ImageCreate event name:\"sha256:fbda0bc3bc4bb93c8b2d8627a9aa8d945c200b51e48c88f9b837dde628fc7c8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:56:40.893494 containerd[1585]: time="2025-05-27T17:56:40.893439035Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:29eaddc64792a689df48506e78bbc641d063ac8bb92d2e66ae2ad05977420747\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:56:40.896170 containerd[1585]: time="2025-05-27T17:56:40.895258574Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.8\" with image id \"sha256:fbda0bc3bc4bb93c8b2d8627a9aa8d945c200b51e48c88f9b837dde628fc7c8f\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.8\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:29eaddc64792a689df48506e78bbc641d063ac8bb92d2e66ae2ad05977420747\", size \"26202149\" in 2.630916729s" May 27 17:56:40.896170 containerd[1585]: time="2025-05-27T17:56:40.895337507Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.8\" returns image reference \"sha256:fbda0bc3bc4bb93c8b2d8627a9aa8d945c200b51e48c88f9b837dde628fc7c8f\"" May 27 17:56:40.897250 containerd[1585]: time="2025-05-27T17:56:40.897221371Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.8\"" May 27 17:56:40.910194 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. May 27 17:56:40.913322 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:56:41.181934 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:56:41.196345 (kubelet)[2189]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:56:41.246855 kubelet[2189]: E0527 17:56:41.246793 2189 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:56:41.249180 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:56:41.249441 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:56:41.250444 systemd[1]: kubelet.service: Consumed 212ms CPU time, 108.5M memory peak. May 27 17:56:43.899804 containerd[1585]: time="2025-05-27T17:56:43.899731413Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:56:43.901034 containerd[1585]: time="2025-05-27T17:56:43.900874298Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.8: active requests=0, bytes read=18780394" May 27 17:56:43.901871 containerd[1585]: time="2025-05-27T17:56:43.901790164Z" level=info msg="ImageCreate event name:\"sha256:2a9c646db0be37003c2b50605a252f7139145411d9e4e0badd8ae07f56ce5eb8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:56:43.905018 containerd[1585]: time="2025-05-27T17:56:43.904984930Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:22994a2632e81059720480b9f6bdeb133b08d58492d0b36dfd6e9768b159b22a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:56:43.906824 containerd[1585]: time="2025-05-27T17:56:43.906775898Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.8\" with image id \"sha256:2a9c646db0be37003c2b50605a252f7139145411d9e4e0badd8ae07f56ce5eb8\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.8\", repo digest \"registry.k8s.io/kube-scheduler@sha256:22994a2632e81059720480b9f6bdeb133b08d58492d0b36dfd6e9768b159b22a\", size \"20268777\" in 3.00940032s" May 27 17:56:43.907000 containerd[1585]: time="2025-05-27T17:56:43.906931495Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.8\" returns image reference \"sha256:2a9c646db0be37003c2b50605a252f7139145411d9e4e0badd8ae07f56ce5eb8\"" May 27 17:56:43.908386 containerd[1585]: time="2025-05-27T17:56:43.908164215Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.8\"" May 27 17:56:46.077488 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3454053500.mount: Deactivated successfully. May 27 17:56:46.807704 containerd[1585]: time="2025-05-27T17:56:46.807084510Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:56:46.808633 containerd[1585]: time="2025-05-27T17:56:46.808603985Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.8: active requests=0, bytes read=30354633" May 27 17:56:46.809751 containerd[1585]: time="2025-05-27T17:56:46.809693371Z" level=info msg="ImageCreate event name:\"sha256:7d73f013cedcf301aef42272c93e4c1174dab1a8eccd96840091ef04b63480f2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:56:46.812640 containerd[1585]: time="2025-05-27T17:56:46.812298698Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:dd0c9a37670f209947b1ed880f06a2e93e1d41da78c037f52f94b13858769838\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:56:46.812845 containerd[1585]: time="2025-05-27T17:56:46.812807384Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.8\" with image id \"sha256:7d73f013cedcf301aef42272c93e4c1174dab1a8eccd96840091ef04b63480f2\", repo tag \"registry.k8s.io/kube-proxy:v1.31.8\", repo digest \"registry.k8s.io/kube-proxy@sha256:dd0c9a37670f209947b1ed880f06a2e93e1d41da78c037f52f94b13858769838\", size \"30353644\" in 2.904595637s" May 27 17:56:46.812941 containerd[1585]: time="2025-05-27T17:56:46.812851353Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.8\" returns image reference \"sha256:7d73f013cedcf301aef42272c93e4c1174dab1a8eccd96840091ef04b63480f2\"" May 27 17:56:46.813607 containerd[1585]: time="2025-05-27T17:56:46.813561027Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" May 27 17:56:47.990074 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3134918236.mount: Deactivated successfully. May 27 17:56:49.508214 containerd[1585]: time="2025-05-27T17:56:49.508102616Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:56:49.510114 containerd[1585]: time="2025-05-27T17:56:49.510072064Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565249" May 27 17:56:49.511763 containerd[1585]: time="2025-05-27T17:56:49.511656294Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:56:49.516205 containerd[1585]: time="2025-05-27T17:56:49.514284167Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:56:49.516205 containerd[1585]: time="2025-05-27T17:56:49.515686152Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 2.702059319s" May 27 17:56:49.516205 containerd[1585]: time="2025-05-27T17:56:49.515735079Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" May 27 17:56:49.516639 containerd[1585]: time="2025-05-27T17:56:49.516598453Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" May 27 17:56:50.757712 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2067867316.mount: Deactivated successfully. May 27 17:56:50.764291 containerd[1585]: time="2025-05-27T17:56:50.764225999Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 17:56:50.765448 containerd[1585]: time="2025-05-27T17:56:50.765310730Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" May 27 17:56:50.766279 containerd[1585]: time="2025-05-27T17:56:50.766235648Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 17:56:50.770019 containerd[1585]: time="2025-05-27T17:56:50.768954270Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 17:56:50.770019 containerd[1585]: time="2025-05-27T17:56:50.769848110Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 1.253065403s" May 27 17:56:50.770019 containerd[1585]: time="2025-05-27T17:56:50.769900984Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" May 27 17:56:50.771090 containerd[1585]: time="2025-05-27T17:56:50.771051471Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" May 27 17:56:51.410299 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. May 27 17:56:51.413059 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:56:51.605731 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:56:51.620135 (kubelet)[2272]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:56:51.692926 kubelet[2272]: E0527 17:56:51.692475 2272 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:56:51.695136 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:56:51.695388 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:56:51.696291 systemd[1]: kubelet.service: Consumed 212ms CPU time, 110.4M memory peak. May 27 17:56:51.771043 update_engine[1563]: I20250527 17:56:51.770875 1563 update_attempter.cc:509] Updating boot flags... May 27 17:56:52.293474 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2393282624.mount: Deactivated successfully. May 27 17:56:54.973510 containerd[1585]: time="2025-05-27T17:56:54.973421168Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:56:54.975198 containerd[1585]: time="2025-05-27T17:56:54.974843609Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56780021" May 27 17:56:54.976004 containerd[1585]: time="2025-05-27T17:56:54.975966447Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:56:54.979491 containerd[1585]: time="2025-05-27T17:56:54.979455265Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:56:54.981118 containerd[1585]: time="2025-05-27T17:56:54.981058243Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 4.209818396s" May 27 17:56:54.981256 containerd[1585]: time="2025-05-27T17:56:54.981231435Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" May 27 17:56:59.410458 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:56:59.410894 systemd[1]: kubelet.service: Consumed 212ms CPU time, 110.4M memory peak. May 27 17:56:59.414148 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:56:59.452135 systemd[1]: Reload requested from client PID 2378 ('systemctl') (unit session-11.scope)... May 27 17:56:59.452385 systemd[1]: Reloading... May 27 17:56:59.610703 zram_generator::config[2423]: No configuration found. May 27 17:56:59.770349 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 17:56:59.941817 systemd[1]: Reloading finished in 488 ms. May 27 17:57:00.022994 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:57:00.026531 systemd[1]: kubelet.service: Deactivated successfully. May 27 17:57:00.026946 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:57:00.027022 systemd[1]: kubelet.service: Consumed 142ms CPU time, 98.1M memory peak. May 27 17:57:00.029969 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:57:00.346915 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:57:00.358190 (kubelet)[2492]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 27 17:57:00.410115 kubelet[2492]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 17:57:00.410115 kubelet[2492]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 27 17:57:00.410115 kubelet[2492]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 17:57:00.411931 kubelet[2492]: I0527 17:57:00.411833 2492 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 27 17:57:01.138715 kubelet[2492]: I0527 17:57:01.137020 2492 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" May 27 17:57:01.138715 kubelet[2492]: I0527 17:57:01.137063 2492 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 27 17:57:01.138715 kubelet[2492]: I0527 17:57:01.137370 2492 server.go:934] "Client rotation is on, will bootstrap in background" May 27 17:57:01.168336 kubelet[2492]: I0527 17:57:01.168255 2492 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 27 17:57:01.171838 kubelet[2492]: E0527 17:57:01.171776 2492 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.230.41.6:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.230.41.6:6443: connect: connection refused" logger="UnhandledError" May 27 17:57:01.186656 kubelet[2492]: I0527 17:57:01.186511 2492 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 27 17:57:01.194382 kubelet[2492]: I0527 17:57:01.194067 2492 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 27 17:57:01.197851 kubelet[2492]: I0527 17:57:01.197810 2492 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" May 27 17:57:01.198371 kubelet[2492]: I0527 17:57:01.198310 2492 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 27 17:57:01.198804 kubelet[2492]: I0527 17:57:01.198458 2492 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-kh28t.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 27 17:57:01.199581 kubelet[2492]: I0527 17:57:01.199174 2492 topology_manager.go:138] "Creating topology manager with none policy" May 27 17:57:01.199581 kubelet[2492]: I0527 17:57:01.199201 2492 container_manager_linux.go:300] "Creating device plugin manager" May 27 17:57:01.199581 kubelet[2492]: I0527 17:57:01.199404 2492 state_mem.go:36] "Initialized new in-memory state store" May 27 17:57:01.202356 kubelet[2492]: I0527 17:57:01.202334 2492 kubelet.go:408] "Attempting to sync node with API server" May 27 17:57:01.202486 kubelet[2492]: I0527 17:57:01.202467 2492 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" May 27 17:57:01.202636 kubelet[2492]: I0527 17:57:01.202618 2492 kubelet.go:314] "Adding apiserver pod source" May 27 17:57:01.202826 kubelet[2492]: I0527 17:57:01.202795 2492 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 27 17:57:01.207967 kubelet[2492]: W0527 17:57:01.207855 2492 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.230.41.6:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-kh28t.gb1.brightbox.com&limit=500&resourceVersion=0": dial tcp 10.230.41.6:6443: connect: connection refused May 27 17:57:01.208104 kubelet[2492]: E0527 17:57:01.207970 2492 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.230.41.6:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-kh28t.gb1.brightbox.com&limit=500&resourceVersion=0\": dial tcp 10.230.41.6:6443: connect: connection refused" logger="UnhandledError" May 27 17:57:01.209390 kubelet[2492]: W0527 17:57:01.209287 2492 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.230.41.6:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.230.41.6:6443: connect: connection refused May 27 17:57:01.209390 kubelet[2492]: E0527 17:57:01.209341 2492 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.230.41.6:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.230.41.6:6443: connect: connection refused" logger="UnhandledError" May 27 17:57:01.209509 kubelet[2492]: I0527 17:57:01.209433 2492 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 27 17:57:01.212696 kubelet[2492]: I0527 17:57:01.212508 2492 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 27 17:57:01.213223 kubelet[2492]: W0527 17:57:01.213196 2492 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 27 17:57:01.214517 kubelet[2492]: I0527 17:57:01.214301 2492 server.go:1274] "Started kubelet" May 27 17:57:01.215442 kubelet[2492]: I0527 17:57:01.215064 2492 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 27 17:57:01.216473 kubelet[2492]: I0527 17:57:01.216447 2492 server.go:449] "Adding debug handlers to kubelet server" May 27 17:57:01.217894 kubelet[2492]: I0527 17:57:01.217860 2492 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 27 17:57:01.218718 kubelet[2492]: I0527 17:57:01.218354 2492 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 27 17:57:01.222483 kubelet[2492]: E0527 17:57:01.219410 2492 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.230.41.6:6443/api/v1/namespaces/default/events\": dial tcp 10.230.41.6:6443: connect: connection refused" event="&Event{ObjectMeta:{srv-kh28t.gb1.brightbox.com.184373ff6bea8681 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:srv-kh28t.gb1.brightbox.com,UID:srv-kh28t.gb1.brightbox.com,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:srv-kh28t.gb1.brightbox.com,},FirstTimestamp:2025-05-27 17:57:01.214271105 +0000 UTC m=+0.851302886,LastTimestamp:2025-05-27 17:57:01.214271105 +0000 UTC m=+0.851302886,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:srv-kh28t.gb1.brightbox.com,}" May 27 17:57:01.223714 kubelet[2492]: I0527 17:57:01.223694 2492 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 27 17:57:01.225376 kubelet[2492]: I0527 17:57:01.225347 2492 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 27 17:57:01.234770 kubelet[2492]: I0527 17:57:01.234748 2492 volume_manager.go:289] "Starting Kubelet Volume Manager" May 27 17:57:01.235192 kubelet[2492]: E0527 17:57:01.235165 2492 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"srv-kh28t.gb1.brightbox.com\" not found" May 27 17:57:01.237269 kubelet[2492]: E0527 17:57:01.237206 2492 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.41.6:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-kh28t.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.41.6:6443: connect: connection refused" interval="200ms" May 27 17:57:01.239952 kubelet[2492]: I0527 17:57:01.239917 2492 reconciler.go:26] "Reconciler: start to sync state" May 27 17:57:01.240036 kubelet[2492]: I0527 17:57:01.239998 2492 desired_state_of_world_populator.go:147] "Desired state populator starts to run" May 27 17:57:01.241407 kubelet[2492]: W0527 17:57:01.240373 2492 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.230.41.6:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.230.41.6:6443: connect: connection refused May 27 17:57:01.241407 kubelet[2492]: E0527 17:57:01.240430 2492 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.230.41.6:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.230.41.6:6443: connect: connection refused" logger="UnhandledError" May 27 17:57:01.241407 kubelet[2492]: I0527 17:57:01.240754 2492 factory.go:221] Registration of the systemd container factory successfully May 27 17:57:01.241407 kubelet[2492]: I0527 17:57:01.240876 2492 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 27 17:57:01.245216 kubelet[2492]: E0527 17:57:01.244853 2492 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 27 17:57:01.245686 kubelet[2492]: I0527 17:57:01.245494 2492 factory.go:221] Registration of the containerd container factory successfully May 27 17:57:01.268550 kubelet[2492]: I0527 17:57:01.268345 2492 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 27 17:57:01.269898 kubelet[2492]: I0527 17:57:01.269874 2492 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 27 17:57:01.269981 kubelet[2492]: I0527 17:57:01.269930 2492 status_manager.go:217] "Starting to sync pod status with apiserver" May 27 17:57:01.269981 kubelet[2492]: I0527 17:57:01.269965 2492 kubelet.go:2321] "Starting kubelet main sync loop" May 27 17:57:01.270092 kubelet[2492]: E0527 17:57:01.270023 2492 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 27 17:57:01.274025 kubelet[2492]: W0527 17:57:01.273958 2492 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.230.41.6:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.230.41.6:6443: connect: connection refused May 27 17:57:01.274025 kubelet[2492]: E0527 17:57:01.274004 2492 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.230.41.6:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.230.41.6:6443: connect: connection refused" logger="UnhandledError" May 27 17:57:01.282493 kubelet[2492]: I0527 17:57:01.282468 2492 cpu_manager.go:214] "Starting CPU manager" policy="none" May 27 17:57:01.282493 kubelet[2492]: I0527 17:57:01.282489 2492 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 27 17:57:01.282621 kubelet[2492]: I0527 17:57:01.282513 2492 state_mem.go:36] "Initialized new in-memory state store" May 27 17:57:01.285074 kubelet[2492]: I0527 17:57:01.285045 2492 policy_none.go:49] "None policy: Start" May 27 17:57:01.285954 kubelet[2492]: I0527 17:57:01.285933 2492 memory_manager.go:170] "Starting memorymanager" policy="None" May 27 17:57:01.286032 kubelet[2492]: I0527 17:57:01.285962 2492 state_mem.go:35] "Initializing new in-memory state store" May 27 17:57:01.294302 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 27 17:57:01.309044 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 27 17:57:01.313802 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 27 17:57:01.322907 kubelet[2492]: I0527 17:57:01.322867 2492 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 27 17:57:01.323914 kubelet[2492]: I0527 17:57:01.323893 2492 eviction_manager.go:189] "Eviction manager: starting control loop" May 27 17:57:01.324084 kubelet[2492]: I0527 17:57:01.324034 2492 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 27 17:57:01.324559 kubelet[2492]: I0527 17:57:01.324539 2492 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 27 17:57:01.329186 kubelet[2492]: E0527 17:57:01.328553 2492 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"srv-kh28t.gb1.brightbox.com\" not found" May 27 17:57:01.387013 systemd[1]: Created slice kubepods-burstable-pod006a239276d977e267f4c686534eb3fb.slice - libcontainer container kubepods-burstable-pod006a239276d977e267f4c686534eb3fb.slice. May 27 17:57:01.414884 systemd[1]: Created slice kubepods-burstable-pod7a85232708f894e82331fdfc7abd4a9e.slice - libcontainer container kubepods-burstable-pod7a85232708f894e82331fdfc7abd4a9e.slice. May 27 17:57:01.421589 systemd[1]: Created slice kubepods-burstable-podef6c3f6f71226197cc822829b47f8188.slice - libcontainer container kubepods-burstable-podef6c3f6f71226197cc822829b47f8188.slice. May 27 17:57:01.427173 kubelet[2492]: I0527 17:57:01.426709 2492 kubelet_node_status.go:72] "Attempting to register node" node="srv-kh28t.gb1.brightbox.com" May 27 17:57:01.427173 kubelet[2492]: E0527 17:57:01.427139 2492 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.230.41.6:6443/api/v1/nodes\": dial tcp 10.230.41.6:6443: connect: connection refused" node="srv-kh28t.gb1.brightbox.com" May 27 17:57:01.437770 kubelet[2492]: E0527 17:57:01.437729 2492 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.41.6:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-kh28t.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.41.6:6443: connect: connection refused" interval="400ms" May 27 17:57:01.541493 kubelet[2492]: I0527 17:57:01.541432 2492 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7a85232708f894e82331fdfc7abd4a9e-ca-certs\") pod \"kube-controller-manager-srv-kh28t.gb1.brightbox.com\" (UID: \"7a85232708f894e82331fdfc7abd4a9e\") " pod="kube-system/kube-controller-manager-srv-kh28t.gb1.brightbox.com" May 27 17:57:01.542204 kubelet[2492]: I0527 17:57:01.541837 2492 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7a85232708f894e82331fdfc7abd4a9e-k8s-certs\") pod \"kube-controller-manager-srv-kh28t.gb1.brightbox.com\" (UID: \"7a85232708f894e82331fdfc7abd4a9e\") " pod="kube-system/kube-controller-manager-srv-kh28t.gb1.brightbox.com" May 27 17:57:01.542204 kubelet[2492]: I0527 17:57:01.541889 2492 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7a85232708f894e82331fdfc7abd4a9e-kubeconfig\") pod \"kube-controller-manager-srv-kh28t.gb1.brightbox.com\" (UID: \"7a85232708f894e82331fdfc7abd4a9e\") " pod="kube-system/kube-controller-manager-srv-kh28t.gb1.brightbox.com" May 27 17:57:01.542204 kubelet[2492]: I0527 17:57:01.541917 2492 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7a85232708f894e82331fdfc7abd4a9e-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-kh28t.gb1.brightbox.com\" (UID: \"7a85232708f894e82331fdfc7abd4a9e\") " pod="kube-system/kube-controller-manager-srv-kh28t.gb1.brightbox.com" May 27 17:57:01.542204 kubelet[2492]: I0527 17:57:01.541946 2492 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ef6c3f6f71226197cc822829b47f8188-kubeconfig\") pod \"kube-scheduler-srv-kh28t.gb1.brightbox.com\" (UID: \"ef6c3f6f71226197cc822829b47f8188\") " pod="kube-system/kube-scheduler-srv-kh28t.gb1.brightbox.com" May 27 17:57:01.542204 kubelet[2492]: I0527 17:57:01.542002 2492 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/006a239276d977e267f4c686534eb3fb-ca-certs\") pod \"kube-apiserver-srv-kh28t.gb1.brightbox.com\" (UID: \"006a239276d977e267f4c686534eb3fb\") " pod="kube-system/kube-apiserver-srv-kh28t.gb1.brightbox.com" May 27 17:57:01.542457 kubelet[2492]: I0527 17:57:01.542035 2492 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/006a239276d977e267f4c686534eb3fb-k8s-certs\") pod \"kube-apiserver-srv-kh28t.gb1.brightbox.com\" (UID: \"006a239276d977e267f4c686534eb3fb\") " pod="kube-system/kube-apiserver-srv-kh28t.gb1.brightbox.com" May 27 17:57:01.542457 kubelet[2492]: I0527 17:57:01.542059 2492 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/006a239276d977e267f4c686534eb3fb-usr-share-ca-certificates\") pod \"kube-apiserver-srv-kh28t.gb1.brightbox.com\" (UID: \"006a239276d977e267f4c686534eb3fb\") " pod="kube-system/kube-apiserver-srv-kh28t.gb1.brightbox.com" May 27 17:57:01.542457 kubelet[2492]: I0527 17:57:01.542096 2492 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/7a85232708f894e82331fdfc7abd4a9e-flexvolume-dir\") pod \"kube-controller-manager-srv-kh28t.gb1.brightbox.com\" (UID: \"7a85232708f894e82331fdfc7abd4a9e\") " pod="kube-system/kube-controller-manager-srv-kh28t.gb1.brightbox.com" May 27 17:57:01.630011 kubelet[2492]: I0527 17:57:01.629895 2492 kubelet_node_status.go:72] "Attempting to register node" node="srv-kh28t.gb1.brightbox.com" May 27 17:57:01.630489 kubelet[2492]: E0527 17:57:01.630455 2492 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.230.41.6:6443/api/v1/nodes\": dial tcp 10.230.41.6:6443: connect: connection refused" node="srv-kh28t.gb1.brightbox.com" May 27 17:57:01.710056 containerd[1585]: time="2025-05-27T17:57:01.709909847Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-kh28t.gb1.brightbox.com,Uid:006a239276d977e267f4c686534eb3fb,Namespace:kube-system,Attempt:0,}" May 27 17:57:01.719772 containerd[1585]: time="2025-05-27T17:57:01.719721528Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-kh28t.gb1.brightbox.com,Uid:7a85232708f894e82331fdfc7abd4a9e,Namespace:kube-system,Attempt:0,}" May 27 17:57:01.725217 containerd[1585]: time="2025-05-27T17:57:01.724902161Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-kh28t.gb1.brightbox.com,Uid:ef6c3f6f71226197cc822829b47f8188,Namespace:kube-system,Attempt:0,}" May 27 17:57:01.839246 kubelet[2492]: E0527 17:57:01.839170 2492 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.41.6:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-kh28t.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.41.6:6443: connect: connection refused" interval="800ms" May 27 17:57:01.861457 containerd[1585]: time="2025-05-27T17:57:01.861380725Z" level=info msg="connecting to shim efb6f49950d84cac22b07392755dd89535b366c29db98ea3dc03193eece34dd4" address="unix:///run/containerd/s/53514a7d0b95f2bd2a02b3a9aa3142ce4abd3576187d04fbeef8ae62bf191559" namespace=k8s.io protocol=ttrpc version=3 May 27 17:57:01.878170 containerd[1585]: time="2025-05-27T17:57:01.877963804Z" level=info msg="connecting to shim 675a06b1132357479d45a7d0a7731f83df8e1f5c7b4a47c2f3db320076a6e295" address="unix:///run/containerd/s/259ec5f3e8ecbf53275af72f944d393c7f555d4aeabd257520fb4517a0f7cc90" namespace=k8s.io protocol=ttrpc version=3 May 27 17:57:01.879039 containerd[1585]: time="2025-05-27T17:57:01.878753980Z" level=info msg="connecting to shim 0df31559bae1a7f2956fcca3ed86f7c0e1a0513aceb376265ace43badb01535f" address="unix:///run/containerd/s/5a3a375fbb48ff7e9e1713ef24ff7860dbbd1ded4b20dc941f789e2136ece7aa" namespace=k8s.io protocol=ttrpc version=3 May 27 17:57:01.998862 systemd[1]: Started cri-containerd-0df31559bae1a7f2956fcca3ed86f7c0e1a0513aceb376265ace43badb01535f.scope - libcontainer container 0df31559bae1a7f2956fcca3ed86f7c0e1a0513aceb376265ace43badb01535f. May 27 17:57:02.002632 systemd[1]: Started cri-containerd-675a06b1132357479d45a7d0a7731f83df8e1f5c7b4a47c2f3db320076a6e295.scope - libcontainer container 675a06b1132357479d45a7d0a7731f83df8e1f5c7b4a47c2f3db320076a6e295. May 27 17:57:02.005408 systemd[1]: Started cri-containerd-efb6f49950d84cac22b07392755dd89535b366c29db98ea3dc03193eece34dd4.scope - libcontainer container efb6f49950d84cac22b07392755dd89535b366c29db98ea3dc03193eece34dd4. May 27 17:57:02.036833 kubelet[2492]: I0527 17:57:02.036451 2492 kubelet_node_status.go:72] "Attempting to register node" node="srv-kh28t.gb1.brightbox.com" May 27 17:57:02.038511 kubelet[2492]: E0527 17:57:02.038453 2492 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://10.230.41.6:6443/api/v1/nodes\": dial tcp 10.230.41.6:6443: connect: connection refused" node="srv-kh28t.gb1.brightbox.com" May 27 17:57:02.121267 containerd[1585]: time="2025-05-27T17:57:02.121193205Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-srv-kh28t.gb1.brightbox.com,Uid:006a239276d977e267f4c686534eb3fb,Namespace:kube-system,Attempt:0,} returns sandbox id \"efb6f49950d84cac22b07392755dd89535b366c29db98ea3dc03193eece34dd4\"" May 27 17:57:02.121652 containerd[1585]: time="2025-05-27T17:57:02.121361445Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-srv-kh28t.gb1.brightbox.com,Uid:7a85232708f894e82331fdfc7abd4a9e,Namespace:kube-system,Attempt:0,} returns sandbox id \"0df31559bae1a7f2956fcca3ed86f7c0e1a0513aceb376265ace43badb01535f\"" May 27 17:57:02.128710 containerd[1585]: time="2025-05-27T17:57:02.128431479Z" level=info msg="CreateContainer within sandbox \"0df31559bae1a7f2956fcca3ed86f7c0e1a0513aceb376265ace43badb01535f\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 27 17:57:02.129557 containerd[1585]: time="2025-05-27T17:57:02.129486116Z" level=info msg="CreateContainer within sandbox \"efb6f49950d84cac22b07392755dd89535b366c29db98ea3dc03193eece34dd4\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 27 17:57:02.139739 containerd[1585]: time="2025-05-27T17:57:02.139034145Z" level=info msg="Container 656558bdb997a0e314d70eb1602298801c1e271b28685883811782caf0665df6: CDI devices from CRI Config.CDIDevices: []" May 27 17:57:02.159022 containerd[1585]: time="2025-05-27T17:57:02.158877762Z" level=info msg="CreateContainer within sandbox \"efb6f49950d84cac22b07392755dd89535b366c29db98ea3dc03193eece34dd4\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"656558bdb997a0e314d70eb1602298801c1e271b28685883811782caf0665df6\"" May 27 17:57:02.159794 containerd[1585]: time="2025-05-27T17:57:02.159748476Z" level=info msg="StartContainer for \"656558bdb997a0e314d70eb1602298801c1e271b28685883811782caf0665df6\"" May 27 17:57:02.163457 containerd[1585]: time="2025-05-27T17:57:02.163423271Z" level=info msg="Container 4aaaefb10b0c3d147f5b78a123e95fc161318fa199436ab67abf89a411b3237a: CDI devices from CRI Config.CDIDevices: []" May 27 17:57:02.163993 containerd[1585]: time="2025-05-27T17:57:02.163928642Z" level=info msg="connecting to shim 656558bdb997a0e314d70eb1602298801c1e271b28685883811782caf0665df6" address="unix:///run/containerd/s/53514a7d0b95f2bd2a02b3a9aa3142ce4abd3576187d04fbeef8ae62bf191559" protocol=ttrpc version=3 May 27 17:57:02.170158 containerd[1585]: time="2025-05-27T17:57:02.170079750Z" level=info msg="CreateContainer within sandbox \"0df31559bae1a7f2956fcca3ed86f7c0e1a0513aceb376265ace43badb01535f\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"4aaaefb10b0c3d147f5b78a123e95fc161318fa199436ab67abf89a411b3237a\"" May 27 17:57:02.170910 containerd[1585]: time="2025-05-27T17:57:02.170880180Z" level=info msg="StartContainer for \"4aaaefb10b0c3d147f5b78a123e95fc161318fa199436ab67abf89a411b3237a\"" May 27 17:57:02.172116 containerd[1585]: time="2025-05-27T17:57:02.172083388Z" level=info msg="connecting to shim 4aaaefb10b0c3d147f5b78a123e95fc161318fa199436ab67abf89a411b3237a" address="unix:///run/containerd/s/5a3a375fbb48ff7e9e1713ef24ff7860dbbd1ded4b20dc941f789e2136ece7aa" protocol=ttrpc version=3 May 27 17:57:02.179312 containerd[1585]: time="2025-05-27T17:57:02.179268666Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-srv-kh28t.gb1.brightbox.com,Uid:ef6c3f6f71226197cc822829b47f8188,Namespace:kube-system,Attempt:0,} returns sandbox id \"675a06b1132357479d45a7d0a7731f83df8e1f5c7b4a47c2f3db320076a6e295\"" May 27 17:57:02.183350 containerd[1585]: time="2025-05-27T17:57:02.183028098Z" level=info msg="CreateContainer within sandbox \"675a06b1132357479d45a7d0a7731f83df8e1f5c7b4a47c2f3db320076a6e295\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 27 17:57:02.199332 containerd[1585]: time="2025-05-27T17:57:02.199282378Z" level=info msg="Container 3438eca8c12ee567e725e31368d0dc88d9073044a7cb06a85f1791e0ae9b7acd: CDI devices from CRI Config.CDIDevices: []" May 27 17:57:02.206870 systemd[1]: Started cri-containerd-4aaaefb10b0c3d147f5b78a123e95fc161318fa199436ab67abf89a411b3237a.scope - libcontainer container 4aaaefb10b0c3d147f5b78a123e95fc161318fa199436ab67abf89a411b3237a. May 27 17:57:02.209110 containerd[1585]: time="2025-05-27T17:57:02.209027519Z" level=info msg="CreateContainer within sandbox \"675a06b1132357479d45a7d0a7731f83df8e1f5c7b4a47c2f3db320076a6e295\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"3438eca8c12ee567e725e31368d0dc88d9073044a7cb06a85f1791e0ae9b7acd\"" May 27 17:57:02.211838 containerd[1585]: time="2025-05-27T17:57:02.211801535Z" level=info msg="StartContainer for \"3438eca8c12ee567e725e31368d0dc88d9073044a7cb06a85f1791e0ae9b7acd\"" May 27 17:57:02.216304 containerd[1585]: time="2025-05-27T17:57:02.216258486Z" level=info msg="connecting to shim 3438eca8c12ee567e725e31368d0dc88d9073044a7cb06a85f1791e0ae9b7acd" address="unix:///run/containerd/s/259ec5f3e8ecbf53275af72f944d393c7f555d4aeabd257520fb4517a0f7cc90" protocol=ttrpc version=3 May 27 17:57:02.218033 systemd[1]: Started cri-containerd-656558bdb997a0e314d70eb1602298801c1e271b28685883811782caf0665df6.scope - libcontainer container 656558bdb997a0e314d70eb1602298801c1e271b28685883811782caf0665df6. May 27 17:57:02.240824 kubelet[2492]: W0527 17:57:02.240778 2492 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.230.41.6:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.230.41.6:6443: connect: connection refused May 27 17:57:02.242440 kubelet[2492]: E0527 17:57:02.240851 2492 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.230.41.6:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.230.41.6:6443: connect: connection refused" logger="UnhandledError" May 27 17:57:02.252796 kubelet[2492]: W0527 17:57:02.251826 2492 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.230.41.6:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.230.41.6:6443: connect: connection refused May 27 17:57:02.252796 kubelet[2492]: E0527 17:57:02.252736 2492 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.230.41.6:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.230.41.6:6443: connect: connection refused" logger="UnhandledError" May 27 17:57:02.268936 systemd[1]: Started cri-containerd-3438eca8c12ee567e725e31368d0dc88d9073044a7cb06a85f1791e0ae9b7acd.scope - libcontainer container 3438eca8c12ee567e725e31368d0dc88d9073044a7cb06a85f1791e0ae9b7acd. May 27 17:57:02.315035 kubelet[2492]: W0527 17:57:02.314882 2492 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.230.41.6:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.230.41.6:6443: connect: connection refused May 27 17:57:02.315035 kubelet[2492]: E0527 17:57:02.314992 2492 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.230.41.6:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.230.41.6:6443: connect: connection refused" logger="UnhandledError" May 27 17:57:02.325915 containerd[1585]: time="2025-05-27T17:57:02.325721947Z" level=info msg="StartContainer for \"4aaaefb10b0c3d147f5b78a123e95fc161318fa199436ab67abf89a411b3237a\" returns successfully" May 27 17:57:02.382111 containerd[1585]: time="2025-05-27T17:57:02.381927057Z" level=info msg="StartContainer for \"656558bdb997a0e314d70eb1602298801c1e271b28685883811782caf0665df6\" returns successfully" May 27 17:57:02.423488 containerd[1585]: time="2025-05-27T17:57:02.423365911Z" level=info msg="StartContainer for \"3438eca8c12ee567e725e31368d0dc88d9073044a7cb06a85f1791e0ae9b7acd\" returns successfully" May 27 17:57:02.643530 kubelet[2492]: E0527 17:57:02.643086 2492 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.230.41.6:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/srv-kh28t.gb1.brightbox.com?timeout=10s\": dial tcp 10.230.41.6:6443: connect: connection refused" interval="1.6s" May 27 17:57:02.769818 kubelet[2492]: W0527 17:57:02.769657 2492 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.230.41.6:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-kh28t.gb1.brightbox.com&limit=500&resourceVersion=0": dial tcp 10.230.41.6:6443: connect: connection refused May 27 17:57:02.769818 kubelet[2492]: E0527 17:57:02.769771 2492 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.230.41.6:6443/api/v1/nodes?fieldSelector=metadata.name%3Dsrv-kh28t.gb1.brightbox.com&limit=500&resourceVersion=0\": dial tcp 10.230.41.6:6443: connect: connection refused" logger="UnhandledError" May 27 17:57:02.843396 kubelet[2492]: I0527 17:57:02.843349 2492 kubelet_node_status.go:72] "Attempting to register node" node="srv-kh28t.gb1.brightbox.com" May 27 17:57:04.724250 kubelet[2492]: E0527 17:57:04.724202 2492 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"srv-kh28t.gb1.brightbox.com\" not found" node="srv-kh28t.gb1.brightbox.com" May 27 17:57:04.875884 kubelet[2492]: I0527 17:57:04.875835 2492 kubelet_node_status.go:75] "Successfully registered node" node="srv-kh28t.gb1.brightbox.com" May 27 17:57:05.210710 kubelet[2492]: I0527 17:57:05.210635 2492 apiserver.go:52] "Watching apiserver" May 27 17:57:05.240920 kubelet[2492]: I0527 17:57:05.240831 2492 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" May 27 17:57:05.333620 kubelet[2492]: E0527 17:57:05.333555 2492 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-srv-kh28t.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-srv-kh28t.gb1.brightbox.com" May 27 17:57:05.334123 kubelet[2492]: E0527 17:57:05.334085 2492 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-srv-kh28t.gb1.brightbox.com\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-srv-kh28t.gb1.brightbox.com" May 27 17:57:06.820561 systemd[1]: Reload requested from client PID 2766 ('systemctl') (unit session-11.scope)... May 27 17:57:06.820598 systemd[1]: Reloading... May 27 17:57:06.951747 zram_generator::config[2820]: No configuration found. May 27 17:57:07.102213 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 17:57:07.303293 systemd[1]: Reloading finished in 482 ms. May 27 17:57:07.350274 kubelet[2492]: I0527 17:57:07.350175 2492 dynamic_cafile_content.go:174] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 27 17:57:07.350397 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:57:07.366483 systemd[1]: kubelet.service: Deactivated successfully. May 27 17:57:07.366982 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:57:07.367077 systemd[1]: kubelet.service: Consumed 1.326s CPU time, 128M memory peak. May 27 17:57:07.370106 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:57:07.656530 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:57:07.669434 (kubelet)[2875]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 27 17:57:07.771952 kubelet[2875]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 17:57:07.771952 kubelet[2875]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 27 17:57:07.771952 kubelet[2875]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 17:57:07.772548 kubelet[2875]: I0527 17:57:07.772056 2875 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 27 17:57:07.787172 kubelet[2875]: I0527 17:57:07.787044 2875 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" May 27 17:57:07.787172 kubelet[2875]: I0527 17:57:07.787103 2875 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 27 17:57:07.787975 kubelet[2875]: I0527 17:57:07.787718 2875 server.go:934] "Client rotation is on, will bootstrap in background" May 27 17:57:07.792313 kubelet[2875]: I0527 17:57:07.792212 2875 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". May 27 17:57:07.803900 kubelet[2875]: I0527 17:57:07.803302 2875 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 27 17:57:07.825340 kubelet[2875]: I0527 17:57:07.825307 2875 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 27 17:57:07.842072 kubelet[2875]: I0527 17:57:07.841921 2875 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 27 17:57:07.842529 kubelet[2875]: I0527 17:57:07.842483 2875 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" May 27 17:57:07.843389 kubelet[2875]: I0527 17:57:07.842862 2875 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 27 17:57:07.843389 kubelet[2875]: I0527 17:57:07.842908 2875 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"srv-kh28t.gb1.brightbox.com","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 27 17:57:07.843389 kubelet[2875]: I0527 17:57:07.843198 2875 topology_manager.go:138] "Creating topology manager with none policy" May 27 17:57:07.843389 kubelet[2875]: I0527 17:57:07.843214 2875 container_manager_linux.go:300] "Creating device plugin manager" May 27 17:57:07.843803 kubelet[2875]: I0527 17:57:07.843248 2875 state_mem.go:36] "Initialized new in-memory state store" May 27 17:57:07.843904 kubelet[2875]: I0527 17:57:07.843886 2875 kubelet.go:408] "Attempting to sync node with API server" May 27 17:57:07.844645 kubelet[2875]: I0527 17:57:07.844607 2875 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" May 27 17:57:07.844815 kubelet[2875]: I0527 17:57:07.844798 2875 kubelet.go:314] "Adding apiserver pod source" May 27 17:57:07.844912 kubelet[2875]: I0527 17:57:07.844895 2875 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 27 17:57:07.848863 kubelet[2875]: I0527 17:57:07.848832 2875 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 27 17:57:07.849570 kubelet[2875]: I0527 17:57:07.849544 2875 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 27 17:57:07.853465 kubelet[2875]: I0527 17:57:07.853438 2875 server.go:1274] "Started kubelet" May 27 17:57:07.853643 kubelet[2875]: I0527 17:57:07.853614 2875 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 27 17:57:07.859908 kubelet[2875]: I0527 17:57:07.855660 2875 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 27 17:57:07.861574 kubelet[2875]: I0527 17:57:07.853955 2875 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 27 17:57:07.865721 kubelet[2875]: I0527 17:57:07.865671 2875 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 27 17:57:07.865882 kubelet[2875]: I0527 17:57:07.861767 2875 volume_manager.go:289] "Starting Kubelet Volume Manager" May 27 17:57:07.866047 kubelet[2875]: I0527 17:57:07.855867 2875 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 27 17:57:07.866231 kubelet[2875]: I0527 17:57:07.861792 2875 desired_state_of_world_populator.go:147] "Desired state populator starts to run" May 27 17:57:07.866495 kubelet[2875]: I0527 17:57:07.866475 2875 reconciler.go:26] "Reconciler: start to sync state" May 27 17:57:07.866627 kubelet[2875]: E0527 17:57:07.861978 2875 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"srv-kh28t.gb1.brightbox.com\" not found" May 27 17:57:07.869856 kubelet[2875]: I0527 17:57:07.869834 2875 factory.go:221] Registration of the systemd container factory successfully May 27 17:57:07.871050 kubelet[2875]: I0527 17:57:07.870821 2875 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 27 17:57:07.872115 kubelet[2875]: I0527 17:57:07.872094 2875 server.go:449] "Adding debug handlers to kubelet server" May 27 17:57:07.877719 kubelet[2875]: I0527 17:57:07.877206 2875 factory.go:221] Registration of the containerd container factory successfully May 27 17:57:07.971555 kubelet[2875]: I0527 17:57:07.971402 2875 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 27 17:57:07.986885 kubelet[2875]: I0527 17:57:07.986831 2875 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 27 17:57:07.987295 kubelet[2875]: I0527 17:57:07.987268 2875 status_manager.go:217] "Starting to sync pod status with apiserver" May 27 17:57:07.988960 kubelet[2875]: I0527 17:57:07.988918 2875 kubelet.go:2321] "Starting kubelet main sync loop" May 27 17:57:07.989690 kubelet[2875]: E0527 17:57:07.989564 2875 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 27 17:57:08.065691 kubelet[2875]: I0527 17:57:08.064774 2875 cpu_manager.go:214] "Starting CPU manager" policy="none" May 27 17:57:08.065691 kubelet[2875]: I0527 17:57:08.065668 2875 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 27 17:57:08.066511 kubelet[2875]: I0527 17:57:08.065731 2875 state_mem.go:36] "Initialized new in-memory state store" May 27 17:57:08.066511 kubelet[2875]: I0527 17:57:08.065962 2875 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 27 17:57:08.066511 kubelet[2875]: I0527 17:57:08.065988 2875 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 27 17:57:08.066511 kubelet[2875]: I0527 17:57:08.066014 2875 policy_none.go:49] "None policy: Start" May 27 17:57:08.069500 kubelet[2875]: I0527 17:57:08.069433 2875 memory_manager.go:170] "Starting memorymanager" policy="None" May 27 17:57:08.069711 kubelet[2875]: I0527 17:57:08.069471 2875 state_mem.go:35] "Initializing new in-memory state store" May 27 17:57:08.070074 kubelet[2875]: I0527 17:57:08.070023 2875 state_mem.go:75] "Updated machine memory state" May 27 17:57:08.090649 kubelet[2875]: I0527 17:57:08.090607 2875 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 27 17:57:08.091038 kubelet[2875]: E0527 17:57:08.090820 2875 kubelet.go:2345] "Skipping pod synchronization" err="container runtime status check may not have completed yet" May 27 17:57:08.091359 kubelet[2875]: I0527 17:57:08.091339 2875 eviction_manager.go:189] "Eviction manager: starting control loop" May 27 17:57:08.091705 kubelet[2875]: I0527 17:57:08.091405 2875 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 27 17:57:08.093077 kubelet[2875]: I0527 17:57:08.093054 2875 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 27 17:57:08.214809 kubelet[2875]: I0527 17:57:08.214660 2875 kubelet_node_status.go:72] "Attempting to register node" node="srv-kh28t.gb1.brightbox.com" May 27 17:57:08.227349 kubelet[2875]: I0527 17:57:08.227236 2875 kubelet_node_status.go:111] "Node was previously registered" node="srv-kh28t.gb1.brightbox.com" May 27 17:57:08.227480 kubelet[2875]: I0527 17:57:08.227354 2875 kubelet_node_status.go:75] "Successfully registered node" node="srv-kh28t.gb1.brightbox.com" May 27 17:57:08.309215 kubelet[2875]: W0527 17:57:08.307859 2875 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 27 17:57:08.309215 kubelet[2875]: W0527 17:57:08.308148 2875 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 27 17:57:08.309626 kubelet[2875]: W0527 17:57:08.309604 2875 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 27 17:57:08.368271 kubelet[2875]: I0527 17:57:08.368221 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/006a239276d977e267f4c686534eb3fb-ca-certs\") pod \"kube-apiserver-srv-kh28t.gb1.brightbox.com\" (UID: \"006a239276d977e267f4c686534eb3fb\") " pod="kube-system/kube-apiserver-srv-kh28t.gb1.brightbox.com" May 27 17:57:08.368434 kubelet[2875]: I0527 17:57:08.368276 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/006a239276d977e267f4c686534eb3fb-usr-share-ca-certificates\") pod \"kube-apiserver-srv-kh28t.gb1.brightbox.com\" (UID: \"006a239276d977e267f4c686534eb3fb\") " pod="kube-system/kube-apiserver-srv-kh28t.gb1.brightbox.com" May 27 17:57:08.368434 kubelet[2875]: I0527 17:57:08.368311 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7a85232708f894e82331fdfc7abd4a9e-ca-certs\") pod \"kube-controller-manager-srv-kh28t.gb1.brightbox.com\" (UID: \"7a85232708f894e82331fdfc7abd4a9e\") " pod="kube-system/kube-controller-manager-srv-kh28t.gb1.brightbox.com" May 27 17:57:08.368434 kubelet[2875]: I0527 17:57:08.368336 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/7a85232708f894e82331fdfc7abd4a9e-flexvolume-dir\") pod \"kube-controller-manager-srv-kh28t.gb1.brightbox.com\" (UID: \"7a85232708f894e82331fdfc7abd4a9e\") " pod="kube-system/kube-controller-manager-srv-kh28t.gb1.brightbox.com" May 27 17:57:08.368434 kubelet[2875]: I0527 17:57:08.368361 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7a85232708f894e82331fdfc7abd4a9e-k8s-certs\") pod \"kube-controller-manager-srv-kh28t.gb1.brightbox.com\" (UID: \"7a85232708f894e82331fdfc7abd4a9e\") " pod="kube-system/kube-controller-manager-srv-kh28t.gb1.brightbox.com" May 27 17:57:08.368434 kubelet[2875]: I0527 17:57:08.368386 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7a85232708f894e82331fdfc7abd4a9e-kubeconfig\") pod \"kube-controller-manager-srv-kh28t.gb1.brightbox.com\" (UID: \"7a85232708f894e82331fdfc7abd4a9e\") " pod="kube-system/kube-controller-manager-srv-kh28t.gb1.brightbox.com" May 27 17:57:08.368713 kubelet[2875]: I0527 17:57:08.368409 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ef6c3f6f71226197cc822829b47f8188-kubeconfig\") pod \"kube-scheduler-srv-kh28t.gb1.brightbox.com\" (UID: \"ef6c3f6f71226197cc822829b47f8188\") " pod="kube-system/kube-scheduler-srv-kh28t.gb1.brightbox.com" May 27 17:57:08.368713 kubelet[2875]: I0527 17:57:08.368434 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/006a239276d977e267f4c686534eb3fb-k8s-certs\") pod \"kube-apiserver-srv-kh28t.gb1.brightbox.com\" (UID: \"006a239276d977e267f4c686534eb3fb\") " pod="kube-system/kube-apiserver-srv-kh28t.gb1.brightbox.com" May 27 17:57:08.368713 kubelet[2875]: I0527 17:57:08.368485 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7a85232708f894e82331fdfc7abd4a9e-usr-share-ca-certificates\") pod \"kube-controller-manager-srv-kh28t.gb1.brightbox.com\" (UID: \"7a85232708f894e82331fdfc7abd4a9e\") " pod="kube-system/kube-controller-manager-srv-kh28t.gb1.brightbox.com" May 27 17:57:08.851472 kubelet[2875]: I0527 17:57:08.851409 2875 apiserver.go:52] "Watching apiserver" May 27 17:57:08.866874 kubelet[2875]: I0527 17:57:08.866650 2875 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" May 27 17:57:09.045689 kubelet[2875]: W0527 17:57:09.045633 2875 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 27 17:57:09.045898 kubelet[2875]: E0527 17:57:09.045719 2875 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-srv-kh28t.gb1.brightbox.com\" already exists" pod="kube-system/kube-apiserver-srv-kh28t.gb1.brightbox.com" May 27 17:57:09.076214 kubelet[2875]: I0527 17:57:09.076085 2875 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-srv-kh28t.gb1.brightbox.com" podStartSLOduration=1.076055872 podStartE2EDuration="1.076055872s" podCreationTimestamp="2025-05-27 17:57:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 17:57:09.066009882 +0000 UTC m=+1.385654782" watchObservedRunningTime="2025-05-27 17:57:09.076055872 +0000 UTC m=+1.395700759" May 27 17:57:09.087473 kubelet[2875]: I0527 17:57:09.087423 2875 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-srv-kh28t.gb1.brightbox.com" podStartSLOduration=1.0870694969999999 podStartE2EDuration="1.087069497s" podCreationTimestamp="2025-05-27 17:57:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 17:57:09.076365411 +0000 UTC m=+1.396010291" watchObservedRunningTime="2025-05-27 17:57:09.087069497 +0000 UTC m=+1.406714381" May 27 17:57:09.097282 kubelet[2875]: I0527 17:57:09.097229 2875 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-srv-kh28t.gb1.brightbox.com" podStartSLOduration=1.0972154729999999 podStartE2EDuration="1.097215473s" podCreationTimestamp="2025-05-27 17:57:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 17:57:09.088206339 +0000 UTC m=+1.407851255" watchObservedRunningTime="2025-05-27 17:57:09.097215473 +0000 UTC m=+1.416860355" May 27 17:57:12.321213 kubelet[2875]: I0527 17:57:12.320967 2875 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 27 17:57:12.321988 containerd[1585]: time="2025-05-27T17:57:12.321929484Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 27 17:57:12.322520 kubelet[2875]: I0527 17:57:12.322170 2875 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 27 17:57:13.380097 systemd[1]: Created slice kubepods-besteffort-pod41916ecd_6849_4767_9856_3611d7416f5b.slice - libcontainer container kubepods-besteffort-pod41916ecd_6849_4767_9856_3611d7416f5b.slice. May 27 17:57:13.454774 systemd[1]: Created slice kubepods-besteffort-podd1de1320_e662_423b_a8ad_80de68bf4ce4.slice - libcontainer container kubepods-besteffort-podd1de1320_e662_423b_a8ad_80de68bf4ce4.slice. May 27 17:57:13.497792 kubelet[2875]: I0527 17:57:13.497739 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxr2c\" (UniqueName: \"kubernetes.io/projected/d1de1320-e662-423b-a8ad-80de68bf4ce4-kube-api-access-hxr2c\") pod \"tigera-operator-7c5755cdcb-dmdw8\" (UID: \"d1de1320-e662-423b-a8ad-80de68bf4ce4\") " pod="tigera-operator/tigera-operator-7c5755cdcb-dmdw8" May 27 17:57:13.497792 kubelet[2875]: I0527 17:57:13.497800 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/41916ecd-6849-4767-9856-3611d7416f5b-kube-proxy\") pod \"kube-proxy-856kg\" (UID: \"41916ecd-6849-4767-9856-3611d7416f5b\") " pod="kube-system/kube-proxy-856kg" May 27 17:57:13.498425 kubelet[2875]: I0527 17:57:13.497831 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/41916ecd-6849-4767-9856-3611d7416f5b-xtables-lock\") pod \"kube-proxy-856kg\" (UID: \"41916ecd-6849-4767-9856-3611d7416f5b\") " pod="kube-system/kube-proxy-856kg" May 27 17:57:13.498425 kubelet[2875]: I0527 17:57:13.497860 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/d1de1320-e662-423b-a8ad-80de68bf4ce4-var-lib-calico\") pod \"tigera-operator-7c5755cdcb-dmdw8\" (UID: \"d1de1320-e662-423b-a8ad-80de68bf4ce4\") " pod="tigera-operator/tigera-operator-7c5755cdcb-dmdw8" May 27 17:57:13.498425 kubelet[2875]: I0527 17:57:13.497887 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/41916ecd-6849-4767-9856-3611d7416f5b-lib-modules\") pod \"kube-proxy-856kg\" (UID: \"41916ecd-6849-4767-9856-3611d7416f5b\") " pod="kube-system/kube-proxy-856kg" May 27 17:57:13.498425 kubelet[2875]: I0527 17:57:13.497915 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnv7w\" (UniqueName: \"kubernetes.io/projected/41916ecd-6849-4767-9856-3611d7416f5b-kube-api-access-qnv7w\") pod \"kube-proxy-856kg\" (UID: \"41916ecd-6849-4767-9856-3611d7416f5b\") " pod="kube-system/kube-proxy-856kg" May 27 17:57:13.690586 containerd[1585]: time="2025-05-27T17:57:13.690514029Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-856kg,Uid:41916ecd-6849-4767-9856-3611d7416f5b,Namespace:kube-system,Attempt:0,}" May 27 17:57:13.716956 containerd[1585]: time="2025-05-27T17:57:13.716843791Z" level=info msg="connecting to shim 10348c28acf8da324894a42987056e8caddd369953010b80be3ea9c247dd9bc1" address="unix:///run/containerd/s/6e43946468df1a74eee5956163623997aac87870d21981cd2fa43181495f3170" namespace=k8s.io protocol=ttrpc version=3 May 27 17:57:13.759020 systemd[1]: Started cri-containerd-10348c28acf8da324894a42987056e8caddd369953010b80be3ea9c247dd9bc1.scope - libcontainer container 10348c28acf8da324894a42987056e8caddd369953010b80be3ea9c247dd9bc1. May 27 17:57:13.760749 containerd[1585]: time="2025-05-27T17:57:13.760699086Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7c5755cdcb-dmdw8,Uid:d1de1320-e662-423b-a8ad-80de68bf4ce4,Namespace:tigera-operator,Attempt:0,}" May 27 17:57:13.810904 containerd[1585]: time="2025-05-27T17:57:13.810845787Z" level=info msg="connecting to shim 25402e914b2a673f1abb8ac71490a629b12dde516718344b0238fc4262441f50" address="unix:///run/containerd/s/2ea09119f07aa0c55c2afbcd71354f34f25ee3a2171631edf46a66cd8b2452de" namespace=k8s.io protocol=ttrpc version=3 May 27 17:57:13.819929 containerd[1585]: time="2025-05-27T17:57:13.819879332Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-856kg,Uid:41916ecd-6849-4767-9856-3611d7416f5b,Namespace:kube-system,Attempt:0,} returns sandbox id \"10348c28acf8da324894a42987056e8caddd369953010b80be3ea9c247dd9bc1\"" May 27 17:57:13.825802 containerd[1585]: time="2025-05-27T17:57:13.825758735Z" level=info msg="CreateContainer within sandbox \"10348c28acf8da324894a42987056e8caddd369953010b80be3ea9c247dd9bc1\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 27 17:57:13.847477 containerd[1585]: time="2025-05-27T17:57:13.847404207Z" level=info msg="Container 23955361b043d8b92d017c30671c3586a719cd119c82ee4d8e6340572a1ba62e: CDI devices from CRI Config.CDIDevices: []" May 27 17:57:13.858857 systemd[1]: Started cri-containerd-25402e914b2a673f1abb8ac71490a629b12dde516718344b0238fc4262441f50.scope - libcontainer container 25402e914b2a673f1abb8ac71490a629b12dde516718344b0238fc4262441f50. May 27 17:57:13.865030 containerd[1585]: time="2025-05-27T17:57:13.864935384Z" level=info msg="CreateContainer within sandbox \"10348c28acf8da324894a42987056e8caddd369953010b80be3ea9c247dd9bc1\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"23955361b043d8b92d017c30671c3586a719cd119c82ee4d8e6340572a1ba62e\"" May 27 17:57:13.866930 containerd[1585]: time="2025-05-27T17:57:13.866726079Z" level=info msg="StartContainer for \"23955361b043d8b92d017c30671c3586a719cd119c82ee4d8e6340572a1ba62e\"" May 27 17:57:13.870616 containerd[1585]: time="2025-05-27T17:57:13.870524978Z" level=info msg="connecting to shim 23955361b043d8b92d017c30671c3586a719cd119c82ee4d8e6340572a1ba62e" address="unix:///run/containerd/s/6e43946468df1a74eee5956163623997aac87870d21981cd2fa43181495f3170" protocol=ttrpc version=3 May 27 17:57:13.914965 systemd[1]: Started cri-containerd-23955361b043d8b92d017c30671c3586a719cd119c82ee4d8e6340572a1ba62e.scope - libcontainer container 23955361b043d8b92d017c30671c3586a719cd119c82ee4d8e6340572a1ba62e. May 27 17:57:13.962783 containerd[1585]: time="2025-05-27T17:57:13.962469196Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7c5755cdcb-dmdw8,Uid:d1de1320-e662-423b-a8ad-80de68bf4ce4,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"25402e914b2a673f1abb8ac71490a629b12dde516718344b0238fc4262441f50\"" May 27 17:57:13.967748 containerd[1585]: time="2025-05-27T17:57:13.967386071Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.0\"" May 27 17:57:14.006791 containerd[1585]: time="2025-05-27T17:57:14.006729947Z" level=info msg="StartContainer for \"23955361b043d8b92d017c30671c3586a719cd119c82ee4d8e6340572a1ba62e\" returns successfully" May 27 17:57:14.075506 kubelet[2875]: I0527 17:57:14.075432 2875 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-856kg" podStartSLOduration=1.075383605 podStartE2EDuration="1.075383605s" podCreationTimestamp="2025-05-27 17:57:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 17:57:14.075067222 +0000 UTC m=+6.394712142" watchObservedRunningTime="2025-05-27 17:57:14.075383605 +0000 UTC m=+6.395028494" May 27 17:57:16.468729 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount33449023.mount: Deactivated successfully. May 27 17:57:19.256952 containerd[1585]: time="2025-05-27T17:57:19.255998256Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:57:19.258357 containerd[1585]: time="2025-05-27T17:57:19.258298343Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.0: active requests=0, bytes read=25055451" May 27 17:57:19.258833 containerd[1585]: time="2025-05-27T17:57:19.258752656Z" level=info msg="ImageCreate event name:\"sha256:5e43c1322619406528ff596056dfeb70cb8d20c5c00439feb752a7725302e033\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:57:19.262754 containerd[1585]: time="2025-05-27T17:57:19.262705638Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:e0a34b265aebce1a2db906d8dad99190706e8bf3910cae626b9c2eb6bbb21775\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:57:19.263697 containerd[1585]: time="2025-05-27T17:57:19.263637152Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.0\" with image id \"sha256:5e43c1322619406528ff596056dfeb70cb8d20c5c00439feb752a7725302e033\", repo tag \"quay.io/tigera/operator:v1.38.0\", repo digest \"quay.io/tigera/operator@sha256:e0a34b265aebce1a2db906d8dad99190706e8bf3910cae626b9c2eb6bbb21775\", size \"25051446\" in 5.296191613s" May 27 17:57:19.263860 containerd[1585]: time="2025-05-27T17:57:19.263815758Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.0\" returns image reference \"sha256:5e43c1322619406528ff596056dfeb70cb8d20c5c00439feb752a7725302e033\"" May 27 17:57:19.267471 containerd[1585]: time="2025-05-27T17:57:19.267428282Z" level=info msg="CreateContainer within sandbox \"25402e914b2a673f1abb8ac71490a629b12dde516718344b0238fc4262441f50\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 27 17:57:19.279692 containerd[1585]: time="2025-05-27T17:57:19.277802799Z" level=info msg="Container 44cea32a747b4b20629a27d9d20eddada3bd1af3ff85e533f78b86d70816c714: CDI devices from CRI Config.CDIDevices: []" May 27 17:57:19.283115 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4111607706.mount: Deactivated successfully. May 27 17:57:19.292527 containerd[1585]: time="2025-05-27T17:57:19.292450710Z" level=info msg="CreateContainer within sandbox \"25402e914b2a673f1abb8ac71490a629b12dde516718344b0238fc4262441f50\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"44cea32a747b4b20629a27d9d20eddada3bd1af3ff85e533f78b86d70816c714\"" May 27 17:57:19.295184 containerd[1585]: time="2025-05-27T17:57:19.294997527Z" level=info msg="StartContainer for \"44cea32a747b4b20629a27d9d20eddada3bd1af3ff85e533f78b86d70816c714\"" May 27 17:57:19.298892 containerd[1585]: time="2025-05-27T17:57:19.298853857Z" level=info msg="connecting to shim 44cea32a747b4b20629a27d9d20eddada3bd1af3ff85e533f78b86d70816c714" address="unix:///run/containerd/s/2ea09119f07aa0c55c2afbcd71354f34f25ee3a2171631edf46a66cd8b2452de" protocol=ttrpc version=3 May 27 17:57:19.329937 systemd[1]: Started cri-containerd-44cea32a747b4b20629a27d9d20eddada3bd1af3ff85e533f78b86d70816c714.scope - libcontainer container 44cea32a747b4b20629a27d9d20eddada3bd1af3ff85e533f78b86d70816c714. May 27 17:57:19.373762 containerd[1585]: time="2025-05-27T17:57:19.373655221Z" level=info msg="StartContainer for \"44cea32a747b4b20629a27d9d20eddada3bd1af3ff85e533f78b86d70816c714\" returns successfully" May 27 17:57:20.098152 kubelet[2875]: I0527 17:57:20.097658 2875 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7c5755cdcb-dmdw8" podStartSLOduration=1.798584451 podStartE2EDuration="7.097628681s" podCreationTimestamp="2025-05-27 17:57:13 +0000 UTC" firstStartedPulling="2025-05-27 17:57:13.965786152 +0000 UTC m=+6.285431035" lastFinishedPulling="2025-05-27 17:57:19.26483039 +0000 UTC m=+11.584475265" observedRunningTime="2025-05-27 17:57:20.096441747 +0000 UTC m=+12.416086641" watchObservedRunningTime="2025-05-27 17:57:20.097628681 +0000 UTC m=+12.417273570" May 27 17:57:24.638685 sudo[1895]: pam_unix(sudo:session): session closed for user root May 27 17:57:24.782475 sshd[1894]: Connection closed by 139.178.68.195 port 53476 May 27 17:57:24.784128 sshd-session[1892]: pam_unix(sshd:session): session closed for user core May 27 17:57:24.793586 systemd[1]: sshd@8-10.230.41.6:22-139.178.68.195:53476.service: Deactivated successfully. May 27 17:57:24.794736 systemd-logind[1561]: Session 11 logged out. Waiting for processes to exit. May 27 17:57:24.799131 systemd[1]: session-11.scope: Deactivated successfully. May 27 17:57:24.801012 systemd[1]: session-11.scope: Consumed 7.004s CPU time, 154.6M memory peak. May 27 17:57:24.808882 systemd-logind[1561]: Removed session 11. May 27 17:57:29.037158 systemd[1]: Created slice kubepods-besteffort-pod11d971d3_b497_4e9e_89fc_4bd9fe15c0d7.slice - libcontainer container kubepods-besteffort-pod11d971d3_b497_4e9e_89fc_4bd9fe15c0d7.slice. May 27 17:57:29.107897 kubelet[2875]: I0527 17:57:29.107833 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11d971d3-b497-4e9e-89fc-4bd9fe15c0d7-tigera-ca-bundle\") pod \"calico-typha-584667c759-5wrmz\" (UID: \"11d971d3-b497-4e9e-89fc-4bd9fe15c0d7\") " pod="calico-system/calico-typha-584667c759-5wrmz" May 27 17:57:29.107897 kubelet[2875]: I0527 17:57:29.107901 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/11d971d3-b497-4e9e-89fc-4bd9fe15c0d7-typha-certs\") pod \"calico-typha-584667c759-5wrmz\" (UID: \"11d971d3-b497-4e9e-89fc-4bd9fe15c0d7\") " pod="calico-system/calico-typha-584667c759-5wrmz" May 27 17:57:29.108583 kubelet[2875]: I0527 17:57:29.107936 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v8zh\" (UniqueName: \"kubernetes.io/projected/11d971d3-b497-4e9e-89fc-4bd9fe15c0d7-kube-api-access-8v8zh\") pod \"calico-typha-584667c759-5wrmz\" (UID: \"11d971d3-b497-4e9e-89fc-4bd9fe15c0d7\") " pod="calico-system/calico-typha-584667c759-5wrmz" May 27 17:57:29.345920 containerd[1585]: time="2025-05-27T17:57:29.345495522Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-584667c759-5wrmz,Uid:11d971d3-b497-4e9e-89fc-4bd9fe15c0d7,Namespace:calico-system,Attempt:0,}" May 27 17:57:29.422796 containerd[1585]: time="2025-05-27T17:57:29.422590554Z" level=info msg="connecting to shim ed36c80ee570e005b9689b9a0533be7354011159d4a659e465518be4781632ee" address="unix:///run/containerd/s/e1251b8f0a1bb998e47807020e7e0a54cf5b7d0fdd750cb915711ccda8103edd" namespace=k8s.io protocol=ttrpc version=3 May 27 17:57:29.459332 systemd[1]: Created slice kubepods-besteffort-pod0d511a75_9fd3_4715_b75c_96f49c5ed7b0.slice - libcontainer container kubepods-besteffort-pod0d511a75_9fd3_4715_b75c_96f49c5ed7b0.slice. May 27 17:57:29.496866 systemd[1]: Started cri-containerd-ed36c80ee570e005b9689b9a0533be7354011159d4a659e465518be4781632ee.scope - libcontainer container ed36c80ee570e005b9689b9a0533be7354011159d4a659e465518be4781632ee. May 27 17:57:29.592010 containerd[1585]: time="2025-05-27T17:57:29.591948719Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-584667c759-5wrmz,Uid:11d971d3-b497-4e9e-89fc-4bd9fe15c0d7,Namespace:calico-system,Attempt:0,} returns sandbox id \"ed36c80ee570e005b9689b9a0533be7354011159d4a659e465518be4781632ee\"" May 27 17:57:29.594241 containerd[1585]: time="2025-05-27T17:57:29.594194503Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.0\"" May 27 17:57:29.613705 kubelet[2875]: I0527 17:57:29.612144 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/0d511a75-9fd3-4715-b75c-96f49c5ed7b0-var-lib-calico\") pod \"calico-node-8l4vr\" (UID: \"0d511a75-9fd3-4715-b75c-96f49c5ed7b0\") " pod="calico-system/calico-node-8l4vr" May 27 17:57:29.613705 kubelet[2875]: I0527 17:57:29.612246 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/0d511a75-9fd3-4715-b75c-96f49c5ed7b0-cni-log-dir\") pod \"calico-node-8l4vr\" (UID: \"0d511a75-9fd3-4715-b75c-96f49c5ed7b0\") " pod="calico-system/calico-node-8l4vr" May 27 17:57:29.613705 kubelet[2875]: I0527 17:57:29.612326 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/0d511a75-9fd3-4715-b75c-96f49c5ed7b0-node-certs\") pod \"calico-node-8l4vr\" (UID: \"0d511a75-9fd3-4715-b75c-96f49c5ed7b0\") " pod="calico-system/calico-node-8l4vr" May 27 17:57:29.613705 kubelet[2875]: I0527 17:57:29.612361 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/0d511a75-9fd3-4715-b75c-96f49c5ed7b0-xtables-lock\") pod \"calico-node-8l4vr\" (UID: \"0d511a75-9fd3-4715-b75c-96f49c5ed7b0\") " pod="calico-system/calico-node-8l4vr" May 27 17:57:29.613705 kubelet[2875]: I0527 17:57:29.612434 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0d511a75-9fd3-4715-b75c-96f49c5ed7b0-lib-modules\") pod \"calico-node-8l4vr\" (UID: \"0d511a75-9fd3-4715-b75c-96f49c5ed7b0\") " pod="calico-system/calico-node-8l4vr" May 27 17:57:29.613993 kubelet[2875]: I0527 17:57:29.612508 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0d511a75-9fd3-4715-b75c-96f49c5ed7b0-tigera-ca-bundle\") pod \"calico-node-8l4vr\" (UID: \"0d511a75-9fd3-4715-b75c-96f49c5ed7b0\") " pod="calico-system/calico-node-8l4vr" May 27 17:57:29.613993 kubelet[2875]: I0527 17:57:29.612587 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwbd8\" (UniqueName: \"kubernetes.io/projected/0d511a75-9fd3-4715-b75c-96f49c5ed7b0-kube-api-access-cwbd8\") pod \"calico-node-8l4vr\" (UID: \"0d511a75-9fd3-4715-b75c-96f49c5ed7b0\") " pod="calico-system/calico-node-8l4vr" May 27 17:57:29.613993 kubelet[2875]: I0527 17:57:29.612719 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/0d511a75-9fd3-4715-b75c-96f49c5ed7b0-cni-bin-dir\") pod \"calico-node-8l4vr\" (UID: \"0d511a75-9fd3-4715-b75c-96f49c5ed7b0\") " pod="calico-system/calico-node-8l4vr" May 27 17:57:29.613993 kubelet[2875]: I0527 17:57:29.612811 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/0d511a75-9fd3-4715-b75c-96f49c5ed7b0-var-run-calico\") pod \"calico-node-8l4vr\" (UID: \"0d511a75-9fd3-4715-b75c-96f49c5ed7b0\") " pod="calico-system/calico-node-8l4vr" May 27 17:57:29.613993 kubelet[2875]: I0527 17:57:29.612876 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/0d511a75-9fd3-4715-b75c-96f49c5ed7b0-flexvol-driver-host\") pod \"calico-node-8l4vr\" (UID: \"0d511a75-9fd3-4715-b75c-96f49c5ed7b0\") " pod="calico-system/calico-node-8l4vr" May 27 17:57:29.614192 kubelet[2875]: I0527 17:57:29.612906 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/0d511a75-9fd3-4715-b75c-96f49c5ed7b0-policysync\") pod \"calico-node-8l4vr\" (UID: \"0d511a75-9fd3-4715-b75c-96f49c5ed7b0\") " pod="calico-system/calico-node-8l4vr" May 27 17:57:29.614192 kubelet[2875]: I0527 17:57:29.612979 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/0d511a75-9fd3-4715-b75c-96f49c5ed7b0-cni-net-dir\") pod \"calico-node-8l4vr\" (UID: \"0d511a75-9fd3-4715-b75c-96f49c5ed7b0\") " pod="calico-system/calico-node-8l4vr" May 27 17:57:29.662098 kubelet[2875]: E0527 17:57:29.661602 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9t7wk" podUID="9c96b9b6-32e2-4ed2-8acd-c7b5982abac8" May 27 17:57:29.715121 kubelet[2875]: E0527 17:57:29.715052 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.715121 kubelet[2875]: W0527 17:57:29.715080 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.715414 kubelet[2875]: E0527 17:57:29.715133 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.715414 kubelet[2875]: E0527 17:57:29.715377 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.715414 kubelet[2875]: W0527 17:57:29.715390 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.716016 kubelet[2875]: E0527 17:57:29.715954 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.717841 kubelet[2875]: E0527 17:57:29.717819 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.718726 kubelet[2875]: W0527 17:57:29.718703 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.718918 kubelet[2875]: E0527 17:57:29.718867 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.719378 kubelet[2875]: E0527 17:57:29.719358 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.719596 kubelet[2875]: W0527 17:57:29.719465 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.720243 kubelet[2875]: E0527 17:57:29.720200 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.720433 kubelet[2875]: E0527 17:57:29.720397 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.720546 kubelet[2875]: W0527 17:57:29.720525 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.720769 kubelet[2875]: E0527 17:57:29.720651 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.721724 kubelet[2875]: E0527 17:57:29.721692 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.721954 kubelet[2875]: W0527 17:57:29.721812 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.721954 kubelet[2875]: E0527 17:57:29.721866 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.724341 kubelet[2875]: E0527 17:57:29.724320 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.726639 kubelet[2875]: W0527 17:57:29.726601 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.726819 kubelet[2875]: E0527 17:57:29.726790 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.727137 kubelet[2875]: E0527 17:57:29.727106 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.727268 kubelet[2875]: W0527 17:57:29.727215 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.727463 kubelet[2875]: E0527 17:57:29.727375 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.727544 kubelet[2875]: E0527 17:57:29.727524 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.727596 kubelet[2875]: W0527 17:57:29.727544 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.727710 kubelet[2875]: E0527 17:57:29.727603 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.730573 kubelet[2875]: E0527 17:57:29.729422 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.730573 kubelet[2875]: W0527 17:57:29.729442 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.730573 kubelet[2875]: E0527 17:57:29.729592 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.734993 kubelet[2875]: E0527 17:57:29.734939 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.734993 kubelet[2875]: W0527 17:57:29.734970 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.735202 kubelet[2875]: E0527 17:57:29.735160 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.735884 kubelet[2875]: E0527 17:57:29.735810 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.736389 kubelet[2875]: W0527 17:57:29.735831 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.736389 kubelet[2875]: E0527 17:57:29.736063 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.736494 kubelet[2875]: E0527 17:57:29.736418 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.736494 kubelet[2875]: W0527 17:57:29.736434 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.737091 kubelet[2875]: E0527 17:57:29.736503 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.737091 kubelet[2875]: E0527 17:57:29.736895 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.737091 kubelet[2875]: W0527 17:57:29.736919 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.737091 kubelet[2875]: E0527 17:57:29.737086 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.738791 kubelet[2875]: E0527 17:57:29.738552 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.738791 kubelet[2875]: W0527 17:57:29.738572 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.738791 kubelet[2875]: E0527 17:57:29.738724 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.740922 kubelet[2875]: E0527 17:57:29.740898 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.740991 kubelet[2875]: W0527 17:57:29.740937 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.741040 kubelet[2875]: E0527 17:57:29.741019 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.741305 kubelet[2875]: E0527 17:57:29.741283 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.741305 kubelet[2875]: W0527 17:57:29.741302 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.741502 kubelet[2875]: E0527 17:57:29.741478 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.742262 kubelet[2875]: E0527 17:57:29.742240 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.742262 kubelet[2875]: W0527 17:57:29.742261 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.742691 kubelet[2875]: E0527 17:57:29.742654 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.743367 kubelet[2875]: E0527 17:57:29.743342 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.743367 kubelet[2875]: W0527 17:57:29.743362 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.743497 kubelet[2875]: E0527 17:57:29.743476 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.743759 kubelet[2875]: E0527 17:57:29.743737 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.743759 kubelet[2875]: W0527 17:57:29.743756 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.743859 kubelet[2875]: E0527 17:57:29.743844 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.744068 kubelet[2875]: E0527 17:57:29.744046 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.744068 kubelet[2875]: W0527 17:57:29.744066 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.744863 kubelet[2875]: E0527 17:57:29.744794 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.744929 kubelet[2875]: E0527 17:57:29.744905 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.744929 kubelet[2875]: W0527 17:57:29.744918 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.745028 kubelet[2875]: E0527 17:57:29.744985 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.745227 kubelet[2875]: E0527 17:57:29.745207 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.745227 kubelet[2875]: W0527 17:57:29.745226 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.745573 kubelet[2875]: E0527 17:57:29.745549 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.745788 kubelet[2875]: E0527 17:57:29.745767 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.745788 kubelet[2875]: W0527 17:57:29.745787 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.747146 kubelet[2875]: E0527 17:57:29.747110 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.747469 kubelet[2875]: E0527 17:57:29.747447 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.747469 kubelet[2875]: W0527 17:57:29.747466 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.747600 kubelet[2875]: E0527 17:57:29.747587 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.747856 kubelet[2875]: E0527 17:57:29.747834 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.747856 kubelet[2875]: W0527 17:57:29.747853 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.747957 kubelet[2875]: E0527 17:57:29.747941 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.748848 kubelet[2875]: E0527 17:57:29.748823 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.748848 kubelet[2875]: W0527 17:57:29.748843 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.748961 kubelet[2875]: E0527 17:57:29.748909 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.749192 kubelet[2875]: E0527 17:57:29.749170 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.749192 kubelet[2875]: W0527 17:57:29.749190 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.749371 kubelet[2875]: E0527 17:57:29.749303 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.749446 kubelet[2875]: E0527 17:57:29.749427 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.749489 kubelet[2875]: W0527 17:57:29.749446 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.749555 kubelet[2875]: E0527 17:57:29.749514 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.750946 kubelet[2875]: E0527 17:57:29.750922 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.750946 kubelet[2875]: W0527 17:57:29.750942 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.751069 kubelet[2875]: E0527 17:57:29.751033 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.751289 kubelet[2875]: E0527 17:57:29.751255 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.751289 kubelet[2875]: W0527 17:57:29.751274 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.751435 kubelet[2875]: E0527 17:57:29.751409 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.751850 kubelet[2875]: E0527 17:57:29.751826 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.751850 kubelet[2875]: W0527 17:57:29.751845 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.751951 kubelet[2875]: E0527 17:57:29.751936 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.752579 kubelet[2875]: E0527 17:57:29.752554 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.752579 kubelet[2875]: W0527 17:57:29.752573 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.753825 kubelet[2875]: E0527 17:57:29.753798 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.754077 kubelet[2875]: E0527 17:57:29.754052 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.754077 kubelet[2875]: W0527 17:57:29.754073 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.754185 kubelet[2875]: E0527 17:57:29.754165 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.754430 kubelet[2875]: E0527 17:57:29.754406 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.754430 kubelet[2875]: W0527 17:57:29.754426 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.754718 kubelet[2875]: E0527 17:57:29.754687 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.755148 kubelet[2875]: E0527 17:57:29.755115 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.755148 kubelet[2875]: W0527 17:57:29.755135 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.756754 kubelet[2875]: E0527 17:57:29.755227 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.758073 kubelet[2875]: E0527 17:57:29.758031 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.758073 kubelet[2875]: W0527 17:57:29.758053 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.759236 kubelet[2875]: E0527 17:57:29.759212 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.759545 kubelet[2875]: E0527 17:57:29.759481 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.759545 kubelet[2875]: W0527 17:57:29.759501 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.759824 kubelet[2875]: E0527 17:57:29.759705 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.759882 kubelet[2875]: E0527 17:57:29.759830 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.759882 kubelet[2875]: W0527 17:57:29.759846 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.759981 kubelet[2875]: E0527 17:57:29.759938 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.760250 kubelet[2875]: E0527 17:57:29.760227 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.760250 kubelet[2875]: W0527 17:57:29.760248 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.760355 kubelet[2875]: E0527 17:57:29.760270 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.761082 kubelet[2875]: E0527 17:57:29.760934 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.761082 kubelet[2875]: W0527 17:57:29.760955 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.761082 kubelet[2875]: E0527 17:57:29.760973 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.761372 kubelet[2875]: E0527 17:57:29.761353 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.761492 kubelet[2875]: W0527 17:57:29.761473 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.761606 kubelet[2875]: E0527 17:57:29.761589 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.761941 kubelet[2875]: E0527 17:57:29.761922 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.762040 kubelet[2875]: W0527 17:57:29.762021 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.762173 kubelet[2875]: E0527 17:57:29.762152 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.762573 kubelet[2875]: E0527 17:57:29.762429 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.762573 kubelet[2875]: W0527 17:57:29.762447 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.762573 kubelet[2875]: E0527 17:57:29.762474 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.763292 kubelet[2875]: E0527 17:57:29.763273 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.763547 kubelet[2875]: W0527 17:57:29.763381 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.763547 kubelet[2875]: E0527 17:57:29.763414 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.763909 kubelet[2875]: E0527 17:57:29.763736 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.763909 kubelet[2875]: W0527 17:57:29.763750 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.763909 kubelet[2875]: E0527 17:57:29.763781 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.765362 kubelet[2875]: E0527 17:57:29.764909 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.765362 kubelet[2875]: W0527 17:57:29.764927 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.765362 kubelet[2875]: E0527 17:57:29.764950 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.765708 kubelet[2875]: E0527 17:57:29.765662 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.765809 kubelet[2875]: W0527 17:57:29.765790 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.766077 kubelet[2875]: E0527 17:57:29.765887 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.766817 kubelet[2875]: E0527 17:57:29.766798 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.766928 kubelet[2875]: W0527 17:57:29.766907 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.767083 kubelet[2875]: E0527 17:57:29.767031 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.767495 kubelet[2875]: E0527 17:57:29.767476 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.768983 kubelet[2875]: W0527 17:57:29.768691 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.768983 kubelet[2875]: E0527 17:57:29.768717 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.769180 kubelet[2875]: E0527 17:57:29.769162 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.769278 kubelet[2875]: W0527 17:57:29.769258 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.769373 kubelet[2875]: E0527 17:57:29.769354 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.769865 kubelet[2875]: E0527 17:57:29.769688 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.769865 kubelet[2875]: W0527 17:57:29.769706 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.769865 kubelet[2875]: E0527 17:57:29.769720 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.770144 kubelet[2875]: E0527 17:57:29.770125 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.770229 kubelet[2875]: W0527 17:57:29.770210 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.770321 kubelet[2875]: E0527 17:57:29.770303 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.772091 kubelet[2875]: E0527 17:57:29.771809 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.772091 kubelet[2875]: W0527 17:57:29.771831 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.772091 kubelet[2875]: E0527 17:57:29.771849 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.772332 kubelet[2875]: E0527 17:57:29.772313 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.772447 kubelet[2875]: W0527 17:57:29.772418 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.772577 kubelet[2875]: E0527 17:57:29.772547 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.773088 kubelet[2875]: E0527 17:57:29.772981 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.773208 kubelet[2875]: W0527 17:57:29.773187 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.773354 kubelet[2875]: E0527 17:57:29.773291 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.774237 kubelet[2875]: E0527 17:57:29.774093 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.774237 kubelet[2875]: W0527 17:57:29.774112 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.774237 kubelet[2875]: E0527 17:57:29.774128 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.776801 kubelet[2875]: E0527 17:57:29.776780 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.778687 kubelet[2875]: W0527 17:57:29.776902 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.778687 kubelet[2875]: E0527 17:57:29.776928 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.794785 kubelet[2875]: E0527 17:57:29.794752 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.794966 kubelet[2875]: W0527 17:57:29.794944 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.795078 kubelet[2875]: E0527 17:57:29.795058 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.814852 kubelet[2875]: E0527 17:57:29.814814 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.814852 kubelet[2875]: W0527 17:57:29.814846 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.815744 kubelet[2875]: E0527 17:57:29.814874 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.815744 kubelet[2875]: I0527 17:57:29.814915 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9c96b9b6-32e2-4ed2-8acd-c7b5982abac8-socket-dir\") pod \"csi-node-driver-9t7wk\" (UID: \"9c96b9b6-32e2-4ed2-8acd-c7b5982abac8\") " pod="calico-system/csi-node-driver-9t7wk" May 27 17:57:29.815744 kubelet[2875]: E0527 17:57:29.815199 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.815744 kubelet[2875]: W0527 17:57:29.815215 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.816736 kubelet[2875]: E0527 17:57:29.816707 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.816962 kubelet[2875]: I0527 17:57:29.816895 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9c96b9b6-32e2-4ed2-8acd-c7b5982abac8-registration-dir\") pod \"csi-node-driver-9t7wk\" (UID: \"9c96b9b6-32e2-4ed2-8acd-c7b5982abac8\") " pod="calico-system/csi-node-driver-9t7wk" May 27 17:57:29.817044 kubelet[2875]: E0527 17:57:29.817024 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.817113 kubelet[2875]: W0527 17:57:29.817052 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.817113 kubelet[2875]: E0527 17:57:29.817073 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.817362 kubelet[2875]: E0527 17:57:29.817332 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.817362 kubelet[2875]: W0527 17:57:29.817352 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.817472 kubelet[2875]: E0527 17:57:29.817395 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.817704 kubelet[2875]: E0527 17:57:29.817684 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.817704 kubelet[2875]: W0527 17:57:29.817698 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.817808 kubelet[2875]: E0527 17:57:29.817729 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.817808 kubelet[2875]: I0527 17:57:29.817755 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/9c96b9b6-32e2-4ed2-8acd-c7b5982abac8-varrun\") pod \"csi-node-driver-9t7wk\" (UID: \"9c96b9b6-32e2-4ed2-8acd-c7b5982abac8\") " pod="calico-system/csi-node-driver-9t7wk" May 27 17:57:29.819856 kubelet[2875]: E0527 17:57:29.819828 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.819856 kubelet[2875]: W0527 17:57:29.819851 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.819989 kubelet[2875]: E0527 17:57:29.819888 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.820167 kubelet[2875]: E0527 17:57:29.820147 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.820167 kubelet[2875]: W0527 17:57:29.820165 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.821922 kubelet[2875]: E0527 17:57:29.821886 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.822895 kubelet[2875]: E0527 17:57:29.822824 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.822895 kubelet[2875]: W0527 17:57:29.822845 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.823713 kubelet[2875]: E0527 17:57:29.823688 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.823794 kubelet[2875]: I0527 17:57:29.823725 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c96b9b6-32e2-4ed2-8acd-c7b5982abac8-kubelet-dir\") pod \"csi-node-driver-9t7wk\" (UID: \"9c96b9b6-32e2-4ed2-8acd-c7b5982abac8\") " pod="calico-system/csi-node-driver-9t7wk" May 27 17:57:29.824970 kubelet[2875]: E0527 17:57:29.824761 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.824970 kubelet[2875]: W0527 17:57:29.824782 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.824970 kubelet[2875]: E0527 17:57:29.824807 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.825216 kubelet[2875]: E0527 17:57:29.825110 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.825216 kubelet[2875]: W0527 17:57:29.825123 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.825216 kubelet[2875]: E0527 17:57:29.825138 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.825376 kubelet[2875]: E0527 17:57:29.825347 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.825376 kubelet[2875]: W0527 17:57:29.825360 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.825459 kubelet[2875]: E0527 17:57:29.825379 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.825459 kubelet[2875]: I0527 17:57:29.825404 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6wcd\" (UniqueName: \"kubernetes.io/projected/9c96b9b6-32e2-4ed2-8acd-c7b5982abac8-kube-api-access-w6wcd\") pod \"csi-node-driver-9t7wk\" (UID: \"9c96b9b6-32e2-4ed2-8acd-c7b5982abac8\") " pod="calico-system/csi-node-driver-9t7wk" May 27 17:57:29.825798 kubelet[2875]: E0527 17:57:29.825743 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.825798 kubelet[2875]: W0527 17:57:29.825765 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.827338 kubelet[2875]: E0527 17:57:29.827313 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.827338 kubelet[2875]: W0527 17:57:29.827334 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.827460 kubelet[2875]: E0527 17:57:29.827350 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.827509 kubelet[2875]: E0527 17:57:29.827496 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.827767 kubelet[2875]: E0527 17:57:29.827734 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.827767 kubelet[2875]: W0527 17:57:29.827753 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.827767 kubelet[2875]: E0527 17:57:29.827769 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.828199 kubelet[2875]: E0527 17:57:29.828105 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.828199 kubelet[2875]: W0527 17:57:29.828126 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.828199 kubelet[2875]: E0527 17:57:29.828141 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.928195 kubelet[2875]: E0527 17:57:29.928156 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.928195 kubelet[2875]: W0527 17:57:29.928186 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.928689 kubelet[2875]: E0527 17:57:29.928211 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.929370 kubelet[2875]: E0527 17:57:29.928754 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.929370 kubelet[2875]: W0527 17:57:29.928768 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.929370 kubelet[2875]: E0527 17:57:29.928783 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.929527 kubelet[2875]: E0527 17:57:29.929389 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.929527 kubelet[2875]: W0527 17:57:29.929411 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.929527 kubelet[2875]: E0527 17:57:29.929428 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.931395 kubelet[2875]: E0527 17:57:29.929691 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.931395 kubelet[2875]: W0527 17:57:29.929710 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.931395 kubelet[2875]: E0527 17:57:29.929725 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.931395 kubelet[2875]: E0527 17:57:29.929944 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.931395 kubelet[2875]: W0527 17:57:29.929955 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.931395 kubelet[2875]: E0527 17:57:29.929967 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.931395 kubelet[2875]: E0527 17:57:29.930227 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.931395 kubelet[2875]: W0527 17:57:29.930240 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.932655 kubelet[2875]: E0527 17:57:29.932341 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.932655 kubelet[2875]: W0527 17:57:29.932373 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.934053 kubelet[2875]: E0527 17:57:29.933504 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.934053 kubelet[2875]: W0527 17:57:29.933518 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.934053 kubelet[2875]: E0527 17:57:29.933533 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.934842 kubelet[2875]: E0527 17:57:29.934803 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.934842 kubelet[2875]: W0527 17:57:29.934833 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.934967 kubelet[2875]: E0527 17:57:29.934849 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.935594 kubelet[2875]: E0527 17:57:29.935572 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.935794 kubelet[2875]: W0527 17:57:29.935604 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.935794 kubelet[2875]: E0527 17:57:29.935633 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.936237 kubelet[2875]: E0527 17:57:29.936111 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.936358 kubelet[2875]: E0527 17:57:29.936149 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.936421 kubelet[2875]: E0527 17:57:29.936292 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.936421 kubelet[2875]: W0527 17:57:29.936401 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.936421 kubelet[2875]: E0527 17:57:29.936417 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.937143 kubelet[2875]: E0527 17:57:29.937122 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.937143 kubelet[2875]: W0527 17:57:29.937140 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.937249 kubelet[2875]: E0527 17:57:29.937170 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.937703 kubelet[2875]: E0527 17:57:29.937541 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.937703 kubelet[2875]: W0527 17:57:29.937591 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.937703 kubelet[2875]: E0527 17:57:29.937699 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.938065 kubelet[2875]: E0527 17:57:29.937944 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.938065 kubelet[2875]: W0527 17:57:29.937957 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.938141 kubelet[2875]: E0527 17:57:29.938071 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.938553 kubelet[2875]: E0527 17:57:29.938272 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.938553 kubelet[2875]: W0527 17:57:29.938289 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.938553 kubelet[2875]: E0527 17:57:29.938381 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.939115 kubelet[2875]: E0527 17:57:29.938989 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.939115 kubelet[2875]: W0527 17:57:29.939003 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.939191 kubelet[2875]: E0527 17:57:29.939132 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.939898 kubelet[2875]: E0527 17:57:29.939854 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.939898 kubelet[2875]: W0527 17:57:29.939868 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.940011 kubelet[2875]: E0527 17:57:29.939959 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.940715 kubelet[2875]: E0527 17:57:29.940692 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.940715 kubelet[2875]: W0527 17:57:29.940712 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.940833 kubelet[2875]: E0527 17:57:29.940805 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.941742 kubelet[2875]: E0527 17:57:29.941716 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.941742 kubelet[2875]: W0527 17:57:29.941737 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.941843 kubelet[2875]: E0527 17:57:29.941799 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.942253 kubelet[2875]: E0527 17:57:29.942227 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.942253 kubelet[2875]: W0527 17:57:29.942247 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.942410 kubelet[2875]: E0527 17:57:29.942382 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.942908 kubelet[2875]: E0527 17:57:29.942881 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.942964 kubelet[2875]: W0527 17:57:29.942936 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.943243 kubelet[2875]: E0527 17:57:29.943024 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.943703 kubelet[2875]: E0527 17:57:29.943683 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.943703 kubelet[2875]: W0527 17:57:29.943701 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.943820 kubelet[2875]: E0527 17:57:29.943807 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.944394 kubelet[2875]: E0527 17:57:29.944372 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.944451 kubelet[2875]: W0527 17:57:29.944412 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.944729 kubelet[2875]: E0527 17:57:29.944505 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.944983 kubelet[2875]: E0527 17:57:29.944957 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.944983 kubelet[2875]: W0527 17:57:29.944977 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.945149 kubelet[2875]: E0527 17:57:29.945111 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.945542 kubelet[2875]: E0527 17:57:29.945506 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.945542 kubelet[2875]: W0527 17:57:29.945536 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.945662 kubelet[2875]: E0527 17:57:29.945551 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:29.959762 kubelet[2875]: E0527 17:57:29.959732 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:29.959876 kubelet[2875]: W0527 17:57:29.959755 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:29.959876 kubelet[2875]: E0527 17:57:29.959803 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:30.075496 containerd[1585]: time="2025-05-27T17:57:30.075444524Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-8l4vr,Uid:0d511a75-9fd3-4715-b75c-96f49c5ed7b0,Namespace:calico-system,Attempt:0,}" May 27 17:57:30.118651 containerd[1585]: time="2025-05-27T17:57:30.118536936Z" level=info msg="connecting to shim 0564141284b0331cd5beca5aafd049f00dae41bfad8550c44d29d9c0651b1e73" address="unix:///run/containerd/s/805f2409e6c84d303a22ce7a7c38ae3fe8299dad14d19b7cb2216fc58a0f94c2" namespace=k8s.io protocol=ttrpc version=3 May 27 17:57:30.151843 systemd[1]: Started cri-containerd-0564141284b0331cd5beca5aafd049f00dae41bfad8550c44d29d9c0651b1e73.scope - libcontainer container 0564141284b0331cd5beca5aafd049f00dae41bfad8550c44d29d9c0651b1e73. May 27 17:57:30.218890 containerd[1585]: time="2025-05-27T17:57:30.218078495Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-8l4vr,Uid:0d511a75-9fd3-4715-b75c-96f49c5ed7b0,Namespace:calico-system,Attempt:0,} returns sandbox id \"0564141284b0331cd5beca5aafd049f00dae41bfad8550c44d29d9c0651b1e73\"" May 27 17:57:30.990166 kubelet[2875]: E0527 17:57:30.990060 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9t7wk" podUID="9c96b9b6-32e2-4ed2-8acd-c7b5982abac8" May 27 17:57:31.729599 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2954014042.mount: Deactivated successfully. May 27 17:57:32.777508 containerd[1585]: time="2025-05-27T17:57:32.777447905Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:57:32.778695 containerd[1585]: time="2025-05-27T17:57:32.778430572Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.0: active requests=0, bytes read=35158669" May 27 17:57:32.779656 containerd[1585]: time="2025-05-27T17:57:32.779607936Z" level=info msg="ImageCreate event name:\"sha256:71be0570e8645ac646675719e0da6ac33a05810991b31aecc303e7add70933be\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:57:32.782153 containerd[1585]: time="2025-05-27T17:57:32.782111251Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d282f6c773c4631b9dc8379eb093c54ca34c7728d55d6509cb45da5e1f5baf8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:57:32.782938 containerd[1585]: time="2025-05-27T17:57:32.782902236Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.0\" with image id \"sha256:71be0570e8645ac646675719e0da6ac33a05810991b31aecc303e7add70933be\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d282f6c773c4631b9dc8379eb093c54ca34c7728d55d6509cb45da5e1f5baf8f\", size \"35158523\" in 3.18866526s" May 27 17:57:32.783009 containerd[1585]: time="2025-05-27T17:57:32.782943782Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.0\" returns image reference \"sha256:71be0570e8645ac646675719e0da6ac33a05810991b31aecc303e7add70933be\"" May 27 17:57:32.797783 containerd[1585]: time="2025-05-27T17:57:32.797131501Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\"" May 27 17:57:32.832311 containerd[1585]: time="2025-05-27T17:57:32.832264098Z" level=info msg="CreateContainer within sandbox \"ed36c80ee570e005b9689b9a0533be7354011159d4a659e465518be4781632ee\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 27 17:57:32.848962 containerd[1585]: time="2025-05-27T17:57:32.848910383Z" level=info msg="Container a47b0ea745993a18518e732a859542cfc7a0bea4b8b57743680f032b9ddd61b4: CDI devices from CRI Config.CDIDevices: []" May 27 17:57:32.857417 containerd[1585]: time="2025-05-27T17:57:32.857275647Z" level=info msg="CreateContainer within sandbox \"ed36c80ee570e005b9689b9a0533be7354011159d4a659e465518be4781632ee\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"a47b0ea745993a18518e732a859542cfc7a0bea4b8b57743680f032b9ddd61b4\"" May 27 17:57:32.861767 containerd[1585]: time="2025-05-27T17:57:32.861725686Z" level=info msg="StartContainer for \"a47b0ea745993a18518e732a859542cfc7a0bea4b8b57743680f032b9ddd61b4\"" May 27 17:57:32.863819 containerd[1585]: time="2025-05-27T17:57:32.863786235Z" level=info msg="connecting to shim a47b0ea745993a18518e732a859542cfc7a0bea4b8b57743680f032b9ddd61b4" address="unix:///run/containerd/s/e1251b8f0a1bb998e47807020e7e0a54cf5b7d0fdd750cb915711ccda8103edd" protocol=ttrpc version=3 May 27 17:57:32.902851 systemd[1]: Started cri-containerd-a47b0ea745993a18518e732a859542cfc7a0bea4b8b57743680f032b9ddd61b4.scope - libcontainer container a47b0ea745993a18518e732a859542cfc7a0bea4b8b57743680f032b9ddd61b4. May 27 17:57:32.981241 containerd[1585]: time="2025-05-27T17:57:32.981189160Z" level=info msg="StartContainer for \"a47b0ea745993a18518e732a859542cfc7a0bea4b8b57743680f032b9ddd61b4\" returns successfully" May 27 17:57:32.999687 kubelet[2875]: E0527 17:57:32.999591 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9t7wk" podUID="9c96b9b6-32e2-4ed2-8acd-c7b5982abac8" May 27 17:57:33.213943 kubelet[2875]: E0527 17:57:33.213699 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:33.213943 kubelet[2875]: W0527 17:57:33.213736 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:33.213943 kubelet[2875]: E0527 17:57:33.213766 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:33.214309 kubelet[2875]: E0527 17:57:33.214280 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:33.214479 kubelet[2875]: W0527 17:57:33.214411 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:33.214479 kubelet[2875]: E0527 17:57:33.214437 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:33.215000 kubelet[2875]: E0527 17:57:33.214981 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:33.215166 kubelet[2875]: W0527 17:57:33.215095 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:33.215166 kubelet[2875]: E0527 17:57:33.215120 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:33.215607 kubelet[2875]: E0527 17:57:33.215589 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:33.215809 kubelet[2875]: W0527 17:57:33.215710 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:33.215809 kubelet[2875]: E0527 17:57:33.215730 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:33.216181 kubelet[2875]: E0527 17:57:33.216163 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:33.216712 kubelet[2875]: W0527 17:57:33.216270 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:33.216712 kubelet[2875]: E0527 17:57:33.216314 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:33.217161 kubelet[2875]: E0527 17:57:33.217142 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:33.217323 kubelet[2875]: W0527 17:57:33.217254 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:33.217323 kubelet[2875]: E0527 17:57:33.217279 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:33.218028 kubelet[2875]: E0527 17:57:33.217982 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:33.218292 kubelet[2875]: W0527 17:57:33.218001 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:33.218292 kubelet[2875]: E0527 17:57:33.218222 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:33.219090 kubelet[2875]: E0527 17:57:33.219021 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:33.219382 kubelet[2875]: W0527 17:57:33.219239 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:33.219382 kubelet[2875]: E0527 17:57:33.219270 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:33.220761 kubelet[2875]: E0527 17:57:33.220737 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:33.220761 kubelet[2875]: W0527 17:57:33.220759 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:33.220894 kubelet[2875]: E0527 17:57:33.220777 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:33.222606 kubelet[2875]: I0527 17:57:33.220445 2875 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-584667c759-5wrmz" podStartSLOduration=1.017894408 podStartE2EDuration="4.220430217s" podCreationTimestamp="2025-05-27 17:57:29 +0000 UTC" firstStartedPulling="2025-05-27 17:57:29.593689205 +0000 UTC m=+21.913334091" lastFinishedPulling="2025-05-27 17:57:32.79622502 +0000 UTC m=+25.115869900" observedRunningTime="2025-05-27 17:57:33.218305184 +0000 UTC m=+25.537950097" watchObservedRunningTime="2025-05-27 17:57:33.220430217 +0000 UTC m=+25.540075104" May 27 17:57:33.222924 kubelet[2875]: E0527 17:57:33.222868 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:33.223147 kubelet[2875]: W0527 17:57:33.222991 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:33.223147 kubelet[2875]: E0527 17:57:33.223016 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:33.223512 kubelet[2875]: E0527 17:57:33.223484 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:33.223512 kubelet[2875]: W0527 17:57:33.223511 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:33.223655 kubelet[2875]: E0527 17:57:33.223528 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:33.224016 kubelet[2875]: E0527 17:57:33.223993 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:33.224016 kubelet[2875]: W0527 17:57:33.224013 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:33.224461 kubelet[2875]: E0527 17:57:33.224030 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:33.224723 kubelet[2875]: E0527 17:57:33.224697 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:33.224723 kubelet[2875]: W0527 17:57:33.224720 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:33.224877 kubelet[2875]: E0527 17:57:33.224737 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:33.225308 kubelet[2875]: E0527 17:57:33.225277 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:33.225308 kubelet[2875]: W0527 17:57:33.225298 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:33.225404 kubelet[2875]: E0527 17:57:33.225314 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:33.225825 kubelet[2875]: E0527 17:57:33.225660 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:33.225933 kubelet[2875]: W0527 17:57:33.225826 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:33.225933 kubelet[2875]: E0527 17:57:33.225843 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:33.253240 kubelet[2875]: E0527 17:57:33.253167 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:33.253240 kubelet[2875]: W0527 17:57:33.253198 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:33.254187 kubelet[2875]: E0527 17:57:33.254023 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:33.254568 kubelet[2875]: E0527 17:57:33.254542 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:33.254795 kubelet[2875]: W0527 17:57:33.254725 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:33.254795 kubelet[2875]: E0527 17:57:33.254761 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:33.255863 kubelet[2875]: E0527 17:57:33.255831 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:33.255863 kubelet[2875]: W0527 17:57:33.255856 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:33.256063 kubelet[2875]: E0527 17:57:33.255882 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:33.256162 kubelet[2875]: E0527 17:57:33.256123 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:33.256162 kubelet[2875]: W0527 17:57:33.256139 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:33.256367 kubelet[2875]: E0527 17:57:33.256183 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:33.256561 kubelet[2875]: E0527 17:57:33.256433 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:33.256561 kubelet[2875]: W0527 17:57:33.256446 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:33.256561 kubelet[2875]: E0527 17:57:33.256547 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:33.256962 kubelet[2875]: E0527 17:57:33.256934 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:33.256962 kubelet[2875]: W0527 17:57:33.256955 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:33.257194 kubelet[2875]: E0527 17:57:33.256991 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:33.257977 kubelet[2875]: E0527 17:57:33.257871 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:33.257977 kubelet[2875]: W0527 17:57:33.257897 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:33.257977 kubelet[2875]: E0527 17:57:33.257930 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:33.258151 kubelet[2875]: E0527 17:57:33.258137 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:33.258151 kubelet[2875]: W0527 17:57:33.258150 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:33.258472 kubelet[2875]: E0527 17:57:33.258196 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:33.258472 kubelet[2875]: E0527 17:57:33.258401 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:33.258472 kubelet[2875]: W0527 17:57:33.258412 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:33.258897 kubelet[2875]: E0527 17:57:33.258668 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:33.258897 kubelet[2875]: W0527 17:57:33.258705 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:33.258897 kubelet[2875]: E0527 17:57:33.258722 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:33.259917 kubelet[2875]: E0527 17:57:33.259863 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:33.259917 kubelet[2875]: W0527 17:57:33.259884 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:33.259917 kubelet[2875]: E0527 17:57:33.259902 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:33.260760 kubelet[2875]: E0527 17:57:33.260138 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:33.260760 kubelet[2875]: W0527 17:57:33.260151 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:33.260760 kubelet[2875]: E0527 17:57:33.260166 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:33.261010 kubelet[2875]: E0527 17:57:33.260956 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:33.261754 kubelet[2875]: E0527 17:57:33.261729 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:33.261754 kubelet[2875]: W0527 17:57:33.261750 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:33.261975 kubelet[2875]: E0527 17:57:33.261774 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:33.264133 kubelet[2875]: E0527 17:57:33.264106 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:33.264133 kubelet[2875]: W0527 17:57:33.264127 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:33.264255 kubelet[2875]: E0527 17:57:33.264150 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:33.264769 kubelet[2875]: E0527 17:57:33.264457 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:33.264769 kubelet[2875]: W0527 17:57:33.264471 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:33.264769 kubelet[2875]: E0527 17:57:33.264505 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:33.264926 kubelet[2875]: E0527 17:57:33.264839 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:33.264926 kubelet[2875]: W0527 17:57:33.264852 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:33.264926 kubelet[2875]: E0527 17:57:33.264884 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:33.265680 kubelet[2875]: E0527 17:57:33.265644 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:33.266900 kubelet[2875]: W0527 17:57:33.266851 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:33.266983 kubelet[2875]: E0527 17:57:33.266901 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:33.267237 kubelet[2875]: E0527 17:57:33.267215 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:33.267237 kubelet[2875]: W0527 17:57:33.267235 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:33.267393 kubelet[2875]: E0527 17:57:33.267251 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:34.153912 kubelet[2875]: I0527 17:57:34.153852 2875 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:57:34.232706 kubelet[2875]: E0527 17:57:34.232525 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:34.232706 kubelet[2875]: W0527 17:57:34.232567 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:34.232706 kubelet[2875]: E0527 17:57:34.232616 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:34.234922 kubelet[2875]: E0527 17:57:34.233081 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:34.234922 kubelet[2875]: W0527 17:57:34.233118 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:34.234922 kubelet[2875]: E0527 17:57:34.233132 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:34.234922 kubelet[2875]: E0527 17:57:34.233428 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:34.234922 kubelet[2875]: W0527 17:57:34.233466 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:34.234922 kubelet[2875]: E0527 17:57:34.233481 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:34.234922 kubelet[2875]: E0527 17:57:34.233778 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:34.234922 kubelet[2875]: W0527 17:57:34.233792 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:34.234922 kubelet[2875]: E0527 17:57:34.233807 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:34.234922 kubelet[2875]: E0527 17:57:34.234147 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:34.235337 kubelet[2875]: W0527 17:57:34.234160 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:34.235337 kubelet[2875]: E0527 17:57:34.234174 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:34.235337 kubelet[2875]: E0527 17:57:34.234467 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:34.235337 kubelet[2875]: W0527 17:57:34.234482 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:34.235337 kubelet[2875]: E0527 17:57:34.234498 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:34.235337 kubelet[2875]: E0527 17:57:34.234866 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:34.235337 kubelet[2875]: W0527 17:57:34.234889 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:34.235337 kubelet[2875]: E0527 17:57:34.234906 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:34.235337 kubelet[2875]: E0527 17:57:34.235190 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:34.235337 kubelet[2875]: W0527 17:57:34.235228 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:34.235731 kubelet[2875]: E0527 17:57:34.235245 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:34.235731 kubelet[2875]: E0527 17:57:34.235519 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:34.235731 kubelet[2875]: W0527 17:57:34.235531 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:34.235731 kubelet[2875]: E0527 17:57:34.235570 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:34.236052 kubelet[2875]: E0527 17:57:34.236019 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:34.236141 kubelet[2875]: W0527 17:57:34.236061 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:34.236141 kubelet[2875]: E0527 17:57:34.236077 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:34.236386 kubelet[2875]: E0527 17:57:34.236350 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:34.236386 kubelet[2875]: W0527 17:57:34.236370 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:34.236545 kubelet[2875]: E0527 17:57:34.236435 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:34.236773 kubelet[2875]: E0527 17:57:34.236746 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:34.236773 kubelet[2875]: W0527 17:57:34.236773 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:34.236967 kubelet[2875]: E0527 17:57:34.236788 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:34.237053 kubelet[2875]: E0527 17:57:34.237042 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:34.237148 kubelet[2875]: W0527 17:57:34.237054 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:34.237148 kubelet[2875]: E0527 17:57:34.237103 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:34.237437 kubelet[2875]: E0527 17:57:34.237420 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:34.237575 kubelet[2875]: W0527 17:57:34.237462 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:34.237575 kubelet[2875]: E0527 17:57:34.237480 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:34.237837 kubelet[2875]: E0527 17:57:34.237819 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:34.237837 kubelet[2875]: W0527 17:57:34.237837 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:34.238541 kubelet[2875]: E0527 17:57:34.237851 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:34.265754 kubelet[2875]: E0527 17:57:34.265273 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:34.265754 kubelet[2875]: W0527 17:57:34.265297 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:34.265754 kubelet[2875]: E0527 17:57:34.265338 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:34.265754 kubelet[2875]: E0527 17:57:34.265644 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:34.265754 kubelet[2875]: W0527 17:57:34.265694 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:34.265754 kubelet[2875]: E0527 17:57:34.265711 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:34.266102 kubelet[2875]: E0527 17:57:34.265972 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:34.266102 kubelet[2875]: W0527 17:57:34.265985 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:34.266102 kubelet[2875]: E0527 17:57:34.266010 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:34.267685 kubelet[2875]: E0527 17:57:34.266289 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:34.267685 kubelet[2875]: W0527 17:57:34.266308 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:34.267685 kubelet[2875]: E0527 17:57:34.266327 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:34.267685 kubelet[2875]: E0527 17:57:34.266655 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:34.267685 kubelet[2875]: W0527 17:57:34.266699 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:34.267685 kubelet[2875]: E0527 17:57:34.266717 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:34.267685 kubelet[2875]: E0527 17:57:34.266975 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:34.267685 kubelet[2875]: W0527 17:57:34.266987 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:34.267685 kubelet[2875]: E0527 17:57:34.267029 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:34.267685 kubelet[2875]: E0527 17:57:34.267351 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:34.268089 kubelet[2875]: W0527 17:57:34.267366 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:34.268089 kubelet[2875]: E0527 17:57:34.267381 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:34.268754 kubelet[2875]: E0527 17:57:34.268217 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:34.268754 kubelet[2875]: W0527 17:57:34.268230 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:34.268754 kubelet[2875]: E0527 17:57:34.268296 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:34.268754 kubelet[2875]: E0527 17:57:34.268542 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:34.268754 kubelet[2875]: W0527 17:57:34.268555 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:34.268754 kubelet[2875]: E0527 17:57:34.268656 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:34.269417 kubelet[2875]: E0527 17:57:34.268911 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:34.269417 kubelet[2875]: W0527 17:57:34.268958 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:34.269417 kubelet[2875]: E0527 17:57:34.268983 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:34.269417 kubelet[2875]: E0527 17:57:34.269322 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:34.269417 kubelet[2875]: W0527 17:57:34.269335 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:34.269417 kubelet[2875]: E0527 17:57:34.269383 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:34.269838 kubelet[2875]: E0527 17:57:34.269716 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:34.269838 kubelet[2875]: W0527 17:57:34.269730 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:34.269838 kubelet[2875]: E0527 17:57:34.269753 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:34.270102 kubelet[2875]: E0527 17:57:34.270084 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:34.270450 kubelet[2875]: W0527 17:57:34.270103 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:34.270450 kubelet[2875]: E0527 17:57:34.270136 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:34.270450 kubelet[2875]: E0527 17:57:34.270432 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:34.270450 kubelet[2875]: W0527 17:57:34.270445 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:34.270684 kubelet[2875]: E0527 17:57:34.270467 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:34.270913 kubelet[2875]: E0527 17:57:34.270774 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:34.270913 kubelet[2875]: W0527 17:57:34.270799 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:34.270913 kubelet[2875]: E0527 17:57:34.270846 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:34.271632 kubelet[2875]: E0527 17:57:34.271116 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:34.271632 kubelet[2875]: W0527 17:57:34.271129 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:34.271632 kubelet[2875]: E0527 17:57:34.271151 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:34.271632 kubelet[2875]: E0527 17:57:34.271485 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:34.271632 kubelet[2875]: W0527 17:57:34.271500 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:34.271632 kubelet[2875]: E0527 17:57:34.271529 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:34.271869 kubelet[2875]: E0527 17:57:34.271771 2875 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:57:34.271869 kubelet[2875]: W0527 17:57:34.271786 2875 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:57:34.271869 kubelet[2875]: E0527 17:57:34.271800 2875 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:57:34.412099 containerd[1585]: time="2025-05-27T17:57:34.411940743Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:57:34.418068 containerd[1585]: time="2025-05-27T17:57:34.416123447Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0: active requests=0, bytes read=4441619" May 27 17:57:34.418068 containerd[1585]: time="2025-05-27T17:57:34.417125129Z" level=info msg="ImageCreate event name:\"sha256:c53606cea03e59dcbfa981dc43a55dff05952895f72576b8389fa00be09ab676\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:57:34.421719 containerd[1585]: time="2025-05-27T17:57:34.420853759Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:ce76dd87f11d3fd0054c35ad2e0e9f833748d007f77a9bfe859d0ddcb66fcb2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:57:34.433796 containerd[1585]: time="2025-05-27T17:57:34.433723971Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" with image id \"sha256:c53606cea03e59dcbfa981dc43a55dff05952895f72576b8389fa00be09ab676\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:ce76dd87f11d3fd0054c35ad2e0e9f833748d007f77a9bfe859d0ddcb66fcb2c\", size \"5934282\" in 1.636532167s" May 27 17:57:34.433796 containerd[1585]: time="2025-05-27T17:57:34.433790957Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" returns image reference \"sha256:c53606cea03e59dcbfa981dc43a55dff05952895f72576b8389fa00be09ab676\"" May 27 17:57:34.445618 containerd[1585]: time="2025-05-27T17:57:34.445541232Z" level=info msg="CreateContainer within sandbox \"0564141284b0331cd5beca5aafd049f00dae41bfad8550c44d29d9c0651b1e73\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 27 17:57:34.495768 containerd[1585]: time="2025-05-27T17:57:34.494002370Z" level=info msg="Container b327134a7797f092ce83a202d1a90554162385153d6015eb8a79a1eb33ca42fb: CDI devices from CRI Config.CDIDevices: []" May 27 17:57:34.501231 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount197501850.mount: Deactivated successfully. May 27 17:57:34.506471 containerd[1585]: time="2025-05-27T17:57:34.506431014Z" level=info msg="CreateContainer within sandbox \"0564141284b0331cd5beca5aafd049f00dae41bfad8550c44d29d9c0651b1e73\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"b327134a7797f092ce83a202d1a90554162385153d6015eb8a79a1eb33ca42fb\"" May 27 17:57:34.508298 containerd[1585]: time="2025-05-27T17:57:34.507379542Z" level=info msg="StartContainer for \"b327134a7797f092ce83a202d1a90554162385153d6015eb8a79a1eb33ca42fb\"" May 27 17:57:34.510055 containerd[1585]: time="2025-05-27T17:57:34.510011335Z" level=info msg="connecting to shim b327134a7797f092ce83a202d1a90554162385153d6015eb8a79a1eb33ca42fb" address="unix:///run/containerd/s/805f2409e6c84d303a22ce7a7c38ae3fe8299dad14d19b7cb2216fc58a0f94c2" protocol=ttrpc version=3 May 27 17:57:34.540871 systemd[1]: Started cri-containerd-b327134a7797f092ce83a202d1a90554162385153d6015eb8a79a1eb33ca42fb.scope - libcontainer container b327134a7797f092ce83a202d1a90554162385153d6015eb8a79a1eb33ca42fb. May 27 17:57:34.616562 containerd[1585]: time="2025-05-27T17:57:34.616432807Z" level=info msg="StartContainer for \"b327134a7797f092ce83a202d1a90554162385153d6015eb8a79a1eb33ca42fb\" returns successfully" May 27 17:57:34.634196 systemd[1]: cri-containerd-b327134a7797f092ce83a202d1a90554162385153d6015eb8a79a1eb33ca42fb.scope: Deactivated successfully. May 27 17:57:34.667371 containerd[1585]: time="2025-05-27T17:57:34.666458697Z" level=info msg="received exit event container_id:\"b327134a7797f092ce83a202d1a90554162385153d6015eb8a79a1eb33ca42fb\" id:\"b327134a7797f092ce83a202d1a90554162385153d6015eb8a79a1eb33ca42fb\" pid:3614 exited_at:{seconds:1748368654 nanos:637560866}" May 27 17:57:34.668760 containerd[1585]: time="2025-05-27T17:57:34.668716424Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b327134a7797f092ce83a202d1a90554162385153d6015eb8a79a1eb33ca42fb\" id:\"b327134a7797f092ce83a202d1a90554162385153d6015eb8a79a1eb33ca42fb\" pid:3614 exited_at:{seconds:1748368654 nanos:637560866}" May 27 17:57:34.702930 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b327134a7797f092ce83a202d1a90554162385153d6015eb8a79a1eb33ca42fb-rootfs.mount: Deactivated successfully. May 27 17:57:34.990686 kubelet[2875]: E0527 17:57:34.990475 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9t7wk" podUID="9c96b9b6-32e2-4ed2-8acd-c7b5982abac8" May 27 17:57:35.161870 containerd[1585]: time="2025-05-27T17:57:35.161736901Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.0\"" May 27 17:57:36.990341 kubelet[2875]: E0527 17:57:36.990181 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9t7wk" podUID="9c96b9b6-32e2-4ed2-8acd-c7b5982abac8" May 27 17:57:38.990092 kubelet[2875]: E0527 17:57:38.990018 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9t7wk" podUID="9c96b9b6-32e2-4ed2-8acd-c7b5982abac8" May 27 17:57:39.859706 containerd[1585]: time="2025-05-27T17:57:39.859418702Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:57:39.861778 containerd[1585]: time="2025-05-27T17:57:39.861738888Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.0: active requests=0, bytes read=70300568" May 27 17:57:39.862419 containerd[1585]: time="2025-05-27T17:57:39.862353031Z" level=info msg="ImageCreate event name:\"sha256:15f996c472622f23047ea38b2d72940e8c34d0996b8a2e12a1f255c1d7083185\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:57:39.865934 containerd[1585]: time="2025-05-27T17:57:39.865896053Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:3dd06656abdc03fbd51782d5f6fe4d70e6825a1c0c5bce2a165bbd2ff9e0f7df\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:57:39.866948 containerd[1585]: time="2025-05-27T17:57:39.866768659Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.0\" with image id \"sha256:15f996c472622f23047ea38b2d72940e8c34d0996b8a2e12a1f255c1d7083185\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:3dd06656abdc03fbd51782d5f6fe4d70e6825a1c0c5bce2a165bbd2ff9e0f7df\", size \"71793271\" in 4.70493453s" May 27 17:57:39.866948 containerd[1585]: time="2025-05-27T17:57:39.866808606Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.0\" returns image reference \"sha256:15f996c472622f23047ea38b2d72940e8c34d0996b8a2e12a1f255c1d7083185\"" May 27 17:57:39.872511 containerd[1585]: time="2025-05-27T17:57:39.871812147Z" level=info msg="CreateContainer within sandbox \"0564141284b0331cd5beca5aafd049f00dae41bfad8550c44d29d9c0651b1e73\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 27 17:57:39.884937 containerd[1585]: time="2025-05-27T17:57:39.884104658Z" level=info msg="Container 2af8d5b6758f3447306d17c111142f426966a43f7e813e3a77a558c8ff25ac5b: CDI devices from CRI Config.CDIDevices: []" May 27 17:57:39.902903 containerd[1585]: time="2025-05-27T17:57:39.902855539Z" level=info msg="CreateContainer within sandbox \"0564141284b0331cd5beca5aafd049f00dae41bfad8550c44d29d9c0651b1e73\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"2af8d5b6758f3447306d17c111142f426966a43f7e813e3a77a558c8ff25ac5b\"" May 27 17:57:39.904056 containerd[1585]: time="2025-05-27T17:57:39.904021992Z" level=info msg="StartContainer for \"2af8d5b6758f3447306d17c111142f426966a43f7e813e3a77a558c8ff25ac5b\"" May 27 17:57:39.906162 containerd[1585]: time="2025-05-27T17:57:39.906128335Z" level=info msg="connecting to shim 2af8d5b6758f3447306d17c111142f426966a43f7e813e3a77a558c8ff25ac5b" address="unix:///run/containerd/s/805f2409e6c84d303a22ce7a7c38ae3fe8299dad14d19b7cb2216fc58a0f94c2" protocol=ttrpc version=3 May 27 17:57:39.939901 systemd[1]: Started cri-containerd-2af8d5b6758f3447306d17c111142f426966a43f7e813e3a77a558c8ff25ac5b.scope - libcontainer container 2af8d5b6758f3447306d17c111142f426966a43f7e813e3a77a558c8ff25ac5b. May 27 17:57:40.006960 containerd[1585]: time="2025-05-27T17:57:40.006901378Z" level=info msg="StartContainer for \"2af8d5b6758f3447306d17c111142f426966a43f7e813e3a77a558c8ff25ac5b\" returns successfully" May 27 17:57:40.938143 systemd[1]: cri-containerd-2af8d5b6758f3447306d17c111142f426966a43f7e813e3a77a558c8ff25ac5b.scope: Deactivated successfully. May 27 17:57:40.938809 systemd[1]: cri-containerd-2af8d5b6758f3447306d17c111142f426966a43f7e813e3a77a558c8ff25ac5b.scope: Consumed 671ms CPU time, 166M memory peak, 12.3M read from disk, 170.9M written to disk. May 27 17:57:40.990478 kubelet[2875]: E0527 17:57:40.990176 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9t7wk" podUID="9c96b9b6-32e2-4ed2-8acd-c7b5982abac8" May 27 17:57:41.029226 kubelet[2875]: I0527 17:57:41.029185 2875 kubelet_node_status.go:488] "Fast updating node status as it just became ready" May 27 17:57:41.041495 containerd[1585]: time="2025-05-27T17:57:41.041304240Z" level=info msg="received exit event container_id:\"2af8d5b6758f3447306d17c111142f426966a43f7e813e3a77a558c8ff25ac5b\" id:\"2af8d5b6758f3447306d17c111142f426966a43f7e813e3a77a558c8ff25ac5b\" pid:3673 exited_at:{seconds:1748368661 nanos:40908085}" May 27 17:57:41.043357 containerd[1585]: time="2025-05-27T17:57:41.043128725Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2af8d5b6758f3447306d17c111142f426966a43f7e813e3a77a558c8ff25ac5b\" id:\"2af8d5b6758f3447306d17c111142f426966a43f7e813e3a77a558c8ff25ac5b\" pid:3673 exited_at:{seconds:1748368661 nanos:40908085}" May 27 17:57:41.115533 systemd[1]: Created slice kubepods-besteffort-pod04782360_5124_401b_b0fa_cb2a35043c5d.slice - libcontainer container kubepods-besteffort-pod04782360_5124_401b_b0fa_cb2a35043c5d.slice. May 27 17:57:41.170392 systemd[1]: Created slice kubepods-besteffort-podcfb3cccd_61b8_4d5c_bde3_b6277323070c.slice - libcontainer container kubepods-besteffort-podcfb3cccd_61b8_4d5c_bde3_b6277323070c.slice. May 27 17:57:41.195748 systemd[1]: Created slice kubepods-besteffort-podfcd5b768_af40_448c_b34c_b9c1a53c8b71.slice - libcontainer container kubepods-besteffort-podfcd5b768_af40_448c_b34c_b9c1a53c8b71.slice. May 27 17:57:41.206092 systemd[1]: Created slice kubepods-burstable-pod58d25a41_fd53_47e5_aec2_9bbdbb20fba6.slice - libcontainer container kubepods-burstable-pod58d25a41_fd53_47e5_aec2_9bbdbb20fba6.slice. May 27 17:57:41.222699 kubelet[2875]: I0527 17:57:41.222126 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a7c34d5-6339-473f-9bc2-4a5f5fd961b4-config\") pod \"goldmane-8f77d7b6c-lv5hg\" (UID: \"3a7c34d5-6339-473f-9bc2-4a5f5fd961b4\") " pod="calico-system/goldmane-8f77d7b6c-lv5hg" May 27 17:57:41.225942 kubelet[2875]: I0527 17:57:41.225781 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2jbr\" (UniqueName: \"kubernetes.io/projected/58d25a41-fd53-47e5-aec2-9bbdbb20fba6-kube-api-access-m2jbr\") pod \"coredns-7c65d6cfc9-cc6hv\" (UID: \"58d25a41-fd53-47e5-aec2-9bbdbb20fba6\") " pod="kube-system/coredns-7c65d6cfc9-cc6hv" May 27 17:57:41.228047 kubelet[2875]: I0527 17:57:41.228022 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fd4283a9-ffc2-4b86-98a9-81f0eb99355f-config-volume\") pod \"coredns-7c65d6cfc9-5qgtj\" (UID: \"fd4283a9-ffc2-4b86-98a9-81f0eb99355f\") " pod="kube-system/coredns-7c65d6cfc9-5qgtj" May 27 17:57:41.229237 kubelet[2875]: I0527 17:57:41.228747 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krtjv\" (UniqueName: \"kubernetes.io/projected/cfb3cccd-61b8-4d5c-bde3-b6277323070c-kube-api-access-krtjv\") pod \"calico-kube-controllers-78b7d7669b-pvkq7\" (UID: \"cfb3cccd-61b8-4d5c-bde3-b6277323070c\") " pod="calico-system/calico-kube-controllers-78b7d7669b-pvkq7" May 27 17:57:41.229742 kubelet[2875]: I0527 17:57:41.229453 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxlqx\" (UniqueName: \"kubernetes.io/projected/04782360-5124-401b-b0fa-cb2a35043c5d-kube-api-access-vxlqx\") pod \"calico-apiserver-5dcf54bdf4-6rv2d\" (UID: \"04782360-5124-401b-b0fa-cb2a35043c5d\") " pod="calico-apiserver/calico-apiserver-5dcf54bdf4-6rv2d" May 27 17:57:41.230829 kubelet[2875]: I0527 17:57:41.230248 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96shr\" (UniqueName: \"kubernetes.io/projected/3a7c34d5-6339-473f-9bc2-4a5f5fd961b4-kube-api-access-96shr\") pod \"goldmane-8f77d7b6c-lv5hg\" (UID: \"3a7c34d5-6339-473f-9bc2-4a5f5fd961b4\") " pod="calico-system/goldmane-8f77d7b6c-lv5hg" May 27 17:57:41.230829 kubelet[2875]: I0527 17:57:41.230295 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnzlm\" (UniqueName: \"kubernetes.io/projected/fd4283a9-ffc2-4b86-98a9-81f0eb99355f-kube-api-access-rnzlm\") pod \"coredns-7c65d6cfc9-5qgtj\" (UID: \"fd4283a9-ffc2-4b86-98a9-81f0eb99355f\") " pod="kube-system/coredns-7c65d6cfc9-5qgtj" May 27 17:57:41.230829 kubelet[2875]: I0527 17:57:41.230329 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3a7c34d5-6339-473f-9bc2-4a5f5fd961b4-goldmane-ca-bundle\") pod \"goldmane-8f77d7b6c-lv5hg\" (UID: \"3a7c34d5-6339-473f-9bc2-4a5f5fd961b4\") " pod="calico-system/goldmane-8f77d7b6c-lv5hg" May 27 17:57:41.230829 kubelet[2875]: I0527 17:57:41.230360 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/04782360-5124-401b-b0fa-cb2a35043c5d-calico-apiserver-certs\") pod \"calico-apiserver-5dcf54bdf4-6rv2d\" (UID: \"04782360-5124-401b-b0fa-cb2a35043c5d\") " pod="calico-apiserver/calico-apiserver-5dcf54bdf4-6rv2d" May 27 17:57:41.230829 kubelet[2875]: I0527 17:57:41.230391 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/fcd5b768-af40-448c-b34c-b9c1a53c8b71-calico-apiserver-certs\") pod \"calico-apiserver-5dcf54bdf4-b8qsn\" (UID: \"fcd5b768-af40-448c-b34c-b9c1a53c8b71\") " pod="calico-apiserver/calico-apiserver-5dcf54bdf4-b8qsn" May 27 17:57:41.231077 kubelet[2875]: I0527 17:57:41.230422 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wk69\" (UniqueName: \"kubernetes.io/projected/fcd5b768-af40-448c-b34c-b9c1a53c8b71-kube-api-access-7wk69\") pod \"calico-apiserver-5dcf54bdf4-b8qsn\" (UID: \"fcd5b768-af40-448c-b34c-b9c1a53c8b71\") " pod="calico-apiserver/calico-apiserver-5dcf54bdf4-b8qsn" May 27 17:57:41.231077 kubelet[2875]: I0527 17:57:41.230448 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/58d25a41-fd53-47e5-aec2-9bbdbb20fba6-config-volume\") pod \"coredns-7c65d6cfc9-cc6hv\" (UID: \"58d25a41-fd53-47e5-aec2-9bbdbb20fba6\") " pod="kube-system/coredns-7c65d6cfc9-cc6hv" May 27 17:57:41.231077 kubelet[2875]: I0527 17:57:41.230477 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf21f10b-a619-4842-a9ab-7c3141c0e1e4-whisker-ca-bundle\") pod \"whisker-59c5c8998-2pl48\" (UID: \"cf21f10b-a619-4842-a9ab-7c3141c0e1e4\") " pod="calico-system/whisker-59c5c8998-2pl48" May 27 17:57:41.231077 kubelet[2875]: I0527 17:57:41.230503 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt9md\" (UniqueName: \"kubernetes.io/projected/cf21f10b-a619-4842-a9ab-7c3141c0e1e4-kube-api-access-mt9md\") pod \"whisker-59c5c8998-2pl48\" (UID: \"cf21f10b-a619-4842-a9ab-7c3141c0e1e4\") " pod="calico-system/whisker-59c5c8998-2pl48" May 27 17:57:41.231077 kubelet[2875]: I0527 17:57:41.230527 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/3a7c34d5-6339-473f-9bc2-4a5f5fd961b4-goldmane-key-pair\") pod \"goldmane-8f77d7b6c-lv5hg\" (UID: \"3a7c34d5-6339-473f-9bc2-4a5f5fd961b4\") " pod="calico-system/goldmane-8f77d7b6c-lv5hg" May 27 17:57:41.231275 kubelet[2875]: I0527 17:57:41.230571 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/cf21f10b-a619-4842-a9ab-7c3141c0e1e4-whisker-backend-key-pair\") pod \"whisker-59c5c8998-2pl48\" (UID: \"cf21f10b-a619-4842-a9ab-7c3141c0e1e4\") " pod="calico-system/whisker-59c5c8998-2pl48" May 27 17:57:41.231275 kubelet[2875]: I0527 17:57:41.230621 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cfb3cccd-61b8-4d5c-bde3-b6277323070c-tigera-ca-bundle\") pod \"calico-kube-controllers-78b7d7669b-pvkq7\" (UID: \"cfb3cccd-61b8-4d5c-bde3-b6277323070c\") " pod="calico-system/calico-kube-controllers-78b7d7669b-pvkq7" May 27 17:57:41.243831 systemd[1]: Created slice kubepods-besteffort-pod3a7c34d5_6339_473f_9bc2_4a5f5fd961b4.slice - libcontainer container kubepods-besteffort-pod3a7c34d5_6339_473f_9bc2_4a5f5fd961b4.slice. May 27 17:57:41.259486 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2af8d5b6758f3447306d17c111142f426966a43f7e813e3a77a558c8ff25ac5b-rootfs.mount: Deactivated successfully. May 27 17:57:41.267350 systemd[1]: Created slice kubepods-besteffort-podcf21f10b_a619_4842_a9ab_7c3141c0e1e4.slice - libcontainer container kubepods-besteffort-podcf21f10b_a619_4842_a9ab_7c3141c0e1e4.slice. May 27 17:57:41.280098 systemd[1]: Created slice kubepods-burstable-podfd4283a9_ffc2_4b86_98a9_81f0eb99355f.slice - libcontainer container kubepods-burstable-podfd4283a9_ffc2_4b86_98a9_81f0eb99355f.slice. May 27 17:57:41.480218 containerd[1585]: time="2025-05-27T17:57:41.480065475Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-78b7d7669b-pvkq7,Uid:cfb3cccd-61b8-4d5c-bde3-b6277323070c,Namespace:calico-system,Attempt:0,}" May 27 17:57:41.538065 containerd[1585]: time="2025-05-27T17:57:41.537477403Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5dcf54bdf4-b8qsn,Uid:fcd5b768-af40-448c-b34c-b9c1a53c8b71,Namespace:calico-apiserver,Attempt:0,}" May 27 17:57:41.545260 containerd[1585]: time="2025-05-27T17:57:41.545223479Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-cc6hv,Uid:58d25a41-fd53-47e5-aec2-9bbdbb20fba6,Namespace:kube-system,Attempt:0,}" May 27 17:57:41.556628 containerd[1585]: time="2025-05-27T17:57:41.556585890Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-8f77d7b6c-lv5hg,Uid:3a7c34d5-6339-473f-9bc2-4a5f5fd961b4,Namespace:calico-system,Attempt:0,}" May 27 17:57:41.575022 containerd[1585]: time="2025-05-27T17:57:41.574937610Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-59c5c8998-2pl48,Uid:cf21f10b-a619-4842-a9ab-7c3141c0e1e4,Namespace:calico-system,Attempt:0,}" May 27 17:57:41.650607 containerd[1585]: time="2025-05-27T17:57:41.650551322Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-5qgtj,Uid:fd4283a9-ffc2-4b86-98a9-81f0eb99355f,Namespace:kube-system,Attempt:0,}" May 27 17:57:41.721831 containerd[1585]: time="2025-05-27T17:57:41.721784896Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5dcf54bdf4-6rv2d,Uid:04782360-5124-401b-b0fa-cb2a35043c5d,Namespace:calico-apiserver,Attempt:0,}" May 27 17:57:41.842770 containerd[1585]: time="2025-05-27T17:57:41.842015480Z" level=error msg="Failed to destroy network for sandbox \"6b39aabe4c8a52dd7e0fb13d351c723463d0a3ff7c08ecea2da2d5d137c9df69\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:57:41.845052 containerd[1585]: time="2025-05-27T17:57:41.845018536Z" level=error msg="Failed to destroy network for sandbox \"0463ce55d2cc2abfa981ef891fbd25b0496c09bb3de2005add62275ffb5a4299\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:57:41.847687 containerd[1585]: time="2025-05-27T17:57:41.847611169Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-8f77d7b6c-lv5hg,Uid:3a7c34d5-6339-473f-9bc2-4a5f5fd961b4,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6b39aabe4c8a52dd7e0fb13d351c723463d0a3ff7c08ecea2da2d5d137c9df69\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:57:41.849282 kubelet[2875]: E0527 17:57:41.849223 2875 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6b39aabe4c8a52dd7e0fb13d351c723463d0a3ff7c08ecea2da2d5d137c9df69\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:57:41.849655 kubelet[2875]: E0527 17:57:41.849432 2875 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6b39aabe4c8a52dd7e0fb13d351c723463d0a3ff7c08ecea2da2d5d137c9df69\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-8f77d7b6c-lv5hg" May 27 17:57:41.849655 kubelet[2875]: E0527 17:57:41.849481 2875 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6b39aabe4c8a52dd7e0fb13d351c723463d0a3ff7c08ecea2da2d5d137c9df69\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-8f77d7b6c-lv5hg" May 27 17:57:41.850304 kubelet[2875]: E0527 17:57:41.850261 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-8f77d7b6c-lv5hg_calico-system(3a7c34d5-6339-473f-9bc2-4a5f5fd961b4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-8f77d7b6c-lv5hg_calico-system(3a7c34d5-6339-473f-9bc2-4a5f5fd961b4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6b39aabe4c8a52dd7e0fb13d351c723463d0a3ff7c08ecea2da2d5d137c9df69\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-8f77d7b6c-lv5hg" podUID="3a7c34d5-6339-473f-9bc2-4a5f5fd961b4" May 27 17:57:41.851309 containerd[1585]: time="2025-05-27T17:57:41.851266554Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5dcf54bdf4-b8qsn,Uid:fcd5b768-af40-448c-b34c-b9c1a53c8b71,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0463ce55d2cc2abfa981ef891fbd25b0496c09bb3de2005add62275ffb5a4299\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:57:41.852407 kubelet[2875]: E0527 17:57:41.852375 2875 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0463ce55d2cc2abfa981ef891fbd25b0496c09bb3de2005add62275ffb5a4299\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:57:41.852618 kubelet[2875]: E0527 17:57:41.852587 2875 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0463ce55d2cc2abfa981ef891fbd25b0496c09bb3de2005add62275ffb5a4299\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5dcf54bdf4-b8qsn" May 27 17:57:41.852802 kubelet[2875]: E0527 17:57:41.852744 2875 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0463ce55d2cc2abfa981ef891fbd25b0496c09bb3de2005add62275ffb5a4299\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5dcf54bdf4-b8qsn" May 27 17:57:41.853244 kubelet[2875]: E0527 17:57:41.853157 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5dcf54bdf4-b8qsn_calico-apiserver(fcd5b768-af40-448c-b34c-b9c1a53c8b71)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5dcf54bdf4-b8qsn_calico-apiserver(fcd5b768-af40-448c-b34c-b9c1a53c8b71)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0463ce55d2cc2abfa981ef891fbd25b0496c09bb3de2005add62275ffb5a4299\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5dcf54bdf4-b8qsn" podUID="fcd5b768-af40-448c-b34c-b9c1a53c8b71" May 27 17:57:41.868928 containerd[1585]: time="2025-05-27T17:57:41.868868219Z" level=error msg="Failed to destroy network for sandbox \"862d0639bd4e7b2308548b67c99b0a5cb7e915f7690e5adaee056489a415e54e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:57:41.873401 containerd[1585]: time="2025-05-27T17:57:41.873134222Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-78b7d7669b-pvkq7,Uid:cfb3cccd-61b8-4d5c-bde3-b6277323070c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"862d0639bd4e7b2308548b67c99b0a5cb7e915f7690e5adaee056489a415e54e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:57:41.873401 containerd[1585]: time="2025-05-27T17:57:41.873300870Z" level=error msg="Failed to destroy network for sandbox \"492a20a1b5e13a497b6d3723b61711fd1b553869b226bc9dcc3d86f5917fcf8c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:57:41.874773 kubelet[2875]: E0527 17:57:41.874143 2875 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"862d0639bd4e7b2308548b67c99b0a5cb7e915f7690e5adaee056489a415e54e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:57:41.874773 kubelet[2875]: E0527 17:57:41.874221 2875 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"862d0639bd4e7b2308548b67c99b0a5cb7e915f7690e5adaee056489a415e54e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-78b7d7669b-pvkq7" May 27 17:57:41.874773 kubelet[2875]: E0527 17:57:41.874248 2875 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"862d0639bd4e7b2308548b67c99b0a5cb7e915f7690e5adaee056489a415e54e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-78b7d7669b-pvkq7" May 27 17:57:41.874951 kubelet[2875]: E0527 17:57:41.874301 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-78b7d7669b-pvkq7_calico-system(cfb3cccd-61b8-4d5c-bde3-b6277323070c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-78b7d7669b-pvkq7_calico-system(cfb3cccd-61b8-4d5c-bde3-b6277323070c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"862d0639bd4e7b2308548b67c99b0a5cb7e915f7690e5adaee056489a415e54e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-78b7d7669b-pvkq7" podUID="cfb3cccd-61b8-4d5c-bde3-b6277323070c" May 27 17:57:41.876696 containerd[1585]: time="2025-05-27T17:57:41.876635096Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-cc6hv,Uid:58d25a41-fd53-47e5-aec2-9bbdbb20fba6,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"492a20a1b5e13a497b6d3723b61711fd1b553869b226bc9dcc3d86f5917fcf8c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:57:41.877384 kubelet[2875]: E0527 17:57:41.877203 2875 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"492a20a1b5e13a497b6d3723b61711fd1b553869b226bc9dcc3d86f5917fcf8c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:57:41.877384 kubelet[2875]: E0527 17:57:41.877243 2875 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"492a20a1b5e13a497b6d3723b61711fd1b553869b226bc9dcc3d86f5917fcf8c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-cc6hv" May 27 17:57:41.877384 kubelet[2875]: E0527 17:57:41.877271 2875 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"492a20a1b5e13a497b6d3723b61711fd1b553869b226bc9dcc3d86f5917fcf8c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-cc6hv" May 27 17:57:41.877572 kubelet[2875]: E0527 17:57:41.877305 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-cc6hv_kube-system(58d25a41-fd53-47e5-aec2-9bbdbb20fba6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-cc6hv_kube-system(58d25a41-fd53-47e5-aec2-9bbdbb20fba6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"492a20a1b5e13a497b6d3723b61711fd1b553869b226bc9dcc3d86f5917fcf8c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-cc6hv" podUID="58d25a41-fd53-47e5-aec2-9bbdbb20fba6" May 27 17:57:41.893241 containerd[1585]: time="2025-05-27T17:57:41.893175635Z" level=error msg="Failed to destroy network for sandbox \"60e23872913cbaacf02b297d691b3f6f74b4a1c97239421c18076e29ebda25d0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:57:41.896895 containerd[1585]: time="2025-05-27T17:57:41.896186613Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-59c5c8998-2pl48,Uid:cf21f10b-a619-4842-a9ab-7c3141c0e1e4,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"60e23872913cbaacf02b297d691b3f6f74b4a1c97239421c18076e29ebda25d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:57:41.897977 kubelet[2875]: E0527 17:57:41.897462 2875 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"60e23872913cbaacf02b297d691b3f6f74b4a1c97239421c18076e29ebda25d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:57:41.897977 kubelet[2875]: E0527 17:57:41.897586 2875 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"60e23872913cbaacf02b297d691b3f6f74b4a1c97239421c18076e29ebda25d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-59c5c8998-2pl48" May 27 17:57:41.897977 kubelet[2875]: E0527 17:57:41.897614 2875 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"60e23872913cbaacf02b297d691b3f6f74b4a1c97239421c18076e29ebda25d0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-59c5c8998-2pl48" May 27 17:57:41.898287 kubelet[2875]: E0527 17:57:41.897688 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-59c5c8998-2pl48_calico-system(cf21f10b-a619-4842-a9ab-7c3141c0e1e4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-59c5c8998-2pl48_calico-system(cf21f10b-a619-4842-a9ab-7c3141c0e1e4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"60e23872913cbaacf02b297d691b3f6f74b4a1c97239421c18076e29ebda25d0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-59c5c8998-2pl48" podUID="cf21f10b-a619-4842-a9ab-7c3141c0e1e4" May 27 17:57:41.916833 containerd[1585]: time="2025-05-27T17:57:41.916757721Z" level=error msg="Failed to destroy network for sandbox \"9891a79c4a5bc210c625c8ce6059bae914b4a634cf96395eaae040594748d251\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:57:41.917909 containerd[1585]: time="2025-05-27T17:57:41.917875042Z" level=error msg="Failed to destroy network for sandbox \"781118e927152bbfc87d8d255d5655d3a024c07fd9ca83f2c2a9faa9f0204486\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:57:41.918336 containerd[1585]: time="2025-05-27T17:57:41.918294252Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-5qgtj,Uid:fd4283a9-ffc2-4b86-98a9-81f0eb99355f,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9891a79c4a5bc210c625c8ce6059bae914b4a634cf96395eaae040594748d251\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:57:41.918596 kubelet[2875]: E0527 17:57:41.918550 2875 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9891a79c4a5bc210c625c8ce6059bae914b4a634cf96395eaae040594748d251\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:57:41.918734 kubelet[2875]: E0527 17:57:41.918621 2875 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9891a79c4a5bc210c625c8ce6059bae914b4a634cf96395eaae040594748d251\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-5qgtj" May 27 17:57:41.918734 kubelet[2875]: E0527 17:57:41.918647 2875 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9891a79c4a5bc210c625c8ce6059bae914b4a634cf96395eaae040594748d251\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-5qgtj" May 27 17:57:41.919021 kubelet[2875]: E0527 17:57:41.918878 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-5qgtj_kube-system(fd4283a9-ffc2-4b86-98a9-81f0eb99355f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-5qgtj_kube-system(fd4283a9-ffc2-4b86-98a9-81f0eb99355f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9891a79c4a5bc210c625c8ce6059bae914b4a634cf96395eaae040594748d251\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-5qgtj" podUID="fd4283a9-ffc2-4b86-98a9-81f0eb99355f" May 27 17:57:41.919866 containerd[1585]: time="2025-05-27T17:57:41.919746971Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5dcf54bdf4-6rv2d,Uid:04782360-5124-401b-b0fa-cb2a35043c5d,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"781118e927152bbfc87d8d255d5655d3a024c07fd9ca83f2c2a9faa9f0204486\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:57:41.920370 kubelet[2875]: E0527 17:57:41.920280 2875 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"781118e927152bbfc87d8d255d5655d3a024c07fd9ca83f2c2a9faa9f0204486\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:57:41.921581 kubelet[2875]: E0527 17:57:41.921494 2875 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"781118e927152bbfc87d8d255d5655d3a024c07fd9ca83f2c2a9faa9f0204486\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5dcf54bdf4-6rv2d" May 27 17:57:41.922195 kubelet[2875]: E0527 17:57:41.922136 2875 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"781118e927152bbfc87d8d255d5655d3a024c07fd9ca83f2c2a9faa9f0204486\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5dcf54bdf4-6rv2d" May 27 17:57:41.922370 kubelet[2875]: E0527 17:57:41.922318 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5dcf54bdf4-6rv2d_calico-apiserver(04782360-5124-401b-b0fa-cb2a35043c5d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5dcf54bdf4-6rv2d_calico-apiserver(04782360-5124-401b-b0fa-cb2a35043c5d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"781118e927152bbfc87d8d255d5655d3a024c07fd9ca83f2c2a9faa9f0204486\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5dcf54bdf4-6rv2d" podUID="04782360-5124-401b-b0fa-cb2a35043c5d" May 27 17:57:42.244542 containerd[1585]: time="2025-05-27T17:57:42.244443792Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.0\"" May 27 17:57:43.009505 systemd[1]: Started sshd@9-10.230.41.6:22-80.94.95.117:65105.service - OpenSSH per-connection server daemon (80.94.95.117:65105). May 27 17:57:43.030290 systemd[1]: Created slice kubepods-besteffort-pod9c96b9b6_32e2_4ed2_8acd_c7b5982abac8.slice - libcontainer container kubepods-besteffort-pod9c96b9b6_32e2_4ed2_8acd_c7b5982abac8.slice. May 27 17:57:43.043412 containerd[1585]: time="2025-05-27T17:57:43.042924033Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9t7wk,Uid:9c96b9b6-32e2-4ed2-8acd-c7b5982abac8,Namespace:calico-system,Attempt:0,}" May 27 17:57:43.136691 sshd[3905]: Connection closed by 80.94.95.117 port 65105 May 27 17:57:43.135349 systemd[1]: sshd@9-10.230.41.6:22-80.94.95.117:65105.service: Deactivated successfully. May 27 17:57:43.144495 containerd[1585]: time="2025-05-27T17:57:43.144450894Z" level=error msg="Failed to destroy network for sandbox \"6971692430cb5fe43dd7c0d46cdebae0aecc0a6cb9ba88fee169da697bd5d6cc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:57:43.147172 containerd[1585]: time="2025-05-27T17:57:43.147094113Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9t7wk,Uid:9c96b9b6-32e2-4ed2-8acd-c7b5982abac8,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6971692430cb5fe43dd7c0d46cdebae0aecc0a6cb9ba88fee169da697bd5d6cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:57:43.148146 kubelet[2875]: E0527 17:57:43.148106 2875 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6971692430cb5fe43dd7c0d46cdebae0aecc0a6cb9ba88fee169da697bd5d6cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:57:43.149064 systemd[1]: run-netns-cni\x2dddd978e6\x2d1059\x2d25f7\x2d91ae\x2d71b5b7479eb6.mount: Deactivated successfully. May 27 17:57:43.150766 kubelet[2875]: E0527 17:57:43.149173 2875 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6971692430cb5fe43dd7c0d46cdebae0aecc0a6cb9ba88fee169da697bd5d6cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9t7wk" May 27 17:57:43.150766 kubelet[2875]: E0527 17:57:43.149209 2875 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6971692430cb5fe43dd7c0d46cdebae0aecc0a6cb9ba88fee169da697bd5d6cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9t7wk" May 27 17:57:43.150766 kubelet[2875]: E0527 17:57:43.149269 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-9t7wk_calico-system(9c96b9b6-32e2-4ed2-8acd-c7b5982abac8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-9t7wk_calico-system(9c96b9b6-32e2-4ed2-8acd-c7b5982abac8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6971692430cb5fe43dd7c0d46cdebae0aecc0a6cb9ba88fee169da697bd5d6cc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-9t7wk" podUID="9c96b9b6-32e2-4ed2-8acd-c7b5982abac8" May 27 17:57:51.117570 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2502565598.mount: Deactivated successfully. May 27 17:57:51.271368 containerd[1585]: time="2025-05-27T17:57:51.234199155Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.0: active requests=0, bytes read=156396372" May 27 17:57:51.337447 containerd[1585]: time="2025-05-27T17:57:51.337306306Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:57:51.358883 containerd[1585]: time="2025-05-27T17:57:51.358761720Z" level=info msg="ImageCreate event name:\"sha256:d12dae9bc0999225efe30fd5618bcf2195709d54ed2840234f5006aab5f7d721\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:57:51.360946 containerd[1585]: time="2025-05-27T17:57:51.360254467Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:7cb61ea47ca0a8e6d0526a42da4f1e399b37ccd13339d0776d272465cb7ee012\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:57:51.360946 containerd[1585]: time="2025-05-27T17:57:51.360792402Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.0\" with image id \"sha256:d12dae9bc0999225efe30fd5618bcf2195709d54ed2840234f5006aab5f7d721\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/node@sha256:7cb61ea47ca0a8e6d0526a42da4f1e399b37ccd13339d0776d272465cb7ee012\", size \"156396234\" in 9.115314437s" May 27 17:57:51.360946 containerd[1585]: time="2025-05-27T17:57:51.360856826Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.0\" returns image reference \"sha256:d12dae9bc0999225efe30fd5618bcf2195709d54ed2840234f5006aab5f7d721\"" May 27 17:57:51.414828 containerd[1585]: time="2025-05-27T17:57:51.414554725Z" level=info msg="CreateContainer within sandbox \"0564141284b0331cd5beca5aafd049f00dae41bfad8550c44d29d9c0651b1e73\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 27 17:57:51.477700 containerd[1585]: time="2025-05-27T17:57:51.474161811Z" level=info msg="Container 53c03d9554d0025de9e8cbdb2e9da19459fba3ea70590696c861f73440d9de07: CDI devices from CRI Config.CDIDevices: []" May 27 17:57:51.478613 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3716949509.mount: Deactivated successfully. May 27 17:57:51.501701 containerd[1585]: time="2025-05-27T17:57:51.501617633Z" level=info msg="CreateContainer within sandbox \"0564141284b0331cd5beca5aafd049f00dae41bfad8550c44d29d9c0651b1e73\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"53c03d9554d0025de9e8cbdb2e9da19459fba3ea70590696c861f73440d9de07\"" May 27 17:57:51.502966 containerd[1585]: time="2025-05-27T17:57:51.502939894Z" level=info msg="StartContainer for \"53c03d9554d0025de9e8cbdb2e9da19459fba3ea70590696c861f73440d9de07\"" May 27 17:57:51.510543 containerd[1585]: time="2025-05-27T17:57:51.509784519Z" level=info msg="connecting to shim 53c03d9554d0025de9e8cbdb2e9da19459fba3ea70590696c861f73440d9de07" address="unix:///run/containerd/s/805f2409e6c84d303a22ce7a7c38ae3fe8299dad14d19b7cb2216fc58a0f94c2" protocol=ttrpc version=3 May 27 17:57:51.655968 systemd[1]: Started cri-containerd-53c03d9554d0025de9e8cbdb2e9da19459fba3ea70590696c861f73440d9de07.scope - libcontainer container 53c03d9554d0025de9e8cbdb2e9da19459fba3ea70590696c861f73440d9de07. May 27 17:57:51.736692 containerd[1585]: time="2025-05-27T17:57:51.735856841Z" level=info msg="StartContainer for \"53c03d9554d0025de9e8cbdb2e9da19459fba3ea70590696c861f73440d9de07\" returns successfully" May 27 17:57:52.053687 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 27 17:57:52.056311 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 27 17:57:52.413504 kubelet[2875]: I0527 17:57:52.413418 2875 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf21f10b-a619-4842-a9ab-7c3141c0e1e4-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "cf21f10b-a619-4842-a9ab-7c3141c0e1e4" (UID: "cf21f10b-a619-4842-a9ab-7c3141c0e1e4"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" May 27 17:57:52.415624 kubelet[2875]: I0527 17:57:52.414389 2875 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf21f10b-a619-4842-a9ab-7c3141c0e1e4-whisker-ca-bundle\") pod \"cf21f10b-a619-4842-a9ab-7c3141c0e1e4\" (UID: \"cf21f10b-a619-4842-a9ab-7c3141c0e1e4\") " May 27 17:57:52.415624 kubelet[2875]: I0527 17:57:52.414448 2875 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/cf21f10b-a619-4842-a9ab-7c3141c0e1e4-whisker-backend-key-pair\") pod \"cf21f10b-a619-4842-a9ab-7c3141c0e1e4\" (UID: \"cf21f10b-a619-4842-a9ab-7c3141c0e1e4\") " May 27 17:57:52.415624 kubelet[2875]: I0527 17:57:52.414490 2875 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt9md\" (UniqueName: \"kubernetes.io/projected/cf21f10b-a619-4842-a9ab-7c3141c0e1e4-kube-api-access-mt9md\") pod \"cf21f10b-a619-4842-a9ab-7c3141c0e1e4\" (UID: \"cf21f10b-a619-4842-a9ab-7c3141c0e1e4\") " May 27 17:57:52.415624 kubelet[2875]: I0527 17:57:52.414614 2875 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf21f10b-a619-4842-a9ab-7c3141c0e1e4-whisker-ca-bundle\") on node \"srv-kh28t.gb1.brightbox.com\" DevicePath \"\"" May 27 17:57:52.438059 kubelet[2875]: I0527 17:57:52.437959 2875 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf21f10b-a619-4842-a9ab-7c3141c0e1e4-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "cf21f10b-a619-4842-a9ab-7c3141c0e1e4" (UID: "cf21f10b-a619-4842-a9ab-7c3141c0e1e4"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" May 27 17:57:52.438944 systemd[1]: var-lib-kubelet-pods-cf21f10b\x2da619\x2d4842\x2da9ab\x2d7c3141c0e1e4-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. May 27 17:57:52.445695 kubelet[2875]: I0527 17:57:52.444885 2875 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf21f10b-a619-4842-a9ab-7c3141c0e1e4-kube-api-access-mt9md" (OuterVolumeSpecName: "kube-api-access-mt9md") pod "cf21f10b-a619-4842-a9ab-7c3141c0e1e4" (UID: "cf21f10b-a619-4842-a9ab-7c3141c0e1e4"). InnerVolumeSpecName "kube-api-access-mt9md". PluginName "kubernetes.io/projected", VolumeGidValue "" May 27 17:57:52.445156 systemd[1]: var-lib-kubelet-pods-cf21f10b\x2da619\x2d4842\x2da9ab\x2d7c3141c0e1e4-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dmt9md.mount: Deactivated successfully. May 27 17:57:52.515940 kubelet[2875]: I0527 17:57:52.515885 2875 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt9md\" (UniqueName: \"kubernetes.io/projected/cf21f10b-a619-4842-a9ab-7c3141c0e1e4-kube-api-access-mt9md\") on node \"srv-kh28t.gb1.brightbox.com\" DevicePath \"\"" May 27 17:57:52.516176 kubelet[2875]: I0527 17:57:52.516137 2875 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/cf21f10b-a619-4842-a9ab-7c3141c0e1e4-whisker-backend-key-pair\") on node \"srv-kh28t.gb1.brightbox.com\" DevicePath \"\"" May 27 17:57:52.702166 systemd[1]: Removed slice kubepods-besteffort-podcf21f10b_a619_4842_a9ab_7c3141c0e1e4.slice - libcontainer container kubepods-besteffort-podcf21f10b_a619_4842_a9ab_7c3141c0e1e4.slice. May 27 17:57:52.720885 kubelet[2875]: I0527 17:57:52.720752 2875 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-8l4vr" podStartSLOduration=2.549966978 podStartE2EDuration="23.720251708s" podCreationTimestamp="2025-05-27 17:57:29 +0000 UTC" firstStartedPulling="2025-05-27 17:57:30.223040439 +0000 UTC m=+22.542685326" lastFinishedPulling="2025-05-27 17:57:51.393325176 +0000 UTC m=+43.712970056" observedRunningTime="2025-05-27 17:57:52.418476224 +0000 UTC m=+44.738121129" watchObservedRunningTime="2025-05-27 17:57:52.720251708 +0000 UTC m=+45.039896606" May 27 17:57:52.790424 systemd[1]: Created slice kubepods-besteffort-pod75023f81_9f7f_4b1f_aa44_b3e234778686.slice - libcontainer container kubepods-besteffort-pod75023f81_9f7f_4b1f_aa44_b3e234778686.slice. May 27 17:57:52.818542 kubelet[2875]: I0527 17:57:52.818489 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75023f81-9f7f-4b1f-aa44-b3e234778686-whisker-ca-bundle\") pod \"whisker-5775587fcb-8p4tb\" (UID: \"75023f81-9f7f-4b1f-aa44-b3e234778686\") " pod="calico-system/whisker-5775587fcb-8p4tb" May 27 17:57:52.818542 kubelet[2875]: I0527 17:57:52.818550 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/75023f81-9f7f-4b1f-aa44-b3e234778686-whisker-backend-key-pair\") pod \"whisker-5775587fcb-8p4tb\" (UID: \"75023f81-9f7f-4b1f-aa44-b3e234778686\") " pod="calico-system/whisker-5775587fcb-8p4tb" May 27 17:57:52.818904 kubelet[2875]: I0527 17:57:52.818585 2875 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xgcg\" (UniqueName: \"kubernetes.io/projected/75023f81-9f7f-4b1f-aa44-b3e234778686-kube-api-access-9xgcg\") pod \"whisker-5775587fcb-8p4tb\" (UID: \"75023f81-9f7f-4b1f-aa44-b3e234778686\") " pod="calico-system/whisker-5775587fcb-8p4tb" May 27 17:57:52.991426 containerd[1585]: time="2025-05-27T17:57:52.991237888Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-5qgtj,Uid:fd4283a9-ffc2-4b86-98a9-81f0eb99355f,Namespace:kube-system,Attempt:0,}" May 27 17:57:53.099482 containerd[1585]: time="2025-05-27T17:57:53.099408090Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5775587fcb-8p4tb,Uid:75023f81-9f7f-4b1f-aa44-b3e234778686,Namespace:calico-system,Attempt:0,}" May 27 17:57:53.390152 systemd-networkd[1519]: cali52e5295baaa: Link UP May 27 17:57:53.391457 systemd-networkd[1519]: cali52e5295baaa: Gained carrier May 27 17:57:53.430185 containerd[1585]: 2025-05-27 17:57:53.144 [INFO][4025] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 27 17:57:53.430185 containerd[1585]: 2025-05-27 17:57:53.159 [INFO][4025] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--kh28t.gb1.brightbox.com-k8s-whisker--5775587fcb--8p4tb-eth0 whisker-5775587fcb- calico-system 75023f81-9f7f-4b1f-aa44-b3e234778686 876 0 2025-05-27 17:57:52 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:5775587fcb projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s srv-kh28t.gb1.brightbox.com whisker-5775587fcb-8p4tb eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali52e5295baaa [] [] }} ContainerID="71efdc4219a5d7f0bf15e430b67c4656182e0ea60314038c432b4acde196d5ed" Namespace="calico-system" Pod="whisker-5775587fcb-8p4tb" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-whisker--5775587fcb--8p4tb-" May 27 17:57:53.430185 containerd[1585]: 2025-05-27 17:57:53.159 [INFO][4025] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="71efdc4219a5d7f0bf15e430b67c4656182e0ea60314038c432b4acde196d5ed" Namespace="calico-system" Pod="whisker-5775587fcb-8p4tb" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-whisker--5775587fcb--8p4tb-eth0" May 27 17:57:53.430185 containerd[1585]: 2025-05-27 17:57:53.279 [INFO][4039] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="71efdc4219a5d7f0bf15e430b67c4656182e0ea60314038c432b4acde196d5ed" HandleID="k8s-pod-network.71efdc4219a5d7f0bf15e430b67c4656182e0ea60314038c432b4acde196d5ed" Workload="srv--kh28t.gb1.brightbox.com-k8s-whisker--5775587fcb--8p4tb-eth0" May 27 17:57:53.430592 containerd[1585]: 2025-05-27 17:57:53.280 [INFO][4039] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="71efdc4219a5d7f0bf15e430b67c4656182e0ea60314038c432b4acde196d5ed" HandleID="k8s-pod-network.71efdc4219a5d7f0bf15e430b67c4656182e0ea60314038c432b4acde196d5ed" Workload="srv--kh28t.gb1.brightbox.com-k8s-whisker--5775587fcb--8p4tb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004e670), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-kh28t.gb1.brightbox.com", "pod":"whisker-5775587fcb-8p4tb", "timestamp":"2025-05-27 17:57:53.279148778 +0000 UTC"}, Hostname:"srv-kh28t.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:57:53.430592 containerd[1585]: 2025-05-27 17:57:53.280 [INFO][4039] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:57:53.430592 containerd[1585]: 2025-05-27 17:57:53.280 [INFO][4039] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:57:53.430592 containerd[1585]: 2025-05-27 17:57:53.280 [INFO][4039] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-kh28t.gb1.brightbox.com' May 27 17:57:53.430592 containerd[1585]: 2025-05-27 17:57:53.300 [INFO][4039] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.71efdc4219a5d7f0bf15e430b67c4656182e0ea60314038c432b4acde196d5ed" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:53.430592 containerd[1585]: 2025-05-27 17:57:53.310 [INFO][4039] ipam/ipam.go 394: Looking up existing affinities for host host="srv-kh28t.gb1.brightbox.com" May 27 17:57:53.430592 containerd[1585]: 2025-05-27 17:57:53.320 [INFO][4039] ipam/ipam.go 511: Trying affinity for 192.168.9.128/26 host="srv-kh28t.gb1.brightbox.com" May 27 17:57:53.430592 containerd[1585]: 2025-05-27 17:57:53.323 [INFO][4039] ipam/ipam.go 158: Attempting to load block cidr=192.168.9.128/26 host="srv-kh28t.gb1.brightbox.com" May 27 17:57:53.430592 containerd[1585]: 2025-05-27 17:57:53.328 [INFO][4039] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.9.128/26 host="srv-kh28t.gb1.brightbox.com" May 27 17:57:53.431511 containerd[1585]: 2025-05-27 17:57:53.328 [INFO][4039] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.9.128/26 handle="k8s-pod-network.71efdc4219a5d7f0bf15e430b67c4656182e0ea60314038c432b4acde196d5ed" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:53.431511 containerd[1585]: 2025-05-27 17:57:53.331 [INFO][4039] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.71efdc4219a5d7f0bf15e430b67c4656182e0ea60314038c432b4acde196d5ed May 27 17:57:53.431511 containerd[1585]: 2025-05-27 17:57:53.342 [INFO][4039] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.9.128/26 handle="k8s-pod-network.71efdc4219a5d7f0bf15e430b67c4656182e0ea60314038c432b4acde196d5ed" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:53.431511 containerd[1585]: 2025-05-27 17:57:53.354 [INFO][4039] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.9.129/26] block=192.168.9.128/26 handle="k8s-pod-network.71efdc4219a5d7f0bf15e430b67c4656182e0ea60314038c432b4acde196d5ed" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:53.431511 containerd[1585]: 2025-05-27 17:57:53.355 [INFO][4039] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.9.129/26] handle="k8s-pod-network.71efdc4219a5d7f0bf15e430b67c4656182e0ea60314038c432b4acde196d5ed" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:53.431511 containerd[1585]: 2025-05-27 17:57:53.355 [INFO][4039] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:57:53.431511 containerd[1585]: 2025-05-27 17:57:53.355 [INFO][4039] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.9.129/26] IPv6=[] ContainerID="71efdc4219a5d7f0bf15e430b67c4656182e0ea60314038c432b4acde196d5ed" HandleID="k8s-pod-network.71efdc4219a5d7f0bf15e430b67c4656182e0ea60314038c432b4acde196d5ed" Workload="srv--kh28t.gb1.brightbox.com-k8s-whisker--5775587fcb--8p4tb-eth0" May 27 17:57:53.431806 containerd[1585]: 2025-05-27 17:57:53.361 [INFO][4025] cni-plugin/k8s.go 418: Populated endpoint ContainerID="71efdc4219a5d7f0bf15e430b67c4656182e0ea60314038c432b4acde196d5ed" Namespace="calico-system" Pod="whisker-5775587fcb-8p4tb" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-whisker--5775587fcb--8p4tb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--kh28t.gb1.brightbox.com-k8s-whisker--5775587fcb--8p4tb-eth0", GenerateName:"whisker-5775587fcb-", Namespace:"calico-system", SelfLink:"", UID:"75023f81-9f7f-4b1f-aa44-b3e234778686", ResourceVersion:"876", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 57, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5775587fcb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-kh28t.gb1.brightbox.com", ContainerID:"", Pod:"whisker-5775587fcb-8p4tb", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.9.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali52e5295baaa", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:57:53.431806 containerd[1585]: 2025-05-27 17:57:53.361 [INFO][4025] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.9.129/32] ContainerID="71efdc4219a5d7f0bf15e430b67c4656182e0ea60314038c432b4acde196d5ed" Namespace="calico-system" Pod="whisker-5775587fcb-8p4tb" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-whisker--5775587fcb--8p4tb-eth0" May 27 17:57:53.431951 containerd[1585]: 2025-05-27 17:57:53.361 [INFO][4025] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali52e5295baaa ContainerID="71efdc4219a5d7f0bf15e430b67c4656182e0ea60314038c432b4acde196d5ed" Namespace="calico-system" Pod="whisker-5775587fcb-8p4tb" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-whisker--5775587fcb--8p4tb-eth0" May 27 17:57:53.431951 containerd[1585]: 2025-05-27 17:57:53.390 [INFO][4025] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="71efdc4219a5d7f0bf15e430b67c4656182e0ea60314038c432b4acde196d5ed" Namespace="calico-system" Pod="whisker-5775587fcb-8p4tb" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-whisker--5775587fcb--8p4tb-eth0" May 27 17:57:53.432038 containerd[1585]: 2025-05-27 17:57:53.392 [INFO][4025] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="71efdc4219a5d7f0bf15e430b67c4656182e0ea60314038c432b4acde196d5ed" Namespace="calico-system" Pod="whisker-5775587fcb-8p4tb" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-whisker--5775587fcb--8p4tb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--kh28t.gb1.brightbox.com-k8s-whisker--5775587fcb--8p4tb-eth0", GenerateName:"whisker-5775587fcb-", Namespace:"calico-system", SelfLink:"", UID:"75023f81-9f7f-4b1f-aa44-b3e234778686", ResourceVersion:"876", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 57, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5775587fcb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-kh28t.gb1.brightbox.com", ContainerID:"71efdc4219a5d7f0bf15e430b67c4656182e0ea60314038c432b4acde196d5ed", Pod:"whisker-5775587fcb-8p4tb", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.9.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali52e5295baaa", MAC:"ce:8c:82:7b:25:d6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:57:53.432146 containerd[1585]: 2025-05-27 17:57:53.415 [INFO][4025] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="71efdc4219a5d7f0bf15e430b67c4656182e0ea60314038c432b4acde196d5ed" Namespace="calico-system" Pod="whisker-5775587fcb-8p4tb" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-whisker--5775587fcb--8p4tb-eth0" May 27 17:57:53.526960 systemd-networkd[1519]: cali2624f810eba: Link UP May 27 17:57:53.528961 systemd-networkd[1519]: cali2624f810eba: Gained carrier May 27 17:57:53.556946 containerd[1585]: 2025-05-27 17:57:53.042 [INFO][4014] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 27 17:57:53.556946 containerd[1585]: 2025-05-27 17:57:53.076 [INFO][4014] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--kh28t.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--5qgtj-eth0 coredns-7c65d6cfc9- kube-system fd4283a9-ffc2-4b86-98a9-81f0eb99355f 807 0 2025-05-27 17:57:13 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-kh28t.gb1.brightbox.com coredns-7c65d6cfc9-5qgtj eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali2624f810eba [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="fbbd19dc62f3b2ec3be1e5d8dc5b748e022314ab53a9060fcbb9bfab39fb8fd1" Namespace="kube-system" Pod="coredns-7c65d6cfc9-5qgtj" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--5qgtj-" May 27 17:57:53.556946 containerd[1585]: 2025-05-27 17:57:53.077 [INFO][4014] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fbbd19dc62f3b2ec3be1e5d8dc5b748e022314ab53a9060fcbb9bfab39fb8fd1" Namespace="kube-system" Pod="coredns-7c65d6cfc9-5qgtj" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--5qgtj-eth0" May 27 17:57:53.556946 containerd[1585]: 2025-05-27 17:57:53.279 [INFO][4023] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fbbd19dc62f3b2ec3be1e5d8dc5b748e022314ab53a9060fcbb9bfab39fb8fd1" HandleID="k8s-pod-network.fbbd19dc62f3b2ec3be1e5d8dc5b748e022314ab53a9060fcbb9bfab39fb8fd1" Workload="srv--kh28t.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--5qgtj-eth0" May 27 17:57:53.557376 containerd[1585]: 2025-05-27 17:57:53.280 [INFO][4023] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="fbbd19dc62f3b2ec3be1e5d8dc5b748e022314ab53a9060fcbb9bfab39fb8fd1" HandleID="k8s-pod-network.fbbd19dc62f3b2ec3be1e5d8dc5b748e022314ab53a9060fcbb9bfab39fb8fd1" Workload="srv--kh28t.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--5qgtj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003964e0), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-kh28t.gb1.brightbox.com", "pod":"coredns-7c65d6cfc9-5qgtj", "timestamp":"2025-05-27 17:57:53.279180698 +0000 UTC"}, Hostname:"srv-kh28t.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:57:53.557376 containerd[1585]: 2025-05-27 17:57:53.280 [INFO][4023] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:57:53.557376 containerd[1585]: 2025-05-27 17:57:53.355 [INFO][4023] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:57:53.557376 containerd[1585]: 2025-05-27 17:57:53.355 [INFO][4023] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-kh28t.gb1.brightbox.com' May 27 17:57:53.557376 containerd[1585]: 2025-05-27 17:57:53.404 [INFO][4023] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.fbbd19dc62f3b2ec3be1e5d8dc5b748e022314ab53a9060fcbb9bfab39fb8fd1" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:53.557376 containerd[1585]: 2025-05-27 17:57:53.437 [INFO][4023] ipam/ipam.go 394: Looking up existing affinities for host host="srv-kh28t.gb1.brightbox.com" May 27 17:57:53.557376 containerd[1585]: 2025-05-27 17:57:53.448 [INFO][4023] ipam/ipam.go 511: Trying affinity for 192.168.9.128/26 host="srv-kh28t.gb1.brightbox.com" May 27 17:57:53.557376 containerd[1585]: 2025-05-27 17:57:53.452 [INFO][4023] ipam/ipam.go 158: Attempting to load block cidr=192.168.9.128/26 host="srv-kh28t.gb1.brightbox.com" May 27 17:57:53.557376 containerd[1585]: 2025-05-27 17:57:53.457 [INFO][4023] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.9.128/26 host="srv-kh28t.gb1.brightbox.com" May 27 17:57:53.560773 containerd[1585]: 2025-05-27 17:57:53.457 [INFO][4023] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.9.128/26 handle="k8s-pod-network.fbbd19dc62f3b2ec3be1e5d8dc5b748e022314ab53a9060fcbb9bfab39fb8fd1" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:53.560773 containerd[1585]: 2025-05-27 17:57:53.462 [INFO][4023] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.fbbd19dc62f3b2ec3be1e5d8dc5b748e022314ab53a9060fcbb9bfab39fb8fd1 May 27 17:57:53.560773 containerd[1585]: 2025-05-27 17:57:53.481 [INFO][4023] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.9.128/26 handle="k8s-pod-network.fbbd19dc62f3b2ec3be1e5d8dc5b748e022314ab53a9060fcbb9bfab39fb8fd1" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:53.560773 containerd[1585]: 2025-05-27 17:57:53.507 [INFO][4023] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.9.130/26] block=192.168.9.128/26 handle="k8s-pod-network.fbbd19dc62f3b2ec3be1e5d8dc5b748e022314ab53a9060fcbb9bfab39fb8fd1" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:53.560773 containerd[1585]: 2025-05-27 17:57:53.507 [INFO][4023] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.9.130/26] handle="k8s-pod-network.fbbd19dc62f3b2ec3be1e5d8dc5b748e022314ab53a9060fcbb9bfab39fb8fd1" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:53.560773 containerd[1585]: 2025-05-27 17:57:53.508 [INFO][4023] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:57:53.560773 containerd[1585]: 2025-05-27 17:57:53.508 [INFO][4023] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.9.130/26] IPv6=[] ContainerID="fbbd19dc62f3b2ec3be1e5d8dc5b748e022314ab53a9060fcbb9bfab39fb8fd1" HandleID="k8s-pod-network.fbbd19dc62f3b2ec3be1e5d8dc5b748e022314ab53a9060fcbb9bfab39fb8fd1" Workload="srv--kh28t.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--5qgtj-eth0" May 27 17:57:53.561086 containerd[1585]: 2025-05-27 17:57:53.517 [INFO][4014] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fbbd19dc62f3b2ec3be1e5d8dc5b748e022314ab53a9060fcbb9bfab39fb8fd1" Namespace="kube-system" Pod="coredns-7c65d6cfc9-5qgtj" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--5qgtj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--kh28t.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--5qgtj-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"fd4283a9-ffc2-4b86-98a9-81f0eb99355f", ResourceVersion:"807", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 57, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-kh28t.gb1.brightbox.com", ContainerID:"", Pod:"coredns-7c65d6cfc9-5qgtj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.9.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali2624f810eba", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:57:53.561086 containerd[1585]: 2025-05-27 17:57:53.518 [INFO][4014] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.9.130/32] ContainerID="fbbd19dc62f3b2ec3be1e5d8dc5b748e022314ab53a9060fcbb9bfab39fb8fd1" Namespace="kube-system" Pod="coredns-7c65d6cfc9-5qgtj" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--5qgtj-eth0" May 27 17:57:53.561086 containerd[1585]: 2025-05-27 17:57:53.518 [INFO][4014] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2624f810eba ContainerID="fbbd19dc62f3b2ec3be1e5d8dc5b748e022314ab53a9060fcbb9bfab39fb8fd1" Namespace="kube-system" Pod="coredns-7c65d6cfc9-5qgtj" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--5qgtj-eth0" May 27 17:57:53.561086 containerd[1585]: 2025-05-27 17:57:53.528 [INFO][4014] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fbbd19dc62f3b2ec3be1e5d8dc5b748e022314ab53a9060fcbb9bfab39fb8fd1" Namespace="kube-system" Pod="coredns-7c65d6cfc9-5qgtj" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--5qgtj-eth0" May 27 17:57:53.561086 containerd[1585]: 2025-05-27 17:57:53.530 [INFO][4014] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fbbd19dc62f3b2ec3be1e5d8dc5b748e022314ab53a9060fcbb9bfab39fb8fd1" Namespace="kube-system" Pod="coredns-7c65d6cfc9-5qgtj" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--5qgtj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--kh28t.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--5qgtj-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"fd4283a9-ffc2-4b86-98a9-81f0eb99355f", ResourceVersion:"807", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 57, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-kh28t.gb1.brightbox.com", ContainerID:"fbbd19dc62f3b2ec3be1e5d8dc5b748e022314ab53a9060fcbb9bfab39fb8fd1", Pod:"coredns-7c65d6cfc9-5qgtj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.9.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali2624f810eba", MAC:"0a:99:f1:82:62:e5", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:57:53.561086 containerd[1585]: 2025-05-27 17:57:53.551 [INFO][4014] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fbbd19dc62f3b2ec3be1e5d8dc5b748e022314ab53a9060fcbb9bfab39fb8fd1" Namespace="kube-system" Pod="coredns-7c65d6cfc9-5qgtj" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--5qgtj-eth0" May 27 17:57:53.755706 containerd[1585]: time="2025-05-27T17:57:53.752999699Z" level=info msg="connecting to shim 71efdc4219a5d7f0bf15e430b67c4656182e0ea60314038c432b4acde196d5ed" address="unix:///run/containerd/s/0626cba019f7f71d7753aef10e3eb2b29cc80cf94547b66d696a84a6d6d4dc00" namespace=k8s.io protocol=ttrpc version=3 May 27 17:57:53.768848 containerd[1585]: time="2025-05-27T17:57:53.767623040Z" level=info msg="connecting to shim fbbd19dc62f3b2ec3be1e5d8dc5b748e022314ab53a9060fcbb9bfab39fb8fd1" address="unix:///run/containerd/s/67332b903e614cc048c520374610beaee5da12702a94b5cce47073e8ad82ab58" namespace=k8s.io protocol=ttrpc version=3 May 27 17:57:53.867846 systemd[1]: Started cri-containerd-71efdc4219a5d7f0bf15e430b67c4656182e0ea60314038c432b4acde196d5ed.scope - libcontainer container 71efdc4219a5d7f0bf15e430b67c4656182e0ea60314038c432b4acde196d5ed. May 27 17:57:53.892358 systemd[1]: Started cri-containerd-fbbd19dc62f3b2ec3be1e5d8dc5b748e022314ab53a9060fcbb9bfab39fb8fd1.scope - libcontainer container fbbd19dc62f3b2ec3be1e5d8dc5b748e022314ab53a9060fcbb9bfab39fb8fd1. May 27 17:57:53.999287 containerd[1585]: time="2025-05-27T17:57:53.998058137Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9t7wk,Uid:9c96b9b6-32e2-4ed2-8acd-c7b5982abac8,Namespace:calico-system,Attempt:0,}" May 27 17:57:54.004536 kubelet[2875]: I0527 17:57:54.004479 2875 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf21f10b-a619-4842-a9ab-7c3141c0e1e4" path="/var/lib/kubelet/pods/cf21f10b-a619-4842-a9ab-7c3141c0e1e4/volumes" May 27 17:57:54.017945 containerd[1585]: time="2025-05-27T17:57:54.017259437Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-5qgtj,Uid:fd4283a9-ffc2-4b86-98a9-81f0eb99355f,Namespace:kube-system,Attempt:0,} returns sandbox id \"fbbd19dc62f3b2ec3be1e5d8dc5b748e022314ab53a9060fcbb9bfab39fb8fd1\"" May 27 17:57:54.050286 containerd[1585]: time="2025-05-27T17:57:54.050210597Z" level=info msg="CreateContainer within sandbox \"fbbd19dc62f3b2ec3be1e5d8dc5b748e022314ab53a9060fcbb9bfab39fb8fd1\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 27 17:57:54.096153 containerd[1585]: time="2025-05-27T17:57:54.096097846Z" level=info msg="Container c76b5d7a24201f2af2d18dfa2d737915b571bca20bfa1bed39d902f1c55bb908: CDI devices from CRI Config.CDIDevices: []" May 27 17:57:54.110787 containerd[1585]: time="2025-05-27T17:57:54.110465351Z" level=info msg="CreateContainer within sandbox \"fbbd19dc62f3b2ec3be1e5d8dc5b748e022314ab53a9060fcbb9bfab39fb8fd1\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"c76b5d7a24201f2af2d18dfa2d737915b571bca20bfa1bed39d902f1c55bb908\"" May 27 17:57:54.115695 containerd[1585]: time="2025-05-27T17:57:54.113285082Z" level=info msg="StartContainer for \"c76b5d7a24201f2af2d18dfa2d737915b571bca20bfa1bed39d902f1c55bb908\"" May 27 17:57:54.130490 containerd[1585]: time="2025-05-27T17:57:54.130422930Z" level=info msg="connecting to shim c76b5d7a24201f2af2d18dfa2d737915b571bca20bfa1bed39d902f1c55bb908" address="unix:///run/containerd/s/67332b903e614cc048c520374610beaee5da12702a94b5cce47073e8ad82ab58" protocol=ttrpc version=3 May 27 17:57:54.170335 containerd[1585]: time="2025-05-27T17:57:54.169905294Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5775587fcb-8p4tb,Uid:75023f81-9f7f-4b1f-aa44-b3e234778686,Namespace:calico-system,Attempt:0,} returns sandbox id \"71efdc4219a5d7f0bf15e430b67c4656182e0ea60314038c432b4acde196d5ed\"" May 27 17:57:54.181911 containerd[1585]: time="2025-05-27T17:57:54.181855449Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 17:57:54.228879 systemd[1]: Started cri-containerd-c76b5d7a24201f2af2d18dfa2d737915b571bca20bfa1bed39d902f1c55bb908.scope - libcontainer container c76b5d7a24201f2af2d18dfa2d737915b571bca20bfa1bed39d902f1c55bb908. May 27 17:57:54.336730 containerd[1585]: time="2025-05-27T17:57:54.336132048Z" level=info msg="StartContainer for \"c76b5d7a24201f2af2d18dfa2d737915b571bca20bfa1bed39d902f1c55bb908\" returns successfully" May 27 17:57:54.380751 systemd-networkd[1519]: calid64ca27ac2e: Link UP May 27 17:57:54.385909 systemd-networkd[1519]: calid64ca27ac2e: Gained carrier May 27 17:57:54.422411 containerd[1585]: 2025-05-27 17:57:54.111 [INFO][4228] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 27 17:57:54.422411 containerd[1585]: 2025-05-27 17:57:54.158 [INFO][4228] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--kh28t.gb1.brightbox.com-k8s-csi--node--driver--9t7wk-eth0 csi-node-driver- calico-system 9c96b9b6-32e2-4ed2-8acd-c7b5982abac8 684 0 2025-05-27 17:57:29 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:68bf44dd5 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s srv-kh28t.gb1.brightbox.com csi-node-driver-9t7wk eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calid64ca27ac2e [] [] }} ContainerID="d52e03e753ef86d8c4475750f227d954df8b1708a6da560b33fbd5edb3a264ab" Namespace="calico-system" Pod="csi-node-driver-9t7wk" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-csi--node--driver--9t7wk-" May 27 17:57:54.422411 containerd[1585]: 2025-05-27 17:57:54.159 [INFO][4228] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d52e03e753ef86d8c4475750f227d954df8b1708a6da560b33fbd5edb3a264ab" Namespace="calico-system" Pod="csi-node-driver-9t7wk" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-csi--node--driver--9t7wk-eth0" May 27 17:57:54.422411 containerd[1585]: 2025-05-27 17:57:54.272 [INFO][4256] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d52e03e753ef86d8c4475750f227d954df8b1708a6da560b33fbd5edb3a264ab" HandleID="k8s-pod-network.d52e03e753ef86d8c4475750f227d954df8b1708a6da560b33fbd5edb3a264ab" Workload="srv--kh28t.gb1.brightbox.com-k8s-csi--node--driver--9t7wk-eth0" May 27 17:57:54.422411 containerd[1585]: 2025-05-27 17:57:54.273 [INFO][4256] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d52e03e753ef86d8c4475750f227d954df8b1708a6da560b33fbd5edb3a264ab" HandleID="k8s-pod-network.d52e03e753ef86d8c4475750f227d954df8b1708a6da560b33fbd5edb3a264ab" Workload="srv--kh28t.gb1.brightbox.com-k8s-csi--node--driver--9t7wk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004fb40), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-kh28t.gb1.brightbox.com", "pod":"csi-node-driver-9t7wk", "timestamp":"2025-05-27 17:57:54.272638851 +0000 UTC"}, Hostname:"srv-kh28t.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:57:54.422411 containerd[1585]: 2025-05-27 17:57:54.273 [INFO][4256] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:57:54.422411 containerd[1585]: 2025-05-27 17:57:54.273 [INFO][4256] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:57:54.422411 containerd[1585]: 2025-05-27 17:57:54.273 [INFO][4256] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-kh28t.gb1.brightbox.com' May 27 17:57:54.422411 containerd[1585]: 2025-05-27 17:57:54.305 [INFO][4256] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d52e03e753ef86d8c4475750f227d954df8b1708a6da560b33fbd5edb3a264ab" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:54.422411 containerd[1585]: 2025-05-27 17:57:54.320 [INFO][4256] ipam/ipam.go 394: Looking up existing affinities for host host="srv-kh28t.gb1.brightbox.com" May 27 17:57:54.422411 containerd[1585]: 2025-05-27 17:57:54.331 [INFO][4256] ipam/ipam.go 511: Trying affinity for 192.168.9.128/26 host="srv-kh28t.gb1.brightbox.com" May 27 17:57:54.422411 containerd[1585]: 2025-05-27 17:57:54.336 [INFO][4256] ipam/ipam.go 158: Attempting to load block cidr=192.168.9.128/26 host="srv-kh28t.gb1.brightbox.com" May 27 17:57:54.422411 containerd[1585]: 2025-05-27 17:57:54.341 [INFO][4256] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.9.128/26 host="srv-kh28t.gb1.brightbox.com" May 27 17:57:54.422411 containerd[1585]: 2025-05-27 17:57:54.341 [INFO][4256] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.9.128/26 handle="k8s-pod-network.d52e03e753ef86d8c4475750f227d954df8b1708a6da560b33fbd5edb3a264ab" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:54.422411 containerd[1585]: 2025-05-27 17:57:54.344 [INFO][4256] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d52e03e753ef86d8c4475750f227d954df8b1708a6da560b33fbd5edb3a264ab May 27 17:57:54.422411 containerd[1585]: 2025-05-27 17:57:54.354 [INFO][4256] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.9.128/26 handle="k8s-pod-network.d52e03e753ef86d8c4475750f227d954df8b1708a6da560b33fbd5edb3a264ab" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:54.422411 containerd[1585]: 2025-05-27 17:57:54.364 [INFO][4256] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.9.131/26] block=192.168.9.128/26 handle="k8s-pod-network.d52e03e753ef86d8c4475750f227d954df8b1708a6da560b33fbd5edb3a264ab" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:54.422411 containerd[1585]: 2025-05-27 17:57:54.364 [INFO][4256] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.9.131/26] handle="k8s-pod-network.d52e03e753ef86d8c4475750f227d954df8b1708a6da560b33fbd5edb3a264ab" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:54.422411 containerd[1585]: 2025-05-27 17:57:54.365 [INFO][4256] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:57:54.422411 containerd[1585]: 2025-05-27 17:57:54.365 [INFO][4256] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.9.131/26] IPv6=[] ContainerID="d52e03e753ef86d8c4475750f227d954df8b1708a6da560b33fbd5edb3a264ab" HandleID="k8s-pod-network.d52e03e753ef86d8c4475750f227d954df8b1708a6da560b33fbd5edb3a264ab" Workload="srv--kh28t.gb1.brightbox.com-k8s-csi--node--driver--9t7wk-eth0" May 27 17:57:54.424315 containerd[1585]: 2025-05-27 17:57:54.369 [INFO][4228] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d52e03e753ef86d8c4475750f227d954df8b1708a6da560b33fbd5edb3a264ab" Namespace="calico-system" Pod="csi-node-driver-9t7wk" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-csi--node--driver--9t7wk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--kh28t.gb1.brightbox.com-k8s-csi--node--driver--9t7wk-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"9c96b9b6-32e2-4ed2-8acd-c7b5982abac8", ResourceVersion:"684", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 57, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"68bf44dd5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-kh28t.gb1.brightbox.com", ContainerID:"", Pod:"csi-node-driver-9t7wk", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.9.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid64ca27ac2e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:57:54.424315 containerd[1585]: 2025-05-27 17:57:54.370 [INFO][4228] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.9.131/32] ContainerID="d52e03e753ef86d8c4475750f227d954df8b1708a6da560b33fbd5edb3a264ab" Namespace="calico-system" Pod="csi-node-driver-9t7wk" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-csi--node--driver--9t7wk-eth0" May 27 17:57:54.424315 containerd[1585]: 2025-05-27 17:57:54.370 [INFO][4228] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid64ca27ac2e ContainerID="d52e03e753ef86d8c4475750f227d954df8b1708a6da560b33fbd5edb3a264ab" Namespace="calico-system" Pod="csi-node-driver-9t7wk" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-csi--node--driver--9t7wk-eth0" May 27 17:57:54.424315 containerd[1585]: 2025-05-27 17:57:54.385 [INFO][4228] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d52e03e753ef86d8c4475750f227d954df8b1708a6da560b33fbd5edb3a264ab" Namespace="calico-system" Pod="csi-node-driver-9t7wk" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-csi--node--driver--9t7wk-eth0" May 27 17:57:54.424315 containerd[1585]: 2025-05-27 17:57:54.387 [INFO][4228] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d52e03e753ef86d8c4475750f227d954df8b1708a6da560b33fbd5edb3a264ab" Namespace="calico-system" Pod="csi-node-driver-9t7wk" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-csi--node--driver--9t7wk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--kh28t.gb1.brightbox.com-k8s-csi--node--driver--9t7wk-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"9c96b9b6-32e2-4ed2-8acd-c7b5982abac8", ResourceVersion:"684", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 57, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"68bf44dd5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-kh28t.gb1.brightbox.com", ContainerID:"d52e03e753ef86d8c4475750f227d954df8b1708a6da560b33fbd5edb3a264ab", Pod:"csi-node-driver-9t7wk", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.9.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid64ca27ac2e", MAC:"46:6f:cb:a1:05:8d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:57:54.424315 containerd[1585]: 2025-05-27 17:57:54.417 [INFO][4228] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d52e03e753ef86d8c4475750f227d954df8b1708a6da560b33fbd5edb3a264ab" Namespace="calico-system" Pod="csi-node-driver-9t7wk" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-csi--node--driver--9t7wk-eth0" May 27 17:57:54.477812 systemd-networkd[1519]: cali52e5295baaa: Gained IPv6LL May 27 17:57:54.502244 containerd[1585]: time="2025-05-27T17:57:54.502176705Z" level=info msg="connecting to shim d52e03e753ef86d8c4475750f227d954df8b1708a6da560b33fbd5edb3a264ab" address="unix:///run/containerd/s/0523d5d2ef243e39797ea5af89f8cf0667ed8b6e2c0ea0255005df460cfe76de" namespace=k8s.io protocol=ttrpc version=3 May 27 17:57:54.504517 containerd[1585]: time="2025-05-27T17:57:54.504472901Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:57:54.506766 containerd[1585]: time="2025-05-27T17:57:54.506719562Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 17:57:54.522844 containerd[1585]: time="2025-05-27T17:57:54.506968863Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:57:54.534598 kubelet[2875]: E0527 17:57:54.534119 2875 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:57:54.535244 kubelet[2875]: E0527 17:57:54.535213 2875 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:57:54.555791 kubelet[2875]: E0527 17:57:54.555696 2875 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:2fa70a3d9d1f41939d7bb3cd196d2e7b,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9xgcg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5775587fcb-8p4tb_calico-system(75023f81-9f7f-4b1f-aa44-b3e234778686): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:57:54.563504 containerd[1585]: time="2025-05-27T17:57:54.563294083Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 17:57:54.565384 systemd[1]: Started cri-containerd-d52e03e753ef86d8c4475750f227d954df8b1708a6da560b33fbd5edb3a264ab.scope - libcontainer container d52e03e753ef86d8c4475750f227d954df8b1708a6da560b33fbd5edb3a264ab. May 27 17:57:54.708645 containerd[1585]: time="2025-05-27T17:57:54.708411251Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9t7wk,Uid:9c96b9b6-32e2-4ed2-8acd-c7b5982abac8,Namespace:calico-system,Attempt:0,} returns sandbox id \"d52e03e753ef86d8c4475750f227d954df8b1708a6da560b33fbd5edb3a264ab\"" May 27 17:57:54.797992 containerd[1585]: time="2025-05-27T17:57:54.797820499Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:57:54.799703 containerd[1585]: time="2025-05-27T17:57:54.798778375Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:57:54.799703 containerd[1585]: time="2025-05-27T17:57:54.798821546Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 17:57:54.800205 kubelet[2875]: E0527 17:57:54.800146 2875 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:57:54.800327 kubelet[2875]: E0527 17:57:54.800222 2875 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:57:54.800707 containerd[1585]: time="2025-05-27T17:57:54.800563244Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.0\"" May 27 17:57:54.800780 kubelet[2875]: E0527 17:57:54.800543 2875 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9xgcg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5775587fcb-8p4tb_calico-system(75023f81-9f7f-4b1f-aa44-b3e234778686): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:57:54.815352 kubelet[2875]: E0527 17:57:54.815271 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-5775587fcb-8p4tb" podUID="75023f81-9f7f-4b1f-aa44-b3e234778686" May 27 17:57:54.991614 containerd[1585]: time="2025-05-27T17:57:54.990698698Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-78b7d7669b-pvkq7,Uid:cfb3cccd-61b8-4d5c-bde3-b6277323070c,Namespace:calico-system,Attempt:0,}" May 27 17:57:55.108837 systemd-networkd[1519]: cali2624f810eba: Gained IPv6LL May 27 17:57:55.166596 systemd-networkd[1519]: cali0dcc67e3ae8: Link UP May 27 17:57:55.167320 systemd-networkd[1519]: cali0dcc67e3ae8: Gained carrier May 27 17:57:55.189324 kubelet[2875]: I0527 17:57:55.187131 2875 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-5qgtj" podStartSLOduration=42.186157353 podStartE2EDuration="42.186157353s" podCreationTimestamp="2025-05-27 17:57:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 17:57:54.44477419 +0000 UTC m=+46.764419112" watchObservedRunningTime="2025-05-27 17:57:55.186157353 +0000 UTC m=+47.505802245" May 27 17:57:55.199007 containerd[1585]: 2025-05-27 17:57:55.053 [INFO][4378] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--kh28t.gb1.brightbox.com-k8s-calico--kube--controllers--78b7d7669b--pvkq7-eth0 calico-kube-controllers-78b7d7669b- calico-system cfb3cccd-61b8-4d5c-bde3-b6277323070c 802 0 2025-05-27 17:57:29 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:78b7d7669b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s srv-kh28t.gb1.brightbox.com calico-kube-controllers-78b7d7669b-pvkq7 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali0dcc67e3ae8 [] [] }} ContainerID="3d25358bf9567d843992d137d68e00d6893722eb048abfe8eed34b8240f317a5" Namespace="calico-system" Pod="calico-kube-controllers-78b7d7669b-pvkq7" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-calico--kube--controllers--78b7d7669b--pvkq7-" May 27 17:57:55.199007 containerd[1585]: 2025-05-27 17:57:55.053 [INFO][4378] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3d25358bf9567d843992d137d68e00d6893722eb048abfe8eed34b8240f317a5" Namespace="calico-system" Pod="calico-kube-controllers-78b7d7669b-pvkq7" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-calico--kube--controllers--78b7d7669b--pvkq7-eth0" May 27 17:57:55.199007 containerd[1585]: 2025-05-27 17:57:55.105 [INFO][4390] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3d25358bf9567d843992d137d68e00d6893722eb048abfe8eed34b8240f317a5" HandleID="k8s-pod-network.3d25358bf9567d843992d137d68e00d6893722eb048abfe8eed34b8240f317a5" Workload="srv--kh28t.gb1.brightbox.com-k8s-calico--kube--controllers--78b7d7669b--pvkq7-eth0" May 27 17:57:55.199007 containerd[1585]: 2025-05-27 17:57:55.105 [INFO][4390] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3d25358bf9567d843992d137d68e00d6893722eb048abfe8eed34b8240f317a5" HandleID="k8s-pod-network.3d25358bf9567d843992d137d68e00d6893722eb048abfe8eed34b8240f317a5" Workload="srv--kh28t.gb1.brightbox.com-k8s-calico--kube--controllers--78b7d7669b--pvkq7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d3900), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-kh28t.gb1.brightbox.com", "pod":"calico-kube-controllers-78b7d7669b-pvkq7", "timestamp":"2025-05-27 17:57:55.105420923 +0000 UTC"}, Hostname:"srv-kh28t.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:57:55.199007 containerd[1585]: 2025-05-27 17:57:55.105 [INFO][4390] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:57:55.199007 containerd[1585]: 2025-05-27 17:57:55.106 [INFO][4390] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:57:55.199007 containerd[1585]: 2025-05-27 17:57:55.106 [INFO][4390] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-kh28t.gb1.brightbox.com' May 27 17:57:55.199007 containerd[1585]: 2025-05-27 17:57:55.117 [INFO][4390] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3d25358bf9567d843992d137d68e00d6893722eb048abfe8eed34b8240f317a5" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:55.199007 containerd[1585]: 2025-05-27 17:57:55.124 [INFO][4390] ipam/ipam.go 394: Looking up existing affinities for host host="srv-kh28t.gb1.brightbox.com" May 27 17:57:55.199007 containerd[1585]: 2025-05-27 17:57:55.135 [INFO][4390] ipam/ipam.go 511: Trying affinity for 192.168.9.128/26 host="srv-kh28t.gb1.brightbox.com" May 27 17:57:55.199007 containerd[1585]: 2025-05-27 17:57:55.138 [INFO][4390] ipam/ipam.go 158: Attempting to load block cidr=192.168.9.128/26 host="srv-kh28t.gb1.brightbox.com" May 27 17:57:55.199007 containerd[1585]: 2025-05-27 17:57:55.140 [INFO][4390] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.9.128/26 host="srv-kh28t.gb1.brightbox.com" May 27 17:57:55.199007 containerd[1585]: 2025-05-27 17:57:55.140 [INFO][4390] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.9.128/26 handle="k8s-pod-network.3d25358bf9567d843992d137d68e00d6893722eb048abfe8eed34b8240f317a5" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:55.199007 containerd[1585]: 2025-05-27 17:57:55.144 [INFO][4390] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.3d25358bf9567d843992d137d68e00d6893722eb048abfe8eed34b8240f317a5 May 27 17:57:55.199007 containerd[1585]: 2025-05-27 17:57:55.150 [INFO][4390] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.9.128/26 handle="k8s-pod-network.3d25358bf9567d843992d137d68e00d6893722eb048abfe8eed34b8240f317a5" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:55.199007 containerd[1585]: 2025-05-27 17:57:55.159 [INFO][4390] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.9.132/26] block=192.168.9.128/26 handle="k8s-pod-network.3d25358bf9567d843992d137d68e00d6893722eb048abfe8eed34b8240f317a5" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:55.199007 containerd[1585]: 2025-05-27 17:57:55.159 [INFO][4390] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.9.132/26] handle="k8s-pod-network.3d25358bf9567d843992d137d68e00d6893722eb048abfe8eed34b8240f317a5" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:55.199007 containerd[1585]: 2025-05-27 17:57:55.159 [INFO][4390] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:57:55.199007 containerd[1585]: 2025-05-27 17:57:55.160 [INFO][4390] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.9.132/26] IPv6=[] ContainerID="3d25358bf9567d843992d137d68e00d6893722eb048abfe8eed34b8240f317a5" HandleID="k8s-pod-network.3d25358bf9567d843992d137d68e00d6893722eb048abfe8eed34b8240f317a5" Workload="srv--kh28t.gb1.brightbox.com-k8s-calico--kube--controllers--78b7d7669b--pvkq7-eth0" May 27 17:57:55.204455 containerd[1585]: 2025-05-27 17:57:55.162 [INFO][4378] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3d25358bf9567d843992d137d68e00d6893722eb048abfe8eed34b8240f317a5" Namespace="calico-system" Pod="calico-kube-controllers-78b7d7669b-pvkq7" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-calico--kube--controllers--78b7d7669b--pvkq7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--kh28t.gb1.brightbox.com-k8s-calico--kube--controllers--78b7d7669b--pvkq7-eth0", GenerateName:"calico-kube-controllers-78b7d7669b-", Namespace:"calico-system", SelfLink:"", UID:"cfb3cccd-61b8-4d5c-bde3-b6277323070c", ResourceVersion:"802", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 57, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"78b7d7669b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-kh28t.gb1.brightbox.com", ContainerID:"", Pod:"calico-kube-controllers-78b7d7669b-pvkq7", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.9.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali0dcc67e3ae8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:57:55.204455 containerd[1585]: 2025-05-27 17:57:55.163 [INFO][4378] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.9.132/32] ContainerID="3d25358bf9567d843992d137d68e00d6893722eb048abfe8eed34b8240f317a5" Namespace="calico-system" Pod="calico-kube-controllers-78b7d7669b-pvkq7" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-calico--kube--controllers--78b7d7669b--pvkq7-eth0" May 27 17:57:55.204455 containerd[1585]: 2025-05-27 17:57:55.163 [INFO][4378] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0dcc67e3ae8 ContainerID="3d25358bf9567d843992d137d68e00d6893722eb048abfe8eed34b8240f317a5" Namespace="calico-system" Pod="calico-kube-controllers-78b7d7669b-pvkq7" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-calico--kube--controllers--78b7d7669b--pvkq7-eth0" May 27 17:57:55.204455 containerd[1585]: 2025-05-27 17:57:55.167 [INFO][4378] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3d25358bf9567d843992d137d68e00d6893722eb048abfe8eed34b8240f317a5" Namespace="calico-system" Pod="calico-kube-controllers-78b7d7669b-pvkq7" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-calico--kube--controllers--78b7d7669b--pvkq7-eth0" May 27 17:57:55.204455 containerd[1585]: 2025-05-27 17:57:55.168 [INFO][4378] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3d25358bf9567d843992d137d68e00d6893722eb048abfe8eed34b8240f317a5" Namespace="calico-system" Pod="calico-kube-controllers-78b7d7669b-pvkq7" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-calico--kube--controllers--78b7d7669b--pvkq7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--kh28t.gb1.brightbox.com-k8s-calico--kube--controllers--78b7d7669b--pvkq7-eth0", GenerateName:"calico-kube-controllers-78b7d7669b-", Namespace:"calico-system", SelfLink:"", UID:"cfb3cccd-61b8-4d5c-bde3-b6277323070c", ResourceVersion:"802", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 57, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"78b7d7669b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-kh28t.gb1.brightbox.com", ContainerID:"3d25358bf9567d843992d137d68e00d6893722eb048abfe8eed34b8240f317a5", Pod:"calico-kube-controllers-78b7d7669b-pvkq7", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.9.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali0dcc67e3ae8", MAC:"fe:c1:00:1f:08:fa", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:57:55.204455 containerd[1585]: 2025-05-27 17:57:55.192 [INFO][4378] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3d25358bf9567d843992d137d68e00d6893722eb048abfe8eed34b8240f317a5" Namespace="calico-system" Pod="calico-kube-controllers-78b7d7669b-pvkq7" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-calico--kube--controllers--78b7d7669b--pvkq7-eth0" May 27 17:57:55.252756 containerd[1585]: time="2025-05-27T17:57:55.252517575Z" level=info msg="connecting to shim 3d25358bf9567d843992d137d68e00d6893722eb048abfe8eed34b8240f317a5" address="unix:///run/containerd/s/1f4eaf1c5b8b91675d9af97ff3a58d35e079e407f8288517d7565f19bf34f0a5" namespace=k8s.io protocol=ttrpc version=3 May 27 17:57:55.305114 systemd[1]: Started cri-containerd-3d25358bf9567d843992d137d68e00d6893722eb048abfe8eed34b8240f317a5.scope - libcontainer container 3d25358bf9567d843992d137d68e00d6893722eb048abfe8eed34b8240f317a5. May 27 17:57:55.373854 systemd-networkd[1519]: vxlan.calico: Link UP May 27 17:57:55.373866 systemd-networkd[1519]: vxlan.calico: Gained carrier May 27 17:57:55.413986 containerd[1585]: time="2025-05-27T17:57:55.413886677Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-78b7d7669b-pvkq7,Uid:cfb3cccd-61b8-4d5c-bde3-b6277323070c,Namespace:calico-system,Attempt:0,} returns sandbox id \"3d25358bf9567d843992d137d68e00d6893722eb048abfe8eed34b8240f317a5\"" May 27 17:57:55.421795 kubelet[2875]: E0527 17:57:55.421744 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5775587fcb-8p4tb" podUID="75023f81-9f7f-4b1f-aa44-b3e234778686" May 27 17:57:56.155200 kubelet[2875]: I0527 17:57:56.154135 2875 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:57:56.261242 systemd-networkd[1519]: calid64ca27ac2e: Gained IPv6LL May 27 17:57:56.587569 systemd-networkd[1519]: vxlan.calico: Gained IPv6LL May 27 17:57:56.654780 containerd[1585]: time="2025-05-27T17:57:56.654695259Z" level=info msg="TaskExit event in podsandbox handler container_id:\"53c03d9554d0025de9e8cbdb2e9da19459fba3ea70590696c861f73440d9de07\" id:\"0e607e98b86af0795f0140dc8affbbcb2a714cd3897f04189e35fdc707403205\" pid:4544 exit_status:1 exited_at:{seconds:1748368676 nanos:653359955}" May 27 17:57:56.767074 containerd[1585]: time="2025-05-27T17:57:56.765756133Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:57:56.767074 containerd[1585]: time="2025-05-27T17:57:56.766591935Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.0: active requests=0, bytes read=8758390" May 27 17:57:56.767459 containerd[1585]: time="2025-05-27T17:57:56.767427770Z" level=info msg="ImageCreate event name:\"sha256:d5b08093b7928c0ac1122e59edf69b2e58c6d10ecc8b9e5cffeb809a956dc48e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:57:56.770367 containerd[1585]: time="2025-05-27T17:57:56.769457090Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:27883a4104876fe239311dd93ce6efd0c4a87de7163d57a4c8d96bd65a287ffd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:57:56.770575 containerd[1585]: time="2025-05-27T17:57:56.770164616Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.0\" with image id \"sha256:d5b08093b7928c0ac1122e59edf69b2e58c6d10ecc8b9e5cffeb809a956dc48e\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:27883a4104876fe239311dd93ce6efd0c4a87de7163d57a4c8d96bd65a287ffd\", size \"10251093\" in 1.969569316s" May 27 17:57:56.770652 containerd[1585]: time="2025-05-27T17:57:56.770582946Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.0\" returns image reference \"sha256:d5b08093b7928c0ac1122e59edf69b2e58c6d10ecc8b9e5cffeb809a956dc48e\"" May 27 17:57:56.777103 containerd[1585]: time="2025-05-27T17:57:56.776874491Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\"" May 27 17:57:56.782313 containerd[1585]: time="2025-05-27T17:57:56.782269010Z" level=info msg="CreateContainer within sandbox \"d52e03e753ef86d8c4475750f227d954df8b1708a6da560b33fbd5edb3a264ab\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 27 17:57:56.803771 containerd[1585]: time="2025-05-27T17:57:56.802903373Z" level=info msg="Container ce3ad4551c2c2a46fc5924970c2c3ce60f42a94ea366d61b770a596cacdc8ab7: CDI devices from CRI Config.CDIDevices: []" May 27 17:57:56.827646 containerd[1585]: time="2025-05-27T17:57:56.827265392Z" level=info msg="CreateContainer within sandbox \"d52e03e753ef86d8c4475750f227d954df8b1708a6da560b33fbd5edb3a264ab\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"ce3ad4551c2c2a46fc5924970c2c3ce60f42a94ea366d61b770a596cacdc8ab7\"" May 27 17:57:56.829245 containerd[1585]: time="2025-05-27T17:57:56.828771562Z" level=info msg="StartContainer for \"ce3ad4551c2c2a46fc5924970c2c3ce60f42a94ea366d61b770a596cacdc8ab7\"" May 27 17:57:56.832464 containerd[1585]: time="2025-05-27T17:57:56.831828838Z" level=info msg="connecting to shim ce3ad4551c2c2a46fc5924970c2c3ce60f42a94ea366d61b770a596cacdc8ab7" address="unix:///run/containerd/s/0523d5d2ef243e39797ea5af89f8cf0667ed8b6e2c0ea0255005df460cfe76de" protocol=ttrpc version=3 May 27 17:57:56.838126 systemd-networkd[1519]: cali0dcc67e3ae8: Gained IPv6LL May 27 17:57:56.884978 systemd[1]: Started cri-containerd-ce3ad4551c2c2a46fc5924970c2c3ce60f42a94ea366d61b770a596cacdc8ab7.scope - libcontainer container ce3ad4551c2c2a46fc5924970c2c3ce60f42a94ea366d61b770a596cacdc8ab7. May 27 17:57:56.924210 containerd[1585]: time="2025-05-27T17:57:56.924159688Z" level=info msg="TaskExit event in podsandbox handler container_id:\"53c03d9554d0025de9e8cbdb2e9da19459fba3ea70590696c861f73440d9de07\" id:\"36bac2d1ff513aa37346f3a497042a839706c9deac6093691c633f55eeeaa930\" pid:4574 exited_at:{seconds:1748368676 nanos:923412715}" May 27 17:57:56.972948 containerd[1585]: time="2025-05-27T17:57:56.972891821Z" level=info msg="StartContainer for \"ce3ad4551c2c2a46fc5924970c2c3ce60f42a94ea366d61b770a596cacdc8ab7\" returns successfully" May 27 17:57:56.991810 containerd[1585]: time="2025-05-27T17:57:56.991347892Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-cc6hv,Uid:58d25a41-fd53-47e5-aec2-9bbdbb20fba6,Namespace:kube-system,Attempt:0,}" May 27 17:57:56.992844 containerd[1585]: time="2025-05-27T17:57:56.992398379Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5dcf54bdf4-6rv2d,Uid:04782360-5124-401b-b0fa-cb2a35043c5d,Namespace:calico-apiserver,Attempt:0,}" May 27 17:57:56.992844 containerd[1585]: time="2025-05-27T17:57:56.992593820Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5dcf54bdf4-b8qsn,Uid:fcd5b768-af40-448c-b34c-b9c1a53c8b71,Namespace:calico-apiserver,Attempt:0,}" May 27 17:57:56.993157 containerd[1585]: time="2025-05-27T17:57:56.993085876Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-8f77d7b6c-lv5hg,Uid:3a7c34d5-6339-473f-9bc2-4a5f5fd961b4,Namespace:calico-system,Attempt:0,}" May 27 17:57:57.354030 systemd-networkd[1519]: cali7440622ebd2: Link UP May 27 17:57:57.360496 systemd-networkd[1519]: cali7440622ebd2: Gained carrier May 27 17:57:57.442559 containerd[1585]: 2025-05-27 17:57:57.150 [INFO][4634] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--kh28t.gb1.brightbox.com-k8s-goldmane--8f77d7b6c--lv5hg-eth0 goldmane-8f77d7b6c- calico-system 3a7c34d5-6339-473f-9bc2-4a5f5fd961b4 804 0 2025-05-27 17:57:29 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:8f77d7b6c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s srv-kh28t.gb1.brightbox.com goldmane-8f77d7b6c-lv5hg eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali7440622ebd2 [] [] }} ContainerID="542a402f60b4e98bd2bdd6c305f2084f6575fbbf02c398347a42a1b17565f63f" Namespace="calico-system" Pod="goldmane-8f77d7b6c-lv5hg" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-goldmane--8f77d7b6c--lv5hg-" May 27 17:57:57.442559 containerd[1585]: 2025-05-27 17:57:57.151 [INFO][4634] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="542a402f60b4e98bd2bdd6c305f2084f6575fbbf02c398347a42a1b17565f63f" Namespace="calico-system" Pod="goldmane-8f77d7b6c-lv5hg" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-goldmane--8f77d7b6c--lv5hg-eth0" May 27 17:57:57.442559 containerd[1585]: 2025-05-27 17:57:57.214 [INFO][4676] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="542a402f60b4e98bd2bdd6c305f2084f6575fbbf02c398347a42a1b17565f63f" HandleID="k8s-pod-network.542a402f60b4e98bd2bdd6c305f2084f6575fbbf02c398347a42a1b17565f63f" Workload="srv--kh28t.gb1.brightbox.com-k8s-goldmane--8f77d7b6c--lv5hg-eth0" May 27 17:57:57.442559 containerd[1585]: 2025-05-27 17:57:57.215 [INFO][4676] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="542a402f60b4e98bd2bdd6c305f2084f6575fbbf02c398347a42a1b17565f63f" HandleID="k8s-pod-network.542a402f60b4e98bd2bdd6c305f2084f6575fbbf02c398347a42a1b17565f63f" Workload="srv--kh28t.gb1.brightbox.com-k8s-goldmane--8f77d7b6c--lv5hg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000366d30), Attrs:map[string]string{"namespace":"calico-system", "node":"srv-kh28t.gb1.brightbox.com", "pod":"goldmane-8f77d7b6c-lv5hg", "timestamp":"2025-05-27 17:57:57.214734271 +0000 UTC"}, Hostname:"srv-kh28t.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:57:57.442559 containerd[1585]: 2025-05-27 17:57:57.215 [INFO][4676] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:57:57.442559 containerd[1585]: 2025-05-27 17:57:57.215 [INFO][4676] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:57:57.442559 containerd[1585]: 2025-05-27 17:57:57.215 [INFO][4676] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-kh28t.gb1.brightbox.com' May 27 17:57:57.442559 containerd[1585]: 2025-05-27 17:57:57.237 [INFO][4676] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.542a402f60b4e98bd2bdd6c305f2084f6575fbbf02c398347a42a1b17565f63f" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.442559 containerd[1585]: 2025-05-27 17:57:57.250 [INFO][4676] ipam/ipam.go 394: Looking up existing affinities for host host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.442559 containerd[1585]: 2025-05-27 17:57:57.263 [INFO][4676] ipam/ipam.go 511: Trying affinity for 192.168.9.128/26 host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.442559 containerd[1585]: 2025-05-27 17:57:57.269 [INFO][4676] ipam/ipam.go 158: Attempting to load block cidr=192.168.9.128/26 host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.442559 containerd[1585]: 2025-05-27 17:57:57.274 [INFO][4676] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.9.128/26 host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.442559 containerd[1585]: 2025-05-27 17:57:57.274 [INFO][4676] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.9.128/26 handle="k8s-pod-network.542a402f60b4e98bd2bdd6c305f2084f6575fbbf02c398347a42a1b17565f63f" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.442559 containerd[1585]: 2025-05-27 17:57:57.278 [INFO][4676] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.542a402f60b4e98bd2bdd6c305f2084f6575fbbf02c398347a42a1b17565f63f May 27 17:57:57.442559 containerd[1585]: 2025-05-27 17:57:57.293 [INFO][4676] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.9.128/26 handle="k8s-pod-network.542a402f60b4e98bd2bdd6c305f2084f6575fbbf02c398347a42a1b17565f63f" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.442559 containerd[1585]: 2025-05-27 17:57:57.311 [INFO][4676] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.9.133/26] block=192.168.9.128/26 handle="k8s-pod-network.542a402f60b4e98bd2bdd6c305f2084f6575fbbf02c398347a42a1b17565f63f" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.442559 containerd[1585]: 2025-05-27 17:57:57.311 [INFO][4676] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.9.133/26] handle="k8s-pod-network.542a402f60b4e98bd2bdd6c305f2084f6575fbbf02c398347a42a1b17565f63f" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.442559 containerd[1585]: 2025-05-27 17:57:57.311 [INFO][4676] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:57:57.442559 containerd[1585]: 2025-05-27 17:57:57.311 [INFO][4676] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.9.133/26] IPv6=[] ContainerID="542a402f60b4e98bd2bdd6c305f2084f6575fbbf02c398347a42a1b17565f63f" HandleID="k8s-pod-network.542a402f60b4e98bd2bdd6c305f2084f6575fbbf02c398347a42a1b17565f63f" Workload="srv--kh28t.gb1.brightbox.com-k8s-goldmane--8f77d7b6c--lv5hg-eth0" May 27 17:57:57.446340 containerd[1585]: 2025-05-27 17:57:57.317 [INFO][4634] cni-plugin/k8s.go 418: Populated endpoint ContainerID="542a402f60b4e98bd2bdd6c305f2084f6575fbbf02c398347a42a1b17565f63f" Namespace="calico-system" Pod="goldmane-8f77d7b6c-lv5hg" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-goldmane--8f77d7b6c--lv5hg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--kh28t.gb1.brightbox.com-k8s-goldmane--8f77d7b6c--lv5hg-eth0", GenerateName:"goldmane-8f77d7b6c-", Namespace:"calico-system", SelfLink:"", UID:"3a7c34d5-6339-473f-9bc2-4a5f5fd961b4", ResourceVersion:"804", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 57, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"8f77d7b6c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-kh28t.gb1.brightbox.com", ContainerID:"", Pod:"goldmane-8f77d7b6c-lv5hg", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.9.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali7440622ebd2", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:57:57.446340 containerd[1585]: 2025-05-27 17:57:57.318 [INFO][4634] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.9.133/32] ContainerID="542a402f60b4e98bd2bdd6c305f2084f6575fbbf02c398347a42a1b17565f63f" Namespace="calico-system" Pod="goldmane-8f77d7b6c-lv5hg" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-goldmane--8f77d7b6c--lv5hg-eth0" May 27 17:57:57.446340 containerd[1585]: 2025-05-27 17:57:57.319 [INFO][4634] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7440622ebd2 ContainerID="542a402f60b4e98bd2bdd6c305f2084f6575fbbf02c398347a42a1b17565f63f" Namespace="calico-system" Pod="goldmane-8f77d7b6c-lv5hg" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-goldmane--8f77d7b6c--lv5hg-eth0" May 27 17:57:57.446340 containerd[1585]: 2025-05-27 17:57:57.365 [INFO][4634] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="542a402f60b4e98bd2bdd6c305f2084f6575fbbf02c398347a42a1b17565f63f" Namespace="calico-system" Pod="goldmane-8f77d7b6c-lv5hg" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-goldmane--8f77d7b6c--lv5hg-eth0" May 27 17:57:57.446340 containerd[1585]: 2025-05-27 17:57:57.373 [INFO][4634] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="542a402f60b4e98bd2bdd6c305f2084f6575fbbf02c398347a42a1b17565f63f" Namespace="calico-system" Pod="goldmane-8f77d7b6c-lv5hg" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-goldmane--8f77d7b6c--lv5hg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--kh28t.gb1.brightbox.com-k8s-goldmane--8f77d7b6c--lv5hg-eth0", GenerateName:"goldmane-8f77d7b6c-", Namespace:"calico-system", SelfLink:"", UID:"3a7c34d5-6339-473f-9bc2-4a5f5fd961b4", ResourceVersion:"804", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 57, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"8f77d7b6c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-kh28t.gb1.brightbox.com", ContainerID:"542a402f60b4e98bd2bdd6c305f2084f6575fbbf02c398347a42a1b17565f63f", Pod:"goldmane-8f77d7b6c-lv5hg", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.9.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali7440622ebd2", MAC:"9e:f2:ed:66:34:72", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:57:57.446340 containerd[1585]: 2025-05-27 17:57:57.416 [INFO][4634] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="542a402f60b4e98bd2bdd6c305f2084f6575fbbf02c398347a42a1b17565f63f" Namespace="calico-system" Pod="goldmane-8f77d7b6c-lv5hg" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-goldmane--8f77d7b6c--lv5hg-eth0" May 27 17:57:57.504627 systemd-networkd[1519]: calib0c40ff5d76: Link UP May 27 17:57:57.511315 systemd-networkd[1519]: calib0c40ff5d76: Gained carrier May 27 17:57:57.563488 containerd[1585]: 2025-05-27 17:57:57.098 [INFO][4619] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--kh28t.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--cc6hv-eth0 coredns-7c65d6cfc9- kube-system 58d25a41-fd53-47e5-aec2-9bbdbb20fba6 803 0 2025-05-27 17:57:13 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s srv-kh28t.gb1.brightbox.com coredns-7c65d6cfc9-cc6hv eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calib0c40ff5d76 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="1cfc28e4c2019d12e5f30ba70416d722960e4ac88c16f86d74b5665a571ad68d" Namespace="kube-system" Pod="coredns-7c65d6cfc9-cc6hv" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--cc6hv-" May 27 17:57:57.563488 containerd[1585]: 2025-05-27 17:57:57.099 [INFO][4619] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1cfc28e4c2019d12e5f30ba70416d722960e4ac88c16f86d74b5665a571ad68d" Namespace="kube-system" Pod="coredns-7c65d6cfc9-cc6hv" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--cc6hv-eth0" May 27 17:57:57.563488 containerd[1585]: 2025-05-27 17:57:57.246 [INFO][4668] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1cfc28e4c2019d12e5f30ba70416d722960e4ac88c16f86d74b5665a571ad68d" HandleID="k8s-pod-network.1cfc28e4c2019d12e5f30ba70416d722960e4ac88c16f86d74b5665a571ad68d" Workload="srv--kh28t.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--cc6hv-eth0" May 27 17:57:57.563488 containerd[1585]: 2025-05-27 17:57:57.246 [INFO][4668] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1cfc28e4c2019d12e5f30ba70416d722960e4ac88c16f86d74b5665a571ad68d" HandleID="k8s-pod-network.1cfc28e4c2019d12e5f30ba70416d722960e4ac88c16f86d74b5665a571ad68d" Workload="srv--kh28t.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--cc6hv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cf9b0), Attrs:map[string]string{"namespace":"kube-system", "node":"srv-kh28t.gb1.brightbox.com", "pod":"coredns-7c65d6cfc9-cc6hv", "timestamp":"2025-05-27 17:57:57.246617272 +0000 UTC"}, Hostname:"srv-kh28t.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:57:57.563488 containerd[1585]: 2025-05-27 17:57:57.246 [INFO][4668] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:57:57.563488 containerd[1585]: 2025-05-27 17:57:57.312 [INFO][4668] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:57:57.563488 containerd[1585]: 2025-05-27 17:57:57.313 [INFO][4668] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-kh28t.gb1.brightbox.com' May 27 17:57:57.563488 containerd[1585]: 2025-05-27 17:57:57.351 [INFO][4668] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1cfc28e4c2019d12e5f30ba70416d722960e4ac88c16f86d74b5665a571ad68d" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.563488 containerd[1585]: 2025-05-27 17:57:57.381 [INFO][4668] ipam/ipam.go 394: Looking up existing affinities for host host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.563488 containerd[1585]: 2025-05-27 17:57:57.400 [INFO][4668] ipam/ipam.go 511: Trying affinity for 192.168.9.128/26 host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.563488 containerd[1585]: 2025-05-27 17:57:57.406 [INFO][4668] ipam/ipam.go 158: Attempting to load block cidr=192.168.9.128/26 host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.563488 containerd[1585]: 2025-05-27 17:57:57.422 [INFO][4668] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.9.128/26 host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.563488 containerd[1585]: 2025-05-27 17:57:57.423 [INFO][4668] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.9.128/26 handle="k8s-pod-network.1cfc28e4c2019d12e5f30ba70416d722960e4ac88c16f86d74b5665a571ad68d" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.563488 containerd[1585]: 2025-05-27 17:57:57.431 [INFO][4668] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1cfc28e4c2019d12e5f30ba70416d722960e4ac88c16f86d74b5665a571ad68d May 27 17:57:57.563488 containerd[1585]: 2025-05-27 17:57:57.448 [INFO][4668] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.9.128/26 handle="k8s-pod-network.1cfc28e4c2019d12e5f30ba70416d722960e4ac88c16f86d74b5665a571ad68d" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.563488 containerd[1585]: 2025-05-27 17:57:57.472 [INFO][4668] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.9.134/26] block=192.168.9.128/26 handle="k8s-pod-network.1cfc28e4c2019d12e5f30ba70416d722960e4ac88c16f86d74b5665a571ad68d" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.563488 containerd[1585]: 2025-05-27 17:57:57.473 [INFO][4668] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.9.134/26] handle="k8s-pod-network.1cfc28e4c2019d12e5f30ba70416d722960e4ac88c16f86d74b5665a571ad68d" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.563488 containerd[1585]: 2025-05-27 17:57:57.474 [INFO][4668] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:57:57.563488 containerd[1585]: 2025-05-27 17:57:57.475 [INFO][4668] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.9.134/26] IPv6=[] ContainerID="1cfc28e4c2019d12e5f30ba70416d722960e4ac88c16f86d74b5665a571ad68d" HandleID="k8s-pod-network.1cfc28e4c2019d12e5f30ba70416d722960e4ac88c16f86d74b5665a571ad68d" Workload="srv--kh28t.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--cc6hv-eth0" May 27 17:57:57.567008 containerd[1585]: 2025-05-27 17:57:57.484 [INFO][4619] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1cfc28e4c2019d12e5f30ba70416d722960e4ac88c16f86d74b5665a571ad68d" Namespace="kube-system" Pod="coredns-7c65d6cfc9-cc6hv" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--cc6hv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--kh28t.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--cc6hv-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"58d25a41-fd53-47e5-aec2-9bbdbb20fba6", ResourceVersion:"803", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 57, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-kh28t.gb1.brightbox.com", ContainerID:"", Pod:"coredns-7c65d6cfc9-cc6hv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.9.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib0c40ff5d76", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:57:57.567008 containerd[1585]: 2025-05-27 17:57:57.485 [INFO][4619] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.9.134/32] ContainerID="1cfc28e4c2019d12e5f30ba70416d722960e4ac88c16f86d74b5665a571ad68d" Namespace="kube-system" Pod="coredns-7c65d6cfc9-cc6hv" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--cc6hv-eth0" May 27 17:57:57.567008 containerd[1585]: 2025-05-27 17:57:57.485 [INFO][4619] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib0c40ff5d76 ContainerID="1cfc28e4c2019d12e5f30ba70416d722960e4ac88c16f86d74b5665a571ad68d" Namespace="kube-system" Pod="coredns-7c65d6cfc9-cc6hv" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--cc6hv-eth0" May 27 17:57:57.567008 containerd[1585]: 2025-05-27 17:57:57.520 [INFO][4619] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1cfc28e4c2019d12e5f30ba70416d722960e4ac88c16f86d74b5665a571ad68d" Namespace="kube-system" Pod="coredns-7c65d6cfc9-cc6hv" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--cc6hv-eth0" May 27 17:57:57.567008 containerd[1585]: 2025-05-27 17:57:57.527 [INFO][4619] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1cfc28e4c2019d12e5f30ba70416d722960e4ac88c16f86d74b5665a571ad68d" Namespace="kube-system" Pod="coredns-7c65d6cfc9-cc6hv" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--cc6hv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--kh28t.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--cc6hv-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"58d25a41-fd53-47e5-aec2-9bbdbb20fba6", ResourceVersion:"803", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 57, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-kh28t.gb1.brightbox.com", ContainerID:"1cfc28e4c2019d12e5f30ba70416d722960e4ac88c16f86d74b5665a571ad68d", Pod:"coredns-7c65d6cfc9-cc6hv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.9.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib0c40ff5d76", MAC:"42:f0:d8:bb:37:0a", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:57:57.567008 containerd[1585]: 2025-05-27 17:57:57.552 [INFO][4619] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1cfc28e4c2019d12e5f30ba70416d722960e4ac88c16f86d74b5665a571ad68d" Namespace="kube-system" Pod="coredns-7c65d6cfc9-cc6hv" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-coredns--7c65d6cfc9--cc6hv-eth0" May 27 17:57:57.577942 containerd[1585]: time="2025-05-27T17:57:57.577437098Z" level=info msg="connecting to shim 542a402f60b4e98bd2bdd6c305f2084f6575fbbf02c398347a42a1b17565f63f" address="unix:///run/containerd/s/98914f35dc0a230c4e02705e5297a7ce24e5e180fe77ed86d3e52e2e618620af" namespace=k8s.io protocol=ttrpc version=3 May 27 17:57:57.631001 systemd[1]: Started cri-containerd-542a402f60b4e98bd2bdd6c305f2084f6575fbbf02c398347a42a1b17565f63f.scope - libcontainer container 542a402f60b4e98bd2bdd6c305f2084f6575fbbf02c398347a42a1b17565f63f. May 27 17:57:57.660362 systemd-networkd[1519]: calia5743fa3c73: Link UP May 27 17:57:57.663858 systemd-networkd[1519]: calia5743fa3c73: Gained carrier May 27 17:57:57.698298 containerd[1585]: 2025-05-27 17:57:57.170 [INFO][4625] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--kh28t.gb1.brightbox.com-k8s-calico--apiserver--5dcf54bdf4--6rv2d-eth0 calico-apiserver-5dcf54bdf4- calico-apiserver 04782360-5124-401b-b0fa-cb2a35043c5d 796 0 2025-05-27 17:57:25 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5dcf54bdf4 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-kh28t.gb1.brightbox.com calico-apiserver-5dcf54bdf4-6rv2d eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calia5743fa3c73 [] [] }} ContainerID="adef821da72e26503399fb51407f765ab4fd61e43a9b437eda027ae9ea49c00c" Namespace="calico-apiserver" Pod="calico-apiserver-5dcf54bdf4-6rv2d" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-calico--apiserver--5dcf54bdf4--6rv2d-" May 27 17:57:57.698298 containerd[1585]: 2025-05-27 17:57:57.170 [INFO][4625] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="adef821da72e26503399fb51407f765ab4fd61e43a9b437eda027ae9ea49c00c" Namespace="calico-apiserver" Pod="calico-apiserver-5dcf54bdf4-6rv2d" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-calico--apiserver--5dcf54bdf4--6rv2d-eth0" May 27 17:57:57.698298 containerd[1585]: 2025-05-27 17:57:57.302 [INFO][4685] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="adef821da72e26503399fb51407f765ab4fd61e43a9b437eda027ae9ea49c00c" HandleID="k8s-pod-network.adef821da72e26503399fb51407f765ab4fd61e43a9b437eda027ae9ea49c00c" Workload="srv--kh28t.gb1.brightbox.com-k8s-calico--apiserver--5dcf54bdf4--6rv2d-eth0" May 27 17:57:57.698298 containerd[1585]: 2025-05-27 17:57:57.303 [INFO][4685] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="adef821da72e26503399fb51407f765ab4fd61e43a9b437eda027ae9ea49c00c" HandleID="k8s-pod-network.adef821da72e26503399fb51407f765ab4fd61e43a9b437eda027ae9ea49c00c" Workload="srv--kh28t.gb1.brightbox.com-k8s-calico--apiserver--5dcf54bdf4--6rv2d-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00048daf0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"srv-kh28t.gb1.brightbox.com", "pod":"calico-apiserver-5dcf54bdf4-6rv2d", "timestamp":"2025-05-27 17:57:57.302248015 +0000 UTC"}, Hostname:"srv-kh28t.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:57:57.698298 containerd[1585]: 2025-05-27 17:57:57.303 [INFO][4685] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:57:57.698298 containerd[1585]: 2025-05-27 17:57:57.474 [INFO][4685] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:57:57.698298 containerd[1585]: 2025-05-27 17:57:57.476 [INFO][4685] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-kh28t.gb1.brightbox.com' May 27 17:57:57.698298 containerd[1585]: 2025-05-27 17:57:57.498 [INFO][4685] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.adef821da72e26503399fb51407f765ab4fd61e43a9b437eda027ae9ea49c00c" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.698298 containerd[1585]: 2025-05-27 17:57:57.536 [INFO][4685] ipam/ipam.go 394: Looking up existing affinities for host host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.698298 containerd[1585]: 2025-05-27 17:57:57.562 [INFO][4685] ipam/ipam.go 511: Trying affinity for 192.168.9.128/26 host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.698298 containerd[1585]: 2025-05-27 17:57:57.568 [INFO][4685] ipam/ipam.go 158: Attempting to load block cidr=192.168.9.128/26 host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.698298 containerd[1585]: 2025-05-27 17:57:57.576 [INFO][4685] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.9.128/26 host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.698298 containerd[1585]: 2025-05-27 17:57:57.577 [INFO][4685] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.9.128/26 handle="k8s-pod-network.adef821da72e26503399fb51407f765ab4fd61e43a9b437eda027ae9ea49c00c" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.698298 containerd[1585]: 2025-05-27 17:57:57.584 [INFO][4685] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.adef821da72e26503399fb51407f765ab4fd61e43a9b437eda027ae9ea49c00c May 27 17:57:57.698298 containerd[1585]: 2025-05-27 17:57:57.609 [INFO][4685] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.9.128/26 handle="k8s-pod-network.adef821da72e26503399fb51407f765ab4fd61e43a9b437eda027ae9ea49c00c" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.698298 containerd[1585]: 2025-05-27 17:57:57.629 [INFO][4685] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.9.135/26] block=192.168.9.128/26 handle="k8s-pod-network.adef821da72e26503399fb51407f765ab4fd61e43a9b437eda027ae9ea49c00c" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.698298 containerd[1585]: 2025-05-27 17:57:57.630 [INFO][4685] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.9.135/26] handle="k8s-pod-network.adef821da72e26503399fb51407f765ab4fd61e43a9b437eda027ae9ea49c00c" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.698298 containerd[1585]: 2025-05-27 17:57:57.630 [INFO][4685] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:57:57.698298 containerd[1585]: 2025-05-27 17:57:57.630 [INFO][4685] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.9.135/26] IPv6=[] ContainerID="adef821da72e26503399fb51407f765ab4fd61e43a9b437eda027ae9ea49c00c" HandleID="k8s-pod-network.adef821da72e26503399fb51407f765ab4fd61e43a9b437eda027ae9ea49c00c" Workload="srv--kh28t.gb1.brightbox.com-k8s-calico--apiserver--5dcf54bdf4--6rv2d-eth0" May 27 17:57:57.701655 containerd[1585]: 2025-05-27 17:57:57.642 [INFO][4625] cni-plugin/k8s.go 418: Populated endpoint ContainerID="adef821da72e26503399fb51407f765ab4fd61e43a9b437eda027ae9ea49c00c" Namespace="calico-apiserver" Pod="calico-apiserver-5dcf54bdf4-6rv2d" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-calico--apiserver--5dcf54bdf4--6rv2d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--kh28t.gb1.brightbox.com-k8s-calico--apiserver--5dcf54bdf4--6rv2d-eth0", GenerateName:"calico-apiserver-5dcf54bdf4-", Namespace:"calico-apiserver", SelfLink:"", UID:"04782360-5124-401b-b0fa-cb2a35043c5d", ResourceVersion:"796", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 57, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5dcf54bdf4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-kh28t.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-5dcf54bdf4-6rv2d", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.9.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia5743fa3c73", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:57:57.701655 containerd[1585]: 2025-05-27 17:57:57.642 [INFO][4625] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.9.135/32] ContainerID="adef821da72e26503399fb51407f765ab4fd61e43a9b437eda027ae9ea49c00c" Namespace="calico-apiserver" Pod="calico-apiserver-5dcf54bdf4-6rv2d" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-calico--apiserver--5dcf54bdf4--6rv2d-eth0" May 27 17:57:57.701655 containerd[1585]: 2025-05-27 17:57:57.642 [INFO][4625] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia5743fa3c73 ContainerID="adef821da72e26503399fb51407f765ab4fd61e43a9b437eda027ae9ea49c00c" Namespace="calico-apiserver" Pod="calico-apiserver-5dcf54bdf4-6rv2d" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-calico--apiserver--5dcf54bdf4--6rv2d-eth0" May 27 17:57:57.701655 containerd[1585]: 2025-05-27 17:57:57.673 [INFO][4625] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="adef821da72e26503399fb51407f765ab4fd61e43a9b437eda027ae9ea49c00c" Namespace="calico-apiserver" Pod="calico-apiserver-5dcf54bdf4-6rv2d" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-calico--apiserver--5dcf54bdf4--6rv2d-eth0" May 27 17:57:57.701655 containerd[1585]: 2025-05-27 17:57:57.674 [INFO][4625] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="adef821da72e26503399fb51407f765ab4fd61e43a9b437eda027ae9ea49c00c" Namespace="calico-apiserver" Pod="calico-apiserver-5dcf54bdf4-6rv2d" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-calico--apiserver--5dcf54bdf4--6rv2d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--kh28t.gb1.brightbox.com-k8s-calico--apiserver--5dcf54bdf4--6rv2d-eth0", GenerateName:"calico-apiserver-5dcf54bdf4-", Namespace:"calico-apiserver", SelfLink:"", UID:"04782360-5124-401b-b0fa-cb2a35043c5d", ResourceVersion:"796", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 57, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5dcf54bdf4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-kh28t.gb1.brightbox.com", ContainerID:"adef821da72e26503399fb51407f765ab4fd61e43a9b437eda027ae9ea49c00c", Pod:"calico-apiserver-5dcf54bdf4-6rv2d", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.9.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia5743fa3c73", MAC:"f2:d7:ea:88:a3:ed", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:57:57.701655 containerd[1585]: 2025-05-27 17:57:57.690 [INFO][4625] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="adef821da72e26503399fb51407f765ab4fd61e43a9b437eda027ae9ea49c00c" Namespace="calico-apiserver" Pod="calico-apiserver-5dcf54bdf4-6rv2d" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-calico--apiserver--5dcf54bdf4--6rv2d-eth0" May 27 17:57:57.720059 containerd[1585]: time="2025-05-27T17:57:57.719859207Z" level=info msg="connecting to shim 1cfc28e4c2019d12e5f30ba70416d722960e4ac88c16f86d74b5665a571ad68d" address="unix:///run/containerd/s/46ff73d31320ceb3a8b35d29aa097d69afc2bb4713a318c701cf3f02407bba3d" namespace=k8s.io protocol=ttrpc version=3 May 27 17:57:57.782925 containerd[1585]: time="2025-05-27T17:57:57.782869446Z" level=info msg="connecting to shim adef821da72e26503399fb51407f765ab4fd61e43a9b437eda027ae9ea49c00c" address="unix:///run/containerd/s/259f362130955d703d919b473b247ecc2704b2a73205a2312363df1dda23678c" namespace=k8s.io protocol=ttrpc version=3 May 27 17:57:57.808835 systemd-networkd[1519]: calibfcda6aa3b4: Link UP May 27 17:57:57.815068 systemd-networkd[1519]: calibfcda6aa3b4: Gained carrier May 27 17:57:57.864186 systemd[1]: Started cri-containerd-1cfc28e4c2019d12e5f30ba70416d722960e4ac88c16f86d74b5665a571ad68d.scope - libcontainer container 1cfc28e4c2019d12e5f30ba70416d722960e4ac88c16f86d74b5665a571ad68d. May 27 17:57:57.894761 containerd[1585]: 2025-05-27 17:57:57.152 [INFO][4656] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {srv--kh28t.gb1.brightbox.com-k8s-calico--apiserver--5dcf54bdf4--b8qsn-eth0 calico-apiserver-5dcf54bdf4- calico-apiserver fcd5b768-af40-448c-b34c-b9c1a53c8b71 806 0 2025-05-27 17:57:25 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5dcf54bdf4 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s srv-kh28t.gb1.brightbox.com calico-apiserver-5dcf54bdf4-b8qsn eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calibfcda6aa3b4 [] [] }} ContainerID="d2ecf47493f047741a4cb24044fb608532c3dcd201253ab6e1ffd7d1dbeacaee" Namespace="calico-apiserver" Pod="calico-apiserver-5dcf54bdf4-b8qsn" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-calico--apiserver--5dcf54bdf4--b8qsn-" May 27 17:57:57.894761 containerd[1585]: 2025-05-27 17:57:57.152 [INFO][4656] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d2ecf47493f047741a4cb24044fb608532c3dcd201253ab6e1ffd7d1dbeacaee" Namespace="calico-apiserver" Pod="calico-apiserver-5dcf54bdf4-b8qsn" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-calico--apiserver--5dcf54bdf4--b8qsn-eth0" May 27 17:57:57.894761 containerd[1585]: 2025-05-27 17:57:57.306 [INFO][4677] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d2ecf47493f047741a4cb24044fb608532c3dcd201253ab6e1ffd7d1dbeacaee" HandleID="k8s-pod-network.d2ecf47493f047741a4cb24044fb608532c3dcd201253ab6e1ffd7d1dbeacaee" Workload="srv--kh28t.gb1.brightbox.com-k8s-calico--apiserver--5dcf54bdf4--b8qsn-eth0" May 27 17:57:57.894761 containerd[1585]: 2025-05-27 17:57:57.306 [INFO][4677] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d2ecf47493f047741a4cb24044fb608532c3dcd201253ab6e1ffd7d1dbeacaee" HandleID="k8s-pod-network.d2ecf47493f047741a4cb24044fb608532c3dcd201253ab6e1ffd7d1dbeacaee" Workload="srv--kh28t.gb1.brightbox.com-k8s-calico--apiserver--5dcf54bdf4--b8qsn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004fac0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"srv-kh28t.gb1.brightbox.com", "pod":"calico-apiserver-5dcf54bdf4-b8qsn", "timestamp":"2025-05-27 17:57:57.306070332 +0000 UTC"}, Hostname:"srv-kh28t.gb1.brightbox.com", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:57:57.894761 containerd[1585]: 2025-05-27 17:57:57.306 [INFO][4677] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:57:57.894761 containerd[1585]: 2025-05-27 17:57:57.634 [INFO][4677] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:57:57.894761 containerd[1585]: 2025-05-27 17:57:57.634 [INFO][4677] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'srv-kh28t.gb1.brightbox.com' May 27 17:57:57.894761 containerd[1585]: 2025-05-27 17:57:57.680 [INFO][4677] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d2ecf47493f047741a4cb24044fb608532c3dcd201253ab6e1ffd7d1dbeacaee" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.894761 containerd[1585]: 2025-05-27 17:57:57.697 [INFO][4677] ipam/ipam.go 394: Looking up existing affinities for host host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.894761 containerd[1585]: 2025-05-27 17:57:57.718 [INFO][4677] ipam/ipam.go 511: Trying affinity for 192.168.9.128/26 host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.894761 containerd[1585]: 2025-05-27 17:57:57.728 [INFO][4677] ipam/ipam.go 158: Attempting to load block cidr=192.168.9.128/26 host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.894761 containerd[1585]: 2025-05-27 17:57:57.740 [INFO][4677] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.9.128/26 host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.894761 containerd[1585]: 2025-05-27 17:57:57.741 [INFO][4677] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.9.128/26 handle="k8s-pod-network.d2ecf47493f047741a4cb24044fb608532c3dcd201253ab6e1ffd7d1dbeacaee" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.894761 containerd[1585]: 2025-05-27 17:57:57.745 [INFO][4677] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d2ecf47493f047741a4cb24044fb608532c3dcd201253ab6e1ffd7d1dbeacaee May 27 17:57:57.894761 containerd[1585]: 2025-05-27 17:57:57.756 [INFO][4677] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.9.128/26 handle="k8s-pod-network.d2ecf47493f047741a4cb24044fb608532c3dcd201253ab6e1ffd7d1dbeacaee" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.894761 containerd[1585]: 2025-05-27 17:57:57.772 [INFO][4677] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.9.136/26] block=192.168.9.128/26 handle="k8s-pod-network.d2ecf47493f047741a4cb24044fb608532c3dcd201253ab6e1ffd7d1dbeacaee" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.894761 containerd[1585]: 2025-05-27 17:57:57.773 [INFO][4677] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.9.136/26] handle="k8s-pod-network.d2ecf47493f047741a4cb24044fb608532c3dcd201253ab6e1ffd7d1dbeacaee" host="srv-kh28t.gb1.brightbox.com" May 27 17:57:57.894761 containerd[1585]: 2025-05-27 17:57:57.773 [INFO][4677] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:57:57.894761 containerd[1585]: 2025-05-27 17:57:57.774 [INFO][4677] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.9.136/26] IPv6=[] ContainerID="d2ecf47493f047741a4cb24044fb608532c3dcd201253ab6e1ffd7d1dbeacaee" HandleID="k8s-pod-network.d2ecf47493f047741a4cb24044fb608532c3dcd201253ab6e1ffd7d1dbeacaee" Workload="srv--kh28t.gb1.brightbox.com-k8s-calico--apiserver--5dcf54bdf4--b8qsn-eth0" May 27 17:57:57.895703 containerd[1585]: 2025-05-27 17:57:57.784 [INFO][4656] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d2ecf47493f047741a4cb24044fb608532c3dcd201253ab6e1ffd7d1dbeacaee" Namespace="calico-apiserver" Pod="calico-apiserver-5dcf54bdf4-b8qsn" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-calico--apiserver--5dcf54bdf4--b8qsn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--kh28t.gb1.brightbox.com-k8s-calico--apiserver--5dcf54bdf4--b8qsn-eth0", GenerateName:"calico-apiserver-5dcf54bdf4-", Namespace:"calico-apiserver", SelfLink:"", UID:"fcd5b768-af40-448c-b34c-b9c1a53c8b71", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 57, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5dcf54bdf4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-kh28t.gb1.brightbox.com", ContainerID:"", Pod:"calico-apiserver-5dcf54bdf4-b8qsn", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.9.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calibfcda6aa3b4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:57:57.895703 containerd[1585]: 2025-05-27 17:57:57.784 [INFO][4656] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.9.136/32] ContainerID="d2ecf47493f047741a4cb24044fb608532c3dcd201253ab6e1ffd7d1dbeacaee" Namespace="calico-apiserver" Pod="calico-apiserver-5dcf54bdf4-b8qsn" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-calico--apiserver--5dcf54bdf4--b8qsn-eth0" May 27 17:57:57.895703 containerd[1585]: 2025-05-27 17:57:57.784 [INFO][4656] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibfcda6aa3b4 ContainerID="d2ecf47493f047741a4cb24044fb608532c3dcd201253ab6e1ffd7d1dbeacaee" Namespace="calico-apiserver" Pod="calico-apiserver-5dcf54bdf4-b8qsn" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-calico--apiserver--5dcf54bdf4--b8qsn-eth0" May 27 17:57:57.895703 containerd[1585]: 2025-05-27 17:57:57.820 [INFO][4656] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d2ecf47493f047741a4cb24044fb608532c3dcd201253ab6e1ffd7d1dbeacaee" Namespace="calico-apiserver" Pod="calico-apiserver-5dcf54bdf4-b8qsn" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-calico--apiserver--5dcf54bdf4--b8qsn-eth0" May 27 17:57:57.895703 containerd[1585]: 2025-05-27 17:57:57.831 [INFO][4656] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d2ecf47493f047741a4cb24044fb608532c3dcd201253ab6e1ffd7d1dbeacaee" Namespace="calico-apiserver" Pod="calico-apiserver-5dcf54bdf4-b8qsn" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-calico--apiserver--5dcf54bdf4--b8qsn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"srv--kh28t.gb1.brightbox.com-k8s-calico--apiserver--5dcf54bdf4--b8qsn-eth0", GenerateName:"calico-apiserver-5dcf54bdf4-", Namespace:"calico-apiserver", SelfLink:"", UID:"fcd5b768-af40-448c-b34c-b9c1a53c8b71", ResourceVersion:"806", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 57, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5dcf54bdf4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"srv-kh28t.gb1.brightbox.com", ContainerID:"d2ecf47493f047741a4cb24044fb608532c3dcd201253ab6e1ffd7d1dbeacaee", Pod:"calico-apiserver-5dcf54bdf4-b8qsn", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.9.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calibfcda6aa3b4", MAC:"3e:19:2c:cf:e4:f4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:57:57.895703 containerd[1585]: 2025-05-27 17:57:57.870 [INFO][4656] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d2ecf47493f047741a4cb24044fb608532c3dcd201253ab6e1ffd7d1dbeacaee" Namespace="calico-apiserver" Pod="calico-apiserver-5dcf54bdf4-b8qsn" WorkloadEndpoint="srv--kh28t.gb1.brightbox.com-k8s-calico--apiserver--5dcf54bdf4--b8qsn-eth0" May 27 17:57:57.906401 systemd[1]: Started cri-containerd-adef821da72e26503399fb51407f765ab4fd61e43a9b437eda027ae9ea49c00c.scope - libcontainer container adef821da72e26503399fb51407f765ab4fd61e43a9b437eda027ae9ea49c00c. May 27 17:57:57.966182 containerd[1585]: time="2025-05-27T17:57:57.965857816Z" level=info msg="connecting to shim d2ecf47493f047741a4cb24044fb608532c3dcd201253ab6e1ffd7d1dbeacaee" address="unix:///run/containerd/s/16bf5b011a8170683c0e84d9330988895dc97872b5cd1cd1e169acc827df8508" namespace=k8s.io protocol=ttrpc version=3 May 27 17:57:58.015861 systemd[1]: Started cri-containerd-d2ecf47493f047741a4cb24044fb608532c3dcd201253ab6e1ffd7d1dbeacaee.scope - libcontainer container d2ecf47493f047741a4cb24044fb608532c3dcd201253ab6e1ffd7d1dbeacaee. May 27 17:57:58.039525 containerd[1585]: time="2025-05-27T17:57:58.039449572Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-cc6hv,Uid:58d25a41-fd53-47e5-aec2-9bbdbb20fba6,Namespace:kube-system,Attempt:0,} returns sandbox id \"1cfc28e4c2019d12e5f30ba70416d722960e4ac88c16f86d74b5665a571ad68d\"" May 27 17:57:58.056924 containerd[1585]: time="2025-05-27T17:57:58.056862918Z" level=info msg="CreateContainer within sandbox \"1cfc28e4c2019d12e5f30ba70416d722960e4ac88c16f86d74b5665a571ad68d\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 27 17:57:58.110077 containerd[1585]: time="2025-05-27T17:57:58.109572996Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-8f77d7b6c-lv5hg,Uid:3a7c34d5-6339-473f-9bc2-4a5f5fd961b4,Namespace:calico-system,Attempt:0,} returns sandbox id \"542a402f60b4e98bd2bdd6c305f2084f6575fbbf02c398347a42a1b17565f63f\"" May 27 17:57:58.118565 containerd[1585]: time="2025-05-27T17:57:58.118032207Z" level=info msg="Container 38b627d589ff581389151148a24793783ab27d358d434aba8434c6f08b8ff693: CDI devices from CRI Config.CDIDevices: []" May 27 17:57:58.137842 containerd[1585]: time="2025-05-27T17:57:58.137286697Z" level=info msg="CreateContainer within sandbox \"1cfc28e4c2019d12e5f30ba70416d722960e4ac88c16f86d74b5665a571ad68d\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"38b627d589ff581389151148a24793783ab27d358d434aba8434c6f08b8ff693\"" May 27 17:57:58.141533 containerd[1585]: time="2025-05-27T17:57:58.141007654Z" level=info msg="StartContainer for \"38b627d589ff581389151148a24793783ab27d358d434aba8434c6f08b8ff693\"" May 27 17:57:58.145396 containerd[1585]: time="2025-05-27T17:57:58.145003339Z" level=info msg="connecting to shim 38b627d589ff581389151148a24793783ab27d358d434aba8434c6f08b8ff693" address="unix:///run/containerd/s/46ff73d31320ceb3a8b35d29aa097d69afc2bb4713a318c701cf3f02407bba3d" protocol=ttrpc version=3 May 27 17:57:58.214011 systemd[1]: Started cri-containerd-38b627d589ff581389151148a24793783ab27d358d434aba8434c6f08b8ff693.scope - libcontainer container 38b627d589ff581389151148a24793783ab27d358d434aba8434c6f08b8ff693. May 27 17:57:58.221656 containerd[1585]: time="2025-05-27T17:57:58.221323682Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5dcf54bdf4-6rv2d,Uid:04782360-5124-401b-b0fa-cb2a35043c5d,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"adef821da72e26503399fb51407f765ab4fd61e43a9b437eda027ae9ea49c00c\"" May 27 17:57:58.226100 containerd[1585]: time="2025-05-27T17:57:58.225100411Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5dcf54bdf4-b8qsn,Uid:fcd5b768-af40-448c-b34c-b9c1a53c8b71,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"d2ecf47493f047741a4cb24044fb608532c3dcd201253ab6e1ffd7d1dbeacaee\"" May 27 17:57:58.269068 containerd[1585]: time="2025-05-27T17:57:58.269021890Z" level=info msg="StartContainer for \"38b627d589ff581389151148a24793783ab27d358d434aba8434c6f08b8ff693\" returns successfully" May 27 17:57:58.495786 kubelet[2875]: I0527 17:57:58.495700 2875 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-cc6hv" podStartSLOduration=45.482940803 podStartE2EDuration="45.482940803s" podCreationTimestamp="2025-05-27 17:57:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 17:57:58.478489927 +0000 UTC m=+50.798134826" watchObservedRunningTime="2025-05-27 17:57:58.482940803 +0000 UTC m=+50.802585685" May 27 17:57:58.503070 systemd-networkd[1519]: cali7440622ebd2: Gained IPv6LL May 27 17:57:58.806612 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4280256903.mount: Deactivated successfully. May 27 17:57:58.949019 systemd-networkd[1519]: calia5743fa3c73: Gained IPv6LL May 27 17:57:59.269039 systemd-networkd[1519]: calib0c40ff5d76: Gained IPv6LL May 27 17:57:59.460868 systemd-networkd[1519]: calibfcda6aa3b4: Gained IPv6LL May 27 17:58:01.268024 containerd[1585]: time="2025-05-27T17:58:01.267468648Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:58:01.270633 containerd[1585]: time="2025-05-27T17:58:01.270260045Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.0: active requests=0, bytes read=51178512" May 27 17:58:01.270633 containerd[1585]: time="2025-05-27T17:58:01.270516876Z" level=info msg="ImageCreate event name:\"sha256:094053209304a3d20e6561c18d37ac2dc4c7fbb68c1579d9864c303edebffa50\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:58:01.280700 containerd[1585]: time="2025-05-27T17:58:01.280135942Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:eb5bc5c9e7a71f1d8ea69bbcc8e54b84fb7ec1e32d919c8b148f80b770f20182\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:58:01.281373 containerd[1585]: time="2025-05-27T17:58:01.281216069Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" with image id \"sha256:094053209304a3d20e6561c18d37ac2dc4c7fbb68c1579d9864c303edebffa50\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:eb5bc5c9e7a71f1d8ea69bbcc8e54b84fb7ec1e32d919c8b148f80b770f20182\", size \"52671183\" in 4.504298554s" May 27 17:58:01.281373 containerd[1585]: time="2025-05-27T17:58:01.281257259Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" returns image reference \"sha256:094053209304a3d20e6561c18d37ac2dc4c7fbb68c1579d9864c303edebffa50\"" May 27 17:58:01.284374 containerd[1585]: time="2025-05-27T17:58:01.284346252Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\"" May 27 17:58:01.317881 containerd[1585]: time="2025-05-27T17:58:01.317582434Z" level=info msg="CreateContainer within sandbox \"3d25358bf9567d843992d137d68e00d6893722eb048abfe8eed34b8240f317a5\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 27 17:58:01.337899 containerd[1585]: time="2025-05-27T17:58:01.337851531Z" level=info msg="Container 9c2130023330c7c1ae8446ef7e237ccdab0c8409b84ef9bd34fd78b16805a086: CDI devices from CRI Config.CDIDevices: []" May 27 17:58:01.351205 containerd[1585]: time="2025-05-27T17:58:01.351098423Z" level=info msg="CreateContainer within sandbox \"3d25358bf9567d843992d137d68e00d6893722eb048abfe8eed34b8240f317a5\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"9c2130023330c7c1ae8446ef7e237ccdab0c8409b84ef9bd34fd78b16805a086\"" May 27 17:58:01.352037 containerd[1585]: time="2025-05-27T17:58:01.351999312Z" level=info msg="StartContainer for \"9c2130023330c7c1ae8446ef7e237ccdab0c8409b84ef9bd34fd78b16805a086\"" May 27 17:58:01.355189 containerd[1585]: time="2025-05-27T17:58:01.355147277Z" level=info msg="connecting to shim 9c2130023330c7c1ae8446ef7e237ccdab0c8409b84ef9bd34fd78b16805a086" address="unix:///run/containerd/s/1f4eaf1c5b8b91675d9af97ff3a58d35e079e407f8288517d7565f19bf34f0a5" protocol=ttrpc version=3 May 27 17:58:01.406886 systemd[1]: Started cri-containerd-9c2130023330c7c1ae8446ef7e237ccdab0c8409b84ef9bd34fd78b16805a086.scope - libcontainer container 9c2130023330c7c1ae8446ef7e237ccdab0c8409b84ef9bd34fd78b16805a086. May 27 17:58:01.591246 containerd[1585]: time="2025-05-27T17:58:01.590612571Z" level=info msg="StartContainer for \"9c2130023330c7c1ae8446ef7e237ccdab0c8409b84ef9bd34fd78b16805a086\" returns successfully" May 27 17:58:02.685582 containerd[1585]: time="2025-05-27T17:58:02.685516762Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9c2130023330c7c1ae8446ef7e237ccdab0c8409b84ef9bd34fd78b16805a086\" id:\"35167f1a8ae4f8df7ad9296495cf2e34b9333c8165d0f23025a62456365547c2\" pid:5033 exited_at:{seconds:1748368682 nanos:684515936}" May 27 17:58:02.726438 kubelet[2875]: I0527 17:58:02.724576 2875 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-78b7d7669b-pvkq7" podStartSLOduration=27.859104666 podStartE2EDuration="33.724516124s" podCreationTimestamp="2025-05-27 17:57:29 +0000 UTC" firstStartedPulling="2025-05-27 17:57:55.418410636 +0000 UTC m=+47.738055517" lastFinishedPulling="2025-05-27 17:58:01.283822093 +0000 UTC m=+53.603466975" observedRunningTime="2025-05-27 17:58:02.554341056 +0000 UTC m=+54.873985949" watchObservedRunningTime="2025-05-27 17:58:02.724516124 +0000 UTC m=+55.044161012" May 27 17:58:03.501498 containerd[1585]: time="2025-05-27T17:58:03.501261139Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:58:03.505319 containerd[1585]: time="2025-05-27T17:58:03.505287967Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0: active requests=0, bytes read=14705639" May 27 17:58:03.506937 containerd[1585]: time="2025-05-27T17:58:03.506253673Z" level=info msg="ImageCreate event name:\"sha256:45c8692ffc029387ee93ba83da8ad26da9749cf2ba6ed03981f8f9933ed5a5b0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:58:03.533396 containerd[1585]: time="2025-05-27T17:58:03.532699351Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:dca5c16181edde2e860463615523ce457cd9dcfca85b7cfdcd6f3ea7de6f2ac8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:58:03.533935 containerd[1585]: time="2025-05-27T17:58:03.533693166Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" with image id \"sha256:45c8692ffc029387ee93ba83da8ad26da9749cf2ba6ed03981f8f9933ed5a5b0\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:dca5c16181edde2e860463615523ce457cd9dcfca85b7cfdcd6f3ea7de6f2ac8\", size \"16198294\" in 2.248870541s" May 27 17:58:03.533935 containerd[1585]: time="2025-05-27T17:58:03.533743483Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" returns image reference \"sha256:45c8692ffc029387ee93ba83da8ad26da9749cf2ba6ed03981f8f9933ed5a5b0\"" May 27 17:58:03.538372 containerd[1585]: time="2025-05-27T17:58:03.538310193Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 17:58:03.542426 containerd[1585]: time="2025-05-27T17:58:03.542382991Z" level=info msg="CreateContainer within sandbox \"d52e03e753ef86d8c4475750f227d954df8b1708a6da560b33fbd5edb3a264ab\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 27 17:58:03.559685 containerd[1585]: time="2025-05-27T17:58:03.558889238Z" level=info msg="Container 65f8c2900ee1c10deeb77d282f7e5fa7c9b924359044f1b8c3921de43ad746b7: CDI devices from CRI Config.CDIDevices: []" May 27 17:58:03.570656 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount189190256.mount: Deactivated successfully. May 27 17:58:03.648784 containerd[1585]: time="2025-05-27T17:58:03.648700647Z" level=info msg="CreateContainer within sandbox \"d52e03e753ef86d8c4475750f227d954df8b1708a6da560b33fbd5edb3a264ab\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"65f8c2900ee1c10deeb77d282f7e5fa7c9b924359044f1b8c3921de43ad746b7\"" May 27 17:58:03.650527 containerd[1585]: time="2025-05-27T17:58:03.650461541Z" level=info msg="StartContainer for \"65f8c2900ee1c10deeb77d282f7e5fa7c9b924359044f1b8c3921de43ad746b7\"" May 27 17:58:03.654586 containerd[1585]: time="2025-05-27T17:58:03.654539270Z" level=info msg="connecting to shim 65f8c2900ee1c10deeb77d282f7e5fa7c9b924359044f1b8c3921de43ad746b7" address="unix:///run/containerd/s/0523d5d2ef243e39797ea5af89f8cf0667ed8b6e2c0ea0255005df460cfe76de" protocol=ttrpc version=3 May 27 17:58:03.697195 systemd[1]: Started cri-containerd-65f8c2900ee1c10deeb77d282f7e5fa7c9b924359044f1b8c3921de43ad746b7.scope - libcontainer container 65f8c2900ee1c10deeb77d282f7e5fa7c9b924359044f1b8c3921de43ad746b7. May 27 17:58:03.766806 containerd[1585]: time="2025-05-27T17:58:03.766002578Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:58:03.772645 containerd[1585]: time="2025-05-27T17:58:03.772607249Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 17:58:03.784416 containerd[1585]: time="2025-05-27T17:58:03.784332673Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:58:03.790370 kubelet[2875]: E0527 17:58:03.790191 2875 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:58:03.793433 kubelet[2875]: E0527 17:58:03.793267 2875 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:58:03.795866 containerd[1585]: time="2025-05-27T17:58:03.794760553Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\"" May 27 17:58:03.811125 kubelet[2875]: E0527 17:58:03.811001 2875 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-96shr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-8f77d7b6c-lv5hg_calico-system(3a7c34d5-6339-473f-9bc2-4a5f5fd961b4): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:58:03.815476 kubelet[2875]: E0527 17:58:03.815430 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-8f77d7b6c-lv5hg" podUID="3a7c34d5-6339-473f-9bc2-4a5f5fd961b4" May 27 17:58:03.877589 containerd[1585]: time="2025-05-27T17:58:03.877534501Z" level=info msg="StartContainer for \"65f8c2900ee1c10deeb77d282f7e5fa7c9b924359044f1b8c3921de43ad746b7\" returns successfully" May 27 17:58:04.266392 kubelet[2875]: I0527 17:58:04.266331 2875 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 27 17:58:04.266624 kubelet[2875]: I0527 17:58:04.266429 2875 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 27 17:58:04.534010 kubelet[2875]: E0527 17:58:04.533388 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-lv5hg" podUID="3a7c34d5-6339-473f-9bc2-4a5f5fd961b4" May 27 17:58:04.582564 kubelet[2875]: I0527 17:58:04.582486 2875 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-9t7wk" podStartSLOduration=26.755347737 podStartE2EDuration="35.582464749s" podCreationTimestamp="2025-05-27 17:57:29 +0000 UTC" firstStartedPulling="2025-05-27 17:57:54.710840268 +0000 UTC m=+47.030485143" lastFinishedPulling="2025-05-27 17:58:03.537957255 +0000 UTC m=+55.857602155" observedRunningTime="2025-05-27 17:58:04.57959099 +0000 UTC m=+56.899235903" watchObservedRunningTime="2025-05-27 17:58:04.582464749 +0000 UTC m=+56.902109637" May 27 17:58:08.850801 containerd[1585]: time="2025-05-27T17:58:08.850739242Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:58:08.858374 containerd[1585]: time="2025-05-27T17:58:08.858334692Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.0: active requests=0, bytes read=47252431" May 27 17:58:08.865689 containerd[1585]: time="2025-05-27T17:58:08.864838220Z" level=info msg="ImageCreate event name:\"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:58:08.867222 containerd[1585]: time="2025-05-27T17:58:08.867185976Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:58:08.868181 containerd[1585]: time="2025-05-27T17:58:08.868143240Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" with image id \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\", size \"48745150\" in 5.07331454s" May 27 17:58:08.868269 containerd[1585]: time="2025-05-27T17:58:08.868184232Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" returns image reference \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\"" May 27 17:58:08.871065 containerd[1585]: time="2025-05-27T17:58:08.871034561Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\"" May 27 17:58:08.885474 containerd[1585]: time="2025-05-27T17:58:08.885396823Z" level=info msg="CreateContainer within sandbox \"adef821da72e26503399fb51407f765ab4fd61e43a9b437eda027ae9ea49c00c\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 27 17:58:08.911703 containerd[1585]: time="2025-05-27T17:58:08.910859809Z" level=info msg="Container dffa57846978d3b891df7258257654d4bad260294e0920bea6f19b0cdcfe0f50: CDI devices from CRI Config.CDIDevices: []" May 27 17:58:08.922220 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2378477048.mount: Deactivated successfully. May 27 17:58:08.926836 containerd[1585]: time="2025-05-27T17:58:08.923454057Z" level=info msg="CreateContainer within sandbox \"adef821da72e26503399fb51407f765ab4fd61e43a9b437eda027ae9ea49c00c\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"dffa57846978d3b891df7258257654d4bad260294e0920bea6f19b0cdcfe0f50\"" May 27 17:58:08.926836 containerd[1585]: time="2025-05-27T17:58:08.924425055Z" level=info msg="StartContainer for \"dffa57846978d3b891df7258257654d4bad260294e0920bea6f19b0cdcfe0f50\"" May 27 17:58:08.934054 containerd[1585]: time="2025-05-27T17:58:08.933989430Z" level=info msg="connecting to shim dffa57846978d3b891df7258257654d4bad260294e0920bea6f19b0cdcfe0f50" address="unix:///run/containerd/s/259f362130955d703d919b473b247ecc2704b2a73205a2312363df1dda23678c" protocol=ttrpc version=3 May 27 17:58:08.972972 systemd[1]: Started cri-containerd-dffa57846978d3b891df7258257654d4bad260294e0920bea6f19b0cdcfe0f50.scope - libcontainer container dffa57846978d3b891df7258257654d4bad260294e0920bea6f19b0cdcfe0f50. May 27 17:58:09.113847 containerd[1585]: time="2025-05-27T17:58:09.113120210Z" level=info msg="StartContainer for \"dffa57846978d3b891df7258257654d4bad260294e0920bea6f19b0cdcfe0f50\" returns successfully" May 27 17:58:09.275864 containerd[1585]: time="2025-05-27T17:58:09.275799641Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:58:09.277990 containerd[1585]: time="2025-05-27T17:58:09.277955949Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.0: active requests=0, bytes read=77" May 27 17:58:09.279975 containerd[1585]: time="2025-05-27T17:58:09.279923980Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" with image id \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\", size \"48745150\" in 408.81451ms" May 27 17:58:09.280193 containerd[1585]: time="2025-05-27T17:58:09.279981455Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" returns image reference \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\"" May 27 17:58:09.282525 containerd[1585]: time="2025-05-27T17:58:09.282294819Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 17:58:09.284469 containerd[1585]: time="2025-05-27T17:58:09.284434374Z" level=info msg="CreateContainer within sandbox \"d2ecf47493f047741a4cb24044fb608532c3dcd201253ab6e1ffd7d1dbeacaee\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 27 17:58:09.294871 containerd[1585]: time="2025-05-27T17:58:09.294804185Z" level=info msg="Container a39ede36249dbc3ade6b9debb77ee2d4420f8188033b8d89b7f606ee8185a90e: CDI devices from CRI Config.CDIDevices: []" May 27 17:58:09.315836 containerd[1585]: time="2025-05-27T17:58:09.315791047Z" level=info msg="CreateContainer within sandbox \"d2ecf47493f047741a4cb24044fb608532c3dcd201253ab6e1ffd7d1dbeacaee\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"a39ede36249dbc3ade6b9debb77ee2d4420f8188033b8d89b7f606ee8185a90e\"" May 27 17:58:09.318865 containerd[1585]: time="2025-05-27T17:58:09.318836853Z" level=info msg="StartContainer for \"a39ede36249dbc3ade6b9debb77ee2d4420f8188033b8d89b7f606ee8185a90e\"" May 27 17:58:09.320304 containerd[1585]: time="2025-05-27T17:58:09.320271632Z" level=info msg="connecting to shim a39ede36249dbc3ade6b9debb77ee2d4420f8188033b8d89b7f606ee8185a90e" address="unix:///run/containerd/s/16bf5b011a8170683c0e84d9330988895dc97872b5cd1cd1e169acc827df8508" protocol=ttrpc version=3 May 27 17:58:09.373330 systemd[1]: Started cri-containerd-a39ede36249dbc3ade6b9debb77ee2d4420f8188033b8d89b7f606ee8185a90e.scope - libcontainer container a39ede36249dbc3ade6b9debb77ee2d4420f8188033b8d89b7f606ee8185a90e. May 27 17:58:09.505914 containerd[1585]: time="2025-05-27T17:58:09.505642591Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:58:09.508058 containerd[1585]: time="2025-05-27T17:58:09.507650081Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 17:58:09.508058 containerd[1585]: time="2025-05-27T17:58:09.507661829Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:58:09.508633 kubelet[2875]: E0527 17:58:09.508116 2875 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:58:09.508633 kubelet[2875]: E0527 17:58:09.508188 2875 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:58:09.510431 kubelet[2875]: E0527 17:58:09.508903 2875 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:2fa70a3d9d1f41939d7bb3cd196d2e7b,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9xgcg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5775587fcb-8p4tb_calico-system(75023f81-9f7f-4b1f-aa44-b3e234778686): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:58:09.514106 containerd[1585]: time="2025-05-27T17:58:09.513754413Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 17:58:09.570445 containerd[1585]: time="2025-05-27T17:58:09.568084293Z" level=info msg="StartContainer for \"a39ede36249dbc3ade6b9debb77ee2d4420f8188033b8d89b7f606ee8185a90e\" returns successfully" May 27 17:58:09.630188 kubelet[2875]: I0527 17:58:09.628654 2875 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5dcf54bdf4-6rv2d" podStartSLOduration=33.983837169 podStartE2EDuration="44.628631011s" podCreationTimestamp="2025-05-27 17:57:25 +0000 UTC" firstStartedPulling="2025-05-27 17:57:58.225775568 +0000 UTC m=+50.545420448" lastFinishedPulling="2025-05-27 17:58:08.87056941 +0000 UTC m=+61.190214290" observedRunningTime="2025-05-27 17:58:09.60793927 +0000 UTC m=+61.927584184" watchObservedRunningTime="2025-05-27 17:58:09.628631011 +0000 UTC m=+61.948275892" May 27 17:58:09.745870 containerd[1585]: time="2025-05-27T17:58:09.745449078Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:58:09.754761 containerd[1585]: time="2025-05-27T17:58:09.754705168Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 17:58:09.754921 containerd[1585]: time="2025-05-27T17:58:09.754705198Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:58:09.755904 kubelet[2875]: E0527 17:58:09.755841 2875 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:58:09.756051 kubelet[2875]: E0527 17:58:09.755908 2875 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:58:09.756235 kubelet[2875]: E0527 17:58:09.756102 2875 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9xgcg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5775587fcb-8p4tb_calico-system(75023f81-9f7f-4b1f-aa44-b3e234778686): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:58:09.758249 kubelet[2875]: E0527 17:58:09.758066 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-5775587fcb-8p4tb" podUID="75023f81-9f7f-4b1f-aa44-b3e234778686" May 27 17:58:10.606128 kubelet[2875]: I0527 17:58:10.606082 2875 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:58:11.610896 kubelet[2875]: I0527 17:58:11.610857 2875 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:58:11.783046 containerd[1585]: time="2025-05-27T17:58:11.782188418Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9c2130023330c7c1ae8446ef7e237ccdab0c8409b84ef9bd34fd78b16805a086\" id:\"ac7f7ff7ea8d94168ee9ce821a5749ed3f869cd34bf4c8e5b0e8f46c4e433061\" pid:5184 exited_at:{seconds:1748368691 nanos:739081025}" May 27 17:58:12.432340 kubelet[2875]: I0527 17:58:12.432194 2875 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5dcf54bdf4-b8qsn" podStartSLOduration=36.37853496 podStartE2EDuration="47.431221427s" podCreationTimestamp="2025-05-27 17:57:25 +0000 UTC" firstStartedPulling="2025-05-27 17:57:58.229216689 +0000 UTC m=+50.548861563" lastFinishedPulling="2025-05-27 17:58:09.281903136 +0000 UTC m=+61.601548030" observedRunningTime="2025-05-27 17:58:09.629280268 +0000 UTC m=+61.948925174" watchObservedRunningTime="2025-05-27 17:58:12.431221427 +0000 UTC m=+64.750866324" May 27 17:58:19.025860 containerd[1585]: time="2025-05-27T17:58:19.025797651Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 17:58:19.249519 containerd[1585]: time="2025-05-27T17:58:19.249411753Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:58:19.252819 containerd[1585]: time="2025-05-27T17:58:19.252749460Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:58:19.252968 containerd[1585]: time="2025-05-27T17:58:19.252847278Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 17:58:19.263455 kubelet[2875]: E0527 17:58:19.263367 2875 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:58:19.264811 kubelet[2875]: E0527 17:58:19.263486 2875 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:58:19.267149 kubelet[2875]: E0527 17:58:19.267057 2875 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-96shr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-8f77d7b6c-lv5hg_calico-system(3a7c34d5-6339-473f-9bc2-4a5f5fd961b4): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:58:19.268345 kubelet[2875]: E0527 17:58:19.268303 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-8f77d7b6c-lv5hg" podUID="3a7c34d5-6339-473f-9bc2-4a5f5fd961b4" May 27 17:58:20.993286 kubelet[2875]: E0527 17:58:20.993117 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5775587fcb-8p4tb" podUID="75023f81-9f7f-4b1f-aa44-b3e234778686" May 27 17:58:26.368581 containerd[1585]: time="2025-05-27T17:58:26.368501173Z" level=info msg="TaskExit event in podsandbox handler container_id:\"53c03d9554d0025de9e8cbdb2e9da19459fba3ea70590696c861f73440d9de07\" id:\"362d162ccfc23c18a41cbef10a54521d9ca16ba57f083cc69a78ec70b39dd575\" pid:5222 exited_at:{seconds:1748368706 nanos:367821974}" May 27 17:58:32.001865 kubelet[2875]: E0527 17:58:32.001430 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-lv5hg" podUID="3a7c34d5-6339-473f-9bc2-4a5f5fd961b4" May 27 17:58:33.981260 kubelet[2875]: I0527 17:58:33.981158 2875 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:58:36.001092 containerd[1585]: time="2025-05-27T17:58:36.001034072Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 17:58:36.256006 containerd[1585]: time="2025-05-27T17:58:36.255366188Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:58:36.257849 containerd[1585]: time="2025-05-27T17:58:36.257808508Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:58:36.258585 containerd[1585]: time="2025-05-27T17:58:36.257912048Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 17:58:36.258728 kubelet[2875]: E0527 17:58:36.258378 2875 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:58:36.259952 kubelet[2875]: E0527 17:58:36.258996 2875 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:58:36.259952 kubelet[2875]: E0527 17:58:36.259130 2875 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:2fa70a3d9d1f41939d7bb3cd196d2e7b,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9xgcg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5775587fcb-8p4tb_calico-system(75023f81-9f7f-4b1f-aa44-b3e234778686): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:58:36.261285 containerd[1585]: time="2025-05-27T17:58:36.261258140Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 17:58:36.313228 systemd[1]: Started sshd@10-10.230.41.6:22-122.154.116.98:60963.service - OpenSSH per-connection server daemon (122.154.116.98:60963). May 27 17:58:36.361754 systemd[1]: Started sshd@11-10.230.41.6:22-122.154.116.98:63846.service - OpenSSH per-connection server daemon (122.154.116.98:63846). May 27 17:58:36.492405 sshd[5256]: Connection closed by 122.154.116.98 port 63846 May 27 17:58:36.494883 systemd[1]: sshd@11-10.230.41.6:22-122.154.116.98:63846.service: Deactivated successfully. May 27 17:58:36.503134 containerd[1585]: time="2025-05-27T17:58:36.503055157Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:58:36.505171 containerd[1585]: time="2025-05-27T17:58:36.505108925Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:58:36.505308 containerd[1585]: time="2025-05-27T17:58:36.505232735Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 17:58:36.505923 kubelet[2875]: E0527 17:58:36.505859 2875 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:58:36.506481 kubelet[2875]: E0527 17:58:36.505949 2875 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:58:36.506481 kubelet[2875]: E0527 17:58:36.506139 2875 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9xgcg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5775587fcb-8p4tb_calico-system(75023f81-9f7f-4b1f-aa44-b3e234778686): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:58:36.508473 kubelet[2875]: E0527 17:58:36.508419 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-5775587fcb-8p4tb" podUID="75023f81-9f7f-4b1f-aa44-b3e234778686" May 27 17:58:36.508950 sshd[5254]: Connection closed by 122.154.116.98 port 60963 May 27 17:58:36.512002 systemd[1]: sshd@10-10.230.41.6:22-122.154.116.98:60963.service: Deactivated successfully. May 27 17:58:38.052103 systemd[1]: Started sshd@12-10.230.41.6:22-122.154.116.98:58039.service - OpenSSH per-connection server daemon (122.154.116.98:58039). May 27 17:58:38.123241 systemd[1]: Started sshd@13-10.230.41.6:22-122.154.116.98:56143.service - OpenSSH per-connection server daemon (122.154.116.98:56143). May 27 17:58:39.209733 sshd[5262]: Connection closed by authenticating user root 122.154.116.98 port 58039 [preauth] May 27 17:58:39.214480 systemd[1]: sshd@12-10.230.41.6:22-122.154.116.98:58039.service: Deactivated successfully. May 27 17:58:39.294819 sshd[5264]: Connection closed by authenticating user root 122.154.116.98 port 56143 [preauth] May 27 17:58:39.297807 systemd[1]: sshd@13-10.230.41.6:22-122.154.116.98:56143.service: Deactivated successfully. May 27 17:58:41.108041 systemd[1]: Started sshd@14-10.230.41.6:22-122.154.116.98:51127.service - OpenSSH per-connection server daemon (122.154.116.98:51127). May 27 17:58:41.223947 systemd[1]: Started sshd@15-10.230.41.6:22-122.154.116.98:61214.service - OpenSSH per-connection server daemon (122.154.116.98:61214). May 27 17:58:41.705281 containerd[1585]: time="2025-05-27T17:58:41.705212782Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9c2130023330c7c1ae8446ef7e237ccdab0c8409b84ef9bd34fd78b16805a086\" id:\"d22b3b6d49e252a94f622d3e42302faf534958540881fb7c3388c8894a8d8716\" pid:5295 exited_at:{seconds:1748368721 nanos:704799266}" May 27 17:58:41.904135 systemd[1]: Started sshd@16-10.230.41.6:22-139.178.68.195:58406.service - OpenSSH per-connection server daemon (139.178.68.195:58406). May 27 17:58:41.964058 sshd[5274]: Invalid user admin from 122.154.116.98 port 51127 May 27 17:58:42.122232 sshd[5278]: Invalid user admin from 122.154.116.98 port 61214 May 27 17:58:42.157065 sshd[5274]: Connection closed by invalid user admin 122.154.116.98 port 51127 [preauth] May 27 17:58:42.162629 systemd[1]: sshd@14-10.230.41.6:22-122.154.116.98:51127.service: Deactivated successfully. May 27 17:58:42.331719 sshd[5278]: Connection closed by invalid user admin 122.154.116.98 port 61214 [preauth] May 27 17:58:42.334251 systemd[1]: sshd@15-10.230.41.6:22-122.154.116.98:61214.service: Deactivated successfully. May 27 17:58:42.886035 sshd[5305]: Accepted publickey for core from 139.178.68.195 port 58406 ssh2: RSA SHA256:TP4s4bOAUCtqDEOOWsp9GTiG5zPCRK6jIwLqN8nWHAI May 27 17:58:42.889449 sshd-session[5305]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:58:42.907751 systemd-logind[1561]: New session 12 of user core. May 27 17:58:42.915211 systemd[1]: Started session-12.scope - Session 12 of User core. May 27 17:58:44.115861 systemd[1]: Started sshd@17-10.230.41.6:22-122.154.116.98:55576.service - OpenSSH per-connection server daemon (122.154.116.98:55576). May 27 17:58:44.236282 systemd[1]: Started sshd@18-10.230.41.6:22-122.154.116.98:51808.service - OpenSSH per-connection server daemon (122.154.116.98:51808). May 27 17:58:44.310807 sshd[5311]: Connection closed by 139.178.68.195 port 58406 May 27 17:58:44.310800 sshd-session[5305]: pam_unix(sshd:session): session closed for user core May 27 17:58:44.324415 systemd-logind[1561]: Session 12 logged out. Waiting for processes to exit. May 27 17:58:44.326268 systemd[1]: sshd@16-10.230.41.6:22-139.178.68.195:58406.service: Deactivated successfully. May 27 17:58:44.330897 systemd[1]: session-12.scope: Deactivated successfully. May 27 17:58:44.339996 systemd-logind[1561]: Removed session 12. May 27 17:58:45.275852 sshd[5322]: Connection closed by authenticating user root 122.154.116.98 port 55576 [preauth] May 27 17:58:45.277960 systemd[1]: sshd@17-10.230.41.6:22-122.154.116.98:55576.service: Deactivated successfully. May 27 17:58:45.356260 sshd[5325]: Connection closed by authenticating user root 122.154.116.98 port 51808 [preauth] May 27 17:58:45.359833 systemd[1]: sshd@18-10.230.41.6:22-122.154.116.98:51808.service: Deactivated successfully. May 27 17:58:46.002555 containerd[1585]: time="2025-05-27T17:58:46.002510292Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 17:58:46.227266 containerd[1585]: time="2025-05-27T17:58:46.227010479Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:58:46.228758 containerd[1585]: time="2025-05-27T17:58:46.228207636Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:58:46.229152 containerd[1585]: time="2025-05-27T17:58:46.228949170Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 17:58:46.229373 kubelet[2875]: E0527 17:58:46.229279 2875 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:58:46.229373 kubelet[2875]: E0527 17:58:46.229360 2875 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:58:46.230360 kubelet[2875]: E0527 17:58:46.229538 2875 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-96shr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-8f77d7b6c-lv5hg_calico-system(3a7c34d5-6339-473f-9bc2-4a5f5fd961b4): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:58:46.231618 kubelet[2875]: E0527 17:58:46.231575 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-8f77d7b6c-lv5hg" podUID="3a7c34d5-6339-473f-9bc2-4a5f5fd961b4" May 27 17:58:47.079976 systemd[1]: Started sshd@19-10.230.41.6:22-122.154.116.98:57786.service - OpenSSH per-connection server daemon (122.154.116.98:57786). May 27 17:58:47.169742 systemd[1]: Started sshd@20-10.230.41.6:22-122.154.116.98:65015.service - OpenSSH per-connection server daemon (122.154.116.98:65015). May 27 17:58:47.902549 sshd[5337]: Invalid user admin from 122.154.116.98 port 57786 May 27 17:58:48.098717 sshd[5337]: Connection closed by invalid user admin 122.154.116.98 port 57786 [preauth] May 27 17:58:48.101944 systemd[1]: sshd@19-10.230.41.6:22-122.154.116.98:57786.service: Deactivated successfully. May 27 17:58:48.112756 sshd[5340]: Invalid user admin from 122.154.116.98 port 65015 May 27 17:58:48.335150 sshd[5340]: Connection closed by invalid user admin 122.154.116.98 port 65015 [preauth] May 27 17:58:48.339794 systemd[1]: sshd@20-10.230.41.6:22-122.154.116.98:65015.service: Deactivated successfully. May 27 17:58:49.464062 systemd[1]: Started sshd@21-10.230.41.6:22-139.178.68.195:34122.service - OpenSSH per-connection server daemon (139.178.68.195:34122). May 27 17:58:49.881037 systemd[1]: Started sshd@22-10.230.41.6:22-122.154.116.98:57805.service - OpenSSH per-connection server daemon (122.154.116.98:57805). May 27 17:58:50.083962 systemd[1]: Started sshd@23-10.230.41.6:22-122.154.116.98:54686.service - OpenSSH per-connection server daemon (122.154.116.98:54686). May 27 17:58:50.381579 sshd[5348]: Accepted publickey for core from 139.178.68.195 port 34122 ssh2: RSA SHA256:TP4s4bOAUCtqDEOOWsp9GTiG5zPCRK6jIwLqN8nWHAI May 27 17:58:50.385006 sshd-session[5348]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:58:50.394191 systemd-logind[1561]: New session 13 of user core. May 27 17:58:50.403858 systemd[1]: Started session-13.scope - Session 13 of User core. May 27 17:58:50.709074 sshd[5351]: Invalid user postgres from 122.154.116.98 port 57805 May 27 17:58:50.904080 sshd[5351]: Connection closed by invalid user postgres 122.154.116.98 port 57805 [preauth] May 27 17:58:50.906829 systemd[1]: sshd@22-10.230.41.6:22-122.154.116.98:57805.service: Deactivated successfully. May 27 17:58:50.914591 sshd[5354]: Invalid user postgres from 122.154.116.98 port 54686 May 27 17:58:50.993563 kubelet[2875]: E0527 17:58:50.993398 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5775587fcb-8p4tb" podUID="75023f81-9f7f-4b1f-aa44-b3e234778686" May 27 17:58:51.109656 sshd[5354]: Connection closed by invalid user postgres 122.154.116.98 port 54686 [preauth] May 27 17:58:51.113179 systemd[1]: sshd@23-10.230.41.6:22-122.154.116.98:54686.service: Deactivated successfully. May 27 17:58:51.131387 sshd[5356]: Connection closed by 139.178.68.195 port 34122 May 27 17:58:51.132243 sshd-session[5348]: pam_unix(sshd:session): session closed for user core May 27 17:58:51.136809 systemd[1]: sshd@21-10.230.41.6:22-139.178.68.195:34122.service: Deactivated successfully. May 27 17:58:51.139619 systemd[1]: session-13.scope: Deactivated successfully. May 27 17:58:51.142152 systemd-logind[1561]: Session 13 logged out. Waiting for processes to exit. May 27 17:58:51.144407 systemd-logind[1561]: Removed session 13. May 27 17:58:52.804415 systemd[1]: Started sshd@24-10.230.41.6:22-122.154.116.98:53448.service - OpenSSH per-connection server daemon (122.154.116.98:53448). May 27 17:58:53.038614 systemd[1]: Started sshd@25-10.230.41.6:22-122.154.116.98:57885.service - OpenSSH per-connection server daemon (122.154.116.98:57885). May 27 17:58:53.639466 sshd[5373]: Invalid user user from 122.154.116.98 port 53448 May 27 17:58:53.833760 sshd[5373]: Connection closed by invalid user user 122.154.116.98 port 53448 [preauth] May 27 17:58:53.836046 systemd[1]: sshd@24-10.230.41.6:22-122.154.116.98:53448.service: Deactivated successfully. May 27 17:58:53.916702 sshd[5376]: Invalid user user from 122.154.116.98 port 57885 May 27 17:58:54.123833 sshd[5376]: Connection closed by invalid user user 122.154.116.98 port 57885 [preauth] May 27 17:58:54.126346 systemd[1]: sshd@25-10.230.41.6:22-122.154.116.98:57885.service: Deactivated successfully. May 27 17:58:55.672757 systemd[1]: Started sshd@26-10.230.41.6:22-122.154.116.98:62764.service - OpenSSH per-connection server daemon (122.154.116.98:62764). May 27 17:58:55.860361 systemd[1]: Started sshd@27-10.230.41.6:22-122.154.116.98:62552.service - OpenSSH per-connection server daemon (122.154.116.98:62552). May 27 17:58:56.293191 systemd[1]: Started sshd@28-10.230.41.6:22-139.178.68.195:44580.service - OpenSSH per-connection server daemon (139.178.68.195:44580). May 27 17:58:56.424968 containerd[1585]: time="2025-05-27T17:58:56.424901202Z" level=info msg="TaskExit event in podsandbox handler container_id:\"53c03d9554d0025de9e8cbdb2e9da19459fba3ea70590696c861f73440d9de07\" id:\"47838f8cdbdbcbe46093817c0d379be651cd48a5c82b5717f53bce1190210f8f\" pid:5400 exited_at:{seconds:1748368736 nanos:424276497}" May 27 17:58:56.558963 sshd[5383]: Invalid user dspace from 122.154.116.98 port 62764 May 27 17:58:56.708546 sshd[5386]: Invalid user dspace from 122.154.116.98 port 62552 May 27 17:58:56.766723 sshd[5383]: Connection closed by invalid user dspace 122.154.116.98 port 62764 [preauth] May 27 17:58:56.769686 systemd[1]: sshd@26-10.230.41.6:22-122.154.116.98:62764.service: Deactivated successfully. May 27 17:58:56.903205 sshd[5386]: Connection closed by invalid user dspace 122.154.116.98 port 62552 [preauth] May 27 17:58:56.905791 systemd[1]: sshd@27-10.230.41.6:22-122.154.116.98:62552.service: Deactivated successfully. May 27 17:58:57.231632 sshd[5411]: Accepted publickey for core from 139.178.68.195 port 44580 ssh2: RSA SHA256:TP4s4bOAUCtqDEOOWsp9GTiG5zPCRK6jIwLqN8nWHAI May 27 17:58:57.234880 sshd-session[5411]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:58:57.256265 systemd-logind[1561]: New session 14 of user core. May 27 17:58:57.265891 systemd[1]: Started session-14.scope - Session 14 of User core. May 27 17:58:57.986774 sshd[5419]: Connection closed by 139.178.68.195 port 44580 May 27 17:58:57.987687 sshd-session[5411]: pam_unix(sshd:session): session closed for user core May 27 17:58:57.994624 systemd[1]: sshd@28-10.230.41.6:22-139.178.68.195:44580.service: Deactivated successfully. May 27 17:58:57.997539 systemd[1]: session-14.scope: Deactivated successfully. May 27 17:58:57.999376 systemd-logind[1561]: Session 14 logged out. Waiting for processes to exit. May 27 17:58:58.001742 systemd-logind[1561]: Removed session 14. May 27 17:58:58.150545 systemd[1]: Started sshd@29-10.230.41.6:22-139.178.68.195:44596.service - OpenSSH per-connection server daemon (139.178.68.195:44596). May 27 17:58:58.457237 systemd[1]: Started sshd@30-10.230.41.6:22-122.154.116.98:62412.service - OpenSSH per-connection server daemon (122.154.116.98:62412). May 27 17:58:58.650655 systemd[1]: Started sshd@31-10.230.41.6:22-122.154.116.98:63981.service - OpenSSH per-connection server daemon (122.154.116.98:63981). May 27 17:58:58.991538 kubelet[2875]: E0527 17:58:58.991200 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-lv5hg" podUID="3a7c34d5-6339-473f-9bc2-4a5f5fd961b4" May 27 17:58:59.098876 sshd[5432]: Accepted publickey for core from 139.178.68.195 port 44596 ssh2: RSA SHA256:TP4s4bOAUCtqDEOOWsp9GTiG5zPCRK6jIwLqN8nWHAI May 27 17:58:59.102294 sshd-session[5432]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:58:59.109751 systemd-logind[1561]: New session 15 of user core. May 27 17:58:59.115838 systemd[1]: Started session-15.scope - Session 15 of User core. May 27 17:58:59.422899 sshd[5435]: Invalid user test from 122.154.116.98 port 62412 May 27 17:58:59.560337 sshd[5438]: Invalid user test from 122.154.116.98 port 63981 May 27 17:58:59.657088 sshd[5435]: Connection closed by invalid user test 122.154.116.98 port 62412 [preauth] May 27 17:58:59.659086 systemd[1]: sshd@30-10.230.41.6:22-122.154.116.98:62412.service: Deactivated successfully. May 27 17:58:59.781476 sshd[5438]: Connection closed by invalid user test 122.154.116.98 port 63981 [preauth] May 27 17:58:59.785424 systemd[1]: sshd@31-10.230.41.6:22-122.154.116.98:63981.service: Deactivated successfully. May 27 17:58:59.885033 sshd[5440]: Connection closed by 139.178.68.195 port 44596 May 27 17:58:59.885469 sshd-session[5432]: pam_unix(sshd:session): session closed for user core May 27 17:58:59.892223 systemd[1]: sshd@29-10.230.41.6:22-139.178.68.195:44596.service: Deactivated successfully. May 27 17:58:59.896438 systemd[1]: session-15.scope: Deactivated successfully. May 27 17:58:59.898890 systemd-logind[1561]: Session 15 logged out. Waiting for processes to exit. May 27 17:58:59.900906 systemd-logind[1561]: Removed session 15. May 27 17:59:00.040961 systemd[1]: Started sshd@32-10.230.41.6:22-139.178.68.195:44604.service - OpenSSH per-connection server daemon (139.178.68.195:44604). May 27 17:59:00.954935 sshd[5455]: Accepted publickey for core from 139.178.68.195 port 44604 ssh2: RSA SHA256:TP4s4bOAUCtqDEOOWsp9GTiG5zPCRK6jIwLqN8nWHAI May 27 17:59:00.956930 sshd-session[5455]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:59:00.965434 systemd-logind[1561]: New session 16 of user core. May 27 17:59:00.972889 systemd[1]: Started session-16.scope - Session 16 of User core. May 27 17:59:01.413348 systemd[1]: Started sshd@33-10.230.41.6:22-122.154.116.98:61472.service - OpenSSH per-connection server daemon (122.154.116.98:61472). May 27 17:59:01.561843 systemd[1]: Started sshd@34-10.230.41.6:22-122.154.116.98:64153.service - OpenSSH per-connection server daemon (122.154.116.98:64153). May 27 17:59:01.672370 sshd[5457]: Connection closed by 139.178.68.195 port 44604 May 27 17:59:01.673113 sshd-session[5455]: pam_unix(sshd:session): session closed for user core May 27 17:59:01.678280 systemd[1]: sshd@32-10.230.41.6:22-139.178.68.195:44604.service: Deactivated successfully. May 27 17:59:01.682222 systemd[1]: session-16.scope: Deactivated successfully. May 27 17:59:01.683919 systemd-logind[1561]: Session 16 logged out. Waiting for processes to exit. May 27 17:59:01.686145 systemd-logind[1561]: Removed session 16. May 27 17:59:02.566170 sshd[5459]: Connection closed by authenticating user root 122.154.116.98 port 61472 [preauth] May 27 17:59:02.569300 systemd[1]: sshd@33-10.230.41.6:22-122.154.116.98:61472.service: Deactivated successfully. May 27 17:59:02.825939 sshd[5469]: Connection closed by authenticating user root 122.154.116.98 port 64153 [preauth] May 27 17:59:02.828401 systemd[1]: sshd@34-10.230.41.6:22-122.154.116.98:64153.service: Deactivated successfully. May 27 17:59:04.135910 systemd[1]: Started sshd@35-10.230.41.6:22-122.154.116.98:61988.service - OpenSSH per-connection server daemon (122.154.116.98:61988). May 27 17:59:04.398378 systemd[1]: Started sshd@36-10.230.41.6:22-122.154.116.98:59616.service - OpenSSH per-connection server daemon (122.154.116.98:59616). May 27 17:59:05.010656 sshd[5480]: Invalid user dspace from 122.154.116.98 port 61988 May 27 17:59:05.217849 sshd[5480]: Connection closed by invalid user dspace 122.154.116.98 port 61988 [preauth] May 27 17:59:05.220374 systemd[1]: sshd@35-10.230.41.6:22-122.154.116.98:61988.service: Deactivated successfully. May 27 17:59:05.270451 sshd[5483]: Invalid user dspace from 122.154.116.98 port 59616 May 27 17:59:05.479746 sshd[5483]: Connection closed by invalid user dspace 122.154.116.98 port 59616 [preauth] May 27 17:59:05.482640 systemd[1]: sshd@36-10.230.41.6:22-122.154.116.98:59616.service: Deactivated successfully. May 27 17:59:06.014846 kubelet[2875]: E0527 17:59:06.014119 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5775587fcb-8p4tb" podUID="75023f81-9f7f-4b1f-aa44-b3e234778686" May 27 17:59:06.831700 systemd[1]: Started sshd@37-10.230.41.6:22-139.178.68.195:39296.service - OpenSSH per-connection server daemon (139.178.68.195:39296). May 27 17:59:06.902187 systemd[1]: Started sshd@38-10.230.41.6:22-122.154.116.98:55731.service - OpenSSH per-connection server daemon (122.154.116.98:55731). May 27 17:59:07.085623 systemd[1]: Started sshd@39-10.230.41.6:22-122.154.116.98:63211.service - OpenSSH per-connection server daemon (122.154.116.98:63211). May 27 17:59:07.739084 sshd[5494]: Accepted publickey for core from 139.178.68.195 port 39296 ssh2: RSA SHA256:TP4s4bOAUCtqDEOOWsp9GTiG5zPCRK6jIwLqN8nWHAI May 27 17:59:07.741802 sshd-session[5494]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:59:07.749238 systemd-logind[1561]: New session 17 of user core. May 27 17:59:07.757920 systemd[1]: Started session-17.scope - Session 17 of User core. May 27 17:59:07.833917 sshd[5497]: Invalid user ubuntu from 122.154.116.98 port 55731 May 27 17:59:07.899332 containerd[1585]: time="2025-05-27T17:59:07.899120649Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9c2130023330c7c1ae8446ef7e237ccdab0c8409b84ef9bd34fd78b16805a086\" id:\"38b778711cdb84791cff6ee9a6a25a892f3549842a6382acc9c0700b318920ef\" pid:5518 exited_at:{seconds:1748368747 nanos:897619600}" May 27 17:59:07.907471 sshd[5500]: Invalid user ubuntu from 122.154.116.98 port 63211 May 27 17:59:08.058181 sshd[5497]: Connection closed by invalid user ubuntu 122.154.116.98 port 55731 [preauth] May 27 17:59:08.062588 systemd[1]: sshd@38-10.230.41.6:22-122.154.116.98:55731.service: Deactivated successfully. May 27 17:59:08.103868 sshd[5500]: Connection closed by invalid user ubuntu 122.154.116.98 port 63211 [preauth] May 27 17:59:08.107172 systemd[1]: sshd@39-10.230.41.6:22-122.154.116.98:63211.service: Deactivated successfully. May 27 17:59:08.462684 sshd[5502]: Connection closed by 139.178.68.195 port 39296 May 27 17:59:08.463790 sshd-session[5494]: pam_unix(sshd:session): session closed for user core May 27 17:59:08.470106 systemd[1]: sshd@37-10.230.41.6:22-139.178.68.195:39296.service: Deactivated successfully. May 27 17:59:08.474903 systemd[1]: session-17.scope: Deactivated successfully. May 27 17:59:08.476662 systemd-logind[1561]: Session 17 logged out. Waiting for processes to exit. May 27 17:59:08.479171 systemd-logind[1561]: Removed session 17. May 27 17:59:09.627198 systemd[1]: Started sshd@40-10.230.41.6:22-122.154.116.98:65125.service - OpenSSH per-connection server daemon (122.154.116.98:65125). May 27 17:59:09.704043 systemd[1]: Started sshd@41-10.230.41.6:22-122.154.116.98:62284.service - OpenSSH per-connection server daemon (122.154.116.98:62284). May 27 17:59:09.994477 kubelet[2875]: E0527 17:59:09.993853 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-lv5hg" podUID="3a7c34d5-6339-473f-9bc2-4a5f5fd961b4" May 27 17:59:10.442744 sshd[5545]: Invalid user dspace from 122.154.116.98 port 65125 May 27 17:59:10.573199 sshd[5548]: Invalid user dspace from 122.154.116.98 port 62284 May 27 17:59:10.636794 sshd[5545]: Connection closed by invalid user dspace 122.154.116.98 port 65125 [preauth] May 27 17:59:10.639015 systemd[1]: sshd@40-10.230.41.6:22-122.154.116.98:65125.service: Deactivated successfully. May 27 17:59:10.781216 sshd[5548]: Connection closed by invalid user dspace 122.154.116.98 port 62284 [preauth] May 27 17:59:10.784031 systemd[1]: sshd@41-10.230.41.6:22-122.154.116.98:62284.service: Deactivated successfully. May 27 17:59:11.536355 containerd[1585]: time="2025-05-27T17:59:11.536222664Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9c2130023330c7c1ae8446ef7e237ccdab0c8409b84ef9bd34fd78b16805a086\" id:\"e9839a0b254edf21f8f87dc3950c9b7260dceea4fa30c139be3ac1e189a905d0\" pid:5566 exited_at:{seconds:1748368751 nanos:535979622}" May 27 17:59:12.157572 systemd[1]: Started sshd@42-10.230.41.6:22-122.154.116.98:63428.service - OpenSSH per-connection server daemon (122.154.116.98:63428). May 27 17:59:12.327033 systemd[1]: Started sshd@43-10.230.41.6:22-122.154.116.98:51725.service - OpenSSH per-connection server daemon (122.154.116.98:51725). May 27 17:59:13.035288 sshd[5578]: Invalid user steam from 122.154.116.98 port 63428 May 27 17:59:13.207137 sshd[5581]: Invalid user steam from 122.154.116.98 port 51725 May 27 17:59:13.242944 sshd[5578]: Connection closed by invalid user steam 122.154.116.98 port 63428 [preauth] May 27 17:59:13.247306 systemd[1]: sshd@42-10.230.41.6:22-122.154.116.98:63428.service: Deactivated successfully. May 27 17:59:13.415051 sshd[5581]: Connection closed by invalid user steam 122.154.116.98 port 51725 [preauth] May 27 17:59:13.417357 systemd[1]: sshd@43-10.230.41.6:22-122.154.116.98:51725.service: Deactivated successfully. May 27 17:59:13.618634 systemd[1]: Started sshd@44-10.230.41.6:22-139.178.68.195:39304.service - OpenSSH per-connection server daemon (139.178.68.195:39304). May 27 17:59:14.577397 sshd[5588]: Accepted publickey for core from 139.178.68.195 port 39304 ssh2: RSA SHA256:TP4s4bOAUCtqDEOOWsp9GTiG5zPCRK6jIwLqN8nWHAI May 27 17:59:14.579551 sshd-session[5588]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:59:14.587990 systemd-logind[1561]: New session 18 of user core. May 27 17:59:14.593895 systemd[1]: Started session-18.scope - Session 18 of User core. May 27 17:59:14.880571 systemd[1]: Started sshd@45-10.230.41.6:22-122.154.116.98:59559.service - OpenSSH per-connection server daemon (122.154.116.98:59559). May 27 17:59:15.055381 systemd[1]: Started sshd@46-10.230.41.6:22-122.154.116.98:54363.service - OpenSSH per-connection server daemon (122.154.116.98:54363). May 27 17:59:15.330064 sshd[5593]: Connection closed by 139.178.68.195 port 39304 May 27 17:59:15.329834 sshd-session[5588]: pam_unix(sshd:session): session closed for user core May 27 17:59:15.336173 systemd[1]: sshd@44-10.230.41.6:22-139.178.68.195:39304.service: Deactivated successfully. May 27 17:59:15.339602 systemd[1]: session-18.scope: Deactivated successfully. May 27 17:59:15.342826 systemd-logind[1561]: Session 18 logged out. Waiting for processes to exit. May 27 17:59:15.346464 systemd-logind[1561]: Removed session 18. May 27 17:59:15.713038 sshd[5595]: Invalid user esuser from 122.154.116.98 port 59559 May 27 17:59:15.906735 sshd[5595]: Connection closed by invalid user esuser 122.154.116.98 port 59559 [preauth] May 27 17:59:15.910144 systemd[1]: sshd@45-10.230.41.6:22-122.154.116.98:59559.service: Deactivated successfully. May 27 17:59:15.935904 sshd[5601]: Invalid user esuser from 122.154.116.98 port 54363 May 27 17:59:16.144849 sshd[5601]: Connection closed by invalid user esuser 122.154.116.98 port 54363 [preauth] May 27 17:59:16.148692 systemd[1]: sshd@46-10.230.41.6:22-122.154.116.98:54363.service: Deactivated successfully. May 27 17:59:17.688930 systemd[1]: Started sshd@47-10.230.41.6:22-122.154.116.98:64285.service - OpenSSH per-connection server daemon (122.154.116.98:64285). May 27 17:59:17.864019 systemd[1]: Started sshd@48-10.230.41.6:22-122.154.116.98:54526.service - OpenSSH per-connection server daemon (122.154.116.98:54526). May 27 17:59:18.609311 sshd[5621]: Invalid user ansible from 122.154.116.98 port 64285 May 27 17:59:18.756293 sshd[5624]: Invalid user ansible from 122.154.116.98 port 54526 May 27 17:59:18.817839 sshd[5621]: Connection closed by invalid user ansible 122.154.116.98 port 64285 [preauth] May 27 17:59:18.821038 systemd[1]: sshd@47-10.230.41.6:22-122.154.116.98:64285.service: Deactivated successfully. May 27 17:59:18.963872 sshd[5624]: Connection closed by invalid user ansible 122.154.116.98 port 54526 [preauth] May 27 17:59:18.965937 systemd[1]: sshd@48-10.230.41.6:22-122.154.116.98:54526.service: Deactivated successfully. May 27 17:59:18.993409 containerd[1585]: time="2025-05-27T17:59:18.993362681Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 17:59:19.298528 containerd[1585]: time="2025-05-27T17:59:19.297932534Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:59:19.299459 containerd[1585]: time="2025-05-27T17:59:19.299313714Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:59:19.299459 containerd[1585]: time="2025-05-27T17:59:19.299419138Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 17:59:19.300929 kubelet[2875]: E0527 17:59:19.300845 2875 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:59:19.302557 kubelet[2875]: E0527 17:59:19.301164 2875 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:59:19.302557 kubelet[2875]: E0527 17:59:19.301331 2875 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:2fa70a3d9d1f41939d7bb3cd196d2e7b,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9xgcg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5775587fcb-8p4tb_calico-system(75023f81-9f7f-4b1f-aa44-b3e234778686): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:59:19.306420 containerd[1585]: time="2025-05-27T17:59:19.306380889Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 17:59:19.528082 containerd[1585]: time="2025-05-27T17:59:19.527839283Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:59:19.529864 containerd[1585]: time="2025-05-27T17:59:19.529802513Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:59:19.530585 containerd[1585]: time="2025-05-27T17:59:19.530017799Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 17:59:19.530660 kubelet[2875]: E0527 17:59:19.530188 2875 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:59:19.530660 kubelet[2875]: E0527 17:59:19.530262 2875 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:59:19.530660 kubelet[2875]: E0527 17:59:19.530424 2875 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9xgcg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5775587fcb-8p4tb_calico-system(75023f81-9f7f-4b1f-aa44-b3e234778686): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:59:19.532284 kubelet[2875]: E0527 17:59:19.532031 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-5775587fcb-8p4tb" podUID="75023f81-9f7f-4b1f-aa44-b3e234778686" May 27 17:59:20.489172 systemd[1]: Started sshd@49-10.230.41.6:22-139.178.68.195:60278.service - OpenSSH per-connection server daemon (139.178.68.195:60278). May 27 17:59:20.555989 systemd[1]: Started sshd@50-10.230.41.6:22-122.154.116.98:56203.service - OpenSSH per-connection server daemon (122.154.116.98:56203). May 27 17:59:20.719932 systemd[1]: Started sshd@51-10.230.41.6:22-122.154.116.98:61376.service - OpenSSH per-connection server daemon (122.154.116.98:61376). May 27 17:59:20.990887 kubelet[2875]: E0527 17:59:20.990817 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-lv5hg" podUID="3a7c34d5-6339-473f-9bc2-4a5f5fd961b4" May 27 17:59:21.449446 sshd[5636]: Invalid user hduser from 122.154.116.98 port 56203 May 27 17:59:21.468964 sshd[5633]: Accepted publickey for core from 139.178.68.195 port 60278 ssh2: RSA SHA256:TP4s4bOAUCtqDEOOWsp9GTiG5zPCRK6jIwLqN8nWHAI May 27 17:59:21.472448 sshd-session[5633]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:59:21.480285 systemd-logind[1561]: New session 19 of user core. May 27 17:59:21.487900 systemd[1]: Started session-19.scope - Session 19 of User core. May 27 17:59:21.593259 sshd[5640]: Invalid user hduser from 122.154.116.98 port 61376 May 27 17:59:21.657170 sshd[5636]: Connection closed by invalid user hduser 122.154.116.98 port 56203 [preauth] May 27 17:59:21.660060 systemd[1]: sshd@50-10.230.41.6:22-122.154.116.98:56203.service: Deactivated successfully. May 27 17:59:21.851019 sshd[5640]: Connection closed by invalid user hduser 122.154.116.98 port 61376 [preauth] May 27 17:59:21.853761 systemd[1]: sshd@51-10.230.41.6:22-122.154.116.98:61376.service: Deactivated successfully. May 27 17:59:22.258530 sshd[5642]: Connection closed by 139.178.68.195 port 60278 May 27 17:59:22.259263 sshd-session[5633]: pam_unix(sshd:session): session closed for user core May 27 17:59:22.267353 systemd[1]: sshd@49-10.230.41.6:22-139.178.68.195:60278.service: Deactivated successfully. May 27 17:59:22.270336 systemd[1]: session-19.scope: Deactivated successfully. May 27 17:59:22.271828 systemd-logind[1561]: Session 19 logged out. Waiting for processes to exit. May 27 17:59:22.274346 systemd-logind[1561]: Removed session 19. May 27 17:59:22.419860 systemd[1]: Started sshd@52-10.230.41.6:22-139.178.68.195:60292.service - OpenSSH per-connection server daemon (139.178.68.195:60292). May 27 17:59:23.282897 systemd[1]: Started sshd@53-10.230.41.6:22-122.154.116.98:51806.service - OpenSSH per-connection server daemon (122.154.116.98:51806). May 27 17:59:23.329045 sshd[5658]: Accepted publickey for core from 139.178.68.195 port 60292 ssh2: RSA SHA256:TP4s4bOAUCtqDEOOWsp9GTiG5zPCRK6jIwLqN8nWHAI May 27 17:59:23.331026 sshd-session[5658]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:59:23.338014 systemd-logind[1561]: New session 20 of user core. May 27 17:59:23.346875 systemd[1]: Started session-20.scope - Session 20 of User core. May 27 17:59:23.459239 systemd[1]: Started sshd@54-10.230.41.6:22-122.154.116.98:57307.service - OpenSSH per-connection server daemon (122.154.116.98:57307). May 27 17:59:24.355840 sshd[5661]: Connection closed by authenticating user root 122.154.116.98 port 51806 [preauth] May 27 17:59:24.358565 systemd[1]: sshd@53-10.230.41.6:22-122.154.116.98:51806.service: Deactivated successfully. May 27 17:59:24.378880 sshd[5663]: Connection closed by 139.178.68.195 port 60292 May 27 17:59:24.379566 sshd-session[5658]: pam_unix(sshd:session): session closed for user core May 27 17:59:24.388268 systemd[1]: sshd@52-10.230.41.6:22-139.178.68.195:60292.service: Deactivated successfully. May 27 17:59:24.392578 systemd[1]: session-20.scope: Deactivated successfully. May 27 17:59:24.394879 systemd-logind[1561]: Session 20 logged out. Waiting for processes to exit. May 27 17:59:24.398420 systemd-logind[1561]: Removed session 20. May 27 17:59:24.534482 systemd[1]: Started sshd@55-10.230.41.6:22-139.178.68.195:34118.service - OpenSSH per-connection server daemon (139.178.68.195:34118). May 27 17:59:24.550258 sshd[5665]: Connection closed by authenticating user root 122.154.116.98 port 57307 [preauth] May 27 17:59:24.553174 systemd[1]: sshd@54-10.230.41.6:22-122.154.116.98:57307.service: Deactivated successfully. May 27 17:59:25.449204 sshd[5678]: Accepted publickey for core from 139.178.68.195 port 34118 ssh2: RSA SHA256:TP4s4bOAUCtqDEOOWsp9GTiG5zPCRK6jIwLqN8nWHAI May 27 17:59:25.450871 sshd-session[5678]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:59:25.459168 systemd-logind[1561]: New session 21 of user core. May 27 17:59:25.467027 systemd[1]: Started session-21.scope - Session 21 of User core. May 27 17:59:26.033937 systemd[1]: Started sshd@56-10.230.41.6:22-122.154.116.98:49778.service - OpenSSH per-connection server daemon (122.154.116.98:49778). May 27 17:59:26.222485 systemd[1]: Started sshd@57-10.230.41.6:22-122.154.116.98:53942.service - OpenSSH per-connection server daemon (122.154.116.98:53942). May 27 17:59:26.323218 containerd[1585]: time="2025-05-27T17:59:26.323009430Z" level=info msg="TaskExit event in podsandbox handler container_id:\"53c03d9554d0025de9e8cbdb2e9da19459fba3ea70590696c861f73440d9de07\" id:\"5760e1b3c2269de32cfe4416d73173c7fe8a6d1324f85ec46845af9d183eaeba\" pid:5710 exited_at:{seconds:1748368766 nanos:322394957}" May 27 17:59:26.915937 sshd[5696]: Invalid user csserver from 122.154.116.98 port 49778 May 27 17:59:27.125749 sshd[5696]: Connection closed by invalid user csserver 122.154.116.98 port 49778 [preauth] May 27 17:59:27.130274 systemd[1]: sshd@56-10.230.41.6:22-122.154.116.98:49778.service: Deactivated successfully. May 27 17:59:27.137779 sshd[5721]: Invalid user csserver from 122.154.116.98 port 53942 May 27 17:59:27.355228 sshd[5721]: Connection closed by invalid user csserver 122.154.116.98 port 53942 [preauth] May 27 17:59:27.359350 systemd[1]: sshd@57-10.230.41.6:22-122.154.116.98:53942.service: Deactivated successfully. May 27 17:59:28.841986 systemd[1]: Started sshd@58-10.230.41.6:22-122.154.116.98:62527.service - OpenSSH per-connection server daemon (122.154.116.98:62527). May 27 17:59:28.946606 sshd[5683]: Connection closed by 139.178.68.195 port 34118 May 27 17:59:28.944303 sshd-session[5678]: pam_unix(sshd:session): session closed for user core May 27 17:59:28.965083 systemd[1]: sshd@55-10.230.41.6:22-139.178.68.195:34118.service: Deactivated successfully. May 27 17:59:28.970195 systemd[1]: session-21.scope: Deactivated successfully. May 27 17:59:28.971548 systemd[1]: session-21.scope: Consumed 794ms CPU time, 76.4M memory peak. May 27 17:59:28.974479 systemd-logind[1561]: Session 21 logged out. Waiting for processes to exit. May 27 17:59:28.977996 systemd-logind[1561]: Removed session 21. May 27 17:59:29.055459 systemd[1]: Started sshd@59-10.230.41.6:22-122.154.116.98:51222.service - OpenSSH per-connection server daemon (122.154.116.98:51222). May 27 17:59:29.100180 systemd[1]: Started sshd@60-10.230.41.6:22-139.178.68.195:34124.service - OpenSSH per-connection server daemon (139.178.68.195:34124). May 27 17:59:30.021494 sshd[5748]: Accepted publickey for core from 139.178.68.195 port 34124 ssh2: RSA SHA256:TP4s4bOAUCtqDEOOWsp9GTiG5zPCRK6jIwLqN8nWHAI May 27 17:59:30.024760 sshd-session[5748]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:59:30.035275 systemd-logind[1561]: New session 22 of user core. May 27 17:59:30.040874 systemd[1]: Started session-22.scope - Session 22 of User core. May 27 17:59:30.059100 sshd[5737]: Connection closed by authenticating user root 122.154.116.98 port 62527 [preauth] May 27 17:59:30.062703 systemd[1]: sshd@58-10.230.41.6:22-122.154.116.98:62527.service: Deactivated successfully. May 27 17:59:30.134736 sshd[5745]: Connection closed by authenticating user root 122.154.116.98 port 51222 [preauth] May 27 17:59:30.138196 systemd[1]: sshd@59-10.230.41.6:22-122.154.116.98:51222.service: Deactivated successfully. May 27 17:59:31.225031 kubelet[2875]: E0527 17:59:31.224963 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5775587fcb-8p4tb" podUID="75023f81-9f7f-4b1f-aa44-b3e234778686" May 27 17:59:31.731408 sshd[5752]: Connection closed by 139.178.68.195 port 34124 May 27 17:59:31.732864 sshd-session[5748]: pam_unix(sshd:session): session closed for user core May 27 17:59:31.738960 systemd[1]: sshd@60-10.230.41.6:22-139.178.68.195:34124.service: Deactivated successfully. May 27 17:59:31.742130 systemd[1]: session-22.scope: Deactivated successfully. May 27 17:59:31.744092 systemd-logind[1561]: Session 22 logged out. Waiting for processes to exit. May 27 17:59:31.746636 systemd-logind[1561]: Removed session 22. May 27 17:59:31.800769 systemd[1]: Started sshd@61-10.230.41.6:22-122.154.116.98:64475.service - OpenSSH per-connection server daemon (122.154.116.98:64475). May 27 17:59:31.852967 systemd[1]: Started sshd@62-10.230.41.6:22-122.154.116.98:57343.service - OpenSSH per-connection server daemon (122.154.116.98:57343). May 27 17:59:31.892786 systemd[1]: Started sshd@63-10.230.41.6:22-139.178.68.195:34130.service - OpenSSH per-connection server daemon (139.178.68.195:34130). May 27 17:59:32.793688 sshd[5771]: Accepted publickey for core from 139.178.68.195 port 34130 ssh2: RSA SHA256:TP4s4bOAUCtqDEOOWsp9GTiG5zPCRK6jIwLqN8nWHAI May 27 17:59:32.795510 sshd-session[5771]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:59:32.802913 systemd-logind[1561]: New session 23 of user core. May 27 17:59:32.810225 systemd[1]: Started session-23.scope - Session 23 of User core. May 27 17:59:32.880961 sshd[5768]: Connection closed by authenticating user root 122.154.116.98 port 57343 [preauth] May 27 17:59:32.883712 systemd[1]: sshd@62-10.230.41.6:22-122.154.116.98:57343.service: Deactivated successfully. May 27 17:59:32.926542 sshd[5766]: Connection closed by authenticating user root 122.154.116.98 port 64475 [preauth] May 27 17:59:32.929826 systemd[1]: sshd@61-10.230.41.6:22-122.154.116.98:64475.service: Deactivated successfully. May 27 17:59:33.533490 sshd[5775]: Connection closed by 139.178.68.195 port 34130 May 27 17:59:33.534170 sshd-session[5771]: pam_unix(sshd:session): session closed for user core May 27 17:59:33.539005 systemd[1]: sshd@63-10.230.41.6:22-139.178.68.195:34130.service: Deactivated successfully. May 27 17:59:33.541734 systemd[1]: session-23.scope: Deactivated successfully. May 27 17:59:33.544610 systemd-logind[1561]: Session 23 logged out. Waiting for processes to exit. May 27 17:59:33.546576 systemd-logind[1561]: Removed session 23. May 27 17:59:34.602231 systemd[1]: Started sshd@64-10.230.41.6:22-122.154.116.98:50652.service - OpenSSH per-connection server daemon (122.154.116.98:50652). May 27 17:59:34.633027 systemd[1]: Started sshd@65-10.230.41.6:22-122.154.116.98:64666.service - OpenSSH per-connection server daemon (122.154.116.98:64666). May 27 17:59:35.008522 containerd[1585]: time="2025-05-27T17:59:35.005630889Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 17:59:35.277908 containerd[1585]: time="2025-05-27T17:59:35.277732947Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:59:35.278994 containerd[1585]: time="2025-05-27T17:59:35.278956293Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 17:59:35.283530 containerd[1585]: time="2025-05-27T17:59:35.283463556Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:59:35.285242 kubelet[2875]: E0527 17:59:35.285175 2875 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:59:35.285912 kubelet[2875]: E0527 17:59:35.285268 2875 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:59:35.290667 kubelet[2875]: E0527 17:59:35.290584 2875 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-96shr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-8f77d7b6c-lv5hg_calico-system(3a7c34d5-6339-473f-9bc2-4a5f5fd961b4): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:59:35.292096 kubelet[2875]: E0527 17:59:35.292043 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-8f77d7b6c-lv5hg" podUID="3a7c34d5-6339-473f-9bc2-4a5f5fd961b4" May 27 17:59:35.509592 sshd[5803]: Invalid user esuser from 122.154.116.98 port 50652 May 27 17:59:35.519655 sshd[5805]: Invalid user esuser from 122.154.116.98 port 64666 May 27 17:59:35.722010 sshd[5803]: Connection closed by invalid user esuser 122.154.116.98 port 50652 [preauth] May 27 17:59:35.726349 systemd[1]: sshd@64-10.230.41.6:22-122.154.116.98:50652.service: Deactivated successfully. May 27 17:59:35.732415 sshd[5805]: Connection closed by invalid user esuser 122.154.116.98 port 64666 [preauth] May 27 17:59:35.734651 systemd[1]: sshd@65-10.230.41.6:22-122.154.116.98:64666.service: Deactivated successfully. May 27 17:59:37.675307 systemd[1]: Started sshd@66-10.230.41.6:22-122.154.116.98:54734.service - OpenSSH per-connection server daemon (122.154.116.98:54734). May 27 17:59:37.679778 systemd[1]: Started sshd@67-10.230.41.6:22-122.154.116.98:49160.service - OpenSSH per-connection server daemon (122.154.116.98:49160). May 27 17:59:38.697238 systemd[1]: Started sshd@68-10.230.41.6:22-139.178.68.195:41884.service - OpenSSH per-connection server daemon (139.178.68.195:41884). May 27 17:59:38.900571 sshd[5818]: Connection closed by authenticating user root 122.154.116.98 port 49160 [preauth] May 27 17:59:38.903038 sshd[5817]: Connection closed by authenticating user root 122.154.116.98 port 54734 [preauth] May 27 17:59:38.906170 systemd[1]: sshd@66-10.230.41.6:22-122.154.116.98:54734.service: Deactivated successfully. May 27 17:59:38.910456 systemd[1]: sshd@67-10.230.41.6:22-122.154.116.98:49160.service: Deactivated successfully. May 27 17:59:39.655442 sshd[5822]: Accepted publickey for core from 139.178.68.195 port 41884 ssh2: RSA SHA256:TP4s4bOAUCtqDEOOWsp9GTiG5zPCRK6jIwLqN8nWHAI May 27 17:59:39.656967 sshd-session[5822]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:59:39.667870 systemd-logind[1561]: New session 24 of user core. May 27 17:59:39.673921 systemd[1]: Started session-24.scope - Session 24 of User core. May 27 17:59:40.533556 sshd[5828]: Connection closed by 139.178.68.195 port 41884 May 27 17:59:40.534956 sshd-session[5822]: pam_unix(sshd:session): session closed for user core May 27 17:59:40.541755 systemd[1]: sshd@68-10.230.41.6:22-139.178.68.195:41884.service: Deactivated successfully. May 27 17:59:40.545265 systemd[1]: session-24.scope: Deactivated successfully. May 27 17:59:40.548438 systemd-logind[1561]: Session 24 logged out. Waiting for processes to exit. May 27 17:59:40.550861 systemd-logind[1561]: Removed session 24. May 27 17:59:40.626620 systemd[1]: Started sshd@69-10.230.41.6:22-122.154.116.98:53207.service - OpenSSH per-connection server daemon (122.154.116.98:53207). May 27 17:59:40.633150 systemd[1]: Started sshd@70-10.230.41.6:22-122.154.116.98:58101.service - OpenSSH per-connection server daemon (122.154.116.98:58101). May 27 17:59:41.564786 sshd[5841]: Invalid user test from 122.154.116.98 port 53207 May 27 17:59:41.565236 sshd[5842]: Invalid user test from 122.154.116.98 port 58101 May 27 17:59:41.776731 sshd[5841]: Connection closed by invalid user test 122.154.116.98 port 53207 [preauth] May 27 17:59:41.779145 sshd[5842]: Connection closed by invalid user test 122.154.116.98 port 58101 [preauth] May 27 17:59:41.779506 systemd[1]: sshd@69-10.230.41.6:22-122.154.116.98:53207.service: Deactivated successfully. May 27 17:59:41.783160 systemd[1]: sshd@70-10.230.41.6:22-122.154.116.98:58101.service: Deactivated successfully. May 27 17:59:41.977729 containerd[1585]: time="2025-05-27T17:59:41.977412344Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9c2130023330c7c1ae8446ef7e237ccdab0c8409b84ef9bd34fd78b16805a086\" id:\"50d10aab9a6d7d6486f53b93b01a1cf96e8f8956a2c2c93176dc2c2108e3c1e8\" pid:5862 exited_at:{seconds:1748368781 nanos:975953392}" May 27 17:59:43.542936 systemd[1]: Started sshd@71-10.230.41.6:22-122.154.116.98:57831.service - OpenSSH per-connection server daemon (122.154.116.98:57831). May 27 17:59:43.566960 systemd[1]: Started sshd@72-10.230.41.6:22-122.154.116.98:58143.service - OpenSSH per-connection server daemon (122.154.116.98:58143). May 27 17:59:44.532169 sshd[5871]: Invalid user admin from 122.154.116.98 port 57831 May 27 17:59:44.577871 sshd[5873]: Invalid user admin from 122.154.116.98 port 58143 May 27 17:59:44.740875 sshd[5871]: Connection closed by invalid user admin 122.154.116.98 port 57831 [preauth] May 27 17:59:44.743577 systemd[1]: sshd@71-10.230.41.6:22-122.154.116.98:57831.service: Deactivated successfully. May 27 17:59:44.802970 sshd[5873]: Connection closed by invalid user admin 122.154.116.98 port 58143 [preauth] May 27 17:59:44.805711 systemd[1]: sshd@72-10.230.41.6:22-122.154.116.98:58143.service: Deactivated successfully. May 27 17:59:45.691969 systemd[1]: Started sshd@73-10.230.41.6:22-139.178.68.195:36350.service - OpenSSH per-connection server daemon (139.178.68.195:36350). May 27 17:59:46.469074 systemd[1]: Started sshd@74-10.230.41.6:22-122.154.116.98:65413.service - OpenSSH per-connection server daemon (122.154.116.98:65413). May 27 17:59:46.563293 systemd[1]: Started sshd@75-10.230.41.6:22-122.154.116.98:58435.service - OpenSSH per-connection server daemon (122.154.116.98:58435). May 27 17:59:46.622393 sshd[5883]: Accepted publickey for core from 139.178.68.195 port 36350 ssh2: RSA SHA256:TP4s4bOAUCtqDEOOWsp9GTiG5zPCRK6jIwLqN8nWHAI May 27 17:59:46.626122 sshd-session[5883]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:59:46.636964 systemd-logind[1561]: New session 25 of user core. May 27 17:59:46.641927 systemd[1]: Started session-25.scope - Session 25 of User core. May 27 17:59:47.002537 kubelet[2875]: E0527 17:59:47.002288 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\"\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\"\"]" pod="calico-system/whisker-5775587fcb-8p4tb" podUID="75023f81-9f7f-4b1f-aa44-b3e234778686" May 27 17:59:47.356739 sshd[5886]: Invalid user ansible from 122.154.116.98 port 65413 May 27 17:59:47.501601 sshd[5889]: Invalid user ansible from 122.154.116.98 port 58435 May 27 17:59:47.564542 sshd[5886]: Connection closed by invalid user ansible 122.154.116.98 port 65413 [preauth] May 27 17:59:47.568207 systemd[1]: sshd@74-10.230.41.6:22-122.154.116.98:65413.service: Deactivated successfully. May 27 17:59:47.570534 sshd[5891]: Connection closed by 139.178.68.195 port 36350 May 27 17:59:47.573143 sshd-session[5883]: pam_unix(sshd:session): session closed for user core May 27 17:59:47.579235 systemd[1]: sshd@73-10.230.41.6:22-139.178.68.195:36350.service: Deactivated successfully. May 27 17:59:47.581610 systemd[1]: session-25.scope: Deactivated successfully. May 27 17:59:47.584100 systemd-logind[1561]: Session 25 logged out. Waiting for processes to exit. May 27 17:59:47.586372 systemd-logind[1561]: Removed session 25. May 27 17:59:47.724584 sshd[5889]: Connection closed by invalid user ansible 122.154.116.98 port 58435 [preauth] May 27 17:59:47.727905 systemd[1]: sshd@75-10.230.41.6:22-122.154.116.98:58435.service: Deactivated successfully. May 27 17:59:49.318318 systemd[1]: Started sshd@76-10.230.41.6:22-122.154.116.98:62842.service - OpenSSH per-connection server daemon (122.154.116.98:62842). May 27 17:59:49.477026 systemd[1]: Started sshd@77-10.230.41.6:22-122.154.116.98:57300.service - OpenSSH per-connection server daemon (122.154.116.98:57300). May 27 17:59:49.992321 kubelet[2875]: E0527 17:59:49.992262 2875 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\"\"" pod="calico-system/goldmane-8f77d7b6c-lv5hg" podUID="3a7c34d5-6339-473f-9bc2-4a5f5fd961b4" May 27 17:59:50.190234 sshd[5907]: Invalid user minecraft from 122.154.116.98 port 62842 May 27 17:59:50.307437 sshd[5910]: Invalid user minecraft from 122.154.116.98 port 57300 May 27 17:59:50.386733 sshd[5907]: Connection closed by invalid user minecraft 122.154.116.98 port 62842 [preauth] May 27 17:59:50.390118 systemd[1]: sshd@76-10.230.41.6:22-122.154.116.98:62842.service: Deactivated successfully. May 27 17:59:50.500034 sshd[5910]: Connection closed by invalid user minecraft 122.154.116.98 port 57300 [preauth] May 27 17:59:50.503165 systemd[1]: sshd@77-10.230.41.6:22-122.154.116.98:57300.service: Deactivated successfully. May 27 17:59:52.115647 systemd[1]: Started sshd@78-10.230.41.6:22-122.154.116.98:63057.service - OpenSSH per-connection server daemon (122.154.116.98:63057). May 27 17:59:52.200769 systemd[1]: Started sshd@79-10.230.41.6:22-122.154.116.98:60569.service - OpenSSH per-connection server daemon (122.154.116.98:60569). May 27 17:59:52.730405 systemd[1]: Started sshd@80-10.230.41.6:22-139.178.68.195:36356.service - OpenSSH per-connection server daemon (139.178.68.195:36356). May 27 17:59:53.034987 sshd[5919]: Invalid user mc from 122.154.116.98 port 60569 May 27 17:59:53.068909 sshd[5917]: Invalid user mc from 122.154.116.98 port 63057 May 27 17:59:53.230970 sshd[5919]: Connection closed by invalid user mc 122.154.116.98 port 60569 [preauth] May 27 17:59:53.233141 systemd[1]: sshd@79-10.230.41.6:22-122.154.116.98:60569.service: Deactivated successfully. May 27 17:59:53.278803 sshd[5917]: Connection closed by invalid user mc 122.154.116.98 port 63057 [preauth] May 27 17:59:53.281604 systemd[1]: sshd@78-10.230.41.6:22-122.154.116.98:63057.service: Deactivated successfully. May 27 17:59:53.650178 sshd[5923]: Accepted publickey for core from 139.178.68.195 port 36356 ssh2: RSA SHA256:TP4s4bOAUCtqDEOOWsp9GTiG5zPCRK6jIwLqN8nWHAI May 27 17:59:53.653294 sshd-session[5923]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:59:53.661749 systemd-logind[1561]: New session 26 of user core. May 27 17:59:53.665133 systemd[1]: Started session-26.scope - Session 26 of User core. May 27 17:59:54.534412 sshd[5929]: Connection closed by 139.178.68.195 port 36356 May 27 17:59:54.536587 sshd-session[5923]: pam_unix(sshd:session): session closed for user core May 27 17:59:54.542410 systemd[1]: sshd@80-10.230.41.6:22-139.178.68.195:36356.service: Deactivated successfully. May 27 17:59:54.545758 systemd[1]: session-26.scope: Deactivated successfully. May 27 17:59:54.547509 systemd-logind[1561]: Session 26 logged out. Waiting for processes to exit. May 27 17:59:54.549739 systemd-logind[1561]: Removed session 26. May 27 17:59:55.039739 systemd[1]: Started sshd@81-10.230.41.6:22-122.154.116.98:49948.service - OpenSSH per-connection server daemon (122.154.116.98:49948). May 27 17:59:55.065603 systemd[1]: Started sshd@82-10.230.41.6:22-122.154.116.98:61747.service - OpenSSH per-connection server daemon (122.154.116.98:61747). May 27 17:59:56.127395 sshd[5942]: Connection closed by authenticating user root 122.154.116.98 port 49948 [preauth] May 27 17:59:56.130978 systemd[1]: sshd@81-10.230.41.6:22-122.154.116.98:49948.service: Deactivated successfully. May 27 17:59:56.147331 sshd[5944]: Connection closed by authenticating user root 122.154.116.98 port 61747 [preauth] May 27 17:59:56.149962 systemd[1]: sshd@82-10.230.41.6:22-122.154.116.98:61747.service: Deactivated successfully. May 27 17:59:56.472846 containerd[1585]: time="2025-05-27T17:59:56.472772247Z" level=info msg="TaskExit event in podsandbox handler container_id:\"53c03d9554d0025de9e8cbdb2e9da19459fba3ea70590696c861f73440d9de07\" id:\"ab9ec49cad798f66db212591ca7eb5f188abb6b42e20ebe5c86be6d0c8a8e869\" pid:5963 exited_at:{seconds:1748368796 nanos:472194791}" May 27 17:59:57.798224 systemd[1]: Started sshd@83-10.230.41.6:22-122.154.116.98:49912.service - OpenSSH per-connection server daemon (122.154.116.98:49912). May 27 17:59:57.851976 systemd[1]: Started sshd@84-10.230.41.6:22-122.154.116.98:60194.service - OpenSSH per-connection server daemon (122.154.116.98:60194).