May 27 18:14:25.970851 kernel: Linux version 6.12.30-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue May 27 15:32:02 -00 2025 May 27 18:14:25.970895 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=digitalocean verity.usrhash=daa3e2d55cc4a7ff0ec15aa9bb0c07df9999cb4e3041f3adad1b1101efdea101 May 27 18:14:25.970905 kernel: BIOS-provided physical RAM map: May 27 18:14:25.970912 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable May 27 18:14:25.970918 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved May 27 18:14:25.970925 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved May 27 18:14:25.970933 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007ffdafff] usable May 27 18:14:25.970945 kernel: BIOS-e820: [mem 0x000000007ffdb000-0x000000007fffffff] reserved May 27 18:14:25.970955 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved May 27 18:14:25.970982 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved May 27 18:14:25.970989 kernel: NX (Execute Disable) protection: active May 27 18:14:25.970996 kernel: APIC: Static calls initialized May 27 18:14:25.971003 kernel: SMBIOS 2.8 present. May 27 18:14:25.971010 kernel: DMI: DigitalOcean Droplet/Droplet, BIOS 20171212 12/12/2017 May 27 18:14:25.971021 kernel: DMI: Memory slots populated: 1/1 May 27 18:14:25.971032 kernel: Hypervisor detected: KVM May 27 18:14:25.971045 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 May 27 18:14:25.971053 kernel: kvm-clock: using sched offset of 4436243502 cycles May 27 18:14:25.971062 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns May 27 18:14:25.971070 kernel: tsc: Detected 2494.140 MHz processor May 27 18:14:25.971078 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved May 27 18:14:25.971087 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable May 27 18:14:25.971094 kernel: last_pfn = 0x7ffdb max_arch_pfn = 0x400000000 May 27 18:14:25.971106 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs May 27 18:14:25.971114 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT May 27 18:14:25.971122 kernel: ACPI: Early table checksum verification disabled May 27 18:14:25.971130 kernel: ACPI: RSDP 0x00000000000F5950 000014 (v00 BOCHS ) May 27 18:14:25.971137 kernel: ACPI: RSDT 0x000000007FFE1986 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 27 18:14:25.971145 kernel: ACPI: FACP 0x000000007FFE176A 000074 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 27 18:14:25.971153 kernel: ACPI: DSDT 0x000000007FFE0040 00172A (v01 BOCHS BXPC 00000001 BXPC 00000001) May 27 18:14:25.971161 kernel: ACPI: FACS 0x000000007FFE0000 000040 May 27 18:14:25.971169 kernel: ACPI: APIC 0x000000007FFE17DE 000080 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 27 18:14:25.971180 kernel: ACPI: HPET 0x000000007FFE185E 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 27 18:14:25.971187 kernel: ACPI: SRAT 0x000000007FFE1896 0000C8 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 27 18:14:25.971196 kernel: ACPI: WAET 0x000000007FFE195E 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 27 18:14:25.971204 kernel: ACPI: Reserving FACP table memory at [mem 0x7ffe176a-0x7ffe17dd] May 27 18:14:25.971211 kernel: ACPI: Reserving DSDT table memory at [mem 0x7ffe0040-0x7ffe1769] May 27 18:14:25.971219 kernel: ACPI: Reserving FACS table memory at [mem 0x7ffe0000-0x7ffe003f] May 27 18:14:25.971227 kernel: ACPI: Reserving APIC table memory at [mem 0x7ffe17de-0x7ffe185d] May 27 18:14:25.971612 kernel: ACPI: Reserving HPET table memory at [mem 0x7ffe185e-0x7ffe1895] May 27 18:14:25.971630 kernel: ACPI: Reserving SRAT table memory at [mem 0x7ffe1896-0x7ffe195d] May 27 18:14:25.971638 kernel: ACPI: Reserving WAET table memory at [mem 0x7ffe195e-0x7ffe1985] May 27 18:14:25.971647 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] May 27 18:14:25.971656 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] May 27 18:14:25.971664 kernel: NUMA: Node 0 [mem 0x00001000-0x0009ffff] + [mem 0x00100000-0x7ffdafff] -> [mem 0x00001000-0x7ffdafff] May 27 18:14:25.971673 kernel: NODE_DATA(0) allocated [mem 0x7ffd3dc0-0x7ffdafff] May 27 18:14:25.971684 kernel: Zone ranges: May 27 18:14:25.971693 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] May 27 18:14:25.971701 kernel: DMA32 [mem 0x0000000001000000-0x000000007ffdafff] May 27 18:14:25.971710 kernel: Normal empty May 27 18:14:25.971718 kernel: Device empty May 27 18:14:25.971727 kernel: Movable zone start for each node May 27 18:14:25.971735 kernel: Early memory node ranges May 27 18:14:25.971744 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] May 27 18:14:25.971752 kernel: node 0: [mem 0x0000000000100000-0x000000007ffdafff] May 27 18:14:25.971763 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007ffdafff] May 27 18:14:25.971772 kernel: On node 0, zone DMA: 1 pages in unavailable ranges May 27 18:14:25.971780 kernel: On node 0, zone DMA: 97 pages in unavailable ranges May 27 18:14:25.971789 kernel: On node 0, zone DMA32: 37 pages in unavailable ranges May 27 18:14:25.971797 kernel: ACPI: PM-Timer IO Port: 0x608 May 27 18:14:25.971805 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) May 27 18:14:25.971821 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 May 27 18:14:25.971830 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) May 27 18:14:25.971840 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) May 27 18:14:25.971855 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) May 27 18:14:25.971870 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) May 27 18:14:25.971879 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) May 27 18:14:25.971887 kernel: ACPI: Using ACPI (MADT) for SMP configuration information May 27 18:14:25.971896 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 May 27 18:14:25.971904 kernel: TSC deadline timer available May 27 18:14:25.971913 kernel: CPU topo: Max. logical packages: 1 May 27 18:14:25.971922 kernel: CPU topo: Max. logical dies: 1 May 27 18:14:25.971930 kernel: CPU topo: Max. dies per package: 1 May 27 18:14:25.971938 kernel: CPU topo: Max. threads per core: 1 May 27 18:14:25.971949 kernel: CPU topo: Num. cores per package: 2 May 27 18:14:25.971958 kernel: CPU topo: Num. threads per package: 2 May 27 18:14:25.971966 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs May 27 18:14:25.971976 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() May 27 18:14:25.971989 kernel: [mem 0x80000000-0xfeffbfff] available for PCI devices May 27 18:14:25.972020 kernel: Booting paravirtualized kernel on KVM May 27 18:14:25.972032 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns May 27 18:14:25.972044 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 May 27 18:14:25.972056 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 May 27 18:14:25.972073 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 May 27 18:14:25.972084 kernel: pcpu-alloc: [0] 0 1 May 27 18:14:25.972092 kernel: kvm-guest: PV spinlocks disabled, no host support May 27 18:14:25.972102 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=digitalocean verity.usrhash=daa3e2d55cc4a7ff0ec15aa9bb0c07df9999cb4e3041f3adad1b1101efdea101 May 27 18:14:25.972111 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 27 18:14:25.972120 kernel: random: crng init done May 27 18:14:25.972128 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 27 18:14:25.972137 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) May 27 18:14:25.972148 kernel: Fallback order for Node 0: 0 May 27 18:14:25.972156 kernel: Built 1 zonelists, mobility grouping on. Total pages: 524153 May 27 18:14:25.972164 kernel: Policy zone: DMA32 May 27 18:14:25.972173 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 27 18:14:25.972181 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 May 27 18:14:25.972189 kernel: Kernel/User page tables isolation: enabled May 27 18:14:25.972198 kernel: ftrace: allocating 40081 entries in 157 pages May 27 18:14:25.972206 kernel: ftrace: allocated 157 pages with 5 groups May 27 18:14:25.972215 kernel: Dynamic Preempt: voluntary May 27 18:14:25.972225 kernel: rcu: Preemptible hierarchical RCU implementation. May 27 18:14:25.972253 kernel: rcu: RCU event tracing is enabled. May 27 18:14:25.972263 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. May 27 18:14:25.972271 kernel: Trampoline variant of Tasks RCU enabled. May 27 18:14:25.972280 kernel: Rude variant of Tasks RCU enabled. May 27 18:14:25.972288 kernel: Tracing variant of Tasks RCU enabled. May 27 18:14:25.972297 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 27 18:14:25.972316 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 May 27 18:14:25.972325 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 27 18:14:25.972340 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 27 18:14:25.972349 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 27 18:14:25.972357 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 May 27 18:14:25.972385 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. May 27 18:14:25.972397 kernel: Console: colour VGA+ 80x25 May 27 18:14:25.972407 kernel: printk: legacy console [tty0] enabled May 27 18:14:25.972415 kernel: printk: legacy console [ttyS0] enabled May 27 18:14:25.972423 kernel: ACPI: Core revision 20240827 May 27 18:14:25.972432 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns May 27 18:14:25.972452 kernel: APIC: Switch to symmetric I/O mode setup May 27 18:14:25.972461 kernel: x2apic enabled May 27 18:14:25.972470 kernel: APIC: Switched APIC routing to: physical x2apic May 27 18:14:25.972482 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 May 27 18:14:25.972494 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x23f39a1d859, max_idle_ns: 440795326830 ns May 27 18:14:25.972504 kernel: Calibrating delay loop (skipped) preset value.. 4988.28 BogoMIPS (lpj=2494140) May 27 18:14:25.972513 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 May 27 18:14:25.972521 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 May 27 18:14:25.972531 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization May 27 18:14:25.972542 kernel: Spectre V2 : Mitigation: Retpolines May 27 18:14:25.972551 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT May 27 18:14:25.972560 kernel: Spectre V2 : Enabling Restricted Speculation for firmware calls May 27 18:14:25.972570 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier May 27 18:14:25.972581 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl May 27 18:14:25.972590 kernel: MDS: Mitigation: Clear CPU buffers May 27 18:14:25.972599 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode May 27 18:14:25.972610 kernel: ITS: Mitigation: Aligned branch/return thunks May 27 18:14:25.972619 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' May 27 18:14:25.972628 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' May 27 18:14:25.972637 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' May 27 18:14:25.972646 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 May 27 18:14:25.972655 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. May 27 18:14:25.972664 kernel: Freeing SMP alternatives memory: 32K May 27 18:14:25.972673 kernel: pid_max: default: 32768 minimum: 301 May 27 18:14:25.972682 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima May 27 18:14:25.972694 kernel: landlock: Up and running. May 27 18:14:25.972703 kernel: SELinux: Initializing. May 27 18:14:25.972712 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) May 27 18:14:25.972721 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) May 27 18:14:25.972730 kernel: smpboot: CPU0: Intel DO-Regular (family: 0x6, model: 0x4f, stepping: 0x1) May 27 18:14:25.972739 kernel: Performance Events: unsupported p6 CPU model 79 no PMU driver, software events only. May 27 18:14:25.972747 kernel: signal: max sigframe size: 1776 May 27 18:14:25.972756 kernel: rcu: Hierarchical SRCU implementation. May 27 18:14:25.972765 kernel: rcu: Max phase no-delay instances is 400. May 27 18:14:25.972777 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level May 27 18:14:25.972786 kernel: NMI watchdog: Perf NMI watchdog permanently disabled May 27 18:14:25.972795 kernel: smp: Bringing up secondary CPUs ... May 27 18:14:25.972803 kernel: smpboot: x86: Booting SMP configuration: May 27 18:14:25.972815 kernel: .... node #0, CPUs: #1 May 27 18:14:25.972824 kernel: smp: Brought up 1 node, 2 CPUs May 27 18:14:25.972833 kernel: smpboot: Total of 2 processors activated (9976.56 BogoMIPS) May 27 18:14:25.972842 kernel: Memory: 1966904K/2096612K available (14336K kernel code, 2430K rwdata, 9952K rodata, 54416K init, 2552K bss, 125144K reserved, 0K cma-reserved) May 27 18:14:25.972853 kernel: devtmpfs: initialized May 27 18:14:25.972865 kernel: x86/mm: Memory block size: 128MB May 27 18:14:25.972874 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 27 18:14:25.972883 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) May 27 18:14:25.972891 kernel: pinctrl core: initialized pinctrl subsystem May 27 18:14:25.972900 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 27 18:14:25.972909 kernel: audit: initializing netlink subsys (disabled) May 27 18:14:25.972918 kernel: audit: type=2000 audit(1748369662.764:1): state=initialized audit_enabled=0 res=1 May 27 18:14:25.972926 kernel: thermal_sys: Registered thermal governor 'step_wise' May 27 18:14:25.972935 kernel: thermal_sys: Registered thermal governor 'user_space' May 27 18:14:25.972947 kernel: cpuidle: using governor menu May 27 18:14:25.972956 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 27 18:14:25.972964 kernel: dca service started, version 1.12.1 May 27 18:14:25.972973 kernel: PCI: Using configuration type 1 for base access May 27 18:14:25.972982 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. May 27 18:14:25.972991 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 27 18:14:25.973000 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page May 27 18:14:25.973009 kernel: ACPI: Added _OSI(Module Device) May 27 18:14:25.973018 kernel: ACPI: Added _OSI(Processor Device) May 27 18:14:25.973029 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 27 18:14:25.973038 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 27 18:14:25.973061 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 27 18:14:25.973074 kernel: ACPI: Interpreter enabled May 27 18:14:25.973087 kernel: ACPI: PM: (supports S0 S5) May 27 18:14:25.973100 kernel: ACPI: Using IOAPIC for interrupt routing May 27 18:14:25.973112 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug May 27 18:14:25.973121 kernel: PCI: Using E820 reservations for host bridge windows May 27 18:14:25.973131 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F May 27 18:14:25.973143 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) May 27 18:14:25.974754 kernel: acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI HPX-Type3] May 27 18:14:25.974872 kernel: acpi PNP0A03:00: _OSC: not requesting OS control; OS requires [ExtendedConfig ASPM ClockPM MSI] May 27 18:14:25.974966 kernel: acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended configuration space under this bridge May 27 18:14:25.974978 kernel: acpiphp: Slot [3] registered May 27 18:14:25.974988 kernel: acpiphp: Slot [4] registered May 27 18:14:25.974996 kernel: acpiphp: Slot [5] registered May 27 18:14:25.975011 kernel: acpiphp: Slot [6] registered May 27 18:14:25.975020 kernel: acpiphp: Slot [7] registered May 27 18:14:25.975029 kernel: acpiphp: Slot [8] registered May 27 18:14:25.975038 kernel: acpiphp: Slot [9] registered May 27 18:14:25.975047 kernel: acpiphp: Slot [10] registered May 27 18:14:25.975056 kernel: acpiphp: Slot [11] registered May 27 18:14:25.975064 kernel: acpiphp: Slot [12] registered May 27 18:14:25.975073 kernel: acpiphp: Slot [13] registered May 27 18:14:25.975085 kernel: acpiphp: Slot [14] registered May 27 18:14:25.975099 kernel: acpiphp: Slot [15] registered May 27 18:14:25.975113 kernel: acpiphp: Slot [16] registered May 27 18:14:25.975122 kernel: acpiphp: Slot [17] registered May 27 18:14:25.975131 kernel: acpiphp: Slot [18] registered May 27 18:14:25.975140 kernel: acpiphp: Slot [19] registered May 27 18:14:25.975149 kernel: acpiphp: Slot [20] registered May 27 18:14:25.975158 kernel: acpiphp: Slot [21] registered May 27 18:14:25.975167 kernel: acpiphp: Slot [22] registered May 27 18:14:25.975175 kernel: acpiphp: Slot [23] registered May 27 18:14:25.975184 kernel: acpiphp: Slot [24] registered May 27 18:14:25.975196 kernel: acpiphp: Slot [25] registered May 27 18:14:25.975205 kernel: acpiphp: Slot [26] registered May 27 18:14:25.975213 kernel: acpiphp: Slot [27] registered May 27 18:14:25.975222 kernel: acpiphp: Slot [28] registered May 27 18:14:25.975231 kernel: acpiphp: Slot [29] registered May 27 18:14:25.975253 kernel: acpiphp: Slot [30] registered May 27 18:14:25.975266 kernel: acpiphp: Slot [31] registered May 27 18:14:25.975278 kernel: PCI host bridge to bus 0000:00 May 27 18:14:25.975441 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] May 27 18:14:25.975554 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] May 27 18:14:25.975653 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] May 27 18:14:25.975736 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xfebfffff window] May 27 18:14:25.975817 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x17fffffff window] May 27 18:14:25.975898 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] May 27 18:14:25.976051 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint May 27 18:14:25.976168 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint May 27 18:14:25.976426 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint May 27 18:14:25.976567 kernel: pci 0000:00:01.1: BAR 4 [io 0xc1e0-0xc1ef] May 27 18:14:25.976719 kernel: pci 0000:00:01.1: BAR 0 [io 0x01f0-0x01f7]: legacy IDE quirk May 27 18:14:25.976859 kernel: pci 0000:00:01.1: BAR 1 [io 0x03f6]: legacy IDE quirk May 27 18:14:25.976994 kernel: pci 0000:00:01.1: BAR 2 [io 0x0170-0x0177]: legacy IDE quirk May 27 18:14:25.977181 kernel: pci 0000:00:01.1: BAR 3 [io 0x0376]: legacy IDE quirk May 27 18:14:25.977314 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint May 27 18:14:25.977421 kernel: pci 0000:00:01.2: BAR 4 [io 0xc180-0xc19f] May 27 18:14:25.977529 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint May 27 18:14:25.977622 kernel: pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI May 27 18:14:25.977718 kernel: pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB May 27 18:14:25.977824 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint May 27 18:14:25.977925 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref] May 27 18:14:25.978028 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref] May 27 18:14:25.978140 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfebf0000-0xfebf0fff] May 27 18:14:25.978249 kernel: pci 0000:00:02.0: ROM [mem 0xfebe0000-0xfebeffff pref] May 27 18:14:25.978346 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] May 27 18:14:25.978471 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint May 27 18:14:25.978577 kernel: pci 0000:00:03.0: BAR 0 [io 0xc1a0-0xc1bf] May 27 18:14:25.978711 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfebf1000-0xfebf1fff] May 27 18:14:25.978828 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref] May 27 18:14:25.978979 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint May 27 18:14:25.979112 kernel: pci 0000:00:04.0: BAR 0 [io 0xc1c0-0xc1df] May 27 18:14:25.979210 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfebf2000-0xfebf2fff] May 27 18:14:25.979319 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref] May 27 18:14:25.979423 kernel: pci 0000:00:05.0: [1af4:1004] type 00 class 0x010000 conventional PCI endpoint May 27 18:14:25.979525 kernel: pci 0000:00:05.0: BAR 0 [io 0xc100-0xc13f] May 27 18:14:25.979618 kernel: pci 0000:00:05.0: BAR 1 [mem 0xfebf3000-0xfebf3fff] May 27 18:14:25.979710 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref] May 27 18:14:25.979812 kernel: pci 0000:00:06.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint May 27 18:14:25.979904 kernel: pci 0000:00:06.0: BAR 0 [io 0xc000-0xc07f] May 27 18:14:25.979995 kernel: pci 0000:00:06.0: BAR 1 [mem 0xfebf4000-0xfebf4fff] May 27 18:14:25.980089 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref] May 27 18:14:25.980198 kernel: pci 0000:00:07.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint May 27 18:14:25.980348 kernel: pci 0000:00:07.0: BAR 0 [io 0xc080-0xc0ff] May 27 18:14:25.980444 kernel: pci 0000:00:07.0: BAR 1 [mem 0xfebf5000-0xfebf5fff] May 27 18:14:25.980538 kernel: pci 0000:00:07.0: BAR 4 [mem 0xfe814000-0xfe817fff 64bit pref] May 27 18:14:25.980659 kernel: pci 0000:00:08.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint May 27 18:14:25.980754 kernel: pci 0000:00:08.0: BAR 0 [io 0xc140-0xc17f] May 27 18:14:25.980889 kernel: pci 0000:00:08.0: BAR 4 [mem 0xfe818000-0xfe81bfff 64bit pref] May 27 18:14:25.980907 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 May 27 18:14:25.980920 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 May 27 18:14:25.980933 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 May 27 18:14:25.980946 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 May 27 18:14:25.980960 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 May 27 18:14:25.980973 kernel: iommu: Default domain type: Translated May 27 18:14:25.980983 kernel: iommu: DMA domain TLB invalidation policy: lazy mode May 27 18:14:25.980996 kernel: PCI: Using ACPI for IRQ routing May 27 18:14:25.981006 kernel: PCI: pci_cache_line_size set to 64 bytes May 27 18:14:25.981015 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] May 27 18:14:25.981024 kernel: e820: reserve RAM buffer [mem 0x7ffdb000-0x7fffffff] May 27 18:14:25.981146 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device May 27 18:14:25.981278 kernel: pci 0000:00:02.0: vgaarb: bridge control possible May 27 18:14:25.981371 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none May 27 18:14:25.981383 kernel: vgaarb: loaded May 27 18:14:25.981393 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 May 27 18:14:25.981406 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter May 27 18:14:25.981415 kernel: clocksource: Switched to clocksource kvm-clock May 27 18:14:25.981424 kernel: VFS: Disk quotas dquot_6.6.0 May 27 18:14:25.981434 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 27 18:14:25.981444 kernel: pnp: PnP ACPI init May 27 18:14:25.981457 kernel: pnp: PnP ACPI: found 4 devices May 27 18:14:25.981469 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns May 27 18:14:25.981483 kernel: NET: Registered PF_INET protocol family May 27 18:14:25.981495 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) May 27 18:14:25.981511 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) May 27 18:14:25.981524 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 27 18:14:25.981538 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) May 27 18:14:25.981549 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) May 27 18:14:25.981558 kernel: TCP: Hash tables configured (established 16384 bind 16384) May 27 18:14:25.981567 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) May 27 18:14:25.981576 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) May 27 18:14:25.981586 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 27 18:14:25.981595 kernel: NET: Registered PF_XDP protocol family May 27 18:14:25.981697 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] May 27 18:14:25.981782 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] May 27 18:14:25.981878 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] May 27 18:14:25.981960 kernel: pci_bus 0000:00: resource 7 [mem 0x80000000-0xfebfffff window] May 27 18:14:25.982043 kernel: pci_bus 0000:00: resource 8 [mem 0x100000000-0x17fffffff window] May 27 18:14:25.982141 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release May 27 18:14:25.982249 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers May 27 18:14:25.982266 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11 May 27 18:14:25.982367 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x720 took 27956 usecs May 27 18:14:25.982380 kernel: PCI: CLS 0 bytes, default 64 May 27 18:14:25.982389 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer May 27 18:14:25.982398 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x23f39a1d859, max_idle_ns: 440795326830 ns May 27 18:14:25.982408 kernel: Initialise system trusted keyrings May 27 18:14:25.982417 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 May 27 18:14:25.982426 kernel: Key type asymmetric registered May 27 18:14:25.982434 kernel: Asymmetric key parser 'x509' registered May 27 18:14:25.982446 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) May 27 18:14:25.982455 kernel: io scheduler mq-deadline registered May 27 18:14:25.982464 kernel: io scheduler kyber registered May 27 18:14:25.982476 kernel: io scheduler bfq registered May 27 18:14:25.982486 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 May 27 18:14:25.982495 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10 May 27 18:14:25.982504 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11 May 27 18:14:25.982513 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10 May 27 18:14:25.982522 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 27 18:14:25.982533 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A May 27 18:14:25.982542 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 May 27 18:14:25.982551 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 May 27 18:14:25.982560 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 May 27 18:14:25.982672 kernel: rtc_cmos 00:03: RTC can wake from S4 May 27 18:14:25.982761 kernel: rtc_cmos 00:03: registered as rtc0 May 27 18:14:25.982846 kernel: rtc_cmos 00:03: setting system clock to 2025-05-27T18:14:25 UTC (1748369665) May 27 18:14:25.982932 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram May 27 18:14:25.982947 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 May 27 18:14:25.982956 kernel: intel_pstate: CPU model not supported May 27 18:14:25.982965 kernel: NET: Registered PF_INET6 protocol family May 27 18:14:25.982974 kernel: Segment Routing with IPv6 May 27 18:14:25.982983 kernel: In-situ OAM (IOAM) with IPv6 May 27 18:14:25.982992 kernel: NET: Registered PF_PACKET protocol family May 27 18:14:25.983000 kernel: Key type dns_resolver registered May 27 18:14:25.983009 kernel: IPI shorthand broadcast: enabled May 27 18:14:25.983018 kernel: sched_clock: Marking stable (3268003852, 85125423)->(3371359589, -18230314) May 27 18:14:25.983030 kernel: registered taskstats version 1 May 27 18:14:25.983039 kernel: Loading compiled-in X.509 certificates May 27 18:14:25.983048 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.30-flatcar: 9507e5c390e18536b38d58c90da64baf0ac9837c' May 27 18:14:25.983057 kernel: Demotion targets for Node 0: null May 27 18:14:25.983066 kernel: Key type .fscrypt registered May 27 18:14:25.983075 kernel: Key type fscrypt-provisioning registered May 27 18:14:25.983102 kernel: ima: No TPM chip found, activating TPM-bypass! May 27 18:14:25.983114 kernel: ima: Allocated hash algorithm: sha1 May 27 18:14:25.983123 kernel: ima: No architecture policies found May 27 18:14:25.983135 kernel: clk: Disabling unused clocks May 27 18:14:25.983144 kernel: Warning: unable to open an initial console. May 27 18:14:25.983154 kernel: Freeing unused kernel image (initmem) memory: 54416K May 27 18:14:25.983163 kernel: Write protecting the kernel read-only data: 24576k May 27 18:14:25.983172 kernel: Freeing unused kernel image (rodata/data gap) memory: 288K May 27 18:14:25.983182 kernel: Run /init as init process May 27 18:14:25.983191 kernel: with arguments: May 27 18:14:25.983200 kernel: /init May 27 18:14:25.983209 kernel: with environment: May 27 18:14:25.983221 kernel: HOME=/ May 27 18:14:25.983230 kernel: TERM=linux May 27 18:14:25.983249 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 27 18:14:25.983260 systemd[1]: Successfully made /usr/ read-only. May 27 18:14:25.983273 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 27 18:14:25.983284 systemd[1]: Detected virtualization kvm. May 27 18:14:25.983294 systemd[1]: Detected architecture x86-64. May 27 18:14:25.983306 systemd[1]: Running in initrd. May 27 18:14:25.983316 systemd[1]: No hostname configured, using default hostname. May 27 18:14:25.983326 systemd[1]: Hostname set to . May 27 18:14:25.983336 systemd[1]: Initializing machine ID from VM UUID. May 27 18:14:25.983346 systemd[1]: Queued start job for default target initrd.target. May 27 18:14:25.983356 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 18:14:25.983365 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 18:14:25.983376 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 27 18:14:25.983388 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 27 18:14:25.983398 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 27 18:14:25.983411 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 27 18:14:25.983423 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 27 18:14:25.983435 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 27 18:14:25.983445 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 18:14:25.983455 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 27 18:14:25.983465 systemd[1]: Reached target paths.target - Path Units. May 27 18:14:25.983476 systemd[1]: Reached target slices.target - Slice Units. May 27 18:14:25.983486 systemd[1]: Reached target swap.target - Swaps. May 27 18:14:25.983496 systemd[1]: Reached target timers.target - Timer Units. May 27 18:14:25.983506 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 27 18:14:25.983518 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 27 18:14:25.983532 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 27 18:14:25.983548 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 27 18:14:25.983562 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 27 18:14:25.983576 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 27 18:14:25.983590 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 27 18:14:25.983602 systemd[1]: Reached target sockets.target - Socket Units. May 27 18:14:25.983616 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 27 18:14:25.983631 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 27 18:14:25.983650 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 27 18:14:25.983661 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). May 27 18:14:25.983671 systemd[1]: Starting systemd-fsck-usr.service... May 27 18:14:25.983681 systemd[1]: Starting systemd-journald.service - Journal Service... May 27 18:14:25.983691 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 27 18:14:25.983701 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 18:14:25.983711 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 27 18:14:25.983724 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 27 18:14:25.983734 systemd[1]: Finished systemd-fsck-usr.service. May 27 18:14:25.983745 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 27 18:14:25.983808 systemd-journald[213]: Collecting audit messages is disabled. May 27 18:14:25.983835 systemd-journald[213]: Journal started May 27 18:14:25.983857 systemd-journald[213]: Runtime Journal (/run/log/journal/20ef69403ab5410498843b62749a2169) is 4.9M, max 39.5M, 34.6M free. May 27 18:14:25.962407 systemd-modules-load[214]: Inserted module 'overlay' May 27 18:14:25.991196 systemd[1]: Started systemd-journald.service - Journal Service. May 27 18:14:26.000365 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 27 18:14:26.002919 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 27 18:14:26.039830 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 27 18:14:26.039868 kernel: Bridge firewalling registered May 27 18:14:26.004396 systemd-modules-load[214]: Inserted module 'br_netfilter' May 27 18:14:26.048621 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 27 18:14:26.049704 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 18:14:26.054368 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 27 18:14:26.057387 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 27 18:14:26.059449 systemd-tmpfiles[227]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. May 27 18:14:26.061402 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 27 18:14:26.072430 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 18:14:26.082056 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 27 18:14:26.085367 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 27 18:14:26.097618 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 18:14:26.100842 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 27 18:14:26.104391 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 27 18:14:26.132090 systemd-resolved[243]: Positive Trust Anchors: May 27 18:14:26.132109 systemd-resolved[243]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 27 18:14:26.132153 systemd-resolved[243]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 27 18:14:26.135100 systemd-resolved[243]: Defaulting to hostname 'linux'. May 27 18:14:26.139924 dracut-cmdline[252]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=digitalocean verity.usrhash=daa3e2d55cc4a7ff0ec15aa9bb0c07df9999cb4e3041f3adad1b1101efdea101 May 27 18:14:26.136397 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 27 18:14:26.137885 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 27 18:14:26.249307 kernel: SCSI subsystem initialized May 27 18:14:26.260279 kernel: Loading iSCSI transport class v2.0-870. May 27 18:14:26.275275 kernel: iscsi: registered transport (tcp) May 27 18:14:26.299285 kernel: iscsi: registered transport (qla4xxx) May 27 18:14:26.299371 kernel: QLogic iSCSI HBA Driver May 27 18:14:26.323931 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 27 18:14:26.346192 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 27 18:14:26.348174 systemd[1]: Reached target network-pre.target - Preparation for Network. May 27 18:14:26.412385 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 27 18:14:26.415629 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 27 18:14:26.473299 kernel: raid6: avx2x4 gen() 16275 MB/s May 27 18:14:26.490333 kernel: raid6: avx2x2 gen() 17131 MB/s May 27 18:14:26.507501 kernel: raid6: avx2x1 gen() 11566 MB/s May 27 18:14:26.507650 kernel: raid6: using algorithm avx2x2 gen() 17131 MB/s May 27 18:14:26.525421 kernel: raid6: .... xor() 17187 MB/s, rmw enabled May 27 18:14:26.525581 kernel: raid6: using avx2x2 recovery algorithm May 27 18:14:26.549324 kernel: xor: automatically using best checksumming function avx May 27 18:14:26.740303 kernel: Btrfs loaded, zoned=no, fsverity=no May 27 18:14:26.747891 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 27 18:14:26.750334 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 18:14:26.786386 systemd-udevd[460]: Using default interface naming scheme 'v255'. May 27 18:14:26.796721 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 18:14:26.800886 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 27 18:14:26.839109 dracut-pre-trigger[470]: rd.md=0: removing MD RAID activation May 27 18:14:26.871857 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 27 18:14:26.873977 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 27 18:14:26.947702 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 27 18:14:26.950697 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 27 18:14:27.060104 kernel: virtio_blk virtio4: 1/0/0 default/read/poll queues May 27 18:14:27.062788 kernel: ACPI: bus type USB registered May 27 18:14:27.062865 kernel: usbcore: registered new interface driver usbfs May 27 18:14:27.063257 kernel: libata version 3.00 loaded. May 27 18:14:27.063291 kernel: virtio_scsi virtio3: 2/0/0 default/read/poll queues May 27 18:14:27.064784 kernel: usbcore: registered new interface driver hub May 27 18:14:27.068262 kernel: ata_piix 0000:00:01.1: version 2.13 May 27 18:14:27.070259 kernel: scsi host0: Virtio SCSI HBA May 27 18:14:27.071581 kernel: scsi host1: ata_piix May 27 18:14:27.071638 kernel: usbcore: registered new device driver usb May 27 18:14:27.077261 kernel: virtio_blk virtio4: [vda] 125829120 512-byte logical blocks (64.4 GB/60.0 GiB) May 27 18:14:27.079927 kernel: scsi host2: ata_piix May 27 18:14:27.080133 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc1e0 irq 14 lpm-pol 0 May 27 18:14:27.080148 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc1e8 irq 15 lpm-pol 0 May 27 18:14:27.086260 kernel: cryptd: max_cpu_qlen set to 1000 May 27 18:14:27.099721 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. May 27 18:14:27.099788 kernel: GPT:9289727 != 125829119 May 27 18:14:27.099807 kernel: GPT:Alternate GPT header not at the end of the disk. May 27 18:14:27.099823 kernel: GPT:9289727 != 125829119 May 27 18:14:27.100456 kernel: GPT: Use GNU Parted to correct GPT errors. May 27 18:14:27.101629 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 27 18:14:27.113890 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 27 18:14:27.114577 kernel: virtio_blk virtio5: 1/0/0 default/read/poll queues May 27 18:14:27.114067 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 27 18:14:27.116567 kernel: virtio_blk virtio5: [vdb] 980 512-byte logical blocks (502 kB/490 KiB) May 27 18:14:27.117267 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 27 18:14:27.121455 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 18:14:27.122948 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 27 18:14:27.182131 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 18:14:27.246469 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 May 27 18:14:27.256266 kernel: AES CTR mode by8 optimization enabled May 27 18:14:27.311656 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. May 27 18:14:27.317623 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller May 27 18:14:27.317857 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1 May 27 18:14:27.318000 kernel: uhci_hcd 0000:00:01.2: detected 2 ports May 27 18:14:27.318986 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c180 May 27 18:14:27.319175 kernel: hub 1-0:1.0: USB hub found May 27 18:14:27.319360 kernel: hub 1-0:1.0: 2 ports detected May 27 18:14:27.330949 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. May 27 18:14:27.331969 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 27 18:14:27.352666 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. May 27 18:14:27.353143 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. May 27 18:14:27.363705 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. May 27 18:14:27.364257 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 27 18:14:27.364884 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 18:14:27.365829 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 27 18:14:27.367629 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 27 18:14:27.368727 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 27 18:14:27.391284 disk-uuid[615]: Primary Header is updated. May 27 18:14:27.391284 disk-uuid[615]: Secondary Entries is updated. May 27 18:14:27.391284 disk-uuid[615]: Secondary Header is updated. May 27 18:14:27.396353 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 27 18:14:27.408268 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 27 18:14:28.423337 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 27 18:14:28.423456 disk-uuid[619]: The operation has completed successfully. May 27 18:14:28.491109 systemd[1]: disk-uuid.service: Deactivated successfully. May 27 18:14:28.491306 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 27 18:14:28.521284 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 27 18:14:28.541438 sh[634]: Success May 27 18:14:28.563273 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 27 18:14:28.563358 kernel: device-mapper: uevent: version 1.0.3 May 27 18:14:28.586262 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev May 27 18:14:28.595289 kernel: device-mapper: verity: sha256 using shash "sha256-avx2" May 27 18:14:28.662871 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 27 18:14:28.665371 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 27 18:14:28.683541 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 27 18:14:28.698273 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' May 27 18:14:28.701276 kernel: BTRFS: device fsid 7caef027-0915-4c01-a3d5-28eff70f7ebd devid 1 transid 39 /dev/mapper/usr (253:0) scanned by mount (646) May 27 18:14:28.701349 kernel: BTRFS info (device dm-0): first mount of filesystem 7caef027-0915-4c01-a3d5-28eff70f7ebd May 27 18:14:28.703286 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm May 27 18:14:28.704452 kernel: BTRFS info (device dm-0): using free-space-tree May 27 18:14:28.714602 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 27 18:14:28.715478 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. May 27 18:14:28.715984 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. May 27 18:14:28.716888 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 27 18:14:28.720224 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 27 18:14:28.746270 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 (254:6) scanned by mount (674) May 27 18:14:28.750490 kernel: BTRFS info (device vda6): first mount of filesystem be856aed-e34b-4b7b-be8a-0716b27db212 May 27 18:14:28.750572 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 27 18:14:28.750592 kernel: BTRFS info (device vda6): using free-space-tree May 27 18:14:28.764267 kernel: BTRFS info (device vda6): last unmount of filesystem be856aed-e34b-4b7b-be8a-0716b27db212 May 27 18:14:28.766080 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 27 18:14:28.768151 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 27 18:14:28.895844 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 27 18:14:28.902556 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 27 18:14:28.945520 ignition[722]: Ignition 2.21.0 May 27 18:14:28.946275 ignition[722]: Stage: fetch-offline May 27 18:14:28.946359 ignition[722]: no configs at "/usr/lib/ignition/base.d" May 27 18:14:28.946375 ignition[722]: no config dir at "/usr/lib/ignition/base.platform.d/digitalocean" May 27 18:14:28.946519 ignition[722]: parsed url from cmdline: "" May 27 18:14:28.946523 ignition[722]: no config URL provided May 27 18:14:28.946530 ignition[722]: reading system config file "/usr/lib/ignition/user.ign" May 27 18:14:28.946538 ignition[722]: no config at "/usr/lib/ignition/user.ign" May 27 18:14:28.946544 ignition[722]: failed to fetch config: resource requires networking May 27 18:14:28.947511 ignition[722]: Ignition finished successfully May 27 18:14:28.951691 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 27 18:14:28.956142 systemd-networkd[817]: lo: Link UP May 27 18:14:28.956155 systemd-networkd[817]: lo: Gained carrier May 27 18:14:28.958700 systemd-networkd[817]: Enumeration completed May 27 18:14:28.959050 systemd-networkd[817]: eth0: found matching network '/usr/lib/systemd/network/yy-digitalocean.network', based on potentially unpredictable interface name. May 27 18:14:28.959054 systemd-networkd[817]: eth0: Configuring with /usr/lib/systemd/network/yy-digitalocean.network. May 27 18:14:28.959704 systemd[1]: Started systemd-networkd.service - Network Configuration. May 27 18:14:28.960171 systemd-networkd[817]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 18:14:28.960178 systemd-networkd[817]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. May 27 18:14:28.960944 systemd-networkd[817]: eth0: Link UP May 27 18:14:28.960950 systemd-networkd[817]: eth0: Gained carrier May 27 18:14:28.960964 systemd-networkd[817]: eth0: found matching network '/usr/lib/systemd/network/yy-digitalocean.network', based on potentially unpredictable interface name. May 27 18:14:28.962144 systemd[1]: Reached target network.target - Network. May 27 18:14:28.964939 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... May 27 18:14:28.966643 systemd-networkd[817]: eth1: Link UP May 27 18:14:28.966648 systemd-networkd[817]: eth1: Gained carrier May 27 18:14:28.966666 systemd-networkd[817]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 18:14:28.981506 systemd-networkd[817]: eth0: DHCPv4 address 146.190.127.126/20, gateway 146.190.112.1 acquired from 169.254.169.253 May 27 18:14:28.985385 systemd-networkd[817]: eth1: DHCPv4 address 10.124.0.33/20 acquired from 169.254.169.253 May 27 18:14:29.004216 ignition[825]: Ignition 2.21.0 May 27 18:14:29.004230 ignition[825]: Stage: fetch May 27 18:14:29.004657 ignition[825]: no configs at "/usr/lib/ignition/base.d" May 27 18:14:29.004675 ignition[825]: no config dir at "/usr/lib/ignition/base.platform.d/digitalocean" May 27 18:14:29.004811 ignition[825]: parsed url from cmdline: "" May 27 18:14:29.004817 ignition[825]: no config URL provided May 27 18:14:29.004823 ignition[825]: reading system config file "/usr/lib/ignition/user.ign" May 27 18:14:29.004833 ignition[825]: no config at "/usr/lib/ignition/user.ign" May 27 18:14:29.004864 ignition[825]: GET http://169.254.169.254/metadata/v1/user-data: attempt #1 May 27 18:14:29.038794 ignition[825]: GET result: OK May 27 18:14:29.039025 ignition[825]: parsing config with SHA512: 466b7ed103d37685268c5d108137b3a5aaec582aa4b297e8e38485dc257ac6a61f1269b2a577c985220c4a2bd81110dd6d56f6a86b3c9973b36226c1a520156d May 27 18:14:29.043620 unknown[825]: fetched base config from "system" May 27 18:14:29.043631 unknown[825]: fetched base config from "system" May 27 18:14:29.043990 ignition[825]: fetch: fetch complete May 27 18:14:29.043638 unknown[825]: fetched user config from "digitalocean" May 27 18:14:29.043998 ignition[825]: fetch: fetch passed May 27 18:14:29.044076 ignition[825]: Ignition finished successfully May 27 18:14:29.046245 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). May 27 18:14:29.051759 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 27 18:14:29.089570 ignition[832]: Ignition 2.21.0 May 27 18:14:29.089582 ignition[832]: Stage: kargs May 27 18:14:29.089764 ignition[832]: no configs at "/usr/lib/ignition/base.d" May 27 18:14:29.089775 ignition[832]: no config dir at "/usr/lib/ignition/base.platform.d/digitalocean" May 27 18:14:29.091436 ignition[832]: kargs: kargs passed May 27 18:14:29.091504 ignition[832]: Ignition finished successfully May 27 18:14:29.092872 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 27 18:14:29.095542 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 27 18:14:29.129572 ignition[838]: Ignition 2.21.0 May 27 18:14:29.129589 ignition[838]: Stage: disks May 27 18:14:29.129800 ignition[838]: no configs at "/usr/lib/ignition/base.d" May 27 18:14:29.129814 ignition[838]: no config dir at "/usr/lib/ignition/base.platform.d/digitalocean" May 27 18:14:29.132025 ignition[838]: disks: disks passed May 27 18:14:29.132096 ignition[838]: Ignition finished successfully May 27 18:14:29.133913 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 27 18:14:29.134618 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 27 18:14:29.135112 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 27 18:14:29.135847 systemd[1]: Reached target local-fs.target - Local File Systems. May 27 18:14:29.136652 systemd[1]: Reached target sysinit.target - System Initialization. May 27 18:14:29.137360 systemd[1]: Reached target basic.target - Basic System. May 27 18:14:29.139301 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 27 18:14:29.171232 systemd-fsck[847]: ROOT: clean, 15/553520 files, 52789/553472 blocks May 27 18:14:29.175374 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 27 18:14:29.177299 systemd[1]: Mounting sysroot.mount - /sysroot... May 27 18:14:29.309276 kernel: EXT4-fs (vda9): mounted filesystem bf93e767-f532-4480-b210-a196f7ac181e r/w with ordered data mode. Quota mode: none. May 27 18:14:29.310618 systemd[1]: Mounted sysroot.mount - /sysroot. May 27 18:14:29.312016 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 27 18:14:29.314570 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 27 18:14:29.317328 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 27 18:14:29.319464 systemd[1]: Starting flatcar-afterburn-network.service - Flatcar Afterburn network service... May 27 18:14:29.327405 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... May 27 18:14:29.327929 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 27 18:14:29.328030 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 27 18:14:29.334138 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 27 18:14:29.339415 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 27 18:14:29.344292 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 (254:6) scanned by mount (855) May 27 18:14:29.348282 kernel: BTRFS info (device vda6): first mount of filesystem be856aed-e34b-4b7b-be8a-0716b27db212 May 27 18:14:29.348352 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 27 18:14:29.348371 kernel: BTRFS info (device vda6): using free-space-tree May 27 18:14:29.359878 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 27 18:14:29.417451 coreos-metadata[858]: May 27 18:14:29.417 INFO Fetching http://169.254.169.254/metadata/v1.json: Attempt #1 May 27 18:14:29.428224 initrd-setup-root[885]: cut: /sysroot/etc/passwd: No such file or directory May 27 18:14:29.430336 coreos-metadata[858]: May 27 18:14:29.429 INFO Fetch successful May 27 18:14:29.441078 coreos-metadata[857]: May 27 18:14:29.440 INFO Fetching http://169.254.169.254/metadata/v1.json: Attempt #1 May 27 18:14:29.442653 initrd-setup-root[892]: cut: /sysroot/etc/group: No such file or directory May 27 18:14:29.443393 coreos-metadata[858]: May 27 18:14:29.442 INFO wrote hostname ci-4344.0.0-6-bb492ec913 to /sysroot/etc/hostname May 27 18:14:29.443458 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. May 27 18:14:29.450225 initrd-setup-root[900]: cut: /sysroot/etc/shadow: No such file or directory May 27 18:14:29.452454 coreos-metadata[857]: May 27 18:14:29.452 INFO Fetch successful May 27 18:14:29.459698 initrd-setup-root[907]: cut: /sysroot/etc/gshadow: No such file or directory May 27 18:14:29.462771 systemd[1]: flatcar-afterburn-network.service: Deactivated successfully. May 27 18:14:29.463832 systemd[1]: Finished flatcar-afterburn-network.service - Flatcar Afterburn network service. May 27 18:14:29.583747 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 27 18:14:29.586119 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 27 18:14:29.588449 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 27 18:14:29.612268 kernel: BTRFS info (device vda6): last unmount of filesystem be856aed-e34b-4b7b-be8a-0716b27db212 May 27 18:14:29.629287 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 27 18:14:29.654277 ignition[977]: INFO : Ignition 2.21.0 May 27 18:14:29.654277 ignition[977]: INFO : Stage: mount May 27 18:14:29.654277 ignition[977]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 18:14:29.654277 ignition[977]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/digitalocean" May 27 18:14:29.657727 ignition[977]: INFO : mount: mount passed May 27 18:14:29.657727 ignition[977]: INFO : Ignition finished successfully May 27 18:14:29.659681 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 27 18:14:29.662194 systemd[1]: Starting ignition-files.service - Ignition (files)... May 27 18:14:29.699677 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 27 18:14:29.702284 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 27 18:14:29.728467 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 (254:6) scanned by mount (989) May 27 18:14:29.728544 kernel: BTRFS info (device vda6): first mount of filesystem be856aed-e34b-4b7b-be8a-0716b27db212 May 27 18:14:29.730533 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 27 18:14:29.730597 kernel: BTRFS info (device vda6): using free-space-tree May 27 18:14:29.736958 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 27 18:14:29.777656 ignition[1005]: INFO : Ignition 2.21.0 May 27 18:14:29.777656 ignition[1005]: INFO : Stage: files May 27 18:14:29.778799 ignition[1005]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 18:14:29.778799 ignition[1005]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/digitalocean" May 27 18:14:29.780153 ignition[1005]: DEBUG : files: compiled without relabeling support, skipping May 27 18:14:29.782458 ignition[1005]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 27 18:14:29.782458 ignition[1005]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 27 18:14:29.786444 ignition[1005]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 27 18:14:29.787536 ignition[1005]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 27 18:14:29.788777 unknown[1005]: wrote ssh authorized keys file for user: core May 27 18:14:29.789792 ignition[1005]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 27 18:14:29.791897 ignition[1005]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" May 27 18:14:29.792865 ignition[1005]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 May 27 18:14:29.934710 ignition[1005]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 27 18:14:30.104839 ignition[1005]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" May 27 18:14:30.105984 ignition[1005]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 27 18:14:30.105984 ignition[1005]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 27 18:14:30.105984 ignition[1005]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 27 18:14:30.105984 ignition[1005]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 27 18:14:30.105984 ignition[1005]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 27 18:14:30.105984 ignition[1005]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 27 18:14:30.105984 ignition[1005]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 27 18:14:30.105984 ignition[1005]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 27 18:14:30.115366 ignition[1005]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 27 18:14:30.115366 ignition[1005]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 27 18:14:30.115366 ignition[1005]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" May 27 18:14:30.115366 ignition[1005]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" May 27 18:14:30.115366 ignition[1005]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" May 27 18:14:30.115366 ignition[1005]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-x86-64.raw: attempt #1 May 27 18:14:30.561486 systemd-networkd[817]: eth0: Gained IPv6LL May 27 18:14:30.689497 systemd-networkd[817]: eth1: Gained IPv6LL May 27 18:14:31.092372 ignition[1005]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 27 18:14:31.435742 ignition[1005]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" May 27 18:14:31.435742 ignition[1005]: INFO : files: op(b): [started] processing unit "prepare-helm.service" May 27 18:14:31.437963 ignition[1005]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 27 18:14:31.440702 ignition[1005]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 27 18:14:31.440702 ignition[1005]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" May 27 18:14:31.440702 ignition[1005]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" May 27 18:14:31.443659 ignition[1005]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" May 27 18:14:31.443659 ignition[1005]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" May 27 18:14:31.443659 ignition[1005]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" May 27 18:14:31.443659 ignition[1005]: INFO : files: files passed May 27 18:14:31.443659 ignition[1005]: INFO : Ignition finished successfully May 27 18:14:31.444337 systemd[1]: Finished ignition-files.service - Ignition (files). May 27 18:14:31.446372 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 27 18:14:31.450564 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 27 18:14:31.462680 systemd[1]: ignition-quench.service: Deactivated successfully. May 27 18:14:31.462861 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 27 18:14:31.474976 initrd-setup-root-after-ignition[1036]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 27 18:14:31.474976 initrd-setup-root-after-ignition[1036]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 27 18:14:31.476910 initrd-setup-root-after-ignition[1040]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 27 18:14:31.478800 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 27 18:14:31.480145 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 27 18:14:31.482049 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 27 18:14:31.542502 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 27 18:14:31.542669 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 27 18:14:31.543932 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 27 18:14:31.544350 systemd[1]: Reached target initrd.target - Initrd Default Target. May 27 18:14:31.545131 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 27 18:14:31.546065 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 27 18:14:31.573896 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 27 18:14:31.576575 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 27 18:14:31.603407 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 27 18:14:31.604574 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 18:14:31.605859 systemd[1]: Stopped target timers.target - Timer Units. May 27 18:14:31.606742 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 27 18:14:31.607342 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 27 18:14:31.608563 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 27 18:14:31.609608 systemd[1]: Stopped target basic.target - Basic System. May 27 18:14:31.610481 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 27 18:14:31.611035 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 27 18:14:31.611808 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 27 18:14:31.612578 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. May 27 18:14:31.613369 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 27 18:14:31.614146 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 27 18:14:31.614832 systemd[1]: Stopped target sysinit.target - System Initialization. May 27 18:14:31.615644 systemd[1]: Stopped target local-fs.target - Local File Systems. May 27 18:14:31.616224 systemd[1]: Stopped target swap.target - Swaps. May 27 18:14:31.616808 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 27 18:14:31.617030 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 27 18:14:31.617840 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 27 18:14:31.618674 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 18:14:31.619495 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 27 18:14:31.619625 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 18:14:31.620369 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 27 18:14:31.620555 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 27 18:14:31.621734 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 27 18:14:31.621886 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 27 18:14:31.622769 systemd[1]: ignition-files.service: Deactivated successfully. May 27 18:14:31.622921 systemd[1]: Stopped ignition-files.service - Ignition (files). May 27 18:14:31.623361 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. May 27 18:14:31.623456 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. May 27 18:14:31.626386 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 27 18:14:31.628130 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 27 18:14:31.629676 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 27 18:14:31.630312 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 27 18:14:31.631852 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 27 18:14:31.632357 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 27 18:14:31.640550 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 27 18:14:31.641169 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 27 18:14:31.660317 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 27 18:14:31.665815 ignition[1060]: INFO : Ignition 2.21.0 May 27 18:14:31.665815 ignition[1060]: INFO : Stage: umount May 27 18:14:31.666847 ignition[1060]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 18:14:31.666847 ignition[1060]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/digitalocean" May 27 18:14:31.666847 ignition[1060]: INFO : umount: umount passed May 27 18:14:31.666847 ignition[1060]: INFO : Ignition finished successfully May 27 18:14:31.669331 systemd[1]: ignition-mount.service: Deactivated successfully. May 27 18:14:31.669485 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 27 18:14:31.670364 systemd[1]: sysroot-boot.service: Deactivated successfully. May 27 18:14:31.670463 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 27 18:14:31.671769 systemd[1]: ignition-disks.service: Deactivated successfully. May 27 18:14:31.671870 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 27 18:14:31.672285 systemd[1]: ignition-kargs.service: Deactivated successfully. May 27 18:14:31.672330 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 27 18:14:31.673071 systemd[1]: ignition-fetch.service: Deactivated successfully. May 27 18:14:31.673130 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). May 27 18:14:31.673834 systemd[1]: Stopped target network.target - Network. May 27 18:14:31.674565 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 27 18:14:31.674702 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 27 18:14:31.675392 systemd[1]: Stopped target paths.target - Path Units. May 27 18:14:31.676099 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 27 18:14:31.676274 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 18:14:31.676866 systemd[1]: Stopped target slices.target - Slice Units. May 27 18:14:31.677688 systemd[1]: Stopped target sockets.target - Socket Units. May 27 18:14:31.678579 systemd[1]: iscsid.socket: Deactivated successfully. May 27 18:14:31.678647 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 27 18:14:31.679436 systemd[1]: iscsiuio.socket: Deactivated successfully. May 27 18:14:31.679490 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 27 18:14:31.680131 systemd[1]: ignition-setup.service: Deactivated successfully. May 27 18:14:31.680209 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 27 18:14:31.680938 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 27 18:14:31.681081 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 27 18:14:31.681915 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 27 18:14:31.681983 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 27 18:14:31.682900 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 27 18:14:31.683557 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 27 18:14:31.691611 systemd[1]: systemd-resolved.service: Deactivated successfully. May 27 18:14:31.692224 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 27 18:14:31.696648 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 27 18:14:31.697516 systemd[1]: systemd-networkd.service: Deactivated successfully. May 27 18:14:31.698087 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 27 18:14:31.700057 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 27 18:14:31.701279 systemd[1]: Stopped target network-pre.target - Preparation for Network. May 27 18:14:31.701857 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 27 18:14:31.701916 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 27 18:14:31.703965 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 27 18:14:31.704464 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 27 18:14:31.704550 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 27 18:14:31.705277 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 27 18:14:31.705349 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 27 18:14:31.705955 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 27 18:14:31.706013 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 27 18:14:31.706819 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 27 18:14:31.706885 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 18:14:31.707946 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 18:14:31.711989 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 27 18:14:31.712091 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 27 18:14:31.728746 systemd[1]: systemd-udevd.service: Deactivated successfully. May 27 18:14:31.728912 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 18:14:31.730054 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 27 18:14:31.730134 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 27 18:14:31.730916 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 27 18:14:31.730955 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 27 18:14:31.731568 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 27 18:14:31.731636 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 27 18:14:31.732727 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 27 18:14:31.732782 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 27 18:14:31.733392 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 27 18:14:31.733443 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 27 18:14:31.735469 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 27 18:14:31.736468 systemd[1]: systemd-network-generator.service: Deactivated successfully. May 27 18:14:31.736544 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. May 27 18:14:31.738404 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 27 18:14:31.738466 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 18:14:31.740909 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. May 27 18:14:31.740984 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 27 18:14:31.741654 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 27 18:14:31.741713 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 27 18:14:31.742542 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 27 18:14:31.742599 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 27 18:14:31.745494 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. May 27 18:14:31.745584 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. May 27 18:14:31.745641 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. May 27 18:14:31.745694 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 27 18:14:31.757164 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 27 18:14:31.757467 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 27 18:14:31.766678 systemd[1]: network-cleanup.service: Deactivated successfully. May 27 18:14:31.766838 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 27 18:14:31.768352 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 27 18:14:31.770189 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 27 18:14:31.794132 systemd[1]: Switching root. May 27 18:14:31.834139 systemd-journald[213]: Journal stopped May 27 18:14:33.153801 systemd-journald[213]: Received SIGTERM from PID 1 (systemd). May 27 18:14:33.153886 kernel: SELinux: policy capability network_peer_controls=1 May 27 18:14:33.153906 kernel: SELinux: policy capability open_perms=1 May 27 18:14:33.153922 kernel: SELinux: policy capability extended_socket_class=1 May 27 18:14:33.153933 kernel: SELinux: policy capability always_check_network=0 May 27 18:14:33.153945 kernel: SELinux: policy capability cgroup_seclabel=1 May 27 18:14:33.153957 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 27 18:14:33.153969 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 27 18:14:33.153984 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 27 18:14:33.153996 kernel: SELinux: policy capability userspace_initial_context=0 May 27 18:14:33.154008 kernel: audit: type=1403 audit(1748369672.076:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 27 18:14:33.154027 systemd[1]: Successfully loaded SELinux policy in 62.365ms. May 27 18:14:33.154050 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 13.296ms. May 27 18:14:33.154067 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 27 18:14:33.154085 systemd[1]: Detected virtualization kvm. May 27 18:14:33.154097 systemd[1]: Detected architecture x86-64. May 27 18:14:33.154109 systemd[1]: Detected first boot. May 27 18:14:33.154122 systemd[1]: Hostname set to . May 27 18:14:33.154137 systemd[1]: Initializing machine ID from VM UUID. May 27 18:14:33.154151 zram_generator::config[1104]: No configuration found. May 27 18:14:33.154177 kernel: Guest personality initialized and is inactive May 27 18:14:33.154194 kernel: VMCI host device registered (name=vmci, major=10, minor=125) May 27 18:14:33.154211 kernel: Initialized host personality May 27 18:14:33.154227 kernel: NET: Registered PF_VSOCK protocol family May 27 18:14:33.154265 systemd[1]: Populated /etc with preset unit settings. May 27 18:14:33.154292 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 27 18:14:33.154305 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 27 18:14:33.154317 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 27 18:14:33.154336 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 27 18:14:33.154358 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 27 18:14:33.154380 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 27 18:14:33.154397 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 27 18:14:33.154410 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 27 18:14:33.154423 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 27 18:14:33.154440 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 27 18:14:33.154453 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 27 18:14:33.154465 systemd[1]: Created slice user.slice - User and Session Slice. May 27 18:14:33.154481 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 18:14:33.154493 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 18:14:33.154506 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 27 18:14:33.154522 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 27 18:14:33.154536 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 27 18:14:33.154549 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 27 18:14:33.154565 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... May 27 18:14:33.154577 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 18:14:33.154593 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 27 18:14:33.154606 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 27 18:14:33.154618 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 27 18:14:33.154631 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 27 18:14:33.154644 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 27 18:14:33.154658 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 18:14:33.154674 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 27 18:14:33.154687 systemd[1]: Reached target slices.target - Slice Units. May 27 18:14:33.154702 systemd[1]: Reached target swap.target - Swaps. May 27 18:14:33.154715 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 27 18:14:33.154728 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 27 18:14:33.154741 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 27 18:14:33.154756 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 27 18:14:33.154771 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 27 18:14:33.154784 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 27 18:14:33.154796 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 27 18:14:33.154808 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 27 18:14:33.154823 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 27 18:14:33.154836 systemd[1]: Mounting media.mount - External Media Directory... May 27 18:14:33.154848 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 18:14:33.154860 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 27 18:14:33.154873 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 27 18:14:33.154885 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 27 18:14:33.154902 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 27 18:14:33.154914 systemd[1]: Reached target machines.target - Containers. May 27 18:14:33.154929 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 27 18:14:33.154944 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 18:14:33.154958 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 27 18:14:33.154970 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 27 18:14:33.154983 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 27 18:14:33.154995 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 27 18:14:33.155008 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 27 18:14:33.155023 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 27 18:14:33.155038 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 27 18:14:33.155059 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 27 18:14:33.155072 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 27 18:14:33.155085 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 27 18:14:33.155098 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 27 18:14:33.155111 systemd[1]: Stopped systemd-fsck-usr.service. May 27 18:14:33.155140 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 18:14:33.155158 systemd[1]: Starting systemd-journald.service - Journal Service... May 27 18:14:33.155184 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 27 18:14:33.155202 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 27 18:14:33.155220 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 27 18:14:33.155274 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 27 18:14:33.155292 kernel: loop: module loaded May 27 18:14:33.155316 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 27 18:14:33.155333 systemd[1]: verity-setup.service: Deactivated successfully. May 27 18:14:33.155351 systemd[1]: Stopped verity-setup.service. May 27 18:14:33.155370 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 18:14:33.155389 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 27 18:14:33.155407 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 27 18:14:33.155427 systemd[1]: Mounted media.mount - External Media Directory. May 27 18:14:33.155443 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 27 18:14:33.155456 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 27 18:14:33.155468 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 27 18:14:33.155482 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 27 18:14:33.155494 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 27 18:14:33.155507 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 27 18:14:33.155520 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 27 18:14:33.155533 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 27 18:14:33.155551 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 27 18:14:33.155565 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 27 18:14:33.155588 systemd[1]: modprobe@loop.service: Deactivated successfully. May 27 18:14:33.155611 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 27 18:14:33.155629 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 27 18:14:33.155648 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 27 18:14:33.155668 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 27 18:14:33.155741 systemd-journald[1174]: Collecting audit messages is disabled. May 27 18:14:33.155782 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 27 18:14:33.155797 systemd-journald[1174]: Journal started May 27 18:14:33.155832 systemd-journald[1174]: Runtime Journal (/run/log/journal/20ef69403ab5410498843b62749a2169) is 4.9M, max 39.5M, 34.6M free. May 27 18:14:32.799529 systemd[1]: Queued start job for default target multi-user.target. May 27 18:14:32.825966 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. May 27 18:14:32.826539 systemd[1]: systemd-journald.service: Deactivated successfully. May 27 18:14:33.161811 systemd[1]: Reached target local-fs.target - Local File Systems. May 27 18:14:33.167267 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 27 18:14:33.181273 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 27 18:14:33.185272 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 18:14:33.188296 kernel: fuse: init (API version 7.41) May 27 18:14:33.191330 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 27 18:14:33.198280 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 27 18:14:33.213276 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 27 18:14:33.217104 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 27 18:14:33.230138 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 27 18:14:33.232281 kernel: ACPI: bus type drm_connector registered May 27 18:14:33.236263 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 27 18:14:33.245636 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 27 18:14:33.253271 systemd[1]: Started systemd-journald.service - Journal Service. May 27 18:14:33.259890 systemd[1]: modprobe@drm.service: Deactivated successfully. May 27 18:14:33.261097 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 27 18:14:33.262817 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 27 18:14:33.263322 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 27 18:14:33.265799 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 27 18:14:33.268454 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 27 18:14:33.269609 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 27 18:14:33.336484 systemd[1]: Reached target network-pre.target - Preparation for Network. May 27 18:14:33.345890 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 27 18:14:33.357522 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 27 18:14:33.369827 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 27 18:14:33.383476 kernel: loop0: detected capacity change from 0 to 146240 May 27 18:14:33.400962 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 27 18:14:33.403068 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 27 18:14:33.406773 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 27 18:14:33.412501 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 27 18:14:33.416693 systemd-journald[1174]: Time spent on flushing to /var/log/journal/20ef69403ab5410498843b62749a2169 is 61.164ms for 1015 entries. May 27 18:14:33.416693 systemd-journald[1174]: System Journal (/var/log/journal/20ef69403ab5410498843b62749a2169) is 8M, max 195.6M, 187.6M free. May 27 18:14:33.491784 systemd-journald[1174]: Received client request to flush runtime journal. May 27 18:14:33.491841 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 27 18:14:33.491863 kernel: loop1: detected capacity change from 0 to 113872 May 27 18:14:33.454933 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 27 18:14:33.486146 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 27 18:14:33.491986 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 27 18:14:33.495370 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 27 18:14:33.511270 kernel: loop2: detected capacity change from 0 to 8 May 27 18:14:33.515720 systemd-tmpfiles[1207]: ACLs are not supported, ignoring. May 27 18:14:33.515739 systemd-tmpfiles[1207]: ACLs are not supported, ignoring. May 27 18:14:33.525890 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 27 18:14:33.530365 kernel: loop3: detected capacity change from 0 to 229808 May 27 18:14:33.532476 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 27 18:14:33.606267 kernel: loop4: detected capacity change from 0 to 146240 May 27 18:14:33.631274 kernel: loop5: detected capacity change from 0 to 113872 May 27 18:14:33.644767 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 27 18:14:33.653131 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 27 18:14:33.655260 kernel: loop6: detected capacity change from 0 to 8 May 27 18:14:33.660275 kernel: loop7: detected capacity change from 0 to 229808 May 27 18:14:33.698662 (sd-merge)[1253]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-digitalocean'. May 27 18:14:33.700999 (sd-merge)[1253]: Merged extensions into '/usr'. May 27 18:14:33.709374 systemd[1]: Reload requested from client PID 1206 ('systemd-sysext') (unit systemd-sysext.service)... May 27 18:14:33.709390 systemd[1]: Reloading... May 27 18:14:33.741942 systemd-tmpfiles[1255]: ACLs are not supported, ignoring. May 27 18:14:33.743902 systemd-tmpfiles[1255]: ACLs are not supported, ignoring. May 27 18:14:33.899283 zram_generator::config[1283]: No configuration found. May 27 18:14:34.095880 ldconfig[1198]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 27 18:14:34.119742 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 18:14:34.273380 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 27 18:14:34.273959 systemd[1]: Reloading finished in 564 ms. May 27 18:14:34.294409 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 27 18:14:34.295513 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 27 18:14:34.296535 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 18:14:34.309350 systemd[1]: Starting ensure-sysext.service... May 27 18:14:34.314494 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 27 18:14:34.357448 systemd[1]: Reload requested from client PID 1327 ('systemctl') (unit ensure-sysext.service)... May 27 18:14:34.357468 systemd[1]: Reloading... May 27 18:14:34.360529 systemd-tmpfiles[1328]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. May 27 18:14:34.360944 systemd-tmpfiles[1328]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. May 27 18:14:34.361372 systemd-tmpfiles[1328]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 27 18:14:34.361732 systemd-tmpfiles[1328]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 27 18:14:34.362801 systemd-tmpfiles[1328]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 27 18:14:34.363148 systemd-tmpfiles[1328]: ACLs are not supported, ignoring. May 27 18:14:34.363303 systemd-tmpfiles[1328]: ACLs are not supported, ignoring. May 27 18:14:34.368203 systemd-tmpfiles[1328]: Detected autofs mount point /boot during canonicalization of boot. May 27 18:14:34.368431 systemd-tmpfiles[1328]: Skipping /boot May 27 18:14:34.385556 systemd-tmpfiles[1328]: Detected autofs mount point /boot during canonicalization of boot. May 27 18:14:34.385708 systemd-tmpfiles[1328]: Skipping /boot May 27 18:14:34.485274 zram_generator::config[1355]: No configuration found. May 27 18:14:34.662026 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 18:14:34.819395 systemd[1]: Reloading finished in 461 ms. May 27 18:14:34.838877 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 27 18:14:34.839823 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 18:14:34.857488 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 27 18:14:34.861703 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 27 18:14:34.863733 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 27 18:14:34.869565 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 27 18:14:34.873567 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 18:14:34.875951 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 27 18:14:34.890382 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 18:14:34.890619 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 18:14:34.894812 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 27 18:14:34.902707 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 27 18:14:34.909545 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 27 18:14:34.910028 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 18:14:34.910157 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 18:14:34.910300 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 18:14:34.914957 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 18:14:34.915160 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 18:14:34.915389 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 18:14:34.915475 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 18:14:34.915563 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 18:14:34.920463 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 18:14:34.920711 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 18:14:34.926639 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 27 18:14:34.928719 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 18:14:34.928874 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 18:14:34.929043 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 18:14:34.932670 systemd[1]: Finished ensure-sysext.service. May 27 18:14:34.941511 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... May 27 18:14:34.946489 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 27 18:14:34.959195 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 27 18:14:34.984345 systemd-udevd[1404]: Using default interface naming scheme 'v255'. May 27 18:14:34.989340 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 27 18:14:34.994699 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 27 18:14:35.000547 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 27 18:14:35.004674 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 27 18:14:35.015212 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 27 18:14:35.015852 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 27 18:14:35.017486 systemd[1]: modprobe@loop.service: Deactivated successfully. May 27 18:14:35.017832 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 27 18:14:35.021714 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 27 18:14:35.026284 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 18:14:35.027027 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 27 18:14:35.027665 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 27 18:14:35.033113 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 27 18:14:35.034217 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 27 18:14:35.034828 systemd[1]: modprobe@drm.service: Deactivated successfully. May 27 18:14:35.035468 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 27 18:14:35.072622 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 27 18:14:35.087198 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 27 18:14:35.106836 augenrules[1463]: No rules May 27 18:14:35.108657 systemd[1]: audit-rules.service: Deactivated successfully. May 27 18:14:35.111372 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 27 18:14:35.246378 systemd[1]: Condition check resulted in dev-disk-by\x2dlabel-config\x2d2.device - /dev/disk/by-label/config-2 being skipped. May 27 18:14:35.258635 systemd[1]: Mounting media-configdrive.mount - /media/configdrive... May 27 18:14:35.259378 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 18:14:35.259685 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 18:14:35.263563 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 27 18:14:35.267521 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 27 18:14:35.273165 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 27 18:14:35.274474 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 18:14:35.274530 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 18:14:35.274561 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 27 18:14:35.274577 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 18:14:35.301314 kernel: ISO 9660 Extensions: RRIP_1991A May 27 18:14:35.305852 systemd[1]: Mounted media-configdrive.mount - /media/configdrive. May 27 18:14:35.314874 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 27 18:14:35.319310 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 27 18:14:35.320147 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 27 18:14:35.320647 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 27 18:14:35.322688 systemd[1]: modprobe@loop.service: Deactivated successfully. May 27 18:14:35.322900 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 27 18:14:35.326508 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 27 18:14:35.326562 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 27 18:14:35.383488 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. May 27 18:14:35.384400 systemd[1]: Reached target time-set.target - System Time Set. May 27 18:14:35.405291 systemd-networkd[1436]: lo: Link UP May 27 18:14:35.405300 systemd-networkd[1436]: lo: Gained carrier May 27 18:14:35.408155 systemd-networkd[1436]: Enumeration completed May 27 18:14:35.408350 systemd[1]: Started systemd-networkd.service - Network Configuration. May 27 18:14:35.412518 systemd-networkd[1436]: eth1: Configuring with /run/systemd/network/10-16:32:bc:ba:85:04.network. May 27 18:14:35.412957 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 27 18:14:35.415621 systemd-networkd[1436]: eth1: Link UP May 27 18:14:35.415835 systemd-networkd[1436]: eth1: Gained carrier May 27 18:14:35.418621 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 27 18:14:35.430261 systemd-timesyncd[1417]: Network configuration changed, trying to establish connection. May 27 18:14:35.445258 systemd-resolved[1403]: Positive Trust Anchors: May 27 18:14:35.445273 systemd-resolved[1403]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 27 18:14:35.445312 systemd-resolved[1403]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 27 18:14:35.451971 systemd-resolved[1403]: Using system hostname 'ci-4344.0.0-6-bb492ec913'. May 27 18:14:35.454047 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 27 18:14:35.454782 systemd[1]: Reached target network.target - Network. May 27 18:14:35.455785 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 27 18:14:35.456322 systemd[1]: Reached target sysinit.target - System Initialization. May 27 18:14:35.457411 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 27 18:14:35.457831 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 27 18:14:35.459270 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. May 27 18:14:35.459973 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 27 18:14:35.462597 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 27 18:14:35.463146 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 27 18:14:35.463668 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 27 18:14:35.463715 systemd[1]: Reached target paths.target - Path Units. May 27 18:14:35.464162 systemd[1]: Reached target timers.target - Timer Units. May 27 18:14:35.466738 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 27 18:14:35.470194 systemd[1]: Starting docker.socket - Docker Socket for the API... May 27 18:14:35.474611 systemd-networkd[1436]: eth0: Configuring with /run/systemd/network/10-d6:ef:e9:89:2f:06.network. May 27 18:14:35.477699 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 27 18:14:35.479351 systemd-timesyncd[1417]: Network configuration changed, trying to establish connection. May 27 18:14:35.479689 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). May 27 18:14:35.480218 systemd-networkd[1436]: eth0: Link UP May 27 18:14:35.480322 systemd[1]: Reached target ssh-access.target - SSH Access Available. May 27 18:14:35.480833 systemd-timesyncd[1417]: Network configuration changed, trying to establish connection. May 27 18:14:35.483511 systemd-networkd[1436]: eth0: Gained carrier May 27 18:14:35.489799 systemd-timesyncd[1417]: Network configuration changed, trying to establish connection. May 27 18:14:35.490070 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 27 18:14:35.491705 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 27 18:14:35.492395 systemd-timesyncd[1417]: Network configuration changed, trying to establish connection. May 27 18:14:35.494520 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 27 18:14:35.497725 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 27 18:14:35.500215 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. May 27 18:14:35.503142 systemd[1]: Reached target sockets.target - Socket Units. May 27 18:14:35.504533 systemd[1]: Reached target basic.target - Basic System. May 27 18:14:35.505302 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 27 18:14:35.505983 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 27 18:14:35.509471 systemd[1]: Starting containerd.service - containerd container runtime... May 27 18:14:35.514600 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... May 27 18:14:35.519693 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 27 18:14:35.523703 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 27 18:14:35.529372 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 27 18:14:35.545272 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 27 18:14:35.547375 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 27 18:14:35.551521 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... May 27 18:14:35.555434 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 27 18:14:35.560832 jq[1511]: false May 27 18:14:35.560476 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 27 18:14:35.567467 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 27 18:14:35.582882 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 27 18:14:35.593932 google_oslogin_nss_cache[1515]: oslogin_cache_refresh[1515]: Refreshing passwd entry cache May 27 18:14:35.593937 oslogin_cache_refresh[1515]: Refreshing passwd entry cache May 27 18:14:35.596562 systemd[1]: Starting systemd-logind.service - User Login Management... May 27 18:14:35.598097 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 27 18:14:35.598756 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 27 18:14:35.606305 systemd[1]: Starting update-engine.service - Update Engine... May 27 18:14:35.611441 coreos-metadata[1508]: May 27 18:14:35.607 INFO Fetching http://169.254.169.254/metadata/v1.json: Attempt #1 May 27 18:14:35.612056 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 27 18:14:35.613459 oslogin_cache_refresh[1515]: Failure getting users, quitting May 27 18:14:35.621165 google_oslogin_nss_cache[1515]: oslogin_cache_refresh[1515]: Failure getting users, quitting May 27 18:14:35.621165 google_oslogin_nss_cache[1515]: oslogin_cache_refresh[1515]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. May 27 18:14:35.621165 google_oslogin_nss_cache[1515]: oslogin_cache_refresh[1515]: Refreshing group entry cache May 27 18:14:35.621165 google_oslogin_nss_cache[1515]: oslogin_cache_refresh[1515]: Failure getting groups, quitting May 27 18:14:35.621165 google_oslogin_nss_cache[1515]: oslogin_cache_refresh[1515]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. May 27 18:14:35.614832 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 27 18:14:35.613479 oslogin_cache_refresh[1515]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. May 27 18:14:35.615686 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 27 18:14:35.613538 oslogin_cache_refresh[1515]: Refreshing group entry cache May 27 18:14:35.616395 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 27 18:14:35.615434 oslogin_cache_refresh[1515]: Failure getting groups, quitting May 27 18:14:35.615453 oslogin_cache_refresh[1515]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. May 27 18:14:35.621939 systemd[1]: google-oslogin-cache.service: Deactivated successfully. May 27 18:14:35.626614 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. May 27 18:14:35.631914 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. May 27 18:14:35.636505 coreos-metadata[1508]: May 27 18:14:35.633 INFO Fetch successful May 27 18:14:35.636579 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 27 18:14:35.637366 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 27 18:14:35.651189 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 27 18:14:35.677850 extend-filesystems[1514]: Found loop4 May 27 18:14:35.677850 extend-filesystems[1514]: Found loop5 May 27 18:14:35.677850 extend-filesystems[1514]: Found loop6 May 27 18:14:35.677850 extend-filesystems[1514]: Found loop7 May 27 18:14:35.677850 extend-filesystems[1514]: Found vda May 27 18:14:35.703065 extend-filesystems[1514]: Found vda1 May 27 18:14:35.703065 extend-filesystems[1514]: Found vda2 May 27 18:14:35.703065 extend-filesystems[1514]: Found vda3 May 27 18:14:35.703065 extend-filesystems[1514]: Found usr May 27 18:14:35.703065 extend-filesystems[1514]: Found vda4 May 27 18:14:35.703065 extend-filesystems[1514]: Found vda6 May 27 18:14:35.703065 extend-filesystems[1514]: Found vda7 May 27 18:14:35.703065 extend-filesystems[1514]: Found vda9 May 27 18:14:35.703065 extend-filesystems[1514]: Checking size of /dev/vda9 May 27 18:14:35.690697 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 27 18:14:35.687705 dbus-daemon[1509]: [system] SELinux support is enabled May 27 18:14:35.744499 jq[1524]: true May 27 18:14:35.711970 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 27 18:14:35.712034 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 27 18:14:35.713063 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 27 18:14:35.713141 systemd[1]: user-configdrive.service - Load cloud-config from /media/configdrive was skipped because of an unmet condition check (ConditionKernelCommandLine=!flatcar.oem.id=digitalocean). May 27 18:14:35.713160 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 27 18:14:35.719312 systemd[1]: motdgen.service: Deactivated successfully. May 27 18:14:35.725504 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 27 18:14:35.756754 tar[1526]: linux-amd64/LICENSE May 27 18:14:35.756754 tar[1526]: linux-amd64/helm May 27 18:14:35.755525 (ntainerd)[1543]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 27 18:14:35.785554 update_engine[1522]: I20250527 18:14:35.773232 1522 main.cc:92] Flatcar Update Engine starting May 27 18:14:35.787537 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 27 18:14:35.796569 systemd[1]: Started update-engine.service - Update Engine. May 27 18:14:35.799785 update_engine[1522]: I20250527 18:14:35.798069 1522 update_check_scheduler.cc:74] Next update check in 4m31s May 27 18:14:35.799828 jq[1546]: true May 27 18:14:35.819115 extend-filesystems[1514]: Resized partition /dev/vda9 May 27 18:14:35.823324 extend-filesystems[1559]: resize2fs 1.47.2 (1-Jan-2025) May 27 18:14:35.828669 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 27 18:14:35.831260 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 15121403 blocks May 27 18:14:35.837806 systemd-logind[1520]: New seat seat0. May 27 18:14:35.847761 systemd[1]: Started systemd-logind.service - User Login Management. May 27 18:14:35.848654 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. May 27 18:14:35.853036 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 27 18:14:35.962420 kernel: EXT4-fs (vda9): resized filesystem to 15121403 May 27 18:14:35.984090 extend-filesystems[1559]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required May 27 18:14:35.984090 extend-filesystems[1559]: old_desc_blocks = 1, new_desc_blocks = 8 May 27 18:14:35.984090 extend-filesystems[1559]: The filesystem on /dev/vda9 is now 15121403 (4k) blocks long. May 27 18:14:36.001606 extend-filesystems[1514]: Resized filesystem in /dev/vda9 May 27 18:14:36.001606 extend-filesystems[1514]: Found vdb May 27 18:14:35.991166 systemd[1]: extend-filesystems.service: Deactivated successfully. May 27 18:14:35.993551 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 27 18:14:36.020276 bash[1576]: Updated "/home/core/.ssh/authorized_keys" May 27 18:14:36.019395 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 27 18:14:36.032391 systemd[1]: Starting sshkeys.service... May 27 18:14:36.070261 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 May 27 18:14:36.075788 kernel: ACPI: button: Power Button [PWRF] May 27 18:14:36.131882 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. May 27 18:14:36.138867 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... May 27 18:14:36.152278 kernel: mousedev: PS/2 mouse device common for all mice May 27 18:14:36.159861 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0 May 27 18:14:36.160228 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD May 27 18:14:36.239301 sshd_keygen[1549]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 27 18:14:36.258245 locksmithd[1558]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 27 18:14:36.277387 coreos-metadata[1585]: May 27 18:14:36.277 INFO Fetching http://169.254.169.254/metadata/v1.json: Attempt #1 May 27 18:14:36.295693 coreos-metadata[1585]: May 27 18:14:36.294 INFO Fetch successful May 27 18:14:36.318013 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 27 18:14:36.320453 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0 May 27 18:14:36.321194 unknown[1585]: wrote ssh authorized keys file for user: core May 27 18:14:36.322397 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console May 27 18:14:36.326911 systemd[1]: Starting issuegen.service - Generate /run/issue... May 27 18:14:36.335281 kernel: Console: switching to colour dummy device 80x25 May 27 18:14:36.339274 kernel: [drm] features: -virgl +edid -resource_blob -host_visible May 27 18:14:36.339355 kernel: [drm] features: -context_init May 27 18:14:36.344272 kernel: [drm] number of scanouts: 1 May 27 18:14:36.344360 kernel: [drm] number of cap sets: 0 May 27 18:14:36.380021 update-ssh-keys[1602]: Updated "/home/core/.ssh/authorized_keys" May 27 18:14:36.382342 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). May 27 18:14:36.385820 systemd[1]: Finished sshkeys.service. May 27 18:14:36.390601 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0 May 27 18:14:36.402364 systemd[1]: issuegen.service: Deactivated successfully. May 27 18:14:36.402730 systemd[1]: Finished issuegen.service - Generate /run/issue. May 27 18:14:36.408804 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 27 18:14:36.467790 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 27 18:14:36.470524 systemd[1]: Started getty@tty1.service - Getty on tty1. May 27 18:14:36.475804 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. May 27 18:14:36.476620 systemd[1]: Reached target getty.target - Login Prompts. May 27 18:14:36.489952 containerd[1543]: time="2025-05-27T18:14:36Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 27 18:14:36.492856 containerd[1543]: time="2025-05-27T18:14:36.492803211Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 May 27 18:14:36.520323 containerd[1543]: time="2025-05-27T18:14:36.520063969Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="10.872µs" May 27 18:14:36.522727 containerd[1543]: time="2025-05-27T18:14:36.522043854Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 27 18:14:36.522727 containerd[1543]: time="2025-05-27T18:14:36.522131821Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 27 18:14:36.523901 containerd[1543]: time="2025-05-27T18:14:36.523667414Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 27 18:14:36.523901 containerd[1543]: time="2025-05-27T18:14:36.523727092Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 27 18:14:36.523901 containerd[1543]: time="2025-05-27T18:14:36.523763665Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 27 18:14:36.528091 containerd[1543]: time="2025-05-27T18:14:36.524677511Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 27 18:14:36.528091 containerd[1543]: time="2025-05-27T18:14:36.526321419Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 27 18:14:36.528091 containerd[1543]: time="2025-05-27T18:14:36.527602868Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 27 18:14:36.528091 containerd[1543]: time="2025-05-27T18:14:36.527661490Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 27 18:14:36.528091 containerd[1543]: time="2025-05-27T18:14:36.527679369Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 27 18:14:36.528091 containerd[1543]: time="2025-05-27T18:14:36.527689393Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 27 18:14:36.529543 containerd[1543]: time="2025-05-27T18:14:36.529230979Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 27 18:14:36.530259 containerd[1543]: time="2025-05-27T18:14:36.530171061Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 27 18:14:36.530371 containerd[1543]: time="2025-05-27T18:14:36.530348750Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 27 18:14:36.530448 containerd[1543]: time="2025-05-27T18:14:36.530435463Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 27 18:14:36.530568 containerd[1543]: time="2025-05-27T18:14:36.530552038Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 27 18:14:36.530901 containerd[1543]: time="2025-05-27T18:14:36.530883061Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 27 18:14:36.531605 containerd[1543]: time="2025-05-27T18:14:36.531576231Z" level=info msg="metadata content store policy set" policy=shared May 27 18:14:36.540272 containerd[1543]: time="2025-05-27T18:14:36.539642580Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 27 18:14:36.540272 containerd[1543]: time="2025-05-27T18:14:36.539740544Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 27 18:14:36.540272 containerd[1543]: time="2025-05-27T18:14:36.539770570Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 27 18:14:36.540272 containerd[1543]: time="2025-05-27T18:14:36.539786801Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 27 18:14:36.540272 containerd[1543]: time="2025-05-27T18:14:36.539799549Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 27 18:14:36.540272 containerd[1543]: time="2025-05-27T18:14:36.539816793Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 27 18:14:36.540272 containerd[1543]: time="2025-05-27T18:14:36.539832947Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 27 18:14:36.540272 containerd[1543]: time="2025-05-27T18:14:36.539849259Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 27 18:14:36.540272 containerd[1543]: time="2025-05-27T18:14:36.539889465Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 27 18:14:36.540272 containerd[1543]: time="2025-05-27T18:14:36.539909925Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 27 18:14:36.540272 containerd[1543]: time="2025-05-27T18:14:36.539920387Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 27 18:14:36.540272 containerd[1543]: time="2025-05-27T18:14:36.539952561Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 27 18:14:36.540272 containerd[1543]: time="2025-05-27T18:14:36.540109005Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 27 18:14:36.540272 containerd[1543]: time="2025-05-27T18:14:36.540130166Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 27 18:14:36.540657 containerd[1543]: time="2025-05-27T18:14:36.540144538Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 27 18:14:36.540657 containerd[1543]: time="2025-05-27T18:14:36.540156326Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 27 18:14:36.540657 containerd[1543]: time="2025-05-27T18:14:36.540167811Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 27 18:14:36.540657 containerd[1543]: time="2025-05-27T18:14:36.540180373Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 27 18:14:36.540657 containerd[1543]: time="2025-05-27T18:14:36.540205934Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 27 18:14:36.540657 containerd[1543]: time="2025-05-27T18:14:36.540218437Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 27 18:14:36.540657 containerd[1543]: time="2025-05-27T18:14:36.540230181Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 27 18:14:36.540939 containerd[1543]: time="2025-05-27T18:14:36.540906594Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 27 18:14:36.540991 containerd[1543]: time="2025-05-27T18:14:36.540943769Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 27 18:14:36.541045 containerd[1543]: time="2025-05-27T18:14:36.541029323Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 27 18:14:36.541079 containerd[1543]: time="2025-05-27T18:14:36.541046765Z" level=info msg="Start snapshots syncer" May 27 18:14:36.541108 containerd[1543]: time="2025-05-27T18:14:36.541075558Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 27 18:14:36.542464 containerd[1543]: time="2025-05-27T18:14:36.542407357Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 27 18:14:36.542812 containerd[1543]: time="2025-05-27T18:14:36.542491388Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 27 18:14:36.542812 containerd[1543]: time="2025-05-27T18:14:36.542609585Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 27 18:14:36.542812 containerd[1543]: time="2025-05-27T18:14:36.542806197Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 27 18:14:36.542939 containerd[1543]: time="2025-05-27T18:14:36.542835398Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 27 18:14:36.542939 containerd[1543]: time="2025-05-27T18:14:36.542851500Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 27 18:14:36.542939 containerd[1543]: time="2025-05-27T18:14:36.542865028Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 27 18:14:36.542939 containerd[1543]: time="2025-05-27T18:14:36.542881443Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 27 18:14:36.542939 containerd[1543]: time="2025-05-27T18:14:36.542892323Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 27 18:14:36.542939 containerd[1543]: time="2025-05-27T18:14:36.542906064Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 27 18:14:36.543140 containerd[1543]: time="2025-05-27T18:14:36.542940863Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 27 18:14:36.543140 containerd[1543]: time="2025-05-27T18:14:36.542955699Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 27 18:14:36.543140 containerd[1543]: time="2025-05-27T18:14:36.542966156Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 27 18:14:36.543140 containerd[1543]: time="2025-05-27T18:14:36.542991344Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 27 18:14:36.543140 containerd[1543]: time="2025-05-27T18:14:36.543005420Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 27 18:14:36.543140 containerd[1543]: time="2025-05-27T18:14:36.543014041Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 27 18:14:36.543140 containerd[1543]: time="2025-05-27T18:14:36.543023068Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 27 18:14:36.543140 containerd[1543]: time="2025-05-27T18:14:36.543031291Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 27 18:14:36.543140 containerd[1543]: time="2025-05-27T18:14:36.543041159Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 27 18:14:36.543140 containerd[1543]: time="2025-05-27T18:14:36.543051068Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 27 18:14:36.543140 containerd[1543]: time="2025-05-27T18:14:36.543069083Z" level=info msg="runtime interface created" May 27 18:14:36.543140 containerd[1543]: time="2025-05-27T18:14:36.543074356Z" level=info msg="created NRI interface" May 27 18:14:36.543140 containerd[1543]: time="2025-05-27T18:14:36.543082171Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 27 18:14:36.543140 containerd[1543]: time="2025-05-27T18:14:36.543097564Z" level=info msg="Connect containerd service" May 27 18:14:36.543140 containerd[1543]: time="2025-05-27T18:14:36.543133608Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 27 18:14:36.546946 containerd[1543]: time="2025-05-27T18:14:36.546893180Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 27 18:14:36.855431 containerd[1543]: time="2025-05-27T18:14:36.855024903Z" level=info msg="Start subscribing containerd event" May 27 18:14:36.855431 containerd[1543]: time="2025-05-27T18:14:36.855109378Z" level=info msg="Start recovering state" May 27 18:14:36.855431 containerd[1543]: time="2025-05-27T18:14:36.855143567Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 27 18:14:36.855431 containerd[1543]: time="2025-05-27T18:14:36.855201587Z" level=info msg=serving... address=/run/containerd/containerd.sock May 27 18:14:36.856609 containerd[1543]: time="2025-05-27T18:14:36.855738585Z" level=info msg="Start event monitor" May 27 18:14:36.856609 containerd[1543]: time="2025-05-27T18:14:36.855801980Z" level=info msg="Start cni network conf syncer for default" May 27 18:14:36.856609 containerd[1543]: time="2025-05-27T18:14:36.855818124Z" level=info msg="Start streaming server" May 27 18:14:36.857695 containerd[1543]: time="2025-05-27T18:14:36.855834376Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 27 18:14:36.857695 containerd[1543]: time="2025-05-27T18:14:36.857047843Z" level=info msg="runtime interface starting up..." May 27 18:14:36.857695 containerd[1543]: time="2025-05-27T18:14:36.857059022Z" level=info msg="starting plugins..." May 27 18:14:36.857695 containerd[1543]: time="2025-05-27T18:14:36.857093376Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 27 18:14:36.858981 containerd[1543]: time="2025-05-27T18:14:36.858751210Z" level=info msg="containerd successfully booted in 0.369256s" May 27 18:14:36.857844 systemd[1]: Started containerd.service - containerd container runtime. May 27 18:14:36.884061 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 18:14:36.899330 systemd-networkd[1436]: eth1: Gained IPv6LL May 27 18:14:36.900138 systemd-timesyncd[1417]: Network configuration changed, trying to establish connection. May 27 18:14:36.904713 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 27 18:14:36.906889 systemd[1]: Reached target network-online.target - Network is Online. May 27 18:14:36.910727 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 18:14:36.912846 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 27 18:14:37.025944 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 27 18:14:37.043888 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 27 18:14:37.044127 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 27 18:14:37.048224 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 27 18:14:37.054529 systemd-logind[1520]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) May 27 18:14:37.055375 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 18:14:37.068418 systemd-logind[1520]: Watching system buttons on /dev/input/event2 (Power Button) May 27 18:14:37.090344 systemd-networkd[1436]: eth0: Gained IPv6LL May 27 18:14:37.090911 systemd-timesyncd[1417]: Network configuration changed, trying to establish connection. May 27 18:14:37.251279 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 18:14:37.406272 kernel: EDAC MC: Ver: 3.0.0 May 27 18:14:37.495335 tar[1526]: linux-amd64/README.md May 27 18:14:37.521600 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 27 18:14:38.216155 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 18:14:38.217153 systemd[1]: Reached target multi-user.target - Multi-User System. May 27 18:14:38.218837 systemd[1]: Startup finished in 3.410s (kernel) + 6.363s (initrd) + 6.204s (userspace) = 15.977s. May 27 18:14:38.229304 (kubelet)[1678]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 18:14:38.916421 kubelet[1678]: E0527 18:14:38.916348 1678 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 18:14:38.919763 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 18:14:38.919919 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 18:14:38.920419 systemd[1]: kubelet.service: Consumed 1.333s CPU time, 265.9M memory peak. May 27 18:14:39.825121 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 27 18:14:39.827291 systemd[1]: Started sshd@0-146.190.127.126:22-139.178.68.195:40048.service - OpenSSH per-connection server daemon (139.178.68.195:40048). May 27 18:14:39.935569 sshd[1689]: Accepted publickey for core from 139.178.68.195 port 40048 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:14:39.938149 sshd-session[1689]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:14:39.946317 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 27 18:14:39.947482 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 27 18:14:39.959394 systemd-logind[1520]: New session 1 of user core. May 27 18:14:39.978103 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 27 18:14:39.984116 systemd[1]: Starting user@500.service - User Manager for UID 500... May 27 18:14:40.000212 (systemd)[1693]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 27 18:14:40.004285 systemd-logind[1520]: New session c1 of user core. May 27 18:14:40.241833 systemd[1693]: Queued start job for default target default.target. May 27 18:14:40.254341 systemd[1693]: Created slice app.slice - User Application Slice. May 27 18:14:40.254402 systemd[1693]: Reached target paths.target - Paths. May 27 18:14:40.254479 systemd[1693]: Reached target timers.target - Timers. May 27 18:14:40.256626 systemd[1693]: Starting dbus.socket - D-Bus User Message Bus Socket... May 27 18:14:40.289282 systemd[1693]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 27 18:14:40.289597 systemd[1693]: Reached target sockets.target - Sockets. May 27 18:14:40.289678 systemd[1693]: Reached target basic.target - Basic System. May 27 18:14:40.289739 systemd[1693]: Reached target default.target - Main User Target. May 27 18:14:40.289782 systemd[1693]: Startup finished in 275ms. May 27 18:14:40.289828 systemd[1]: Started user@500.service - User Manager for UID 500. May 27 18:14:40.298556 systemd[1]: Started session-1.scope - Session 1 of User core. May 27 18:14:40.366574 systemd[1]: Started sshd@1-146.190.127.126:22-139.178.68.195:40054.service - OpenSSH per-connection server daemon (139.178.68.195:40054). May 27 18:14:40.427816 sshd[1704]: Accepted publickey for core from 139.178.68.195 port 40054 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:14:40.429849 sshd-session[1704]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:14:40.436209 systemd-logind[1520]: New session 2 of user core. May 27 18:14:40.442538 systemd[1]: Started session-2.scope - Session 2 of User core. May 27 18:14:40.507521 sshd[1706]: Connection closed by 139.178.68.195 port 40054 May 27 18:14:40.508052 sshd-session[1704]: pam_unix(sshd:session): session closed for user core May 27 18:14:40.522778 systemd[1]: sshd@1-146.190.127.126:22-139.178.68.195:40054.service: Deactivated successfully. May 27 18:14:40.525083 systemd[1]: session-2.scope: Deactivated successfully. May 27 18:14:40.526083 systemd-logind[1520]: Session 2 logged out. Waiting for processes to exit. May 27 18:14:40.530412 systemd[1]: Started sshd@2-146.190.127.126:22-139.178.68.195:40070.service - OpenSSH per-connection server daemon (139.178.68.195:40070). May 27 18:14:40.531516 systemd-logind[1520]: Removed session 2. May 27 18:14:40.591621 sshd[1712]: Accepted publickey for core from 139.178.68.195 port 40070 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:14:40.593426 sshd-session[1712]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:14:40.598943 systemd-logind[1520]: New session 3 of user core. May 27 18:14:40.609562 systemd[1]: Started session-3.scope - Session 3 of User core. May 27 18:14:40.665728 sshd[1714]: Connection closed by 139.178.68.195 port 40070 May 27 18:14:40.666490 sshd-session[1712]: pam_unix(sshd:session): session closed for user core May 27 18:14:40.683719 systemd[1]: sshd@2-146.190.127.126:22-139.178.68.195:40070.service: Deactivated successfully. May 27 18:14:40.685862 systemd[1]: session-3.scope: Deactivated successfully. May 27 18:14:40.687684 systemd-logind[1520]: Session 3 logged out. Waiting for processes to exit. May 27 18:14:40.690385 systemd[1]: Started sshd@3-146.190.127.126:22-139.178.68.195:40086.service - OpenSSH per-connection server daemon (139.178.68.195:40086). May 27 18:14:40.692342 systemd-logind[1520]: Removed session 3. May 27 18:14:40.750025 sshd[1720]: Accepted publickey for core from 139.178.68.195 port 40086 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:14:40.751687 sshd-session[1720]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:14:40.758496 systemd-logind[1520]: New session 4 of user core. May 27 18:14:40.763591 systemd[1]: Started session-4.scope - Session 4 of User core. May 27 18:14:40.826874 sshd[1722]: Connection closed by 139.178.68.195 port 40086 May 27 18:14:40.827458 sshd-session[1720]: pam_unix(sshd:session): session closed for user core May 27 18:14:40.838602 systemd[1]: sshd@3-146.190.127.126:22-139.178.68.195:40086.service: Deactivated successfully. May 27 18:14:40.840841 systemd[1]: session-4.scope: Deactivated successfully. May 27 18:14:40.841846 systemd-logind[1520]: Session 4 logged out. Waiting for processes to exit. May 27 18:14:40.846975 systemd[1]: Started sshd@4-146.190.127.126:22-139.178.68.195:40098.service - OpenSSH per-connection server daemon (139.178.68.195:40098). May 27 18:14:40.848536 systemd-logind[1520]: Removed session 4. May 27 18:14:40.907940 sshd[1728]: Accepted publickey for core from 139.178.68.195 port 40098 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:14:40.910288 sshd-session[1728]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:14:40.917208 systemd-logind[1520]: New session 5 of user core. May 27 18:14:40.928561 systemd[1]: Started session-5.scope - Session 5 of User core. May 27 18:14:40.998909 sudo[1731]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 27 18:14:40.999341 sudo[1731]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 18:14:41.016062 sudo[1731]: pam_unix(sudo:session): session closed for user root May 27 18:14:41.021124 sshd[1730]: Connection closed by 139.178.68.195 port 40098 May 27 18:14:41.021933 sshd-session[1728]: pam_unix(sshd:session): session closed for user core May 27 18:14:41.032164 systemd[1]: sshd@4-146.190.127.126:22-139.178.68.195:40098.service: Deactivated successfully. May 27 18:14:41.034815 systemd[1]: session-5.scope: Deactivated successfully. May 27 18:14:41.035689 systemd-logind[1520]: Session 5 logged out. Waiting for processes to exit. May 27 18:14:41.039957 systemd[1]: Started sshd@5-146.190.127.126:22-139.178.68.195:40108.service - OpenSSH per-connection server daemon (139.178.68.195:40108). May 27 18:14:41.041305 systemd-logind[1520]: Removed session 5. May 27 18:14:41.101662 sshd[1737]: Accepted publickey for core from 139.178.68.195 port 40108 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:14:41.103402 sshd-session[1737]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:14:41.110488 systemd-logind[1520]: New session 6 of user core. May 27 18:14:41.124649 systemd[1]: Started session-6.scope - Session 6 of User core. May 27 18:14:41.184874 sudo[1741]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 27 18:14:41.185745 sudo[1741]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 18:14:41.212318 sudo[1741]: pam_unix(sudo:session): session closed for user root May 27 18:14:41.220688 sudo[1740]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 27 18:14:41.221126 sudo[1740]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 18:14:41.238666 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 27 18:14:41.299331 augenrules[1763]: No rules May 27 18:14:41.300454 systemd[1]: audit-rules.service: Deactivated successfully. May 27 18:14:41.300745 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 27 18:14:41.302091 sudo[1740]: pam_unix(sudo:session): session closed for user root May 27 18:14:41.306159 sshd[1739]: Connection closed by 139.178.68.195 port 40108 May 27 18:14:41.307461 sshd-session[1737]: pam_unix(sshd:session): session closed for user core May 27 18:14:41.320026 systemd[1]: sshd@5-146.190.127.126:22-139.178.68.195:40108.service: Deactivated successfully. May 27 18:14:41.322701 systemd[1]: session-6.scope: Deactivated successfully. May 27 18:14:41.324003 systemd-logind[1520]: Session 6 logged out. Waiting for processes to exit. May 27 18:14:41.328574 systemd[1]: Started sshd@6-146.190.127.126:22-139.178.68.195:40112.service - OpenSSH per-connection server daemon (139.178.68.195:40112). May 27 18:14:41.329894 systemd-logind[1520]: Removed session 6. May 27 18:14:41.385204 sshd[1772]: Accepted publickey for core from 139.178.68.195 port 40112 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:14:41.387056 sshd-session[1772]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:14:41.392997 systemd-logind[1520]: New session 7 of user core. May 27 18:14:41.400543 systemd[1]: Started session-7.scope - Session 7 of User core. May 27 18:14:41.461583 sudo[1775]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 27 18:14:41.462363 sudo[1775]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 18:14:41.950787 systemd[1]: Starting docker.service - Docker Application Container Engine... May 27 18:14:41.978821 (dockerd)[1794]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 27 18:14:42.319067 dockerd[1794]: time="2025-05-27T18:14:42.318441087Z" level=info msg="Starting up" May 27 18:14:42.320202 dockerd[1794]: time="2025-05-27T18:14:42.320173269Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 27 18:14:42.355166 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport1702969233-merged.mount: Deactivated successfully. May 27 18:14:42.378996 dockerd[1794]: time="2025-05-27T18:14:42.378924468Z" level=info msg="Loading containers: start." May 27 18:14:42.391321 kernel: Initializing XFRM netlink socket May 27 18:14:42.631481 systemd-timesyncd[1417]: Network configuration changed, trying to establish connection. May 27 18:14:42.633626 systemd-timesyncd[1417]: Network configuration changed, trying to establish connection. May 27 18:14:42.646076 systemd-timesyncd[1417]: Network configuration changed, trying to establish connection. May 27 18:14:42.682409 systemd-networkd[1436]: docker0: Link UP May 27 18:14:42.682738 systemd-timesyncd[1417]: Network configuration changed, trying to establish connection. May 27 18:14:42.685532 dockerd[1794]: time="2025-05-27T18:14:42.685480577Z" level=info msg="Loading containers: done." May 27 18:14:42.704260 dockerd[1794]: time="2025-05-27T18:14:42.702969300Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 27 18:14:42.704260 dockerd[1794]: time="2025-05-27T18:14:42.703383439Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 May 27 18:14:42.704260 dockerd[1794]: time="2025-05-27T18:14:42.703519360Z" level=info msg="Initializing buildkit" May 27 18:14:42.708147 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck512479176-merged.mount: Deactivated successfully. May 27 18:14:42.729231 dockerd[1794]: time="2025-05-27T18:14:42.729129057Z" level=info msg="Completed buildkit initialization" May 27 18:14:42.738502 dockerd[1794]: time="2025-05-27T18:14:42.738442164Z" level=info msg="Daemon has completed initialization" May 27 18:14:42.738502 dockerd[1794]: time="2025-05-27T18:14:42.738544950Z" level=info msg="API listen on /run/docker.sock" May 27 18:14:42.739198 systemd[1]: Started docker.service - Docker Application Container Engine. May 27 18:14:43.540837 containerd[1543]: time="2025-05-27T18:14:43.540677467Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.1\"" May 27 18:14:44.108653 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount255023584.mount: Deactivated successfully. May 27 18:14:45.427751 containerd[1543]: time="2025-05-27T18:14:45.427679379Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:14:45.429382 containerd[1543]: time="2025-05-27T18:14:45.428613353Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.1: active requests=0, bytes read=30075403" May 27 18:14:45.429716 containerd[1543]: time="2025-05-27T18:14:45.429685740Z" level=info msg="ImageCreate event name:\"sha256:c6ab243b29f82a6ce269a5342bfd9ea3d0d4ef0f2bb7e98c6ac0bde1aeafab66\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:14:45.432437 containerd[1543]: time="2025-05-27T18:14:45.432394636Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:d8ae2fb01c39aa1c7add84f3d54425cf081c24c11e3946830292a8cfa4293548\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:14:45.433745 containerd[1543]: time="2025-05-27T18:14:45.433691556Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.1\" with image id \"sha256:c6ab243b29f82a6ce269a5342bfd9ea3d0d4ef0f2bb7e98c6ac0bde1aeafab66\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.1\", repo digest \"registry.k8s.io/kube-apiserver@sha256:d8ae2fb01c39aa1c7add84f3d54425cf081c24c11e3946830292a8cfa4293548\", size \"30072203\" in 1.892754261s" May 27 18:14:45.433916 containerd[1543]: time="2025-05-27T18:14:45.433897112Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.1\" returns image reference \"sha256:c6ab243b29f82a6ce269a5342bfd9ea3d0d4ef0f2bb7e98c6ac0bde1aeafab66\"" May 27 18:14:45.434602 containerd[1543]: time="2025-05-27T18:14:45.434579521Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.1\"" May 27 18:14:46.992786 containerd[1543]: time="2025-05-27T18:14:46.992656220Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:14:46.994168 containerd[1543]: time="2025-05-27T18:14:46.994008829Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.1: active requests=0, bytes read=26011390" May 27 18:14:46.995177 containerd[1543]: time="2025-05-27T18:14:46.995094782Z" level=info msg="ImageCreate event name:\"sha256:ef43894fa110c389f7286f4d5a3ea176072c95280efeca60d6a79617cdbbf3e4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:14:46.998756 containerd[1543]: time="2025-05-27T18:14:46.998647523Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:7c9bea694e3a3c01ed6a5ee02d55a6124cc08e0b2eec6caa33f2c396b8cbc3f8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:14:47.001143 containerd[1543]: time="2025-05-27T18:14:47.001021308Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.1\" with image id \"sha256:ef43894fa110c389f7286f4d5a3ea176072c95280efeca60d6a79617cdbbf3e4\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.1\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:7c9bea694e3a3c01ed6a5ee02d55a6124cc08e0b2eec6caa33f2c396b8cbc3f8\", size \"27638910\" in 1.566167509s" May 27 18:14:47.001143 containerd[1543]: time="2025-05-27T18:14:47.001082866Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.1\" returns image reference \"sha256:ef43894fa110c389f7286f4d5a3ea176072c95280efeca60d6a79617cdbbf3e4\"" May 27 18:14:47.002202 containerd[1543]: time="2025-05-27T18:14:47.002100940Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.1\"" May 27 18:14:48.489518 containerd[1543]: time="2025-05-27T18:14:48.489445536Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:14:48.490886 containerd[1543]: time="2025-05-27T18:14:48.490836108Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.1: active requests=0, bytes read=20148960" May 27 18:14:48.493270 containerd[1543]: time="2025-05-27T18:14:48.491594595Z" level=info msg="ImageCreate event name:\"sha256:398c985c0d950becc8dcdab5877a8a517ffeafca0792b3fe5f1acff218aeac49\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:14:48.495151 containerd[1543]: time="2025-05-27T18:14:48.495107326Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:395b7de7cdbdcc3c3a3db270844a3f71d757e2447a1e4db76b4cce46fba7fd55\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:14:48.496505 containerd[1543]: time="2025-05-27T18:14:48.496462972Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.1\" with image id \"sha256:398c985c0d950becc8dcdab5877a8a517ffeafca0792b3fe5f1acff218aeac49\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.1\", repo digest \"registry.k8s.io/kube-scheduler@sha256:395b7de7cdbdcc3c3a3db270844a3f71d757e2447a1e4db76b4cce46fba7fd55\", size \"21776498\" in 1.49431238s" May 27 18:14:48.496656 containerd[1543]: time="2025-05-27T18:14:48.496641705Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.1\" returns image reference \"sha256:398c985c0d950becc8dcdab5877a8a517ffeafca0792b3fe5f1acff218aeac49\"" May 27 18:14:48.497790 containerd[1543]: time="2025-05-27T18:14:48.497750963Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.1\"" May 27 18:14:49.139668 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 27 18:14:49.141947 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 18:14:49.353402 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 18:14:49.365171 (kubelet)[2078]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 18:14:49.449966 kubelet[2078]: E0527 18:14:49.449539 2078 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 18:14:49.457120 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 18:14:49.457284 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 18:14:49.457715 systemd[1]: kubelet.service: Consumed 229ms CPU time, 108.7M memory peak. May 27 18:14:49.736108 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1779929264.mount: Deactivated successfully. May 27 18:14:50.330700 containerd[1543]: time="2025-05-27T18:14:50.330600529Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:14:50.331975 containerd[1543]: time="2025-05-27T18:14:50.331802145Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.1: active requests=0, bytes read=31889075" May 27 18:14:50.332823 containerd[1543]: time="2025-05-27T18:14:50.332775895Z" level=info msg="ImageCreate event name:\"sha256:b79c189b052cdbe0e837d0caa6faf1d9fd696d8664fcc462f67d9ea51f26fef2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:14:50.335280 containerd[1543]: time="2025-05-27T18:14:50.334867741Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:7ddf379897139ae8ade8b33cb9373b70c632a4d5491da6e234f5d830e0a50807\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:14:50.335814 containerd[1543]: time="2025-05-27T18:14:50.335620680Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.1\" with image id \"sha256:b79c189b052cdbe0e837d0caa6faf1d9fd696d8664fcc462f67d9ea51f26fef2\", repo tag \"registry.k8s.io/kube-proxy:v1.33.1\", repo digest \"registry.k8s.io/kube-proxy@sha256:7ddf379897139ae8ade8b33cb9373b70c632a4d5491da6e234f5d830e0a50807\", size \"31888094\" in 1.837698826s" May 27 18:14:50.335814 containerd[1543]: time="2025-05-27T18:14:50.335658012Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.1\" returns image reference \"sha256:b79c189b052cdbe0e837d0caa6faf1d9fd696d8664fcc462f67d9ea51f26fef2\"" May 27 18:14:50.336183 containerd[1543]: time="2025-05-27T18:14:50.336157113Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" May 27 18:14:50.337732 systemd-resolved[1403]: Using degraded feature set UDP instead of UDP+EDNS0 for DNS server 67.207.67.3. May 27 18:14:50.850093 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount671594837.mount: Deactivated successfully. May 27 18:14:51.757450 containerd[1543]: time="2025-05-27T18:14:51.757388170Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:14:51.758309 containerd[1543]: time="2025-05-27T18:14:51.758266063Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=20942238" May 27 18:14:51.759160 containerd[1543]: time="2025-05-27T18:14:51.759090473Z" level=info msg="ImageCreate event name:\"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:14:51.762286 containerd[1543]: time="2025-05-27T18:14:51.762022847Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:14:51.764045 containerd[1543]: time="2025-05-27T18:14:51.763505472Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"20939036\" in 1.427222565s" May 27 18:14:51.764045 containerd[1543]: time="2025-05-27T18:14:51.763565191Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\"" May 27 18:14:51.764554 containerd[1543]: time="2025-05-27T18:14:51.764516899Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" May 27 18:14:52.245641 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount390211736.mount: Deactivated successfully. May 27 18:14:52.251168 containerd[1543]: time="2025-05-27T18:14:52.251095260Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 18:14:52.252355 containerd[1543]: time="2025-05-27T18:14:52.252274027Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" May 27 18:14:52.254727 containerd[1543]: time="2025-05-27T18:14:52.253131978Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 18:14:52.255899 containerd[1543]: time="2025-05-27T18:14:52.255841218Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 18:14:52.259926 containerd[1543]: time="2025-05-27T18:14:52.259861544Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 495.252799ms" May 27 18:14:52.259926 containerd[1543]: time="2025-05-27T18:14:52.259932476Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" May 27 18:14:52.260988 containerd[1543]: time="2025-05-27T18:14:52.260922246Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" May 27 18:14:53.409497 systemd-resolved[1403]: Using degraded feature set UDP instead of UDP+EDNS0 for DNS server 67.207.67.2. May 27 18:14:54.611268 containerd[1543]: time="2025-05-27T18:14:54.610817748Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:14:54.613333 containerd[1543]: time="2025-05-27T18:14:54.613277082Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=58142739" May 27 18:14:54.615692 containerd[1543]: time="2025-05-27T18:14:54.615598726Z" level=info msg="ImageCreate event name:\"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:14:54.620310 containerd[1543]: time="2025-05-27T18:14:54.620026890Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:14:54.622203 containerd[1543]: time="2025-05-27T18:14:54.621967845Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"58938593\" in 2.360998224s" May 27 18:14:54.622203 containerd[1543]: time="2025-05-27T18:14:54.622035921Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\"" May 27 18:14:59.708020 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. May 27 18:14:59.712297 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 18:14:59.923506 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 18:14:59.939815 (kubelet)[2185]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 18:15:00.011813 kubelet[2185]: E0527 18:15:00.011658 2185 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 18:15:00.017677 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 18:15:00.017897 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 18:15:00.019142 systemd[1]: kubelet.service: Consumed 206ms CPU time, 107.6M memory peak. May 27 18:15:00.195766 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 18:15:00.196025 systemd[1]: kubelet.service: Consumed 206ms CPU time, 107.6M memory peak. May 27 18:15:00.199963 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 18:15:00.238973 systemd[1]: Reload requested from client PID 2199 ('systemctl') (unit session-7.scope)... May 27 18:15:00.238992 systemd[1]: Reloading... May 27 18:15:00.390298 zram_generator::config[2245]: No configuration found. May 27 18:15:00.526708 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 18:15:00.714207 systemd[1]: Reloading finished in 474 ms. May 27 18:15:00.791583 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM May 27 18:15:00.791736 systemd[1]: kubelet.service: Failed with result 'signal'. May 27 18:15:00.792163 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 18:15:00.792262 systemd[1]: kubelet.service: Consumed 136ms CPU time, 98.3M memory peak. May 27 18:15:00.795111 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 18:15:01.028641 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 18:15:01.049392 (kubelet)[2296]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 27 18:15:01.138148 kubelet[2296]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 18:15:01.138148 kubelet[2296]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 27 18:15:01.138148 kubelet[2296]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 18:15:01.138148 kubelet[2296]: I0527 18:15:01.136311 2296 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 27 18:15:01.751278 kubelet[2296]: I0527 18:15:01.750525 2296 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" May 27 18:15:01.751278 kubelet[2296]: I0527 18:15:01.750578 2296 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 27 18:15:01.752525 kubelet[2296]: I0527 18:15:01.752405 2296 server.go:956] "Client rotation is on, will bootstrap in background" May 27 18:15:01.818797 kubelet[2296]: E0527 18:15:01.818720 2296 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://146.190.127.126:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 146.190.127.126:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" May 27 18:15:01.821267 kubelet[2296]: I0527 18:15:01.821166 2296 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 27 18:15:01.835747 kubelet[2296]: I0527 18:15:01.835672 2296 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 27 18:15:01.846469 kubelet[2296]: I0527 18:15:01.846284 2296 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 27 18:15:01.848674 kubelet[2296]: I0527 18:15:01.848180 2296 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 27 18:15:01.852949 kubelet[2296]: I0527 18:15:01.848305 2296 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4344.0.0-6-bb492ec913","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 27 18:15:01.853717 kubelet[2296]: I0527 18:15:01.853378 2296 topology_manager.go:138] "Creating topology manager with none policy" May 27 18:15:01.853717 kubelet[2296]: I0527 18:15:01.853415 2296 container_manager_linux.go:303] "Creating device plugin manager" May 27 18:15:01.853717 kubelet[2296]: I0527 18:15:01.853626 2296 state_mem.go:36] "Initialized new in-memory state store" May 27 18:15:01.859210 kubelet[2296]: I0527 18:15:01.859152 2296 kubelet.go:480] "Attempting to sync node with API server" May 27 18:15:01.859638 kubelet[2296]: I0527 18:15:01.859452 2296 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" May 27 18:15:01.859638 kubelet[2296]: I0527 18:15:01.859505 2296 kubelet.go:386] "Adding apiserver pod source" May 27 18:15:01.859638 kubelet[2296]: I0527 18:15:01.859528 2296 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 27 18:15:01.870351 kubelet[2296]: E0527 18:15:01.869693 2296 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://146.190.127.126:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4344.0.0-6-bb492ec913&limit=500&resourceVersion=0\": dial tcp 146.190.127.126:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" May 27 18:15:01.875797 kubelet[2296]: E0527 18:15:01.875570 2296 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://146.190.127.126:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 146.190.127.126:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" May 27 18:15:01.876022 kubelet[2296]: I0527 18:15:01.875852 2296 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 27 18:15:01.877137 kubelet[2296]: I0527 18:15:01.877086 2296 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" May 27 18:15:01.878318 kubelet[2296]: W0527 18:15:01.878279 2296 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 27 18:15:01.885142 kubelet[2296]: I0527 18:15:01.885100 2296 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 27 18:15:01.885308 kubelet[2296]: I0527 18:15:01.885184 2296 server.go:1289] "Started kubelet" May 27 18:15:01.892289 kubelet[2296]: I0527 18:15:01.891835 2296 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 27 18:15:01.896859 kubelet[2296]: E0527 18:15:01.893695 2296 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://146.190.127.126:6443/api/v1/namespaces/default/events\": dial tcp 146.190.127.126:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4344.0.0-6-bb492ec913.184374fb08eb4a61 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4344.0.0-6-bb492ec913,UID:ci-4344.0.0-6-bb492ec913,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4344.0.0-6-bb492ec913,},FirstTimestamp:2025-05-27 18:15:01.885135457 +0000 UTC m=+0.828174202,LastTimestamp:2025-05-27 18:15:01.885135457 +0000 UTC m=+0.828174202,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4344.0.0-6-bb492ec913,}" May 27 18:15:01.915386 kubelet[2296]: I0527 18:15:01.901796 2296 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 May 27 18:15:01.917505 kubelet[2296]: I0527 18:15:01.917451 2296 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 27 18:15:01.927316 kubelet[2296]: I0527 18:15:01.901868 2296 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 27 18:15:01.927316 kubelet[2296]: I0527 18:15:01.926845 2296 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 27 18:15:01.931921 kubelet[2296]: I0527 18:15:01.931520 2296 volume_manager.go:297] "Starting Kubelet Volume Manager" May 27 18:15:01.933450 kubelet[2296]: E0527 18:15:01.931961 2296 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4344.0.0-6-bb492ec913\" not found" May 27 18:15:01.933450 kubelet[2296]: I0527 18:15:01.932985 2296 desired_state_of_world_populator.go:150] "Desired state populator starts to run" May 27 18:15:01.933450 kubelet[2296]: I0527 18:15:01.933083 2296 reconciler.go:26] "Reconciler: start to sync state" May 27 18:15:01.936459 kubelet[2296]: E0527 18:15:01.936240 2296 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://146.190.127.126:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 146.190.127.126:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" May 27 18:15:01.937622 kubelet[2296]: E0527 18:15:01.936410 2296 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://146.190.127.126:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4344.0.0-6-bb492ec913?timeout=10s\": dial tcp 146.190.127.126:6443: connect: connection refused" interval="200ms" May 27 18:15:01.938628 kubelet[2296]: I0527 18:15:01.937702 2296 server.go:317] "Adding debug handlers to kubelet server" May 27 18:15:01.963545 kubelet[2296]: I0527 18:15:01.963135 2296 factory.go:223] Registration of the systemd container factory successfully May 27 18:15:01.963545 kubelet[2296]: I0527 18:15:01.963342 2296 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 27 18:15:01.967271 kubelet[2296]: E0527 18:15:01.966227 2296 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 27 18:15:01.969557 kubelet[2296]: I0527 18:15:01.969365 2296 factory.go:223] Registration of the containerd container factory successfully May 27 18:15:02.000285 kubelet[2296]: I0527 18:15:01.999115 2296 cpu_manager.go:221] "Starting CPU manager" policy="none" May 27 18:15:02.000285 kubelet[2296]: I0527 18:15:01.999149 2296 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" May 27 18:15:02.000285 kubelet[2296]: I0527 18:15:01.999172 2296 state_mem.go:36] "Initialized new in-memory state store" May 27 18:15:02.003343 kubelet[2296]: I0527 18:15:02.002835 2296 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" May 27 18:15:02.003343 kubelet[2296]: I0527 18:15:02.003074 2296 policy_none.go:49] "None policy: Start" May 27 18:15:02.003343 kubelet[2296]: I0527 18:15:02.003346 2296 memory_manager.go:186] "Starting memorymanager" policy="None" May 27 18:15:02.003561 kubelet[2296]: I0527 18:15:02.003363 2296 state_mem.go:35] "Initializing new in-memory state store" May 27 18:15:02.007672 kubelet[2296]: I0527 18:15:02.007611 2296 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" May 27 18:15:02.007864 kubelet[2296]: I0527 18:15:02.007770 2296 status_manager.go:230] "Starting to sync pod status with apiserver" May 27 18:15:02.007864 kubelet[2296]: I0527 18:15:02.007827 2296 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 27 18:15:02.007864 kubelet[2296]: I0527 18:15:02.007836 2296 kubelet.go:2436] "Starting kubelet main sync loop" May 27 18:15:02.007991 kubelet[2296]: E0527 18:15:02.007922 2296 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 27 18:15:02.022352 kubelet[2296]: E0527 18:15:02.022283 2296 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://146.190.127.126:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 146.190.127.126:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" May 27 18:15:02.029699 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 27 18:15:02.042620 kubelet[2296]: E0527 18:15:02.042514 2296 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4344.0.0-6-bb492ec913\" not found" May 27 18:15:02.060190 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 27 18:15:02.069529 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 27 18:15:02.087882 kubelet[2296]: E0527 18:15:02.087835 2296 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" May 27 18:15:02.088198 kubelet[2296]: I0527 18:15:02.088165 2296 eviction_manager.go:189] "Eviction manager: starting control loop" May 27 18:15:02.088313 kubelet[2296]: I0527 18:15:02.088193 2296 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 27 18:15:02.090124 kubelet[2296]: I0527 18:15:02.089534 2296 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 27 18:15:02.094754 kubelet[2296]: E0527 18:15:02.094706 2296 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" May 27 18:15:02.094921 kubelet[2296]: E0527 18:15:02.094779 2296 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4344.0.0-6-bb492ec913\" not found" May 27 18:15:02.132611 systemd[1]: Created slice kubepods-burstable-pod02ebf679fa3beccd0a711ce449103f3a.slice - libcontainer container kubepods-burstable-pod02ebf679fa3beccd0a711ce449103f3a.slice. May 27 18:15:02.140277 kubelet[2296]: E0527 18:15:02.139544 2296 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://146.190.127.126:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4344.0.0-6-bb492ec913?timeout=10s\": dial tcp 146.190.127.126:6443: connect: connection refused" interval="400ms" May 27 18:15:02.148657 systemd[1]: Created slice kubepods-burstable-pod620dda864dab66edd85c02b6385df7bb.slice - libcontainer container kubepods-burstable-pod620dda864dab66edd85c02b6385df7bb.slice. May 27 18:15:02.152281 kubelet[2296]: E0527 18:15:02.151896 2296 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-6-bb492ec913\" not found" node="ci-4344.0.0-6-bb492ec913" May 27 18:15:02.157154 kubelet[2296]: I0527 18:15:02.155784 2296 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/02ebf679fa3beccd0a711ce449103f3a-ca-certs\") pod \"kube-apiserver-ci-4344.0.0-6-bb492ec913\" (UID: \"02ebf679fa3beccd0a711ce449103f3a\") " pod="kube-system/kube-apiserver-ci-4344.0.0-6-bb492ec913" May 27 18:15:02.157532 kubelet[2296]: I0527 18:15:02.157484 2296 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/02ebf679fa3beccd0a711ce449103f3a-k8s-certs\") pod \"kube-apiserver-ci-4344.0.0-6-bb492ec913\" (UID: \"02ebf679fa3beccd0a711ce449103f3a\") " pod="kube-system/kube-apiserver-ci-4344.0.0-6-bb492ec913" May 27 18:15:02.157616 kubelet[2296]: I0527 18:15:02.157554 2296 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1a9fcab34006630df3d68b33eb586413-ca-certs\") pod \"kube-controller-manager-ci-4344.0.0-6-bb492ec913\" (UID: \"1a9fcab34006630df3d68b33eb586413\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-6-bb492ec913" May 27 18:15:02.157616 kubelet[2296]: I0527 18:15:02.157586 2296 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1a9fcab34006630df3d68b33eb586413-kubeconfig\") pod \"kube-controller-manager-ci-4344.0.0-6-bb492ec913\" (UID: \"1a9fcab34006630df3d68b33eb586413\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-6-bb492ec913" May 27 18:15:02.157766 kubelet[2296]: I0527 18:15:02.157628 2296 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/02ebf679fa3beccd0a711ce449103f3a-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4344.0.0-6-bb492ec913\" (UID: \"02ebf679fa3beccd0a711ce449103f3a\") " pod="kube-system/kube-apiserver-ci-4344.0.0-6-bb492ec913" May 27 18:15:02.157766 kubelet[2296]: I0527 18:15:02.157665 2296 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/1a9fcab34006630df3d68b33eb586413-flexvolume-dir\") pod \"kube-controller-manager-ci-4344.0.0-6-bb492ec913\" (UID: \"1a9fcab34006630df3d68b33eb586413\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-6-bb492ec913" May 27 18:15:02.157766 kubelet[2296]: I0527 18:15:02.157698 2296 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1a9fcab34006630df3d68b33eb586413-k8s-certs\") pod \"kube-controller-manager-ci-4344.0.0-6-bb492ec913\" (UID: \"1a9fcab34006630df3d68b33eb586413\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-6-bb492ec913" May 27 18:15:02.157766 kubelet[2296]: I0527 18:15:02.157735 2296 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1a9fcab34006630df3d68b33eb586413-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4344.0.0-6-bb492ec913\" (UID: \"1a9fcab34006630df3d68b33eb586413\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-6-bb492ec913" May 27 18:15:02.157936 kubelet[2296]: I0527 18:15:02.157768 2296 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/620dda864dab66edd85c02b6385df7bb-kubeconfig\") pod \"kube-scheduler-ci-4344.0.0-6-bb492ec913\" (UID: \"620dda864dab66edd85c02b6385df7bb\") " pod="kube-system/kube-scheduler-ci-4344.0.0-6-bb492ec913" May 27 18:15:02.158733 kubelet[2296]: E0527 18:15:02.158674 2296 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-6-bb492ec913\" not found" node="ci-4344.0.0-6-bb492ec913" May 27 18:15:02.164659 systemd[1]: Created slice kubepods-burstable-pod1a9fcab34006630df3d68b33eb586413.slice - libcontainer container kubepods-burstable-pod1a9fcab34006630df3d68b33eb586413.slice. May 27 18:15:02.170490 kubelet[2296]: E0527 18:15:02.170425 2296 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-6-bb492ec913\" not found" node="ci-4344.0.0-6-bb492ec913" May 27 18:15:02.190710 kubelet[2296]: I0527 18:15:02.190204 2296 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344.0.0-6-bb492ec913" May 27 18:15:02.190710 kubelet[2296]: E0527 18:15:02.190666 2296 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://146.190.127.126:6443/api/v1/nodes\": dial tcp 146.190.127.126:6443: connect: connection refused" node="ci-4344.0.0-6-bb492ec913" May 27 18:15:02.298141 kubelet[2296]: E0527 18:15:02.297363 2296 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://146.190.127.126:6443/api/v1/namespaces/default/events\": dial tcp 146.190.127.126:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4344.0.0-6-bb492ec913.184374fb08eb4a61 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4344.0.0-6-bb492ec913,UID:ci-4344.0.0-6-bb492ec913,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4344.0.0-6-bb492ec913,},FirstTimestamp:2025-05-27 18:15:01.885135457 +0000 UTC m=+0.828174202,LastTimestamp:2025-05-27 18:15:01.885135457 +0000 UTC m=+0.828174202,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4344.0.0-6-bb492ec913,}" May 27 18:15:02.392222 kubelet[2296]: I0527 18:15:02.392172 2296 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344.0.0-6-bb492ec913" May 27 18:15:02.393180 kubelet[2296]: E0527 18:15:02.393114 2296 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://146.190.127.126:6443/api/v1/nodes\": dial tcp 146.190.127.126:6443: connect: connection refused" node="ci-4344.0.0-6-bb492ec913" May 27 18:15:02.453364 kubelet[2296]: E0527 18:15:02.453292 2296 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:02.454463 containerd[1543]: time="2025-05-27T18:15:02.454406029Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4344.0.0-6-bb492ec913,Uid:02ebf679fa3beccd0a711ce449103f3a,Namespace:kube-system,Attempt:0,}" May 27 18:15:02.461299 kubelet[2296]: E0527 18:15:02.461071 2296 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:02.462409 containerd[1543]: time="2025-05-27T18:15:02.462086196Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4344.0.0-6-bb492ec913,Uid:620dda864dab66edd85c02b6385df7bb,Namespace:kube-system,Attempt:0,}" May 27 18:15:02.472072 kubelet[2296]: E0527 18:15:02.472018 2296 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:02.473345 containerd[1543]: time="2025-05-27T18:15:02.473161149Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4344.0.0-6-bb492ec913,Uid:1a9fcab34006630df3d68b33eb586413,Namespace:kube-system,Attempt:0,}" May 27 18:15:02.542543 kubelet[2296]: E0527 18:15:02.542493 2296 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://146.190.127.126:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4344.0.0-6-bb492ec913?timeout=10s\": dial tcp 146.190.127.126:6443: connect: connection refused" interval="800ms" May 27 18:15:02.669037 containerd[1543]: time="2025-05-27T18:15:02.668802297Z" level=info msg="connecting to shim 20ae5a8e2bf286b09ba1a7e10fddf56802a3b35db73826f34c29228d5b1aca9f" address="unix:///run/containerd/s/38383a091b60d97760ebede861ee8fbdf8e189a7a168e0628753d72b81962fcc" namespace=k8s.io protocol=ttrpc version=3 May 27 18:15:02.675218 containerd[1543]: time="2025-05-27T18:15:02.675051634Z" level=info msg="connecting to shim cbc4dea54516389535ed9010272ee6a5ef584fe710e3122f61c24444c95de71a" address="unix:///run/containerd/s/b7ecb6dd1b4c67984019dd71b690bb32c0e0e008945e40237ee437c70afc58c7" namespace=k8s.io protocol=ttrpc version=3 May 27 18:15:02.676266 containerd[1543]: time="2025-05-27T18:15:02.676106481Z" level=info msg="connecting to shim b71ba0f0440fb2019f59702bf81336f19930f7f8078502591d78c676d53fb06a" address="unix:///run/containerd/s/55d41b2693a1d08127a6cdcc975fe879e3ffc4f427e085de255ce27dbaabea9f" namespace=k8s.io protocol=ttrpc version=3 May 27 18:15:02.715894 kubelet[2296]: E0527 18:15:02.715626 2296 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://146.190.127.126:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4344.0.0-6-bb492ec913&limit=500&resourceVersion=0\": dial tcp 146.190.127.126:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" May 27 18:15:02.797202 kubelet[2296]: I0527 18:15:02.797104 2296 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344.0.0-6-bb492ec913" May 27 18:15:02.799750 kubelet[2296]: E0527 18:15:02.798188 2296 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://146.190.127.126:6443/api/v1/nodes\": dial tcp 146.190.127.126:6443: connect: connection refused" node="ci-4344.0.0-6-bb492ec913" May 27 18:15:02.850894 systemd[1]: Started cri-containerd-20ae5a8e2bf286b09ba1a7e10fddf56802a3b35db73826f34c29228d5b1aca9f.scope - libcontainer container 20ae5a8e2bf286b09ba1a7e10fddf56802a3b35db73826f34c29228d5b1aca9f. May 27 18:15:02.853862 systemd[1]: Started cri-containerd-b71ba0f0440fb2019f59702bf81336f19930f7f8078502591d78c676d53fb06a.scope - libcontainer container b71ba0f0440fb2019f59702bf81336f19930f7f8078502591d78c676d53fb06a. May 27 18:15:02.857661 systemd[1]: Started cri-containerd-cbc4dea54516389535ed9010272ee6a5ef584fe710e3122f61c24444c95de71a.scope - libcontainer container cbc4dea54516389535ed9010272ee6a5ef584fe710e3122f61c24444c95de71a. May 27 18:15:03.009417 containerd[1543]: time="2025-05-27T18:15:03.009362440Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4344.0.0-6-bb492ec913,Uid:02ebf679fa3beccd0a711ce449103f3a,Namespace:kube-system,Attempt:0,} returns sandbox id \"b71ba0f0440fb2019f59702bf81336f19930f7f8078502591d78c676d53fb06a\"" May 27 18:15:03.011167 kubelet[2296]: E0527 18:15:03.011131 2296 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:03.027849 containerd[1543]: time="2025-05-27T18:15:03.027780023Z" level=info msg="CreateContainer within sandbox \"b71ba0f0440fb2019f59702bf81336f19930f7f8078502591d78c676d53fb06a\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 27 18:15:03.035655 containerd[1543]: time="2025-05-27T18:15:03.035423916Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4344.0.0-6-bb492ec913,Uid:1a9fcab34006630df3d68b33eb586413,Namespace:kube-system,Attempt:0,} returns sandbox id \"cbc4dea54516389535ed9010272ee6a5ef584fe710e3122f61c24444c95de71a\"" May 27 18:15:03.037158 kubelet[2296]: E0527 18:15:03.037104 2296 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:03.050658 containerd[1543]: time="2025-05-27T18:15:03.050610945Z" level=info msg="CreateContainer within sandbox \"cbc4dea54516389535ed9010272ee6a5ef584fe710e3122f61c24444c95de71a\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 27 18:15:03.055735 containerd[1543]: time="2025-05-27T18:15:03.055643036Z" level=info msg="Container 24fa8905e40320b64978f0bf1723397043bcf3fb5e0b3df4f47006972379005f: CDI devices from CRI Config.CDIDevices: []" May 27 18:15:03.070338 containerd[1543]: time="2025-05-27T18:15:03.070232485Z" level=info msg="CreateContainer within sandbox \"b71ba0f0440fb2019f59702bf81336f19930f7f8078502591d78c676d53fb06a\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"24fa8905e40320b64978f0bf1723397043bcf3fb5e0b3df4f47006972379005f\"" May 27 18:15:03.073417 containerd[1543]: time="2025-05-27T18:15:03.073360528Z" level=info msg="StartContainer for \"24fa8905e40320b64978f0bf1723397043bcf3fb5e0b3df4f47006972379005f\"" May 27 18:15:03.075010 containerd[1543]: time="2025-05-27T18:15:03.074878924Z" level=info msg="Container b8141abc808c60bc02684a7a34b0df5a87993331e55fe3eb026277765620d710: CDI devices from CRI Config.CDIDevices: []" May 27 18:15:03.082574 containerd[1543]: time="2025-05-27T18:15:03.082414117Z" level=info msg="connecting to shim 24fa8905e40320b64978f0bf1723397043bcf3fb5e0b3df4f47006972379005f" address="unix:///run/containerd/s/55d41b2693a1d08127a6cdcc975fe879e3ffc4f427e085de255ce27dbaabea9f" protocol=ttrpc version=3 May 27 18:15:03.082765 containerd[1543]: time="2025-05-27T18:15:03.082499845Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4344.0.0-6-bb492ec913,Uid:620dda864dab66edd85c02b6385df7bb,Namespace:kube-system,Attempt:0,} returns sandbox id \"20ae5a8e2bf286b09ba1a7e10fddf56802a3b35db73826f34c29228d5b1aca9f\"" May 27 18:15:03.086288 kubelet[2296]: E0527 18:15:03.086030 2296 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:03.089811 containerd[1543]: time="2025-05-27T18:15:03.089741162Z" level=info msg="CreateContainer within sandbox \"cbc4dea54516389535ed9010272ee6a5ef584fe710e3122f61c24444c95de71a\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"b8141abc808c60bc02684a7a34b0df5a87993331e55fe3eb026277765620d710\"" May 27 18:15:03.091526 containerd[1543]: time="2025-05-27T18:15:03.091485363Z" level=info msg="StartContainer for \"b8141abc808c60bc02684a7a34b0df5a87993331e55fe3eb026277765620d710\"" May 27 18:15:03.092064 containerd[1543]: time="2025-05-27T18:15:03.092011693Z" level=info msg="CreateContainer within sandbox \"20ae5a8e2bf286b09ba1a7e10fddf56802a3b35db73826f34c29228d5b1aca9f\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 27 18:15:03.096758 containerd[1543]: time="2025-05-27T18:15:03.096670755Z" level=info msg="connecting to shim b8141abc808c60bc02684a7a34b0df5a87993331e55fe3eb026277765620d710" address="unix:///run/containerd/s/b7ecb6dd1b4c67984019dd71b690bb32c0e0e008945e40237ee437c70afc58c7" protocol=ttrpc version=3 May 27 18:15:03.106460 containerd[1543]: time="2025-05-27T18:15:03.105856120Z" level=info msg="Container ac97c0914d6ed9fb061f2297c2ac82962f0b38d902dc5d840a9916bca0c1cb80: CDI devices from CRI Config.CDIDevices: []" May 27 18:15:03.119612 systemd[1]: Started cri-containerd-24fa8905e40320b64978f0bf1723397043bcf3fb5e0b3df4f47006972379005f.scope - libcontainer container 24fa8905e40320b64978f0bf1723397043bcf3fb5e0b3df4f47006972379005f. May 27 18:15:03.132689 systemd[1]: Started cri-containerd-b8141abc808c60bc02684a7a34b0df5a87993331e55fe3eb026277765620d710.scope - libcontainer container b8141abc808c60bc02684a7a34b0df5a87993331e55fe3eb026277765620d710. May 27 18:15:03.139612 containerd[1543]: time="2025-05-27T18:15:03.139564565Z" level=info msg="CreateContainer within sandbox \"20ae5a8e2bf286b09ba1a7e10fddf56802a3b35db73826f34c29228d5b1aca9f\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"ac97c0914d6ed9fb061f2297c2ac82962f0b38d902dc5d840a9916bca0c1cb80\"" May 27 18:15:03.140605 containerd[1543]: time="2025-05-27T18:15:03.140556814Z" level=info msg="StartContainer for \"ac97c0914d6ed9fb061f2297c2ac82962f0b38d902dc5d840a9916bca0c1cb80\"" May 27 18:15:03.145151 containerd[1543]: time="2025-05-27T18:15:03.144839131Z" level=info msg="connecting to shim ac97c0914d6ed9fb061f2297c2ac82962f0b38d902dc5d840a9916bca0c1cb80" address="unix:///run/containerd/s/38383a091b60d97760ebede861ee8fbdf8e189a7a168e0628753d72b81962fcc" protocol=ttrpc version=3 May 27 18:15:03.185934 systemd[1]: Started cri-containerd-ac97c0914d6ed9fb061f2297c2ac82962f0b38d902dc5d840a9916bca0c1cb80.scope - libcontainer container ac97c0914d6ed9fb061f2297c2ac82962f0b38d902dc5d840a9916bca0c1cb80. May 27 18:15:03.252445 containerd[1543]: time="2025-05-27T18:15:03.252366212Z" level=info msg="StartContainer for \"24fa8905e40320b64978f0bf1723397043bcf3fb5e0b3df4f47006972379005f\" returns successfully" May 27 18:15:03.268792 kubelet[2296]: E0527 18:15:03.268493 2296 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://146.190.127.126:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 146.190.127.126:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" May 27 18:15:03.273514 containerd[1543]: time="2025-05-27T18:15:03.273438166Z" level=info msg="StartContainer for \"b8141abc808c60bc02684a7a34b0df5a87993331e55fe3eb026277765620d710\" returns successfully" May 27 18:15:03.344889 kubelet[2296]: E0527 18:15:03.344828 2296 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://146.190.127.126:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4344.0.0-6-bb492ec913?timeout=10s\": dial tcp 146.190.127.126:6443: connect: connection refused" interval="1.6s" May 27 18:15:03.349577 containerd[1543]: time="2025-05-27T18:15:03.349519270Z" level=info msg="StartContainer for \"ac97c0914d6ed9fb061f2297c2ac82962f0b38d902dc5d840a9916bca0c1cb80\" returns successfully" May 27 18:15:03.420295 kubelet[2296]: E0527 18:15:03.420163 2296 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://146.190.127.126:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 146.190.127.126:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" May 27 18:15:03.583073 kubelet[2296]: E0527 18:15:03.582894 2296 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://146.190.127.126:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 146.190.127.126:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" May 27 18:15:03.602541 kubelet[2296]: I0527 18:15:03.602492 2296 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344.0.0-6-bb492ec913" May 27 18:15:04.071553 kubelet[2296]: E0527 18:15:04.071501 2296 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-6-bb492ec913\" not found" node="ci-4344.0.0-6-bb492ec913" May 27 18:15:04.071761 kubelet[2296]: E0527 18:15:04.071706 2296 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:04.076062 kubelet[2296]: E0527 18:15:04.076003 2296 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-6-bb492ec913\" not found" node="ci-4344.0.0-6-bb492ec913" May 27 18:15:04.076270 kubelet[2296]: E0527 18:15:04.076215 2296 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:04.085692 kubelet[2296]: E0527 18:15:04.085642 2296 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-6-bb492ec913\" not found" node="ci-4344.0.0-6-bb492ec913" May 27 18:15:04.085887 kubelet[2296]: E0527 18:15:04.085851 2296 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:05.089060 kubelet[2296]: E0527 18:15:05.089010 2296 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-6-bb492ec913\" not found" node="ci-4344.0.0-6-bb492ec913" May 27 18:15:05.089711 kubelet[2296]: E0527 18:15:05.089194 2296 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:05.090952 kubelet[2296]: E0527 18:15:05.090911 2296 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-6-bb492ec913\" not found" node="ci-4344.0.0-6-bb492ec913" May 27 18:15:05.091100 kubelet[2296]: E0527 18:15:05.091061 2296 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:05.091886 kubelet[2296]: E0527 18:15:05.091861 2296 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-6-bb492ec913\" not found" node="ci-4344.0.0-6-bb492ec913" May 27 18:15:05.092002 kubelet[2296]: E0527 18:15:05.091984 2296 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:06.091980 kubelet[2296]: E0527 18:15:06.091924 2296 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-6-bb492ec913\" not found" node="ci-4344.0.0-6-bb492ec913" May 27 18:15:06.092535 kubelet[2296]: E0527 18:15:06.092121 2296 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:06.094155 kubelet[2296]: E0527 18:15:06.094089 2296 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-6-bb492ec913\" not found" node="ci-4344.0.0-6-bb492ec913" May 27 18:15:06.094397 kubelet[2296]: E0527 18:15:06.094377 2296 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:06.278882 kubelet[2296]: E0527 18:15:06.278824 2296 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4344.0.0-6-bb492ec913\" not found" node="ci-4344.0.0-6-bb492ec913" May 27 18:15:06.363813 kubelet[2296]: I0527 18:15:06.363657 2296 kubelet_node_status.go:78] "Successfully registered node" node="ci-4344.0.0-6-bb492ec913" May 27 18:15:06.432998 kubelet[2296]: I0527 18:15:06.432932 2296 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4344.0.0-6-bb492ec913" May 27 18:15:06.450264 kubelet[2296]: E0527 18:15:06.449966 2296 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4344.0.0-6-bb492ec913\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4344.0.0-6-bb492ec913" May 27 18:15:06.450264 kubelet[2296]: I0527 18:15:06.450005 2296 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4344.0.0-6-bb492ec913" May 27 18:15:06.452775 kubelet[2296]: E0527 18:15:06.452715 2296 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4344.0.0-6-bb492ec913\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4344.0.0-6-bb492ec913" May 27 18:15:06.452775 kubelet[2296]: I0527 18:15:06.452760 2296 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4344.0.0-6-bb492ec913" May 27 18:15:06.455025 kubelet[2296]: E0527 18:15:06.454972 2296 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4344.0.0-6-bb492ec913\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4344.0.0-6-bb492ec913" May 27 18:15:06.878718 kubelet[2296]: I0527 18:15:06.878604 2296 apiserver.go:52] "Watching apiserver" May 27 18:15:06.933275 kubelet[2296]: I0527 18:15:06.933182 2296 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" May 27 18:15:07.098210 kubelet[2296]: I0527 18:15:07.098166 2296 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4344.0.0-6-bb492ec913" May 27 18:15:07.109829 kubelet[2296]: I0527 18:15:07.109643 2296 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" May 27 18:15:07.111296 kubelet[2296]: E0527 18:15:07.111186 2296 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:08.104054 kubelet[2296]: E0527 18:15:08.104011 2296 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:08.527727 systemd[1]: Reload requested from client PID 2574 ('systemctl') (unit session-7.scope)... May 27 18:15:08.527747 systemd[1]: Reloading... May 27 18:15:08.711873 zram_generator::config[2620]: No configuration found. May 27 18:15:08.874994 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 18:15:09.040119 systemd[1]: Reloading finished in 511 ms. May 27 18:15:09.083719 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 27 18:15:09.098610 systemd[1]: kubelet.service: Deactivated successfully. May 27 18:15:09.098908 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 18:15:09.099015 systemd[1]: kubelet.service: Consumed 1.346s CPU time, 126M memory peak. May 27 18:15:09.101971 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 18:15:09.302511 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 18:15:09.323985 (kubelet)[2668]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 27 18:15:09.409087 kubelet[2668]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 18:15:09.409087 kubelet[2668]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 27 18:15:09.409087 kubelet[2668]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 18:15:09.410531 kubelet[2668]: I0527 18:15:09.409353 2668 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 27 18:15:09.426032 kubelet[2668]: I0527 18:15:09.425970 2668 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" May 27 18:15:09.426032 kubelet[2668]: I0527 18:15:09.426005 2668 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 27 18:15:09.426809 kubelet[2668]: I0527 18:15:09.426604 2668 server.go:956] "Client rotation is on, will bootstrap in background" May 27 18:15:09.428879 kubelet[2668]: I0527 18:15:09.428848 2668 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" May 27 18:15:09.436195 kubelet[2668]: I0527 18:15:09.435765 2668 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 27 18:15:09.443760 kubelet[2668]: I0527 18:15:09.443726 2668 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 27 18:15:09.447433 kubelet[2668]: I0527 18:15:09.447401 2668 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 27 18:15:09.447877 kubelet[2668]: I0527 18:15:09.447814 2668 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 27 18:15:09.448158 kubelet[2668]: I0527 18:15:09.447872 2668 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4344.0.0-6-bb492ec913","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 27 18:15:09.448158 kubelet[2668]: I0527 18:15:09.448161 2668 topology_manager.go:138] "Creating topology manager with none policy" May 27 18:15:09.448530 kubelet[2668]: I0527 18:15:09.448177 2668 container_manager_linux.go:303] "Creating device plugin manager" May 27 18:15:09.448530 kubelet[2668]: I0527 18:15:09.448250 2668 state_mem.go:36] "Initialized new in-memory state store" May 27 18:15:09.448530 kubelet[2668]: I0527 18:15:09.448462 2668 kubelet.go:480] "Attempting to sync node with API server" May 27 18:15:09.448530 kubelet[2668]: I0527 18:15:09.448481 2668 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" May 27 18:15:09.448530 kubelet[2668]: I0527 18:15:09.448510 2668 kubelet.go:386] "Adding apiserver pod source" May 27 18:15:09.448530 kubelet[2668]: I0527 18:15:09.448527 2668 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 27 18:15:09.464255 kubelet[2668]: I0527 18:15:09.462500 2668 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 27 18:15:09.465256 kubelet[2668]: I0527 18:15:09.464899 2668 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" May 27 18:15:09.476335 kubelet[2668]: I0527 18:15:09.475928 2668 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 27 18:15:09.478953 kubelet[2668]: I0527 18:15:09.476831 2668 server.go:1289] "Started kubelet" May 27 18:15:09.479939 kubelet[2668]: I0527 18:15:09.479533 2668 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 May 27 18:15:09.484890 kubelet[2668]: I0527 18:15:09.484675 2668 server.go:317] "Adding debug handlers to kubelet server" May 27 18:15:09.485664 kubelet[2668]: I0527 18:15:09.485446 2668 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 27 18:15:09.497373 kubelet[2668]: I0527 18:15:09.497231 2668 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 27 18:15:09.501810 kubelet[2668]: I0527 18:15:09.499694 2668 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 27 18:15:09.502131 kubelet[2668]: I0527 18:15:09.502103 2668 volume_manager.go:297] "Starting Kubelet Volume Manager" May 27 18:15:09.505498 kubelet[2668]: I0527 18:15:09.504822 2668 desired_state_of_world_populator.go:150] "Desired state populator starts to run" May 27 18:15:09.505498 kubelet[2668]: I0527 18:15:09.505499 2668 reconciler.go:26] "Reconciler: start to sync state" May 27 18:15:09.506985 kubelet[2668]: I0527 18:15:09.506544 2668 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 27 18:15:09.515699 kubelet[2668]: I0527 18:15:09.515480 2668 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 27 18:15:09.519930 kubelet[2668]: I0527 18:15:09.519868 2668 factory.go:223] Registration of the containerd container factory successfully May 27 18:15:09.519930 kubelet[2668]: I0527 18:15:09.519891 2668 factory.go:223] Registration of the systemd container factory successfully May 27 18:15:09.557839 kubelet[2668]: I0527 18:15:09.557386 2668 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" May 27 18:15:09.571685 kubelet[2668]: I0527 18:15:09.571639 2668 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" May 27 18:15:09.571884 kubelet[2668]: I0527 18:15:09.571866 2668 status_manager.go:230] "Starting to sync pod status with apiserver" May 27 18:15:09.571942 kubelet[2668]: I0527 18:15:09.571904 2668 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 27 18:15:09.572123 kubelet[2668]: I0527 18:15:09.572106 2668 kubelet.go:2436] "Starting kubelet main sync loop" May 27 18:15:09.572658 kubelet[2668]: E0527 18:15:09.572280 2668 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 27 18:15:09.638014 kubelet[2668]: I0527 18:15:09.636824 2668 cpu_manager.go:221] "Starting CPU manager" policy="none" May 27 18:15:09.638014 kubelet[2668]: I0527 18:15:09.636847 2668 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" May 27 18:15:09.638014 kubelet[2668]: I0527 18:15:09.636878 2668 state_mem.go:36] "Initialized new in-memory state store" May 27 18:15:09.638014 kubelet[2668]: I0527 18:15:09.637650 2668 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 27 18:15:09.638014 kubelet[2668]: I0527 18:15:09.637673 2668 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 27 18:15:09.638014 kubelet[2668]: I0527 18:15:09.637820 2668 policy_none.go:49] "None policy: Start" May 27 18:15:09.638014 kubelet[2668]: I0527 18:15:09.637836 2668 memory_manager.go:186] "Starting memorymanager" policy="None" May 27 18:15:09.638014 kubelet[2668]: I0527 18:15:09.637964 2668 state_mem.go:35] "Initializing new in-memory state store" May 27 18:15:09.638600 kubelet[2668]: I0527 18:15:09.638403 2668 state_mem.go:75] "Updated machine memory state" May 27 18:15:09.649112 kubelet[2668]: E0527 18:15:09.649048 2668 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" May 27 18:15:09.649294 kubelet[2668]: I0527 18:15:09.649269 2668 eviction_manager.go:189] "Eviction manager: starting control loop" May 27 18:15:09.649344 kubelet[2668]: I0527 18:15:09.649289 2668 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 27 18:15:09.650333 kubelet[2668]: I0527 18:15:09.649998 2668 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 27 18:15:09.657415 kubelet[2668]: E0527 18:15:09.656737 2668 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" May 27 18:15:09.676275 kubelet[2668]: I0527 18:15:09.674863 2668 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4344.0.0-6-bb492ec913" May 27 18:15:09.676275 kubelet[2668]: I0527 18:15:09.675915 2668 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4344.0.0-6-bb492ec913" May 27 18:15:09.676275 kubelet[2668]: I0527 18:15:09.676167 2668 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4344.0.0-6-bb492ec913" May 27 18:15:09.688214 kubelet[2668]: I0527 18:15:09.688184 2668 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" May 27 18:15:09.688406 kubelet[2668]: I0527 18:15:09.688351 2668 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" May 27 18:15:09.688759 kubelet[2668]: I0527 18:15:09.688740 2668 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" May 27 18:15:09.688824 kubelet[2668]: E0527 18:15:09.688797 2668 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4344.0.0-6-bb492ec913\" already exists" pod="kube-system/kube-apiserver-ci-4344.0.0-6-bb492ec913" May 27 18:15:09.707261 kubelet[2668]: I0527 18:15:09.707176 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/02ebf679fa3beccd0a711ce449103f3a-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4344.0.0-6-bb492ec913\" (UID: \"02ebf679fa3beccd0a711ce449103f3a\") " pod="kube-system/kube-apiserver-ci-4344.0.0-6-bb492ec913" May 27 18:15:09.707493 kubelet[2668]: I0527 18:15:09.707319 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1a9fcab34006630df3d68b33eb586413-ca-certs\") pod \"kube-controller-manager-ci-4344.0.0-6-bb492ec913\" (UID: \"1a9fcab34006630df3d68b33eb586413\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-6-bb492ec913" May 27 18:15:09.707493 kubelet[2668]: I0527 18:15:09.707378 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/1a9fcab34006630df3d68b33eb586413-flexvolume-dir\") pod \"kube-controller-manager-ci-4344.0.0-6-bb492ec913\" (UID: \"1a9fcab34006630df3d68b33eb586413\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-6-bb492ec913" May 27 18:15:09.708612 kubelet[2668]: I0527 18:15:09.707408 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1a9fcab34006630df3d68b33eb586413-k8s-certs\") pod \"kube-controller-manager-ci-4344.0.0-6-bb492ec913\" (UID: \"1a9fcab34006630df3d68b33eb586413\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-6-bb492ec913" May 27 18:15:09.708612 kubelet[2668]: I0527 18:15:09.708028 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/02ebf679fa3beccd0a711ce449103f3a-ca-certs\") pod \"kube-apiserver-ci-4344.0.0-6-bb492ec913\" (UID: \"02ebf679fa3beccd0a711ce449103f3a\") " pod="kube-system/kube-apiserver-ci-4344.0.0-6-bb492ec913" May 27 18:15:09.708612 kubelet[2668]: I0527 18:15:09.708060 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/02ebf679fa3beccd0a711ce449103f3a-k8s-certs\") pod \"kube-apiserver-ci-4344.0.0-6-bb492ec913\" (UID: \"02ebf679fa3beccd0a711ce449103f3a\") " pod="kube-system/kube-apiserver-ci-4344.0.0-6-bb492ec913" May 27 18:15:09.757528 kubelet[2668]: I0527 18:15:09.756857 2668 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344.0.0-6-bb492ec913" May 27 18:15:09.771944 kubelet[2668]: I0527 18:15:09.771901 2668 kubelet_node_status.go:124] "Node was previously registered" node="ci-4344.0.0-6-bb492ec913" May 27 18:15:09.772159 kubelet[2668]: I0527 18:15:09.772149 2668 kubelet_node_status.go:78] "Successfully registered node" node="ci-4344.0.0-6-bb492ec913" May 27 18:15:09.809080 kubelet[2668]: I0527 18:15:09.808933 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1a9fcab34006630df3d68b33eb586413-kubeconfig\") pod \"kube-controller-manager-ci-4344.0.0-6-bb492ec913\" (UID: \"1a9fcab34006630df3d68b33eb586413\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-6-bb492ec913" May 27 18:15:09.809080 kubelet[2668]: I0527 18:15:09.808991 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1a9fcab34006630df3d68b33eb586413-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4344.0.0-6-bb492ec913\" (UID: \"1a9fcab34006630df3d68b33eb586413\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-6-bb492ec913" May 27 18:15:09.809080 kubelet[2668]: I0527 18:15:09.809056 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/620dda864dab66edd85c02b6385df7bb-kubeconfig\") pod \"kube-scheduler-ci-4344.0.0-6-bb492ec913\" (UID: \"620dda864dab66edd85c02b6385df7bb\") " pod="kube-system/kube-scheduler-ci-4344.0.0-6-bb492ec913" May 27 18:15:09.991955 kubelet[2668]: E0527 18:15:09.989509 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:09.991955 kubelet[2668]: E0527 18:15:09.990098 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:09.993291 kubelet[2668]: E0527 18:15:09.993161 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:10.470202 kubelet[2668]: I0527 18:15:10.468220 2668 apiserver.go:52] "Watching apiserver" May 27 18:15:10.505961 kubelet[2668]: I0527 18:15:10.505901 2668 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" May 27 18:15:10.629326 kubelet[2668]: I0527 18:15:10.628520 2668 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4344.0.0-6-bb492ec913" podStartSLOduration=1.628501241 podStartE2EDuration="1.628501241s" podCreationTimestamp="2025-05-27 18:15:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 18:15:10.62754373 +0000 UTC m=+1.293569147" watchObservedRunningTime="2025-05-27 18:15:10.628501241 +0000 UTC m=+1.294526653" May 27 18:15:10.632559 kubelet[2668]: E0527 18:15:10.630178 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:10.632559 kubelet[2668]: E0527 18:15:10.630509 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:10.634341 kubelet[2668]: E0527 18:15:10.634297 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:10.680500 kubelet[2668]: I0527 18:15:10.680427 2668 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4344.0.0-6-bb492ec913" podStartSLOduration=3.6803856169999998 podStartE2EDuration="3.680385617s" podCreationTimestamp="2025-05-27 18:15:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 18:15:10.661264057 +0000 UTC m=+1.327289464" watchObservedRunningTime="2025-05-27 18:15:10.680385617 +0000 UTC m=+1.346411005" May 27 18:15:10.695254 kubelet[2668]: I0527 18:15:10.694393 2668 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4344.0.0-6-bb492ec913" podStartSLOduration=1.694375465 podStartE2EDuration="1.694375465s" podCreationTimestamp="2025-05-27 18:15:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 18:15:10.681217401 +0000 UTC m=+1.347242810" watchObservedRunningTime="2025-05-27 18:15:10.694375465 +0000 UTC m=+1.360400874" May 27 18:15:11.631875 kubelet[2668]: E0527 18:15:11.631799 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:11.632626 kubelet[2668]: E0527 18:15:11.631811 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:12.633950 kubelet[2668]: E0527 18:15:12.633902 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:12.923398 systemd-resolved[1403]: Using degraded feature set TCP instead of UDP for DNS server 67.207.67.2. May 27 18:15:13.581615 systemd-resolved[1403]: Clock change detected. Flushing caches. May 27 18:15:13.582143 systemd-timesyncd[1417]: Contacted time server 204.2.134.172:123 (2.flatcar.pool.ntp.org). May 27 18:15:13.582229 systemd-timesyncd[1417]: Initial clock synchronization to Tue 2025-05-27 18:15:13.581397 UTC. May 27 18:15:13.942403 kubelet[2668]: I0527 18:15:13.941712 2668 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 27 18:15:13.943462 containerd[1543]: time="2025-05-27T18:15:13.943300016Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 27 18:15:13.944505 kubelet[2668]: I0527 18:15:13.944468 2668 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 27 18:15:14.991611 systemd[1]: Created slice kubepods-besteffort-podc0dcb2dd_8d05_4d12_b291_b39c0d599a42.slice - libcontainer container kubepods-besteffort-podc0dcb2dd_8d05_4d12_b291_b39c0d599a42.slice. May 27 18:15:15.077522 kubelet[2668]: I0527 18:15:15.077460 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/c0dcb2dd-8d05-4d12-b291-b39c0d599a42-kube-proxy\") pod \"kube-proxy-7fcgn\" (UID: \"c0dcb2dd-8d05-4d12-b291-b39c0d599a42\") " pod="kube-system/kube-proxy-7fcgn" May 27 18:15:15.077978 kubelet[2668]: I0527 18:15:15.077557 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c0dcb2dd-8d05-4d12-b291-b39c0d599a42-lib-modules\") pod \"kube-proxy-7fcgn\" (UID: \"c0dcb2dd-8d05-4d12-b291-b39c0d599a42\") " pod="kube-system/kube-proxy-7fcgn" May 27 18:15:15.077978 kubelet[2668]: I0527 18:15:15.077611 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/c0dcb2dd-8d05-4d12-b291-b39c0d599a42-xtables-lock\") pod \"kube-proxy-7fcgn\" (UID: \"c0dcb2dd-8d05-4d12-b291-b39c0d599a42\") " pod="kube-system/kube-proxy-7fcgn" May 27 18:15:15.077978 kubelet[2668]: I0527 18:15:15.077636 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62q6r\" (UniqueName: \"kubernetes.io/projected/c0dcb2dd-8d05-4d12-b291-b39c0d599a42-kube-api-access-62q6r\") pod \"kube-proxy-7fcgn\" (UID: \"c0dcb2dd-8d05-4d12-b291-b39c0d599a42\") " pod="kube-system/kube-proxy-7fcgn" May 27 18:15:15.161083 systemd[1]: Created slice kubepods-besteffort-pod7e74e826_74ab_4aa1_947d_469ee3625c07.slice - libcontainer container kubepods-besteffort-pod7e74e826_74ab_4aa1_947d_469ee3625c07.slice. May 27 18:15:15.178555 kubelet[2668]: I0527 18:15:15.178499 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cj44\" (UniqueName: \"kubernetes.io/projected/7e74e826-74ab-4aa1-947d-469ee3625c07-kube-api-access-5cj44\") pod \"tigera-operator-844669ff44-c559p\" (UID: \"7e74e826-74ab-4aa1-947d-469ee3625c07\") " pod="tigera-operator/tigera-operator-844669ff44-c559p" May 27 18:15:15.178743 kubelet[2668]: I0527 18:15:15.178657 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/7e74e826-74ab-4aa1-947d-469ee3625c07-var-lib-calico\") pod \"tigera-operator-844669ff44-c559p\" (UID: \"7e74e826-74ab-4aa1-947d-469ee3625c07\") " pod="tigera-operator/tigera-operator-844669ff44-c559p" May 27 18:15:15.189037 kubelet[2668]: E0527 18:15:15.188715 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:15.303128 kubelet[2668]: E0527 18:15:15.302799 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:15.306492 containerd[1543]: time="2025-05-27T18:15:15.306208063Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-7fcgn,Uid:c0dcb2dd-8d05-4d12-b291-b39c0d599a42,Namespace:kube-system,Attempt:0,}" May 27 18:15:15.332561 containerd[1543]: time="2025-05-27T18:15:15.332504884Z" level=info msg="connecting to shim 5af9526ee9a4fef232627600828ea67614d14c5b3a41c5011eb249c66897d1a0" address="unix:///run/containerd/s/b484c9f8766af7732135c690cfc67910193a8f4305fffe69b3c53da8655083ce" namespace=k8s.io protocol=ttrpc version=3 May 27 18:15:15.366825 systemd[1]: Started cri-containerd-5af9526ee9a4fef232627600828ea67614d14c5b3a41c5011eb249c66897d1a0.scope - libcontainer container 5af9526ee9a4fef232627600828ea67614d14c5b3a41c5011eb249c66897d1a0. May 27 18:15:15.407883 containerd[1543]: time="2025-05-27T18:15:15.407847473Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-7fcgn,Uid:c0dcb2dd-8d05-4d12-b291-b39c0d599a42,Namespace:kube-system,Attempt:0,} returns sandbox id \"5af9526ee9a4fef232627600828ea67614d14c5b3a41c5011eb249c66897d1a0\"" May 27 18:15:15.410377 kubelet[2668]: E0527 18:15:15.409477 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:15.416124 containerd[1543]: time="2025-05-27T18:15:15.416083942Z" level=info msg="CreateContainer within sandbox \"5af9526ee9a4fef232627600828ea67614d14c5b3a41c5011eb249c66897d1a0\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 27 18:15:15.427565 containerd[1543]: time="2025-05-27T18:15:15.427507000Z" level=info msg="Container 1beb0084cf04d2626db20ff865f6f99f644acfc5d5088852f8461a6d9c31d6d8: CDI devices from CRI Config.CDIDevices: []" May 27 18:15:15.437603 containerd[1543]: time="2025-05-27T18:15:15.437548284Z" level=info msg="CreateContainer within sandbox \"5af9526ee9a4fef232627600828ea67614d14c5b3a41c5011eb249c66897d1a0\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"1beb0084cf04d2626db20ff865f6f99f644acfc5d5088852f8461a6d9c31d6d8\"" May 27 18:15:15.439707 containerd[1543]: time="2025-05-27T18:15:15.439654331Z" level=info msg="StartContainer for \"1beb0084cf04d2626db20ff865f6f99f644acfc5d5088852f8461a6d9c31d6d8\"" May 27 18:15:15.442922 containerd[1543]: time="2025-05-27T18:15:15.442770031Z" level=info msg="connecting to shim 1beb0084cf04d2626db20ff865f6f99f644acfc5d5088852f8461a6d9c31d6d8" address="unix:///run/containerd/s/b484c9f8766af7732135c690cfc67910193a8f4305fffe69b3c53da8655083ce" protocol=ttrpc version=3 May 27 18:15:15.467719 containerd[1543]: time="2025-05-27T18:15:15.467614200Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-844669ff44-c559p,Uid:7e74e826-74ab-4aa1-947d-469ee3625c07,Namespace:tigera-operator,Attempt:0,}" May 27 18:15:15.473955 systemd[1]: Started cri-containerd-1beb0084cf04d2626db20ff865f6f99f644acfc5d5088852f8461a6d9c31d6d8.scope - libcontainer container 1beb0084cf04d2626db20ff865f6f99f644acfc5d5088852f8461a6d9c31d6d8. May 27 18:15:15.492145 containerd[1543]: time="2025-05-27T18:15:15.492082729Z" level=info msg="connecting to shim 46cad09c3894210fd9b3a97603bfcd693e6f6a1203195a10fe19133c9aa82673" address="unix:///run/containerd/s/bc5f8bde32c4e372d838a8542f69848564fcb1a433da04bbe3622e24236b42a6" namespace=k8s.io protocol=ttrpc version=3 May 27 18:15:15.537827 systemd[1]: Started cri-containerd-46cad09c3894210fd9b3a97603bfcd693e6f6a1203195a10fe19133c9aa82673.scope - libcontainer container 46cad09c3894210fd9b3a97603bfcd693e6f6a1203195a10fe19133c9aa82673. May 27 18:15:15.551147 containerd[1543]: time="2025-05-27T18:15:15.549282239Z" level=info msg="StartContainer for \"1beb0084cf04d2626db20ff865f6f99f644acfc5d5088852f8461a6d9c31d6d8\" returns successfully" May 27 18:15:15.617051 containerd[1543]: time="2025-05-27T18:15:15.616918906Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-844669ff44-c559p,Uid:7e74e826-74ab-4aa1-947d-469ee3625c07,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"46cad09c3894210fd9b3a97603bfcd693e6f6a1203195a10fe19133c9aa82673\"" May 27 18:15:15.621707 containerd[1543]: time="2025-05-27T18:15:15.621669264Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.0\"" May 27 18:15:16.185407 kubelet[2668]: E0527 18:15:16.183706 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:16.185853 kubelet[2668]: E0527 18:15:16.185753 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:16.209066 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount719969917.mount: Deactivated successfully. May 27 18:15:16.431156 kubelet[2668]: E0527 18:15:16.431096 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:16.451586 kubelet[2668]: I0527 18:15:16.451010 2668 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-7fcgn" podStartSLOduration=2.450982276 podStartE2EDuration="2.450982276s" podCreationTimestamp="2025-05-27 18:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 18:15:16.224746404 +0000 UTC m=+6.352575728" watchObservedRunningTime="2025-05-27 18:15:16.450982276 +0000 UTC m=+6.578811605" May 27 18:15:17.195468 kubelet[2668]: E0527 18:15:17.194085 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:17.248789 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2443154176.mount: Deactivated successfully. May 27 18:15:18.166631 containerd[1543]: time="2025-05-27T18:15:18.166560971Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:15:18.167791 containerd[1543]: time="2025-05-27T18:15:18.167739750Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.0: active requests=0, bytes read=25055451" May 27 18:15:18.168246 containerd[1543]: time="2025-05-27T18:15:18.168206662Z" level=info msg="ImageCreate event name:\"sha256:5e43c1322619406528ff596056dfeb70cb8d20c5c00439feb752a7725302e033\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:15:18.171072 containerd[1543]: time="2025-05-27T18:15:18.171011365Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:e0a34b265aebce1a2db906d8dad99190706e8bf3910cae626b9c2eb6bbb21775\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:15:18.171510 containerd[1543]: time="2025-05-27T18:15:18.171364031Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.0\" with image id \"sha256:5e43c1322619406528ff596056dfeb70cb8d20c5c00439feb752a7725302e033\", repo tag \"quay.io/tigera/operator:v1.38.0\", repo digest \"quay.io/tigera/operator@sha256:e0a34b265aebce1a2db906d8dad99190706e8bf3910cae626b9c2eb6bbb21775\", size \"25051446\" in 2.549657898s" May 27 18:15:18.171510 containerd[1543]: time="2025-05-27T18:15:18.171392968Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.0\" returns image reference \"sha256:5e43c1322619406528ff596056dfeb70cb8d20c5c00439feb752a7725302e033\"" May 27 18:15:18.177053 containerd[1543]: time="2025-05-27T18:15:18.176997453Z" level=info msg="CreateContainer within sandbox \"46cad09c3894210fd9b3a97603bfcd693e6f6a1203195a10fe19133c9aa82673\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 27 18:15:18.192342 containerd[1543]: time="2025-05-27T18:15:18.190770244Z" level=info msg="Container 9da6560a3f84068ec512461e4da61494562f699e533e34b0eb9a584c3bf80eba: CDI devices from CRI Config.CDIDevices: []" May 27 18:15:18.195583 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3663265413.mount: Deactivated successfully. May 27 18:15:18.207323 containerd[1543]: time="2025-05-27T18:15:18.207275099Z" level=info msg="CreateContainer within sandbox \"46cad09c3894210fd9b3a97603bfcd693e6f6a1203195a10fe19133c9aa82673\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"9da6560a3f84068ec512461e4da61494562f699e533e34b0eb9a584c3bf80eba\"" May 27 18:15:18.208598 containerd[1543]: time="2025-05-27T18:15:18.208555358Z" level=info msg="StartContainer for \"9da6560a3f84068ec512461e4da61494562f699e533e34b0eb9a584c3bf80eba\"" May 27 18:15:18.209698 containerd[1543]: time="2025-05-27T18:15:18.209639992Z" level=info msg="connecting to shim 9da6560a3f84068ec512461e4da61494562f699e533e34b0eb9a584c3bf80eba" address="unix:///run/containerd/s/bc5f8bde32c4e372d838a8542f69848564fcb1a433da04bbe3622e24236b42a6" protocol=ttrpc version=3 May 27 18:15:18.245003 systemd[1]: Started cri-containerd-9da6560a3f84068ec512461e4da61494562f699e533e34b0eb9a584c3bf80eba.scope - libcontainer container 9da6560a3f84068ec512461e4da61494562f699e533e34b0eb9a584c3bf80eba. May 27 18:15:18.309466 containerd[1543]: time="2025-05-27T18:15:18.308831757Z" level=info msg="StartContainer for \"9da6560a3f84068ec512461e4da61494562f699e533e34b0eb9a584c3bf80eba\" returns successfully" May 27 18:15:19.220362 kubelet[2668]: I0527 18:15:19.220186 2668 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-844669ff44-c559p" podStartSLOduration=1.667754717 podStartE2EDuration="4.22015094s" podCreationTimestamp="2025-05-27 18:15:15 +0000 UTC" firstStartedPulling="2025-05-27 18:15:15.620380503 +0000 UTC m=+5.748209822" lastFinishedPulling="2025-05-27 18:15:18.172776718 +0000 UTC m=+8.300606045" observedRunningTime="2025-05-27 18:15:19.219712926 +0000 UTC m=+9.347542249" watchObservedRunningTime="2025-05-27 18:15:19.22015094 +0000 UTC m=+9.347980265" May 27 18:15:21.783237 update_engine[1522]: I20250527 18:15:21.782506 1522 update_attempter.cc:509] Updating boot flags... May 27 18:15:22.375236 kubelet[2668]: E0527 18:15:22.375198 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:23.228047 kubelet[2668]: E0527 18:15:23.227989 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:25.728709 sudo[1775]: pam_unix(sudo:session): session closed for user root May 27 18:15:25.732460 sshd[1774]: Connection closed by 139.178.68.195 port 40112 May 27 18:15:25.734427 sshd-session[1772]: pam_unix(sshd:session): session closed for user core May 27 18:15:25.741277 systemd[1]: sshd@6-146.190.127.126:22-139.178.68.195:40112.service: Deactivated successfully. May 27 18:15:25.741954 systemd-logind[1520]: Session 7 logged out. Waiting for processes to exit. May 27 18:15:25.747206 systemd[1]: session-7.scope: Deactivated successfully. May 27 18:15:25.747902 systemd[1]: session-7.scope: Consumed 8.244s CPU time, 167M memory peak. May 27 18:15:25.755686 systemd-logind[1520]: Removed session 7. May 27 18:15:29.320510 systemd[1]: Created slice kubepods-besteffort-pod307bce77_5299_4f02_bd38_0cdd2c71bcad.slice - libcontainer container kubepods-besteffort-pod307bce77_5299_4f02_bd38_0cdd2c71bcad.slice. May 27 18:15:29.379909 kubelet[2668]: I0527 18:15:29.379849 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/307bce77-5299-4f02-bd38-0cdd2c71bcad-typha-certs\") pod \"calico-typha-9cfb99589-swjxg\" (UID: \"307bce77-5299-4f02-bd38-0cdd2c71bcad\") " pod="calico-system/calico-typha-9cfb99589-swjxg" May 27 18:15:29.379909 kubelet[2668]: I0527 18:15:29.379909 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrlhk\" (UniqueName: \"kubernetes.io/projected/307bce77-5299-4f02-bd38-0cdd2c71bcad-kube-api-access-vrlhk\") pod \"calico-typha-9cfb99589-swjxg\" (UID: \"307bce77-5299-4f02-bd38-0cdd2c71bcad\") " pod="calico-system/calico-typha-9cfb99589-swjxg" May 27 18:15:29.380551 kubelet[2668]: I0527 18:15:29.379939 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/307bce77-5299-4f02-bd38-0cdd2c71bcad-tigera-ca-bundle\") pod \"calico-typha-9cfb99589-swjxg\" (UID: \"307bce77-5299-4f02-bd38-0cdd2c71bcad\") " pod="calico-system/calico-typha-9cfb99589-swjxg" May 27 18:15:29.615559 systemd[1]: Created slice kubepods-besteffort-pod08ec9af9_b618_456a_89b8_e6f5946c7fdf.slice - libcontainer container kubepods-besteffort-pod08ec9af9_b618_456a_89b8_e6f5946c7fdf.slice. May 27 18:15:29.624529 kubelet[2668]: E0527 18:15:29.624481 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:29.625824 containerd[1543]: time="2025-05-27T18:15:29.625782587Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-9cfb99589-swjxg,Uid:307bce77-5299-4f02-bd38-0cdd2c71bcad,Namespace:calico-system,Attempt:0,}" May 27 18:15:29.668375 containerd[1543]: time="2025-05-27T18:15:29.668315779Z" level=info msg="connecting to shim 889bba814c9e6e9f261acdf40ff1a7814556a8cdf542770a569c16f826a5e74e" address="unix:///run/containerd/s/18bd7b16dc6defe7aa99bf0183b6e5a56d9a0e3bdedf1b21c3ac66140249aa01" namespace=k8s.io protocol=ttrpc version=3 May 27 18:15:29.682154 kubelet[2668]: I0527 18:15:29.682098 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/08ec9af9-b618-456a-89b8-e6f5946c7fdf-cni-log-dir\") pod \"calico-node-hnlwc\" (UID: \"08ec9af9-b618-456a-89b8-e6f5946c7fdf\") " pod="calico-system/calico-node-hnlwc" May 27 18:15:29.682154 kubelet[2668]: I0527 18:15:29.682155 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/08ec9af9-b618-456a-89b8-e6f5946c7fdf-tigera-ca-bundle\") pod \"calico-node-hnlwc\" (UID: \"08ec9af9-b618-456a-89b8-e6f5946c7fdf\") " pod="calico-system/calico-node-hnlwc" May 27 18:15:29.682343 kubelet[2668]: I0527 18:15:29.682202 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/08ec9af9-b618-456a-89b8-e6f5946c7fdf-xtables-lock\") pod \"calico-node-hnlwc\" (UID: \"08ec9af9-b618-456a-89b8-e6f5946c7fdf\") " pod="calico-system/calico-node-hnlwc" May 27 18:15:29.682343 kubelet[2668]: I0527 18:15:29.682232 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/08ec9af9-b618-456a-89b8-e6f5946c7fdf-cni-bin-dir\") pod \"calico-node-hnlwc\" (UID: \"08ec9af9-b618-456a-89b8-e6f5946c7fdf\") " pod="calico-system/calico-node-hnlwc" May 27 18:15:29.682343 kubelet[2668]: I0527 18:15:29.682255 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/08ec9af9-b618-456a-89b8-e6f5946c7fdf-policysync\") pod \"calico-node-hnlwc\" (UID: \"08ec9af9-b618-456a-89b8-e6f5946c7fdf\") " pod="calico-system/calico-node-hnlwc" May 27 18:15:29.682343 kubelet[2668]: I0527 18:15:29.682280 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l67wp\" (UniqueName: \"kubernetes.io/projected/08ec9af9-b618-456a-89b8-e6f5946c7fdf-kube-api-access-l67wp\") pod \"calico-node-hnlwc\" (UID: \"08ec9af9-b618-456a-89b8-e6f5946c7fdf\") " pod="calico-system/calico-node-hnlwc" May 27 18:15:29.682343 kubelet[2668]: I0527 18:15:29.682308 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/08ec9af9-b618-456a-89b8-e6f5946c7fdf-cni-net-dir\") pod \"calico-node-hnlwc\" (UID: \"08ec9af9-b618-456a-89b8-e6f5946c7fdf\") " pod="calico-system/calico-node-hnlwc" May 27 18:15:29.682548 kubelet[2668]: I0527 18:15:29.682332 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/08ec9af9-b618-456a-89b8-e6f5946c7fdf-flexvol-driver-host\") pod \"calico-node-hnlwc\" (UID: \"08ec9af9-b618-456a-89b8-e6f5946c7fdf\") " pod="calico-system/calico-node-hnlwc" May 27 18:15:29.682548 kubelet[2668]: I0527 18:15:29.682357 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/08ec9af9-b618-456a-89b8-e6f5946c7fdf-lib-modules\") pod \"calico-node-hnlwc\" (UID: \"08ec9af9-b618-456a-89b8-e6f5946c7fdf\") " pod="calico-system/calico-node-hnlwc" May 27 18:15:29.682548 kubelet[2668]: I0527 18:15:29.682383 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/08ec9af9-b618-456a-89b8-e6f5946c7fdf-node-certs\") pod \"calico-node-hnlwc\" (UID: \"08ec9af9-b618-456a-89b8-e6f5946c7fdf\") " pod="calico-system/calico-node-hnlwc" May 27 18:15:29.682548 kubelet[2668]: I0527 18:15:29.682406 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/08ec9af9-b618-456a-89b8-e6f5946c7fdf-var-lib-calico\") pod \"calico-node-hnlwc\" (UID: \"08ec9af9-b618-456a-89b8-e6f5946c7fdf\") " pod="calico-system/calico-node-hnlwc" May 27 18:15:29.682548 kubelet[2668]: I0527 18:15:29.682461 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/08ec9af9-b618-456a-89b8-e6f5946c7fdf-var-run-calico\") pod \"calico-node-hnlwc\" (UID: \"08ec9af9-b618-456a-89b8-e6f5946c7fdf\") " pod="calico-system/calico-node-hnlwc" May 27 18:15:29.722365 systemd[1]: Started cri-containerd-889bba814c9e6e9f261acdf40ff1a7814556a8cdf542770a569c16f826a5e74e.scope - libcontainer container 889bba814c9e6e9f261acdf40ff1a7814556a8cdf542770a569c16f826a5e74e. May 27 18:15:29.788856 kubelet[2668]: E0527 18:15:29.788806 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:29.788856 kubelet[2668]: W0527 18:15:29.788838 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:29.789389 kubelet[2668]: E0527 18:15:29.788871 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:29.799002 kubelet[2668]: E0527 18:15:29.798971 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:29.799477 kubelet[2668]: W0527 18:15:29.799176 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:29.799477 kubelet[2668]: E0527 18:15:29.799222 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:29.827686 kubelet[2668]: E0527 18:15:29.827626 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:29.827686 kubelet[2668]: W0527 18:15:29.827651 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:29.828033 kubelet[2668]: E0527 18:15:29.827887 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:29.891737 kubelet[2668]: E0527 18:15:29.891416 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bglbs" podUID="201ff555-e16a-488d-8fbf-728dbbf651e9" May 27 18:15:29.924734 containerd[1543]: time="2025-05-27T18:15:29.924664106Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-hnlwc,Uid:08ec9af9-b618-456a-89b8-e6f5946c7fdf,Namespace:calico-system,Attempt:0,}" May 27 18:15:29.954381 containerd[1543]: time="2025-05-27T18:15:29.954242336Z" level=info msg="connecting to shim e1edfeb6bc174344d1893d9d9a496ed2407a3d49caba857eb73d775fe58fd102" address="unix:///run/containerd/s/578e2f108ff4de2555abab7f48ba40144e4f7d4aa92f5317fb1ffef9bfd47beb" namespace=k8s.io protocol=ttrpc version=3 May 27 18:15:29.962818 kubelet[2668]: E0527 18:15:29.962691 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:29.963231 kubelet[2668]: W0527 18:15:29.963088 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:29.963231 kubelet[2668]: E0527 18:15:29.963124 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:29.964444 kubelet[2668]: E0527 18:15:29.964365 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:29.965468 kubelet[2668]: W0527 18:15:29.964561 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:29.965468 kubelet[2668]: E0527 18:15:29.964736 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:29.966490 kubelet[2668]: E0527 18:15:29.966311 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:29.966490 kubelet[2668]: W0527 18:15:29.966379 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:29.966490 kubelet[2668]: E0527 18:15:29.966410 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:29.967380 kubelet[2668]: E0527 18:15:29.967215 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:29.967380 kubelet[2668]: W0527 18:15:29.967232 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:29.967740 kubelet[2668]: E0527 18:15:29.967252 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:29.970776 kubelet[2668]: E0527 18:15:29.969754 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:29.970776 kubelet[2668]: W0527 18:15:29.969822 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:29.970776 kubelet[2668]: E0527 18:15:29.969843 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:29.970776 kubelet[2668]: E0527 18:15:29.970722 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:29.971164 kubelet[2668]: W0527 18:15:29.970738 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:29.971164 kubelet[2668]: E0527 18:15:29.971055 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:29.971418 kubelet[2668]: E0527 18:15:29.971405 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:29.971507 kubelet[2668]: W0527 18:15:29.971496 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:29.971602 kubelet[2668]: E0527 18:15:29.971593 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:29.972108 kubelet[2668]: E0527 18:15:29.972091 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:29.972456 kubelet[2668]: W0527 18:15:29.972189 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:29.972456 kubelet[2668]: E0527 18:15:29.972342 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:29.973787 kubelet[2668]: E0527 18:15:29.973384 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:29.973787 kubelet[2668]: W0527 18:15:29.973397 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:29.973787 kubelet[2668]: E0527 18:15:29.973409 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:29.973787 kubelet[2668]: E0527 18:15:29.973596 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:29.973787 kubelet[2668]: W0527 18:15:29.973607 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:29.973787 kubelet[2668]: E0527 18:15:29.973617 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:29.974198 kubelet[2668]: E0527 18:15:29.974056 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:29.974198 kubelet[2668]: W0527 18:15:29.974084 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:29.974198 kubelet[2668]: E0527 18:15:29.974096 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:29.974742 kubelet[2668]: E0527 18:15:29.974656 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:29.974742 kubelet[2668]: W0527 18:15:29.974668 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:29.974742 kubelet[2668]: E0527 18:15:29.974679 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:29.975859 kubelet[2668]: E0527 18:15:29.975636 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:29.975859 kubelet[2668]: W0527 18:15:29.975650 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:29.975859 kubelet[2668]: E0527 18:15:29.975661 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:29.976193 kubelet[2668]: E0527 18:15:29.976072 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:29.976193 kubelet[2668]: W0527 18:15:29.976084 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:29.976193 kubelet[2668]: E0527 18:15:29.976098 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:29.976756 kubelet[2668]: E0527 18:15:29.976741 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:29.977790 kubelet[2668]: W0527 18:15:29.977652 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:29.977790 kubelet[2668]: E0527 18:15:29.977677 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:29.978416 kubelet[2668]: E0527 18:15:29.977991 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:29.978416 kubelet[2668]: W0527 18:15:29.978003 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:29.978416 kubelet[2668]: E0527 18:15:29.978014 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:29.979072 kubelet[2668]: E0527 18:15:29.978888 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:29.979072 kubelet[2668]: W0527 18:15:29.978902 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:29.979072 kubelet[2668]: E0527 18:15:29.978913 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:29.979699 kubelet[2668]: E0527 18:15:29.979497 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:29.979699 kubelet[2668]: W0527 18:15:29.979514 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:29.979699 kubelet[2668]: E0527 18:15:29.979525 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:29.981758 kubelet[2668]: E0527 18:15:29.980629 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:29.981758 kubelet[2668]: W0527 18:15:29.980644 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:29.981758 kubelet[2668]: E0527 18:15:29.980656 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:29.981758 kubelet[2668]: E0527 18:15:29.981643 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:29.981758 kubelet[2668]: W0527 18:15:29.981655 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:29.981758 kubelet[2668]: E0527 18:15:29.981667 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:29.985902 kubelet[2668]: E0527 18:15:29.985874 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:29.986089 kubelet[2668]: W0527 18:15:29.986063 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:29.986217 kubelet[2668]: E0527 18:15:29.986199 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:29.987570 kubelet[2668]: I0527 18:15:29.987532 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/201ff555-e16a-488d-8fbf-728dbbf651e9-kubelet-dir\") pod \"csi-node-driver-bglbs\" (UID: \"201ff555-e16a-488d-8fbf-728dbbf651e9\") " pod="calico-system/csi-node-driver-bglbs" May 27 18:15:29.989622 kubelet[2668]: E0527 18:15:29.988729 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:29.989622 kubelet[2668]: W0527 18:15:29.988754 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:29.989622 kubelet[2668]: E0527 18:15:29.988777 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:29.989622 kubelet[2668]: I0527 18:15:29.988811 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/201ff555-e16a-488d-8fbf-728dbbf651e9-socket-dir\") pod \"csi-node-driver-bglbs\" (UID: \"201ff555-e16a-488d-8fbf-728dbbf651e9\") " pod="calico-system/csi-node-driver-bglbs" May 27 18:15:29.991585 kubelet[2668]: E0527 18:15:29.991540 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:29.991585 kubelet[2668]: W0527 18:15:29.991579 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:29.991731 kubelet[2668]: E0527 18:15:29.991607 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:29.991731 kubelet[2668]: I0527 18:15:29.991644 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z68jh\" (UniqueName: \"kubernetes.io/projected/201ff555-e16a-488d-8fbf-728dbbf651e9-kube-api-access-z68jh\") pod \"csi-node-driver-bglbs\" (UID: \"201ff555-e16a-488d-8fbf-728dbbf651e9\") " pod="calico-system/csi-node-driver-bglbs" May 27 18:15:29.992760 kubelet[2668]: E0527 18:15:29.992713 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:29.992760 kubelet[2668]: W0527 18:15:29.992740 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:29.993225 kubelet[2668]: E0527 18:15:29.992768 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:29.993225 kubelet[2668]: I0527 18:15:29.992804 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/201ff555-e16a-488d-8fbf-728dbbf651e9-registration-dir\") pod \"csi-node-driver-bglbs\" (UID: \"201ff555-e16a-488d-8fbf-728dbbf651e9\") " pod="calico-system/csi-node-driver-bglbs" May 27 18:15:29.994529 kubelet[2668]: E0527 18:15:29.994415 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:29.994529 kubelet[2668]: W0527 18:15:29.994457 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:29.994529 kubelet[2668]: E0527 18:15:29.994481 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:29.994529 kubelet[2668]: I0527 18:15:29.994516 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/201ff555-e16a-488d-8fbf-728dbbf651e9-varrun\") pod \"csi-node-driver-bglbs\" (UID: \"201ff555-e16a-488d-8fbf-728dbbf651e9\") " pod="calico-system/csi-node-driver-bglbs" May 27 18:15:29.996943 kubelet[2668]: E0527 18:15:29.996135 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:29.996943 kubelet[2668]: W0527 18:15:29.996165 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:29.996943 kubelet[2668]: E0527 18:15:29.996188 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:29.997461 kubelet[2668]: E0527 18:15:29.997322 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:29.997461 kubelet[2668]: W0527 18:15:29.997348 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:29.998027 kubelet[2668]: E0527 18:15:29.997984 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:29.999668 kubelet[2668]: E0527 18:15:29.999037 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:29.999668 kubelet[2668]: W0527 18:15:29.999058 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:29.999668 kubelet[2668]: E0527 18:15:29.999078 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:29.999921 kubelet[2668]: E0527 18:15:29.999902 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:29.999965 kubelet[2668]: W0527 18:15:29.999921 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:29.999965 kubelet[2668]: E0527 18:15:29.999939 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:30.002297 kubelet[2668]: E0527 18:15:30.002262 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:30.002297 kubelet[2668]: W0527 18:15:30.002288 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:30.002502 kubelet[2668]: E0527 18:15:30.002326 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:30.004141 kubelet[2668]: E0527 18:15:30.003961 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:30.004141 kubelet[2668]: W0527 18:15:30.003995 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:30.004141 kubelet[2668]: E0527 18:15:30.004025 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:30.005653 kubelet[2668]: E0527 18:15:30.005614 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:30.005653 kubelet[2668]: W0527 18:15:30.005645 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:30.005863 kubelet[2668]: E0527 18:15:30.005677 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:30.007550 kubelet[2668]: E0527 18:15:30.007506 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:30.007550 kubelet[2668]: W0527 18:15:30.007536 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:30.007772 kubelet[2668]: E0527 18:15:30.007564 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:30.009122 kubelet[2668]: E0527 18:15:30.008995 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:30.009122 kubelet[2668]: W0527 18:15:30.009039 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:30.009122 kubelet[2668]: E0527 18:15:30.009068 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:30.011345 kubelet[2668]: E0527 18:15:30.011138 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:30.011345 kubelet[2668]: W0527 18:15:30.011167 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:30.011345 kubelet[2668]: E0527 18:15:30.011239 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:30.025998 systemd[1]: Started cri-containerd-e1edfeb6bc174344d1893d9d9a496ed2407a3d49caba857eb73d775fe58fd102.scope - libcontainer container e1edfeb6bc174344d1893d9d9a496ed2407a3d49caba857eb73d775fe58fd102. May 27 18:15:30.096631 kubelet[2668]: E0527 18:15:30.096588 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:30.096631 kubelet[2668]: W0527 18:15:30.096619 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:30.097304 kubelet[2668]: E0527 18:15:30.096651 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:30.098343 kubelet[2668]: E0527 18:15:30.098308 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:30.098343 kubelet[2668]: W0527 18:15:30.098339 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:30.098984 kubelet[2668]: E0527 18:15:30.098370 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:30.100961 kubelet[2668]: E0527 18:15:30.099672 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:30.100961 kubelet[2668]: W0527 18:15:30.100624 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:30.100961 kubelet[2668]: E0527 18:15:30.100673 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:30.101631 kubelet[2668]: E0527 18:15:30.101606 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:30.101851 kubelet[2668]: W0527 18:15:30.101826 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:30.102419 kubelet[2668]: E0527 18:15:30.102018 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:30.103214 containerd[1543]: time="2025-05-27T18:15:30.103164756Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-9cfb99589-swjxg,Uid:307bce77-5299-4f02-bd38-0cdd2c71bcad,Namespace:calico-system,Attempt:0,} returns sandbox id \"889bba814c9e6e9f261acdf40ff1a7814556a8cdf542770a569c16f826a5e74e\"" May 27 18:15:30.103578 kubelet[2668]: E0527 18:15:30.103547 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:30.103968 kubelet[2668]: W0527 18:15:30.103710 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:30.103968 kubelet[2668]: E0527 18:15:30.103794 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:30.104402 kubelet[2668]: E0527 18:15:30.104385 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:30.104768 kubelet[2668]: W0527 18:15:30.104522 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:30.104768 kubelet[2668]: E0527 18:15:30.104557 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:30.105219 kubelet[2668]: E0527 18:15:30.105061 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:30.105219 kubelet[2668]: W0527 18:15:30.105078 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:30.105219 kubelet[2668]: E0527 18:15:30.105098 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:30.105773 kubelet[2668]: E0527 18:15:30.105620 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:30.105773 kubelet[2668]: W0527 18:15:30.105638 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:30.105773 kubelet[2668]: E0527 18:15:30.105655 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:30.105991 kubelet[2668]: E0527 18:15:30.105976 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:30.106083 kubelet[2668]: W0527 18:15:30.106068 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:30.106320 kubelet[2668]: E0527 18:15:30.106181 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:30.106592 kubelet[2668]: E0527 18:15:30.106575 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:30.106681 kubelet[2668]: W0527 18:15:30.106666 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:30.106756 kubelet[2668]: E0527 18:15:30.106744 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:30.108763 kubelet[2668]: E0527 18:15:30.106082 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:30.109386 kubelet[2668]: E0527 18:15:30.109361 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:30.109624 kubelet[2668]: W0527 18:15:30.109559 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:30.109835 kubelet[2668]: E0527 18:15:30.109801 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:30.110560 kubelet[2668]: E0527 18:15:30.110529 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:30.110685 kubelet[2668]: W0527 18:15:30.110665 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:30.110777 kubelet[2668]: E0527 18:15:30.110763 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:30.111257 containerd[1543]: time="2025-05-27T18:15:30.111187982Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.0\"" May 27 18:15:30.116022 kubelet[2668]: E0527 18:15:30.115948 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:30.116022 kubelet[2668]: W0527 18:15:30.115998 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:30.116518 kubelet[2668]: E0527 18:15:30.116032 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:30.116934 kubelet[2668]: E0527 18:15:30.116892 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:30.116934 kubelet[2668]: W0527 18:15:30.116918 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:30.117132 kubelet[2668]: E0527 18:15:30.116944 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:30.117712 kubelet[2668]: E0527 18:15:30.117663 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:30.117712 kubelet[2668]: W0527 18:15:30.117688 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:30.117712 kubelet[2668]: E0527 18:15:30.117712 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:30.119488 kubelet[2668]: E0527 18:15:30.119364 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:30.119592 kubelet[2668]: W0527 18:15:30.119493 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:30.119592 kubelet[2668]: E0527 18:15:30.119522 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:30.120288 kubelet[2668]: E0527 18:15:30.120259 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:30.120288 kubelet[2668]: W0527 18:15:30.120288 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:30.120580 kubelet[2668]: E0527 18:15:30.120310 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:30.129777 kubelet[2668]: E0527 18:15:30.128673 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:30.129777 kubelet[2668]: W0527 18:15:30.128710 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:30.129777 kubelet[2668]: E0527 18:15:30.128740 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:30.134245 kubelet[2668]: E0527 18:15:30.130929 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:30.134245 kubelet[2668]: W0527 18:15:30.130960 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:30.134245 kubelet[2668]: E0527 18:15:30.130996 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:30.134245 kubelet[2668]: E0527 18:15:30.131897 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:30.134245 kubelet[2668]: W0527 18:15:30.131918 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:30.134245 kubelet[2668]: E0527 18:15:30.131943 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:30.134245 kubelet[2668]: E0527 18:15:30.132629 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:30.134245 kubelet[2668]: W0527 18:15:30.132649 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:30.134245 kubelet[2668]: E0527 18:15:30.132867 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:30.134245 kubelet[2668]: E0527 18:15:30.133568 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:30.134800 kubelet[2668]: W0527 18:15:30.133587 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:30.134800 kubelet[2668]: E0527 18:15:30.133609 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:30.135751 kubelet[2668]: E0527 18:15:30.135236 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:30.135751 kubelet[2668]: W0527 18:15:30.135351 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:30.135751 kubelet[2668]: E0527 18:15:30.135372 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:30.136419 kubelet[2668]: E0527 18:15:30.136022 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:30.136419 kubelet[2668]: W0527 18:15:30.136368 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:30.136419 kubelet[2668]: E0527 18:15:30.136397 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:30.137574 kubelet[2668]: E0527 18:15:30.137550 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:30.137708 kubelet[2668]: W0527 18:15:30.137678 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:30.142901 kubelet[2668]: E0527 18:15:30.141866 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:30.186272 kubelet[2668]: E0527 18:15:30.186123 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:30.186272 kubelet[2668]: W0527 18:15:30.186170 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:30.186554 kubelet[2668]: E0527 18:15:30.186299 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:30.340223 containerd[1543]: time="2025-05-27T18:15:30.340166844Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-hnlwc,Uid:08ec9af9-b618-456a-89b8-e6f5946c7fdf,Namespace:calico-system,Attempt:0,} returns sandbox id \"e1edfeb6bc174344d1893d9d9a496ed2407a3d49caba857eb73d775fe58fd102\"" May 27 18:15:31.805559 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2874804419.mount: Deactivated successfully. May 27 18:15:32.111592 kubelet[2668]: E0527 18:15:32.111046 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bglbs" podUID="201ff555-e16a-488d-8fbf-728dbbf651e9" May 27 18:15:32.836509 containerd[1543]: time="2025-05-27T18:15:32.836420919Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:15:32.837751 containerd[1543]: time="2025-05-27T18:15:32.837710222Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.0: active requests=0, bytes read=35158669" May 27 18:15:32.838296 containerd[1543]: time="2025-05-27T18:15:32.838263656Z" level=info msg="ImageCreate event name:\"sha256:71be0570e8645ac646675719e0da6ac33a05810991b31aecc303e7add70933be\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:15:32.841263 containerd[1543]: time="2025-05-27T18:15:32.841207999Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d282f6c773c4631b9dc8379eb093c54ca34c7728d55d6509cb45da5e1f5baf8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:15:32.842708 containerd[1543]: time="2025-05-27T18:15:32.842646985Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.0\" with image id \"sha256:71be0570e8645ac646675719e0da6ac33a05810991b31aecc303e7add70933be\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d282f6c773c4631b9dc8379eb093c54ca34c7728d55d6509cb45da5e1f5baf8f\", size \"35158523\" in 2.731409208s" May 27 18:15:32.842708 containerd[1543]: time="2025-05-27T18:15:32.842690434Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.0\" returns image reference \"sha256:71be0570e8645ac646675719e0da6ac33a05810991b31aecc303e7add70933be\"" May 27 18:15:32.844816 containerd[1543]: time="2025-05-27T18:15:32.844784437Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\"" May 27 18:15:32.874418 containerd[1543]: time="2025-05-27T18:15:32.874346617Z" level=info msg="CreateContainer within sandbox \"889bba814c9e6e9f261acdf40ff1a7814556a8cdf542770a569c16f826a5e74e\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 27 18:15:32.950749 containerd[1543]: time="2025-05-27T18:15:32.950686146Z" level=info msg="Container f35266d878fb26d9036046931a867d725d6e150a9e4c78e645fb6dad03893406: CDI devices from CRI Config.CDIDevices: []" May 27 18:15:32.953047 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1863385436.mount: Deactivated successfully. May 27 18:15:32.985623 containerd[1543]: time="2025-05-27T18:15:32.985539708Z" level=info msg="CreateContainer within sandbox \"889bba814c9e6e9f261acdf40ff1a7814556a8cdf542770a569c16f826a5e74e\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"f35266d878fb26d9036046931a867d725d6e150a9e4c78e645fb6dad03893406\"" May 27 18:15:32.987089 containerd[1543]: time="2025-05-27T18:15:32.986429674Z" level=info msg="StartContainer for \"f35266d878fb26d9036046931a867d725d6e150a9e4c78e645fb6dad03893406\"" May 27 18:15:32.988771 containerd[1543]: time="2025-05-27T18:15:32.988724841Z" level=info msg="connecting to shim f35266d878fb26d9036046931a867d725d6e150a9e4c78e645fb6dad03893406" address="unix:///run/containerd/s/18bd7b16dc6defe7aa99bf0183b6e5a56d9a0e3bdedf1b21c3ac66140249aa01" protocol=ttrpc version=3 May 27 18:15:33.026713 systemd[1]: Started cri-containerd-f35266d878fb26d9036046931a867d725d6e150a9e4c78e645fb6dad03893406.scope - libcontainer container f35266d878fb26d9036046931a867d725d6e150a9e4c78e645fb6dad03893406. May 27 18:15:33.143176 containerd[1543]: time="2025-05-27T18:15:33.140021612Z" level=info msg="StartContainer for \"f35266d878fb26d9036046931a867d725d6e150a9e4c78e645fb6dad03893406\" returns successfully" May 27 18:15:33.292051 kubelet[2668]: E0527 18:15:33.291994 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:33.308178 kubelet[2668]: E0527 18:15:33.308139 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:33.308442 kubelet[2668]: W0527 18:15:33.308397 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:33.308556 kubelet[2668]: E0527 18:15:33.308429 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:33.308807 kubelet[2668]: E0527 18:15:33.308788 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:33.308924 kubelet[2668]: W0527 18:15:33.308799 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:33.308924 kubelet[2668]: E0527 18:15:33.308887 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:33.309209 kubelet[2668]: E0527 18:15:33.309168 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:33.309209 kubelet[2668]: W0527 18:15:33.309179 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:33.309209 kubelet[2668]: E0527 18:15:33.309191 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:33.309627 kubelet[2668]: E0527 18:15:33.309586 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:33.309627 kubelet[2668]: W0527 18:15:33.309598 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:33.309627 kubelet[2668]: E0527 18:15:33.309608 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:33.310149 kubelet[2668]: E0527 18:15:33.310038 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:33.310149 kubelet[2668]: W0527 18:15:33.310050 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:33.310149 kubelet[2668]: E0527 18:15:33.310064 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:33.310580 kubelet[2668]: E0527 18:15:33.310544 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:33.310580 kubelet[2668]: W0527 18:15:33.310556 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:33.310720 kubelet[2668]: E0527 18:15:33.310567 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:33.311512 kubelet[2668]: E0527 18:15:33.311494 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:33.311691 kubelet[2668]: W0527 18:15:33.311550 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:33.311691 kubelet[2668]: E0527 18:15:33.311563 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:33.311899 kubelet[2668]: E0527 18:15:33.311883 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:33.312022 kubelet[2668]: W0527 18:15:33.311952 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:33.312022 kubelet[2668]: E0527 18:15:33.311969 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:33.312378 kubelet[2668]: E0527 18:15:33.312262 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:33.312378 kubelet[2668]: W0527 18:15:33.312272 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:33.312378 kubelet[2668]: E0527 18:15:33.312282 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:33.313802 kubelet[2668]: E0527 18:15:33.313709 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:33.313802 kubelet[2668]: W0527 18:15:33.313731 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:33.313802 kubelet[2668]: E0527 18:15:33.313747 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:33.314327 kubelet[2668]: E0527 18:15:33.314230 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:33.314327 kubelet[2668]: W0527 18:15:33.314247 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:33.314327 kubelet[2668]: E0527 18:15:33.314265 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:33.314799 kubelet[2668]: E0527 18:15:33.314700 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:33.314799 kubelet[2668]: W0527 18:15:33.314715 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:33.314799 kubelet[2668]: E0527 18:15:33.314730 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:33.316323 kubelet[2668]: E0527 18:15:33.315137 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:33.316323 kubelet[2668]: W0527 18:15:33.315152 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:33.316323 kubelet[2668]: E0527 18:15:33.315167 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:33.316867 kubelet[2668]: E0527 18:15:33.316765 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:33.316867 kubelet[2668]: W0527 18:15:33.316784 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:33.316867 kubelet[2668]: E0527 18:15:33.316801 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:33.317350 kubelet[2668]: E0527 18:15:33.317236 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:33.317350 kubelet[2668]: W0527 18:15:33.317254 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:33.317350 kubelet[2668]: E0527 18:15:33.317268 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:33.349531 kubelet[2668]: E0527 18:15:33.349490 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:33.350228 kubelet[2668]: W0527 18:15:33.350080 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:33.351613 kubelet[2668]: E0527 18:15:33.350328 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:33.352591 kubelet[2668]: E0527 18:15:33.352539 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:33.352591 kubelet[2668]: W0527 18:15:33.352563 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:33.353115 kubelet[2668]: E0527 18:15:33.352889 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:33.353327 kubelet[2668]: E0527 18:15:33.353289 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:33.353485 kubelet[2668]: W0527 18:15:33.353401 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:33.353635 kubelet[2668]: E0527 18:15:33.353425 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:33.354048 kubelet[2668]: E0527 18:15:33.354024 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:33.354269 kubelet[2668]: W0527 18:15:33.354219 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:33.354269 kubelet[2668]: E0527 18:15:33.354246 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:33.355455 kubelet[2668]: E0527 18:15:33.354920 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:33.355455 kubelet[2668]: W0527 18:15:33.354939 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:33.355455 kubelet[2668]: E0527 18:15:33.354956 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:33.357695 kubelet[2668]: E0527 18:15:33.357645 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:33.357928 kubelet[2668]: W0527 18:15:33.357867 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:33.358089 kubelet[2668]: E0527 18:15:33.357914 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:33.359796 kubelet[2668]: E0527 18:15:33.359702 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:33.359796 kubelet[2668]: W0527 18:15:33.359730 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:33.360192 kubelet[2668]: E0527 18:15:33.360034 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:33.360663 kubelet[2668]: E0527 18:15:33.360632 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:33.360845 kubelet[2668]: W0527 18:15:33.360751 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:33.360845 kubelet[2668]: E0527 18:15:33.360782 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:33.361546 kubelet[2668]: E0527 18:15:33.361476 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:33.361546 kubelet[2668]: W0527 18:15:33.361498 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:33.361546 kubelet[2668]: E0527 18:15:33.361521 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:33.362251 kubelet[2668]: E0527 18:15:33.362194 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:33.362251 kubelet[2668]: W0527 18:15:33.362212 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:33.362251 kubelet[2668]: E0527 18:15:33.362232 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:33.364097 kubelet[2668]: E0527 18:15:33.363548 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:33.364097 kubelet[2668]: W0527 18:15:33.363569 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:33.364097 kubelet[2668]: E0527 18:15:33.363591 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:33.364731 kubelet[2668]: E0527 18:15:33.364709 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:33.364982 kubelet[2668]: W0527 18:15:33.364851 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:33.365213 kubelet[2668]: E0527 18:15:33.365071 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:33.366862 kubelet[2668]: E0527 18:15:33.366810 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:33.366862 kubelet[2668]: W0527 18:15:33.366826 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:33.366862 kubelet[2668]: E0527 18:15:33.366843 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:33.367973 kubelet[2668]: E0527 18:15:33.367748 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:33.367973 kubelet[2668]: W0527 18:15:33.367766 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:33.367973 kubelet[2668]: E0527 18:15:33.367818 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:33.369393 kubelet[2668]: E0527 18:15:33.368302 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:33.369724 kubelet[2668]: W0527 18:15:33.369610 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:33.369724 kubelet[2668]: E0527 18:15:33.369650 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:33.370182 kubelet[2668]: E0527 18:15:33.370164 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:33.370329 kubelet[2668]: W0527 18:15:33.370283 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:33.370329 kubelet[2668]: E0527 18:15:33.370306 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:33.370765 kubelet[2668]: E0527 18:15:33.370749 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:33.370938 kubelet[2668]: W0527 18:15:33.370875 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:33.370938 kubelet[2668]: E0527 18:15:33.370896 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:33.372784 kubelet[2668]: E0527 18:15:33.372720 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:33.372784 kubelet[2668]: W0527 18:15:33.372739 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:33.372784 kubelet[2668]: E0527 18:15:33.372757 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:34.111477 kubelet[2668]: E0527 18:15:34.111251 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bglbs" podUID="201ff555-e16a-488d-8fbf-728dbbf651e9" May 27 18:15:34.295130 kubelet[2668]: E0527 18:15:34.294744 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:34.318338 kubelet[2668]: I0527 18:15:34.318251 2668 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-9cfb99589-swjxg" podStartSLOduration=2.584612169 podStartE2EDuration="5.318231332s" podCreationTimestamp="2025-05-27 18:15:29 +0000 UTC" firstStartedPulling="2025-05-27 18:15:30.110394536 +0000 UTC m=+20.238223854" lastFinishedPulling="2025-05-27 18:15:32.844013714 +0000 UTC m=+22.971843017" observedRunningTime="2025-05-27 18:15:33.385610652 +0000 UTC m=+23.513439977" watchObservedRunningTime="2025-05-27 18:15:34.318231332 +0000 UTC m=+24.446060652" May 27 18:15:34.324907 kubelet[2668]: E0527 18:15:34.324846 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:34.327122 kubelet[2668]: W0527 18:15:34.326521 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:34.327122 kubelet[2668]: E0527 18:15:34.326989 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:34.327869 kubelet[2668]: E0527 18:15:34.327682 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:34.327869 kubelet[2668]: W0527 18:15:34.327705 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:34.327869 kubelet[2668]: E0527 18:15:34.327728 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:34.328336 kubelet[2668]: E0527 18:15:34.328281 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:34.328336 kubelet[2668]: W0527 18:15:34.328302 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:34.328672 kubelet[2668]: E0527 18:15:34.328518 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:34.329558 kubelet[2668]: E0527 18:15:34.329032 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:34.329558 kubelet[2668]: W0527 18:15:34.329500 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:34.329558 kubelet[2668]: E0527 18:15:34.329527 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:34.330654 kubelet[2668]: E0527 18:15:34.330572 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:34.330654 kubelet[2668]: W0527 18:15:34.330587 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:34.331098 kubelet[2668]: E0527 18:15:34.330875 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:34.331944 kubelet[2668]: E0527 18:15:34.331896 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:34.332211 kubelet[2668]: W0527 18:15:34.332085 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:34.332211 kubelet[2668]: E0527 18:15:34.332105 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:34.333830 kubelet[2668]: E0527 18:15:34.333756 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:34.333830 kubelet[2668]: W0527 18:15:34.333779 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:34.334059 kubelet[2668]: E0527 18:15:34.333979 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:34.334804 kubelet[2668]: E0527 18:15:34.334616 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:34.334804 kubelet[2668]: W0527 18:15:34.334642 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:34.334804 kubelet[2668]: E0527 18:15:34.334660 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:34.335240 kubelet[2668]: E0527 18:15:34.335219 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:34.335661 kubelet[2668]: W0527 18:15:34.335418 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:34.335661 kubelet[2668]: E0527 18:15:34.335471 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:34.336454 kubelet[2668]: E0527 18:15:34.336314 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:34.336454 kubelet[2668]: W0527 18:15:34.336329 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:34.336454 kubelet[2668]: E0527 18:15:34.336344 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:34.337001 kubelet[2668]: E0527 18:15:34.336872 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:34.337001 kubelet[2668]: W0527 18:15:34.336885 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:34.337001 kubelet[2668]: E0527 18:15:34.336902 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:34.338546 kubelet[2668]: E0527 18:15:34.338524 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:34.338866 kubelet[2668]: W0527 18:15:34.338650 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:34.338866 kubelet[2668]: E0527 18:15:34.338677 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:34.339296 kubelet[2668]: E0527 18:15:34.339189 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:34.339631 kubelet[2668]: W0527 18:15:34.339608 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:34.339728 kubelet[2668]: E0527 18:15:34.339701 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:34.340887 kubelet[2668]: E0527 18:15:34.340561 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:34.340887 kubelet[2668]: W0527 18:15:34.340580 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:34.340887 kubelet[2668]: E0527 18:15:34.340600 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:34.341462 kubelet[2668]: E0527 18:15:34.341402 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:34.341747 kubelet[2668]: W0527 18:15:34.341425 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:34.341747 kubelet[2668]: E0527 18:15:34.341573 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:34.362543 kubelet[2668]: E0527 18:15:34.362256 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:34.362543 kubelet[2668]: W0527 18:15:34.362296 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:34.362543 kubelet[2668]: E0527 18:15:34.362329 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:34.363651 kubelet[2668]: E0527 18:15:34.363578 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:34.363651 kubelet[2668]: W0527 18:15:34.363601 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:34.364548 kubelet[2668]: E0527 18:15:34.363773 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:34.365019 kubelet[2668]: E0527 18:15:34.364730 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:34.365019 kubelet[2668]: W0527 18:15:34.364754 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:34.365019 kubelet[2668]: E0527 18:15:34.364777 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:34.365242 kubelet[2668]: E0527 18:15:34.365227 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:34.365345 kubelet[2668]: W0527 18:15:34.365329 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:34.365406 kubelet[2668]: E0527 18:15:34.365393 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:34.365870 kubelet[2668]: E0527 18:15:34.365853 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:34.366147 kubelet[2668]: W0527 18:15:34.365959 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:34.366147 kubelet[2668]: E0527 18:15:34.365991 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:34.367259 kubelet[2668]: E0527 18:15:34.367058 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:34.367259 kubelet[2668]: W0527 18:15:34.367074 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:34.367259 kubelet[2668]: E0527 18:15:34.367089 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:34.368089 kubelet[2668]: E0527 18:15:34.367984 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:34.368790 kubelet[2668]: W0527 18:15:34.368340 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:34.368790 kubelet[2668]: E0527 18:15:34.368368 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:34.369526 kubelet[2668]: E0527 18:15:34.369509 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:34.369625 kubelet[2668]: W0527 18:15:34.369607 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:34.369886 kubelet[2668]: E0527 18:15:34.369860 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:34.370652 kubelet[2668]: E0527 18:15:34.370564 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:34.370652 kubelet[2668]: W0527 18:15:34.370581 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:34.370652 kubelet[2668]: E0527 18:15:34.370598 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:34.371268 kubelet[2668]: E0527 18:15:34.371143 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:34.371268 kubelet[2668]: W0527 18:15:34.371157 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:34.371268 kubelet[2668]: E0527 18:15:34.371171 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:34.371502 kubelet[2668]: E0527 18:15:34.371485 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:34.371816 kubelet[2668]: W0527 18:15:34.371554 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:34.371816 kubelet[2668]: E0527 18:15:34.371570 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:34.372320 kubelet[2668]: E0527 18:15:34.372304 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:34.373068 kubelet[2668]: W0527 18:15:34.372405 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:34.373068 kubelet[2668]: E0527 18:15:34.372424 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:34.373068 kubelet[2668]: E0527 18:15:34.372897 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:34.373068 kubelet[2668]: W0527 18:15:34.372916 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:34.373068 kubelet[2668]: E0527 18:15:34.372935 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:34.373329 kubelet[2668]: E0527 18:15:34.373312 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:34.373409 kubelet[2668]: W0527 18:15:34.373395 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:34.373503 kubelet[2668]: E0527 18:15:34.373487 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:34.373863 kubelet[2668]: E0527 18:15:34.373845 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:34.373967 kubelet[2668]: W0527 18:15:34.373950 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:34.374039 kubelet[2668]: E0527 18:15:34.374028 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:34.374734 kubelet[2668]: E0527 18:15:34.374663 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:34.374734 kubelet[2668]: W0527 18:15:34.374688 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:34.374734 kubelet[2668]: E0527 18:15:34.374708 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:34.375267 kubelet[2668]: E0527 18:15:34.375242 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:34.375267 kubelet[2668]: W0527 18:15:34.375264 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:34.375346 kubelet[2668]: E0527 18:15:34.375282 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:34.375668 kubelet[2668]: E0527 18:15:34.375648 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 18:15:34.375742 kubelet[2668]: W0527 18:15:34.375668 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 18:15:34.375742 kubelet[2668]: E0527 18:15:34.375685 2668 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 18:15:34.882016 containerd[1543]: time="2025-05-27T18:15:34.881877901Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:15:34.883263 containerd[1543]: time="2025-05-27T18:15:34.883080954Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0: active requests=0, bytes read=4441619" May 27 18:15:34.883911 containerd[1543]: time="2025-05-27T18:15:34.883872283Z" level=info msg="ImageCreate event name:\"sha256:c53606cea03e59dcbfa981dc43a55dff05952895f72576b8389fa00be09ab676\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:15:34.886012 containerd[1543]: time="2025-05-27T18:15:34.885963336Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:ce76dd87f11d3fd0054c35ad2e0e9f833748d007f77a9bfe859d0ddcb66fcb2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:15:34.886867 containerd[1543]: time="2025-05-27T18:15:34.886821929Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" with image id \"sha256:c53606cea03e59dcbfa981dc43a55dff05952895f72576b8389fa00be09ab676\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:ce76dd87f11d3fd0054c35ad2e0e9f833748d007f77a9bfe859d0ddcb66fcb2c\", size \"5934282\" in 2.041999248s" May 27 18:15:34.887031 containerd[1543]: time="2025-05-27T18:15:34.887008858Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" returns image reference \"sha256:c53606cea03e59dcbfa981dc43a55dff05952895f72576b8389fa00be09ab676\"" May 27 18:15:34.892935 containerd[1543]: time="2025-05-27T18:15:34.892890137Z" level=info msg="CreateContainer within sandbox \"e1edfeb6bc174344d1893d9d9a496ed2407a3d49caba857eb73d775fe58fd102\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 27 18:15:34.905694 containerd[1543]: time="2025-05-27T18:15:34.905633771Z" level=info msg="Container 2d5175992773f3f9c20c4c62c53b246bb2d06c65019f8a14e196609275ddfc18: CDI devices from CRI Config.CDIDevices: []" May 27 18:15:34.914476 containerd[1543]: time="2025-05-27T18:15:34.914362082Z" level=info msg="CreateContainer within sandbox \"e1edfeb6bc174344d1893d9d9a496ed2407a3d49caba857eb73d775fe58fd102\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"2d5175992773f3f9c20c4c62c53b246bb2d06c65019f8a14e196609275ddfc18\"" May 27 18:15:34.916042 containerd[1543]: time="2025-05-27T18:15:34.915248816Z" level=info msg="StartContainer for \"2d5175992773f3f9c20c4c62c53b246bb2d06c65019f8a14e196609275ddfc18\"" May 27 18:15:34.918542 containerd[1543]: time="2025-05-27T18:15:34.918416483Z" level=info msg="connecting to shim 2d5175992773f3f9c20c4c62c53b246bb2d06c65019f8a14e196609275ddfc18" address="unix:///run/containerd/s/578e2f108ff4de2555abab7f48ba40144e4f7d4aa92f5317fb1ffef9bfd47beb" protocol=ttrpc version=3 May 27 18:15:34.951695 systemd[1]: Started cri-containerd-2d5175992773f3f9c20c4c62c53b246bb2d06c65019f8a14e196609275ddfc18.scope - libcontainer container 2d5175992773f3f9c20c4c62c53b246bb2d06c65019f8a14e196609275ddfc18. May 27 18:15:35.002718 containerd[1543]: time="2025-05-27T18:15:35.002679336Z" level=info msg="StartContainer for \"2d5175992773f3f9c20c4c62c53b246bb2d06c65019f8a14e196609275ddfc18\" returns successfully" May 27 18:15:35.017949 systemd[1]: cri-containerd-2d5175992773f3f9c20c4c62c53b246bb2d06c65019f8a14e196609275ddfc18.scope: Deactivated successfully. May 27 18:15:35.056988 containerd[1543]: time="2025-05-27T18:15:35.056862271Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2d5175992773f3f9c20c4c62c53b246bb2d06c65019f8a14e196609275ddfc18\" id:\"2d5175992773f3f9c20c4c62c53b246bb2d06c65019f8a14e196609275ddfc18\" pid:3406 exited_at:{seconds:1748369735 nanos:21318450}" May 27 18:15:35.057448 containerd[1543]: time="2025-05-27T18:15:35.057076768Z" level=info msg="received exit event container_id:\"2d5175992773f3f9c20c4c62c53b246bb2d06c65019f8a14e196609275ddfc18\" id:\"2d5175992773f3f9c20c4c62c53b246bb2d06c65019f8a14e196609275ddfc18\" pid:3406 exited_at:{seconds:1748369735 nanos:21318450}" May 27 18:15:35.089235 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2d5175992773f3f9c20c4c62c53b246bb2d06c65019f8a14e196609275ddfc18-rootfs.mount: Deactivated successfully. May 27 18:15:35.301537 kubelet[2668]: E0527 18:15:35.301484 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:35.305443 containerd[1543]: time="2025-05-27T18:15:35.305283678Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.0\"" May 27 18:15:36.116011 kubelet[2668]: E0527 18:15:36.115928 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bglbs" podUID="201ff555-e16a-488d-8fbf-728dbbf651e9" May 27 18:15:38.111134 kubelet[2668]: E0527 18:15:38.111086 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bglbs" podUID="201ff555-e16a-488d-8fbf-728dbbf651e9" May 27 18:15:39.206970 containerd[1543]: time="2025-05-27T18:15:39.206900374Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:15:39.208586 containerd[1543]: time="2025-05-27T18:15:39.208541505Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.0: active requests=0, bytes read=70300568" May 27 18:15:39.210454 containerd[1543]: time="2025-05-27T18:15:39.210326693Z" level=info msg="ImageCreate event name:\"sha256:15f996c472622f23047ea38b2d72940e8c34d0996b8a2e12a1f255c1d7083185\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:15:39.212461 containerd[1543]: time="2025-05-27T18:15:39.211731675Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:3dd06656abdc03fbd51782d5f6fe4d70e6825a1c0c5bce2a165bbd2ff9e0f7df\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:15:39.212461 containerd[1543]: time="2025-05-27T18:15:39.212357886Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.0\" with image id \"sha256:15f996c472622f23047ea38b2d72940e8c34d0996b8a2e12a1f255c1d7083185\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:3dd06656abdc03fbd51782d5f6fe4d70e6825a1c0c5bce2a165bbd2ff9e0f7df\", size \"71793271\" in 3.90678174s" May 27 18:15:39.212461 containerd[1543]: time="2025-05-27T18:15:39.212385425Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.0\" returns image reference \"sha256:15f996c472622f23047ea38b2d72940e8c34d0996b8a2e12a1f255c1d7083185\"" May 27 18:15:39.216784 containerd[1543]: time="2025-05-27T18:15:39.216733638Z" level=info msg="CreateContainer within sandbox \"e1edfeb6bc174344d1893d9d9a496ed2407a3d49caba857eb73d775fe58fd102\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 27 18:15:39.239793 containerd[1543]: time="2025-05-27T18:15:39.239749396Z" level=info msg="Container 11170eb582c7344789b0a9631283e4f3b467b2d5e3dd782191011ed7eb64f9c3: CDI devices from CRI Config.CDIDevices: []" May 27 18:15:39.245980 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1827651965.mount: Deactivated successfully. May 27 18:15:39.255574 containerd[1543]: time="2025-05-27T18:15:39.255508798Z" level=info msg="CreateContainer within sandbox \"e1edfeb6bc174344d1893d9d9a496ed2407a3d49caba857eb73d775fe58fd102\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"11170eb582c7344789b0a9631283e4f3b467b2d5e3dd782191011ed7eb64f9c3\"" May 27 18:15:39.256427 containerd[1543]: time="2025-05-27T18:15:39.256326570Z" level=info msg="StartContainer for \"11170eb582c7344789b0a9631283e4f3b467b2d5e3dd782191011ed7eb64f9c3\"" May 27 18:15:39.259366 containerd[1543]: time="2025-05-27T18:15:39.259310600Z" level=info msg="connecting to shim 11170eb582c7344789b0a9631283e4f3b467b2d5e3dd782191011ed7eb64f9c3" address="unix:///run/containerd/s/578e2f108ff4de2555abab7f48ba40144e4f7d4aa92f5317fb1ffef9bfd47beb" protocol=ttrpc version=3 May 27 18:15:39.297727 systemd[1]: Started cri-containerd-11170eb582c7344789b0a9631283e4f3b467b2d5e3dd782191011ed7eb64f9c3.scope - libcontainer container 11170eb582c7344789b0a9631283e4f3b467b2d5e3dd782191011ed7eb64f9c3. May 27 18:15:39.358801 containerd[1543]: time="2025-05-27T18:15:39.358706462Z" level=info msg="StartContainer for \"11170eb582c7344789b0a9631283e4f3b467b2d5e3dd782191011ed7eb64f9c3\" returns successfully" May 27 18:15:40.004592 systemd[1]: cri-containerd-11170eb582c7344789b0a9631283e4f3b467b2d5e3dd782191011ed7eb64f9c3.scope: Deactivated successfully. May 27 18:15:40.005521 systemd[1]: cri-containerd-11170eb582c7344789b0a9631283e4f3b467b2d5e3dd782191011ed7eb64f9c3.scope: Consumed 674ms CPU time, 165.7M memory peak, 15.3M read from disk, 170.9M written to disk. May 27 18:15:40.010339 containerd[1543]: time="2025-05-27T18:15:40.010212778Z" level=info msg="received exit event container_id:\"11170eb582c7344789b0a9631283e4f3b467b2d5e3dd782191011ed7eb64f9c3\" id:\"11170eb582c7344789b0a9631283e4f3b467b2d5e3dd782191011ed7eb64f9c3\" pid:3467 exited_at:{seconds:1748369740 nanos:7901884}" May 27 18:15:40.014726 containerd[1543]: time="2025-05-27T18:15:40.014644292Z" level=info msg="TaskExit event in podsandbox handler container_id:\"11170eb582c7344789b0a9631283e4f3b467b2d5e3dd782191011ed7eb64f9c3\" id:\"11170eb582c7344789b0a9631283e4f3b467b2d5e3dd782191011ed7eb64f9c3\" pid:3467 exited_at:{seconds:1748369740 nanos:7901884}" May 27 18:15:40.065377 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-11170eb582c7344789b0a9631283e4f3b467b2d5e3dd782191011ed7eb64f9c3-rootfs.mount: Deactivated successfully. May 27 18:15:40.102583 kubelet[2668]: I0527 18:15:40.102256 2668 kubelet_node_status.go:501] "Fast updating node status as it just became ready" May 27 18:15:40.126316 systemd[1]: Created slice kubepods-besteffort-pod201ff555_e16a_488d_8fbf_728dbbf651e9.slice - libcontainer container kubepods-besteffort-pod201ff555_e16a_488d_8fbf_728dbbf651e9.slice. May 27 18:15:40.134640 containerd[1543]: time="2025-05-27T18:15:40.134532914Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-bglbs,Uid:201ff555-e16a-488d-8fbf-728dbbf651e9,Namespace:calico-system,Attempt:0,}" May 27 18:15:40.296258 kubelet[2668]: I0527 18:15:40.296094 2668 status_manager.go:895] "Failed to get status for pod" podUID="63bb5b15-0701-4fe1-85dc-ecf91b71d70e" pod="calico-system/whisker-d898d594d-j6fkv" err="pods \"whisker-d898d594d-j6fkv\" is forbidden: User \"system:node:ci-4344.0.0-6-bb492ec913\" cannot get resource \"pods\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4344.0.0-6-bb492ec913' and this object" May 27 18:15:40.300162 kubelet[2668]: E0527 18:15:40.296915 2668 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"whisker-ca-bundle\" is forbidden: User \"system:node:ci-4344.0.0-6-bb492ec913\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4344.0.0-6-bb492ec913' and this object" logger="UnhandledError" reflector="object-\"calico-system\"/\"whisker-ca-bundle\"" type="*v1.ConfigMap" May 27 18:15:40.300591 kubelet[2668]: E0527 18:15:40.300413 2668 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"whisker-backend-key-pair\" is forbidden: User \"system:node:ci-4344.0.0-6-bb492ec913\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4344.0.0-6-bb492ec913' and this object" logger="UnhandledError" reflector="object-\"calico-system\"/\"whisker-backend-key-pair\"" type="*v1.Secret" May 27 18:15:40.317600 systemd[1]: Created slice kubepods-besteffort-pod63bb5b15_0701_4fe1_85dc_ecf91b71d70e.slice - libcontainer container kubepods-besteffort-pod63bb5b15_0701_4fe1_85dc_ecf91b71d70e.slice. May 27 18:15:40.326020 kubelet[2668]: I0527 18:15:40.323932 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ce66028f-5ce9-4b5e-94a0-7c40a79bb9e7-config-volume\") pod \"coredns-674b8bbfcf-wj82f\" (UID: \"ce66028f-5ce9-4b5e-94a0-7c40a79bb9e7\") " pod="kube-system/coredns-674b8bbfcf-wj82f" May 27 18:15:40.326020 kubelet[2668]: I0527 18:15:40.324006 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/63bb5b15-0701-4fe1-85dc-ecf91b71d70e-whisker-ca-bundle\") pod \"whisker-d898d594d-j6fkv\" (UID: \"63bb5b15-0701-4fe1-85dc-ecf91b71d70e\") " pod="calico-system/whisker-d898d594d-j6fkv" May 27 18:15:40.326020 kubelet[2668]: I0527 18:15:40.324035 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4tbh\" (UniqueName: \"kubernetes.io/projected/01eaa575-f0aa-4af1-a87a-d7de0da8f15b-kube-api-access-v4tbh\") pod \"coredns-674b8bbfcf-wl8mq\" (UID: \"01eaa575-f0aa-4af1-a87a-d7de0da8f15b\") " pod="kube-system/coredns-674b8bbfcf-wl8mq" May 27 18:15:40.326020 kubelet[2668]: I0527 18:15:40.324062 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/63bb5b15-0701-4fe1-85dc-ecf91b71d70e-whisker-backend-key-pair\") pod \"whisker-d898d594d-j6fkv\" (UID: \"63bb5b15-0701-4fe1-85dc-ecf91b71d70e\") " pod="calico-system/whisker-d898d594d-j6fkv" May 27 18:15:40.326020 kubelet[2668]: I0527 18:15:40.324101 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2v84\" (UniqueName: \"kubernetes.io/projected/63bb5b15-0701-4fe1-85dc-ecf91b71d70e-kube-api-access-j2v84\") pod \"whisker-d898d594d-j6fkv\" (UID: \"63bb5b15-0701-4fe1-85dc-ecf91b71d70e\") " pod="calico-system/whisker-d898d594d-j6fkv" May 27 18:15:40.326363 kubelet[2668]: I0527 18:15:40.324126 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8dxk\" (UniqueName: \"kubernetes.io/projected/ce66028f-5ce9-4b5e-94a0-7c40a79bb9e7-kube-api-access-v8dxk\") pod \"coredns-674b8bbfcf-wj82f\" (UID: \"ce66028f-5ce9-4b5e-94a0-7c40a79bb9e7\") " pod="kube-system/coredns-674b8bbfcf-wj82f" May 27 18:15:40.326363 kubelet[2668]: I0527 18:15:40.324170 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01eaa575-f0aa-4af1-a87a-d7de0da8f15b-config-volume\") pod \"coredns-674b8bbfcf-wl8mq\" (UID: \"01eaa575-f0aa-4af1-a87a-d7de0da8f15b\") " pod="kube-system/coredns-674b8bbfcf-wl8mq" May 27 18:15:40.341478 systemd[1]: Created slice kubepods-burstable-pod01eaa575_f0aa_4af1_a87a_d7de0da8f15b.slice - libcontainer container kubepods-burstable-pod01eaa575_f0aa_4af1_a87a_d7de0da8f15b.slice. May 27 18:15:40.373170 systemd[1]: Created slice kubepods-burstable-podce66028f_5ce9_4b5e_94a0_7c40a79bb9e7.slice - libcontainer container kubepods-burstable-podce66028f_5ce9_4b5e_94a0_7c40a79bb9e7.slice. May 27 18:15:40.401220 systemd[1]: Created slice kubepods-besteffort-pod0cb2dcfc_ccf3_42e3_add2_e6cfa71bb149.slice - libcontainer container kubepods-besteffort-pod0cb2dcfc_ccf3_42e3_add2_e6cfa71bb149.slice. May 27 18:15:40.419802 containerd[1543]: time="2025-05-27T18:15:40.419109424Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.0\"" May 27 18:15:40.421411 systemd[1]: Created slice kubepods-besteffort-pod74b5366e_b0c8_4081_9c1e_92ebd54aa8ff.slice - libcontainer container kubepods-besteffort-pod74b5366e_b0c8_4081_9c1e_92ebd54aa8ff.slice. May 27 18:15:40.431912 kubelet[2668]: I0527 18:15:40.428895 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/74b5366e-b0c8-4081-9c1e-92ebd54aa8ff-calico-apiserver-certs\") pod \"calico-apiserver-59958685d4-slstk\" (UID: \"74b5366e-b0c8-4081-9c1e-92ebd54aa8ff\") " pod="calico-apiserver/calico-apiserver-59958685d4-slstk" May 27 18:15:40.431912 kubelet[2668]: I0527 18:15:40.428996 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/312bbe06-8b56-48c8-bd2e-4f07049cb4ed-goldmane-ca-bundle\") pod \"goldmane-78d55f7ddc-hqs5q\" (UID: \"312bbe06-8b56-48c8-bd2e-4f07049cb4ed\") " pod="calico-system/goldmane-78d55f7ddc-hqs5q" May 27 18:15:40.431912 kubelet[2668]: I0527 18:15:40.429057 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/312bbe06-8b56-48c8-bd2e-4f07049cb4ed-config\") pod \"goldmane-78d55f7ddc-hqs5q\" (UID: \"312bbe06-8b56-48c8-bd2e-4f07049cb4ed\") " pod="calico-system/goldmane-78d55f7ddc-hqs5q" May 27 18:15:40.431912 kubelet[2668]: I0527 18:15:40.429085 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dctn5\" (UniqueName: \"kubernetes.io/projected/312bbe06-8b56-48c8-bd2e-4f07049cb4ed-kube-api-access-dctn5\") pod \"goldmane-78d55f7ddc-hqs5q\" (UID: \"312bbe06-8b56-48c8-bd2e-4f07049cb4ed\") " pod="calico-system/goldmane-78d55f7ddc-hqs5q" May 27 18:15:40.431912 kubelet[2668]: I0527 18:15:40.429127 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dj4x\" (UniqueName: \"kubernetes.io/projected/74b5366e-b0c8-4081-9c1e-92ebd54aa8ff-kube-api-access-7dj4x\") pod \"calico-apiserver-59958685d4-slstk\" (UID: \"74b5366e-b0c8-4081-9c1e-92ebd54aa8ff\") " pod="calico-apiserver/calico-apiserver-59958685d4-slstk" May 27 18:15:40.432261 kubelet[2668]: I0527 18:15:40.429157 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6kdx\" (UniqueName: \"kubernetes.io/projected/0cb2dcfc-ccf3-42e3-add2-e6cfa71bb149-kube-api-access-g6kdx\") pod \"calico-kube-controllers-78bf7cd7cb-xmffc\" (UID: \"0cb2dcfc-ccf3-42e3-add2-e6cfa71bb149\") " pod="calico-system/calico-kube-controllers-78bf7cd7cb-xmffc" May 27 18:15:40.432261 kubelet[2668]: I0527 18:15:40.429190 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxdl9\" (UniqueName: \"kubernetes.io/projected/03e37baf-9d1d-4e3b-b155-232ddbbdbb3b-kube-api-access-qxdl9\") pod \"calico-apiserver-59958685d4-qt6zw\" (UID: \"03e37baf-9d1d-4e3b-b155-232ddbbdbb3b\") " pod="calico-apiserver/calico-apiserver-59958685d4-qt6zw" May 27 18:15:40.432261 kubelet[2668]: I0527 18:15:40.429222 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/312bbe06-8b56-48c8-bd2e-4f07049cb4ed-goldmane-key-pair\") pod \"goldmane-78d55f7ddc-hqs5q\" (UID: \"312bbe06-8b56-48c8-bd2e-4f07049cb4ed\") " pod="calico-system/goldmane-78d55f7ddc-hqs5q" May 27 18:15:40.432261 kubelet[2668]: I0527 18:15:40.429247 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/03e37baf-9d1d-4e3b-b155-232ddbbdbb3b-calico-apiserver-certs\") pod \"calico-apiserver-59958685d4-qt6zw\" (UID: \"03e37baf-9d1d-4e3b-b155-232ddbbdbb3b\") " pod="calico-apiserver/calico-apiserver-59958685d4-qt6zw" May 27 18:15:40.432261 kubelet[2668]: I0527 18:15:40.429300 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0cb2dcfc-ccf3-42e3-add2-e6cfa71bb149-tigera-ca-bundle\") pod \"calico-kube-controllers-78bf7cd7cb-xmffc\" (UID: \"0cb2dcfc-ccf3-42e3-add2-e6cfa71bb149\") " pod="calico-system/calico-kube-controllers-78bf7cd7cb-xmffc" May 27 18:15:40.446733 systemd[1]: Created slice kubepods-besteffort-pod312bbe06_8b56_48c8_bd2e_4f07049cb4ed.slice - libcontainer container kubepods-besteffort-pod312bbe06_8b56_48c8_bd2e_4f07049cb4ed.slice. May 27 18:15:40.472859 systemd[1]: Created slice kubepods-besteffort-pod03e37baf_9d1d_4e3b_b155_232ddbbdbb3b.slice - libcontainer container kubepods-besteffort-pod03e37baf_9d1d_4e3b_b155_232ddbbdbb3b.slice. May 27 18:15:40.647249 containerd[1543]: time="2025-05-27T18:15:40.647000206Z" level=error msg="Failed to destroy network for sandbox \"142c6f542ebf65ac5ae71a07f73182056556a47841ff8565d1117a5b64acd308\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:15:40.659423 kubelet[2668]: E0527 18:15:40.657484 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:40.669719 containerd[1543]: time="2025-05-27T18:15:40.648830396Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-bglbs,Uid:201ff555-e16a-488d-8fbf-728dbbf651e9,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"142c6f542ebf65ac5ae71a07f73182056556a47841ff8565d1117a5b64acd308\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:15:40.670986 containerd[1543]: time="2025-05-27T18:15:40.660208708Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-wl8mq,Uid:01eaa575-f0aa-4af1-a87a-d7de0da8f15b,Namespace:kube-system,Attempt:0,}" May 27 18:15:40.671773 kubelet[2668]: E0527 18:15:40.671615 2668 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"142c6f542ebf65ac5ae71a07f73182056556a47841ff8565d1117a5b64acd308\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:15:40.671773 kubelet[2668]: E0527 18:15:40.671724 2668 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"142c6f542ebf65ac5ae71a07f73182056556a47841ff8565d1117a5b64acd308\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-bglbs" May 27 18:15:40.672411 kubelet[2668]: E0527 18:15:40.672068 2668 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"142c6f542ebf65ac5ae71a07f73182056556a47841ff8565d1117a5b64acd308\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-bglbs" May 27 18:15:40.672642 kubelet[2668]: E0527 18:15:40.672584 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-bglbs_calico-system(201ff555-e16a-488d-8fbf-728dbbf651e9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-bglbs_calico-system(201ff555-e16a-488d-8fbf-728dbbf651e9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"142c6f542ebf65ac5ae71a07f73182056556a47841ff8565d1117a5b64acd308\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-bglbs" podUID="201ff555-e16a-488d-8fbf-728dbbf651e9" May 27 18:15:40.682509 kubelet[2668]: E0527 18:15:40.682297 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:40.685774 containerd[1543]: time="2025-05-27T18:15:40.685717341Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-wj82f,Uid:ce66028f-5ce9-4b5e-94a0-7c40a79bb9e7,Namespace:kube-system,Attempt:0,}" May 27 18:15:40.717055 containerd[1543]: time="2025-05-27T18:15:40.717004951Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-78bf7cd7cb-xmffc,Uid:0cb2dcfc-ccf3-42e3-add2-e6cfa71bb149,Namespace:calico-system,Attempt:0,}" May 27 18:15:40.731658 containerd[1543]: time="2025-05-27T18:15:40.731417382Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59958685d4-slstk,Uid:74b5366e-b0c8-4081-9c1e-92ebd54aa8ff,Namespace:calico-apiserver,Attempt:0,}" May 27 18:15:40.766105 containerd[1543]: time="2025-05-27T18:15:40.765983470Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-hqs5q,Uid:312bbe06-8b56-48c8-bd2e-4f07049cb4ed,Namespace:calico-system,Attempt:0,}" May 27 18:15:40.814846 containerd[1543]: time="2025-05-27T18:15:40.813956612Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59958685d4-qt6zw,Uid:03e37baf-9d1d-4e3b-b155-232ddbbdbb3b,Namespace:calico-apiserver,Attempt:0,}" May 27 18:15:40.969618 containerd[1543]: time="2025-05-27T18:15:40.968856740Z" level=error msg="Failed to destroy network for sandbox \"76d16f5b8bef971be366d24e53227ce82b8521f7b2cb08efc5d2c711180ddb2d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:15:40.972784 containerd[1543]: time="2025-05-27T18:15:40.972698376Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-wl8mq,Uid:01eaa575-f0aa-4af1-a87a-d7de0da8f15b,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"76d16f5b8bef971be366d24e53227ce82b8521f7b2cb08efc5d2c711180ddb2d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:15:40.974073 kubelet[2668]: E0527 18:15:40.973357 2668 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"76d16f5b8bef971be366d24e53227ce82b8521f7b2cb08efc5d2c711180ddb2d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:15:40.974073 kubelet[2668]: E0527 18:15:40.973461 2668 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"76d16f5b8bef971be366d24e53227ce82b8521f7b2cb08efc5d2c711180ddb2d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-wl8mq" May 27 18:15:40.974073 kubelet[2668]: E0527 18:15:40.973497 2668 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"76d16f5b8bef971be366d24e53227ce82b8521f7b2cb08efc5d2c711180ddb2d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-wl8mq" May 27 18:15:40.974445 kubelet[2668]: E0527 18:15:40.974386 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-wl8mq_kube-system(01eaa575-f0aa-4af1-a87a-d7de0da8f15b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-wl8mq_kube-system(01eaa575-f0aa-4af1-a87a-d7de0da8f15b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"76d16f5b8bef971be366d24e53227ce82b8521f7b2cb08efc5d2c711180ddb2d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-wl8mq" podUID="01eaa575-f0aa-4af1-a87a-d7de0da8f15b" May 27 18:15:40.997192 containerd[1543]: time="2025-05-27T18:15:40.997115441Z" level=error msg="Failed to destroy network for sandbox \"971f355adefd9a9e6082456b0436eb03ff5b3e7bec0da193de895a457d7c3aaa\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:15:40.998677 containerd[1543]: time="2025-05-27T18:15:40.998567182Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59958685d4-slstk,Uid:74b5366e-b0c8-4081-9c1e-92ebd54aa8ff,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"971f355adefd9a9e6082456b0436eb03ff5b3e7bec0da193de895a457d7c3aaa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:15:40.999204 kubelet[2668]: E0527 18:15:40.999147 2668 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"971f355adefd9a9e6082456b0436eb03ff5b3e7bec0da193de895a457d7c3aaa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:15:40.999341 kubelet[2668]: E0527 18:15:40.999238 2668 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"971f355adefd9a9e6082456b0436eb03ff5b3e7bec0da193de895a457d7c3aaa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-59958685d4-slstk" May 27 18:15:40.999341 kubelet[2668]: E0527 18:15:40.999268 2668 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"971f355adefd9a9e6082456b0436eb03ff5b3e7bec0da193de895a457d7c3aaa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-59958685d4-slstk" May 27 18:15:40.999590 kubelet[2668]: E0527 18:15:40.999358 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-59958685d4-slstk_calico-apiserver(74b5366e-b0c8-4081-9c1e-92ebd54aa8ff)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-59958685d4-slstk_calico-apiserver(74b5366e-b0c8-4081-9c1e-92ebd54aa8ff)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"971f355adefd9a9e6082456b0436eb03ff5b3e7bec0da193de895a457d7c3aaa\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-59958685d4-slstk" podUID="74b5366e-b0c8-4081-9c1e-92ebd54aa8ff" May 27 18:15:41.016665 containerd[1543]: time="2025-05-27T18:15:41.016601943Z" level=error msg="Failed to destroy network for sandbox \"7e7caf44ad61165f77b5b7e48732db189a9c6a0b247f04ea07c1b29758df1ec2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:15:41.019885 containerd[1543]: time="2025-05-27T18:15:41.019815284Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-wj82f,Uid:ce66028f-5ce9-4b5e-94a0-7c40a79bb9e7,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7e7caf44ad61165f77b5b7e48732db189a9c6a0b247f04ea07c1b29758df1ec2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:15:41.021076 kubelet[2668]: E0527 18:15:41.021025 2668 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7e7caf44ad61165f77b5b7e48732db189a9c6a0b247f04ea07c1b29758df1ec2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:15:41.021603 kubelet[2668]: E0527 18:15:41.021507 2668 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7e7caf44ad61165f77b5b7e48732db189a9c6a0b247f04ea07c1b29758df1ec2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-wj82f" May 27 18:15:41.021603 kubelet[2668]: E0527 18:15:41.021564 2668 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7e7caf44ad61165f77b5b7e48732db189a9c6a0b247f04ea07c1b29758df1ec2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-wj82f" May 27 18:15:41.022484 kubelet[2668]: E0527 18:15:41.021946 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-wj82f_kube-system(ce66028f-5ce9-4b5e-94a0-7c40a79bb9e7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-wj82f_kube-system(ce66028f-5ce9-4b5e-94a0-7c40a79bb9e7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7e7caf44ad61165f77b5b7e48732db189a9c6a0b247f04ea07c1b29758df1ec2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-wj82f" podUID="ce66028f-5ce9-4b5e-94a0-7c40a79bb9e7" May 27 18:15:41.058217 containerd[1543]: time="2025-05-27T18:15:41.057999751Z" level=error msg="Failed to destroy network for sandbox \"f1a3c54d49ea3bd47c2fcb851a26bb58a2d984834d4e9193972020aa273ef6e9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:15:41.060243 containerd[1543]: time="2025-05-27T18:15:41.059964691Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-78bf7cd7cb-xmffc,Uid:0cb2dcfc-ccf3-42e3-add2-e6cfa71bb149,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f1a3c54d49ea3bd47c2fcb851a26bb58a2d984834d4e9193972020aa273ef6e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:15:41.061107 kubelet[2668]: E0527 18:15:41.060587 2668 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f1a3c54d49ea3bd47c2fcb851a26bb58a2d984834d4e9193972020aa273ef6e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:15:41.061107 kubelet[2668]: E0527 18:15:41.060662 2668 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f1a3c54d49ea3bd47c2fcb851a26bb58a2d984834d4e9193972020aa273ef6e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-78bf7cd7cb-xmffc" May 27 18:15:41.061107 kubelet[2668]: E0527 18:15:41.060695 2668 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f1a3c54d49ea3bd47c2fcb851a26bb58a2d984834d4e9193972020aa273ef6e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-78bf7cd7cb-xmffc" May 27 18:15:41.061517 kubelet[2668]: E0527 18:15:41.060760 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-78bf7cd7cb-xmffc_calico-system(0cb2dcfc-ccf3-42e3-add2-e6cfa71bb149)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-78bf7cd7cb-xmffc_calico-system(0cb2dcfc-ccf3-42e3-add2-e6cfa71bb149)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f1a3c54d49ea3bd47c2fcb851a26bb58a2d984834d4e9193972020aa273ef6e9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-78bf7cd7cb-xmffc" podUID="0cb2dcfc-ccf3-42e3-add2-e6cfa71bb149" May 27 18:15:41.071605 containerd[1543]: time="2025-05-27T18:15:41.071540425Z" level=error msg="Failed to destroy network for sandbox \"f8c5db06eef8f799ec06fb15bfb355bd610cbd7cc63b5fb1a673cf0e2ae7ecf5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:15:41.073918 containerd[1543]: time="2025-05-27T18:15:41.073059109Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-hqs5q,Uid:312bbe06-8b56-48c8-bd2e-4f07049cb4ed,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f8c5db06eef8f799ec06fb15bfb355bd610cbd7cc63b5fb1a673cf0e2ae7ecf5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:15:41.075056 kubelet[2668]: E0527 18:15:41.074991 2668 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f8c5db06eef8f799ec06fb15bfb355bd610cbd7cc63b5fb1a673cf0e2ae7ecf5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:15:41.075429 kubelet[2668]: E0527 18:15:41.075084 2668 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f8c5db06eef8f799ec06fb15bfb355bd610cbd7cc63b5fb1a673cf0e2ae7ecf5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-78d55f7ddc-hqs5q" May 27 18:15:41.075429 kubelet[2668]: E0527 18:15:41.075114 2668 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f8c5db06eef8f799ec06fb15bfb355bd610cbd7cc63b5fb1a673cf0e2ae7ecf5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-78d55f7ddc-hqs5q" May 27 18:15:41.075429 kubelet[2668]: E0527 18:15:41.075213 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-78d55f7ddc-hqs5q_calico-system(312bbe06-8b56-48c8-bd2e-4f07049cb4ed)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-78d55f7ddc-hqs5q_calico-system(312bbe06-8b56-48c8-bd2e-4f07049cb4ed)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f8c5db06eef8f799ec06fb15bfb355bd610cbd7cc63b5fb1a673cf0e2ae7ecf5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-78d55f7ddc-hqs5q" podUID="312bbe06-8b56-48c8-bd2e-4f07049cb4ed" May 27 18:15:41.079192 containerd[1543]: time="2025-05-27T18:15:41.079093266Z" level=error msg="Failed to destroy network for sandbox \"5710adab22acfdad295635fcd50bfdb40f984bf70779e0c285383ba97018124b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:15:41.080771 containerd[1543]: time="2025-05-27T18:15:41.080560844Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59958685d4-qt6zw,Uid:03e37baf-9d1d-4e3b-b155-232ddbbdbb3b,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5710adab22acfdad295635fcd50bfdb40f984bf70779e0c285383ba97018124b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:15:41.080962 kubelet[2668]: E0527 18:15:41.080903 2668 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5710adab22acfdad295635fcd50bfdb40f984bf70779e0c285383ba97018124b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:15:41.081033 kubelet[2668]: E0527 18:15:41.081001 2668 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5710adab22acfdad295635fcd50bfdb40f984bf70779e0c285383ba97018124b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-59958685d4-qt6zw" May 27 18:15:41.081088 kubelet[2668]: E0527 18:15:41.081030 2668 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5710adab22acfdad295635fcd50bfdb40f984bf70779e0c285383ba97018124b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-59958685d4-qt6zw" May 27 18:15:41.081791 kubelet[2668]: E0527 18:15:41.081427 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-59958685d4-qt6zw_calico-apiserver(03e37baf-9d1d-4e3b-b155-232ddbbdbb3b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-59958685d4-qt6zw_calico-apiserver(03e37baf-9d1d-4e3b-b155-232ddbbdbb3b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5710adab22acfdad295635fcd50bfdb40f984bf70779e0c285383ba97018124b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-59958685d4-qt6zw" podUID="03e37baf-9d1d-4e3b-b155-232ddbbdbb3b" May 27 18:15:41.434556 kubelet[2668]: E0527 18:15:41.434097 2668 configmap.go:193] Couldn't get configMap calico-system/whisker-ca-bundle: failed to sync configmap cache: timed out waiting for the condition May 27 18:15:41.434556 kubelet[2668]: E0527 18:15:41.434230 2668 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/63bb5b15-0701-4fe1-85dc-ecf91b71d70e-whisker-ca-bundle podName:63bb5b15-0701-4fe1-85dc-ecf91b71d70e nodeName:}" failed. No retries permitted until 2025-05-27 18:15:41.934204867 +0000 UTC m=+32.062034183 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "whisker-ca-bundle" (UniqueName: "kubernetes.io/configmap/63bb5b15-0701-4fe1-85dc-ecf91b71d70e-whisker-ca-bundle") pod "whisker-d898d594d-j6fkv" (UID: "63bb5b15-0701-4fe1-85dc-ecf91b71d70e") : failed to sync configmap cache: timed out waiting for the condition May 27 18:15:41.436712 kubelet[2668]: E0527 18:15:41.434095 2668 secret.go:189] Couldn't get secret calico-system/whisker-backend-key-pair: failed to sync secret cache: timed out waiting for the condition May 27 18:15:41.436712 kubelet[2668]: E0527 18:15:41.435262 2668 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/63bb5b15-0701-4fe1-85dc-ecf91b71d70e-whisker-backend-key-pair podName:63bb5b15-0701-4fe1-85dc-ecf91b71d70e nodeName:}" failed. No retries permitted until 2025-05-27 18:15:41.935148654 +0000 UTC m=+32.062977972 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "whisker-backend-key-pair" (UniqueName: "kubernetes.io/secret/63bb5b15-0701-4fe1-85dc-ecf91b71d70e-whisker-backend-key-pair") pod "whisker-d898d594d-j6fkv" (UID: "63bb5b15-0701-4fe1-85dc-ecf91b71d70e") : failed to sync secret cache: timed out waiting for the condition May 27 18:15:41.496911 systemd[1]: run-netns-cni\x2dd05abc11\x2dd2df\x2d595c\x2d1937\x2d10a76086baa5.mount: Deactivated successfully. May 27 18:15:42.136890 containerd[1543]: time="2025-05-27T18:15:42.136814505Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-d898d594d-j6fkv,Uid:63bb5b15-0701-4fe1-85dc-ecf91b71d70e,Namespace:calico-system,Attempt:0,}" May 27 18:15:42.212629 containerd[1543]: time="2025-05-27T18:15:42.212555888Z" level=error msg="Failed to destroy network for sandbox \"7d3626a651ef5186834c95a80e63f99c8cc5755240e4ad365d78039dfba43a88\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:15:42.216330 containerd[1543]: time="2025-05-27T18:15:42.216192431Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-d898d594d-j6fkv,Uid:63bb5b15-0701-4fe1-85dc-ecf91b71d70e,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7d3626a651ef5186834c95a80e63f99c8cc5755240e4ad365d78039dfba43a88\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:15:42.216626 kubelet[2668]: E0527 18:15:42.216482 2668 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7d3626a651ef5186834c95a80e63f99c8cc5755240e4ad365d78039dfba43a88\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 18:15:42.216626 kubelet[2668]: E0527 18:15:42.216547 2668 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7d3626a651ef5186834c95a80e63f99c8cc5755240e4ad365d78039dfba43a88\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-d898d594d-j6fkv" May 27 18:15:42.216626 kubelet[2668]: E0527 18:15:42.216581 2668 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7d3626a651ef5186834c95a80e63f99c8cc5755240e4ad365d78039dfba43a88\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-d898d594d-j6fkv" May 27 18:15:42.216769 kubelet[2668]: E0527 18:15:42.216680 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-d898d594d-j6fkv_calico-system(63bb5b15-0701-4fe1-85dc-ecf91b71d70e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-d898d594d-j6fkv_calico-system(63bb5b15-0701-4fe1-85dc-ecf91b71d70e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7d3626a651ef5186834c95a80e63f99c8cc5755240e4ad365d78039dfba43a88\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-d898d594d-j6fkv" podUID="63bb5b15-0701-4fe1-85dc-ecf91b71d70e" May 27 18:15:42.219617 systemd[1]: run-netns-cni\x2d5c9fa751\x2de284\x2db153\x2d36d3\x2d454a1fcd4998.mount: Deactivated successfully. May 27 18:15:48.002398 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1618150698.mount: Deactivated successfully. May 27 18:15:48.024304 containerd[1543]: time="2025-05-27T18:15:48.024226641Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:15:48.025551 containerd[1543]: time="2025-05-27T18:15:48.025070761Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.0: active requests=0, bytes read=156396372" May 27 18:15:48.027462 containerd[1543]: time="2025-05-27T18:15:48.027138906Z" level=info msg="ImageCreate event name:\"sha256:d12dae9bc0999225efe30fd5618bcf2195709d54ed2840234f5006aab5f7d721\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:15:48.029476 containerd[1543]: time="2025-05-27T18:15:48.029387009Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:7cb61ea47ca0a8e6d0526a42da4f1e399b37ccd13339d0776d272465cb7ee012\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:15:48.030533 containerd[1543]: time="2025-05-27T18:15:48.030493332Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.0\" with image id \"sha256:d12dae9bc0999225efe30fd5618bcf2195709d54ed2840234f5006aab5f7d721\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/node@sha256:7cb61ea47ca0a8e6d0526a42da4f1e399b37ccd13339d0776d272465cb7ee012\", size \"156396234\" in 7.61132978s" May 27 18:15:48.030671 containerd[1543]: time="2025-05-27T18:15:48.030654751Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.0\" returns image reference \"sha256:d12dae9bc0999225efe30fd5618bcf2195709d54ed2840234f5006aab5f7d721\"" May 27 18:15:48.061235 containerd[1543]: time="2025-05-27T18:15:48.061013812Z" level=info msg="CreateContainer within sandbox \"e1edfeb6bc174344d1893d9d9a496ed2407a3d49caba857eb73d775fe58fd102\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 27 18:15:48.073547 containerd[1543]: time="2025-05-27T18:15:48.073494775Z" level=info msg="Container cfef13c479386db1499369db02a1c6f1fb36ea37285d4077a26db8d97d74af06: CDI devices from CRI Config.CDIDevices: []" May 27 18:15:48.079033 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount938084334.mount: Deactivated successfully. May 27 18:15:48.147829 containerd[1543]: time="2025-05-27T18:15:48.147777895Z" level=info msg="CreateContainer within sandbox \"e1edfeb6bc174344d1893d9d9a496ed2407a3d49caba857eb73d775fe58fd102\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"cfef13c479386db1499369db02a1c6f1fb36ea37285d4077a26db8d97d74af06\"" May 27 18:15:48.149716 containerd[1543]: time="2025-05-27T18:15:48.149626427Z" level=info msg="StartContainer for \"cfef13c479386db1499369db02a1c6f1fb36ea37285d4077a26db8d97d74af06\"" May 27 18:15:48.155136 containerd[1543]: time="2025-05-27T18:15:48.155080056Z" level=info msg="connecting to shim cfef13c479386db1499369db02a1c6f1fb36ea37285d4077a26db8d97d74af06" address="unix:///run/containerd/s/578e2f108ff4de2555abab7f48ba40144e4f7d4aa92f5317fb1ffef9bfd47beb" protocol=ttrpc version=3 May 27 18:15:48.321622 systemd[1]: Started cri-containerd-cfef13c479386db1499369db02a1c6f1fb36ea37285d4077a26db8d97d74af06.scope - libcontainer container cfef13c479386db1499369db02a1c6f1fb36ea37285d4077a26db8d97d74af06. May 27 18:15:48.461711 containerd[1543]: time="2025-05-27T18:15:48.461662256Z" level=info msg="StartContainer for \"cfef13c479386db1499369db02a1c6f1fb36ea37285d4077a26db8d97d74af06\" returns successfully" May 27 18:15:48.767905 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 27 18:15:48.768116 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 27 18:15:49.099526 kubelet[2668]: I0527 18:15:49.098249 2668 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/63bb5b15-0701-4fe1-85dc-ecf91b71d70e-whisker-backend-key-pair\") pod \"63bb5b15-0701-4fe1-85dc-ecf91b71d70e\" (UID: \"63bb5b15-0701-4fe1-85dc-ecf91b71d70e\") " May 27 18:15:49.102318 kubelet[2668]: I0527 18:15:49.100592 2668 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/63bb5b15-0701-4fe1-85dc-ecf91b71d70e-whisker-ca-bundle\") pod \"63bb5b15-0701-4fe1-85dc-ecf91b71d70e\" (UID: \"63bb5b15-0701-4fe1-85dc-ecf91b71d70e\") " May 27 18:15:49.102318 kubelet[2668]: I0527 18:15:49.100660 2668 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2v84\" (UniqueName: \"kubernetes.io/projected/63bb5b15-0701-4fe1-85dc-ecf91b71d70e-kube-api-access-j2v84\") pod \"63bb5b15-0701-4fe1-85dc-ecf91b71d70e\" (UID: \"63bb5b15-0701-4fe1-85dc-ecf91b71d70e\") " May 27 18:15:49.107758 kubelet[2668]: I0527 18:15:49.107600 2668 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63bb5b15-0701-4fe1-85dc-ecf91b71d70e-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "63bb5b15-0701-4fe1-85dc-ecf91b71d70e" (UID: "63bb5b15-0701-4fe1-85dc-ecf91b71d70e"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 27 18:15:49.109974 kubelet[2668]: I0527 18:15:49.108578 2668 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63bb5b15-0701-4fe1-85dc-ecf91b71d70e-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "63bb5b15-0701-4fe1-85dc-ecf91b71d70e" (UID: "63bb5b15-0701-4fe1-85dc-ecf91b71d70e"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 27 18:15:49.110744 systemd[1]: var-lib-kubelet-pods-63bb5b15\x2d0701\x2d4fe1\x2d85dc\x2decf91b71d70e-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. May 27 18:15:49.121109 systemd[1]: var-lib-kubelet-pods-63bb5b15\x2d0701\x2d4fe1\x2d85dc\x2decf91b71d70e-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dj2v84.mount: Deactivated successfully. May 27 18:15:49.122765 kubelet[2668]: I0527 18:15:49.122700 2668 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63bb5b15-0701-4fe1-85dc-ecf91b71d70e-kube-api-access-j2v84" (OuterVolumeSpecName: "kube-api-access-j2v84") pod "63bb5b15-0701-4fe1-85dc-ecf91b71d70e" (UID: "63bb5b15-0701-4fe1-85dc-ecf91b71d70e"). InnerVolumeSpecName "kube-api-access-j2v84". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 27 18:15:49.201611 kubelet[2668]: I0527 18:15:49.201469 2668 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/63bb5b15-0701-4fe1-85dc-ecf91b71d70e-whisker-ca-bundle\") on node \"ci-4344.0.0-6-bb492ec913\" DevicePath \"\"" May 27 18:15:49.201611 kubelet[2668]: I0527 18:15:49.201529 2668 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j2v84\" (UniqueName: \"kubernetes.io/projected/63bb5b15-0701-4fe1-85dc-ecf91b71d70e-kube-api-access-j2v84\") on node \"ci-4344.0.0-6-bb492ec913\" DevicePath \"\"" May 27 18:15:49.201611 kubelet[2668]: I0527 18:15:49.201549 2668 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/63bb5b15-0701-4fe1-85dc-ecf91b71d70e-whisker-backend-key-pair\") on node \"ci-4344.0.0-6-bb492ec913\" DevicePath \"\"" May 27 18:15:49.444974 systemd[1]: Removed slice kubepods-besteffort-pod63bb5b15_0701_4fe1_85dc_ecf91b71d70e.slice - libcontainer container kubepods-besteffort-pod63bb5b15_0701_4fe1_85dc_ecf91b71d70e.slice. May 27 18:15:49.462460 kubelet[2668]: I0527 18:15:49.461591 2668 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-hnlwc" podStartSLOduration=2.778676822 podStartE2EDuration="20.461570038s" podCreationTimestamp="2025-05-27 18:15:29 +0000 UTC" firstStartedPulling="2025-05-27 18:15:30.348819897 +0000 UTC m=+20.476649217" lastFinishedPulling="2025-05-27 18:15:48.031713101 +0000 UTC m=+38.159542433" observedRunningTime="2025-05-27 18:15:49.459812623 +0000 UTC m=+39.587641939" watchObservedRunningTime="2025-05-27 18:15:49.461570038 +0000 UTC m=+39.589399362" May 27 18:15:49.547126 systemd[1]: Created slice kubepods-besteffort-pod461386c4_2f9e_4c80_a2ab_7b0260259077.slice - libcontainer container kubepods-besteffort-pod461386c4_2f9e_4c80_a2ab_7b0260259077.slice. May 27 18:15:49.607589 kubelet[2668]: I0527 18:15:49.607450 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/461386c4-2f9e-4c80-a2ab-7b0260259077-whisker-backend-key-pair\") pod \"whisker-7989d9b698-55wz2\" (UID: \"461386c4-2f9e-4c80-a2ab-7b0260259077\") " pod="calico-system/whisker-7989d9b698-55wz2" May 27 18:15:49.607589 kubelet[2668]: I0527 18:15:49.607507 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/461386c4-2f9e-4c80-a2ab-7b0260259077-whisker-ca-bundle\") pod \"whisker-7989d9b698-55wz2\" (UID: \"461386c4-2f9e-4c80-a2ab-7b0260259077\") " pod="calico-system/whisker-7989d9b698-55wz2" May 27 18:15:49.607589 kubelet[2668]: I0527 18:15:49.607524 2668 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4mn6\" (UniqueName: \"kubernetes.io/projected/461386c4-2f9e-4c80-a2ab-7b0260259077-kube-api-access-l4mn6\") pod \"whisker-7989d9b698-55wz2\" (UID: \"461386c4-2f9e-4c80-a2ab-7b0260259077\") " pod="calico-system/whisker-7989d9b698-55wz2" May 27 18:15:49.856148 containerd[1543]: time="2025-05-27T18:15:49.856082844Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7989d9b698-55wz2,Uid:461386c4-2f9e-4c80-a2ab-7b0260259077,Namespace:calico-system,Attempt:0,}" May 27 18:15:50.123315 kubelet[2668]: I0527 18:15:50.122809 2668 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63bb5b15-0701-4fe1-85dc-ecf91b71d70e" path="/var/lib/kubelet/pods/63bb5b15-0701-4fe1-85dc-ecf91b71d70e/volumes" May 27 18:15:50.248575 systemd-networkd[1436]: cali1f2daeb2fde: Link UP May 27 18:15:50.248754 systemd-networkd[1436]: cali1f2daeb2fde: Gained carrier May 27 18:15:50.268193 containerd[1543]: 2025-05-27 18:15:49.915 [INFO][3792] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 27 18:15:50.268193 containerd[1543]: 2025-05-27 18:15:49.951 [INFO][3792] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--6--bb492ec913-k8s-whisker--7989d9b698--55wz2-eth0 whisker-7989d9b698- calico-system 461386c4-2f9e-4c80-a2ab-7b0260259077 917 0 2025-05-27 18:15:49 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:7989d9b698 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4344.0.0-6-bb492ec913 whisker-7989d9b698-55wz2 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali1f2daeb2fde [] [] }} ContainerID="a306e80fca4c042a3555a7d7a5817b89b9ed415d51437cda5d631d5fc19b6bdc" Namespace="calico-system" Pod="whisker-7989d9b698-55wz2" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-whisker--7989d9b698--55wz2-" May 27 18:15:50.268193 containerd[1543]: 2025-05-27 18:15:49.951 [INFO][3792] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a306e80fca4c042a3555a7d7a5817b89b9ed415d51437cda5d631d5fc19b6bdc" Namespace="calico-system" Pod="whisker-7989d9b698-55wz2" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-whisker--7989d9b698--55wz2-eth0" May 27 18:15:50.268193 containerd[1543]: 2025-05-27 18:15:50.150 [INFO][3800] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a306e80fca4c042a3555a7d7a5817b89b9ed415d51437cda5d631d5fc19b6bdc" HandleID="k8s-pod-network.a306e80fca4c042a3555a7d7a5817b89b9ed415d51437cda5d631d5fc19b6bdc" Workload="ci--4344.0.0--6--bb492ec913-k8s-whisker--7989d9b698--55wz2-eth0" May 27 18:15:50.268906 containerd[1543]: 2025-05-27 18:15:50.152 [INFO][3800] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a306e80fca4c042a3555a7d7a5817b89b9ed415d51437cda5d631d5fc19b6bdc" HandleID="k8s-pod-network.a306e80fca4c042a3555a7d7a5817b89b9ed415d51437cda5d631d5fc19b6bdc" Workload="ci--4344.0.0--6--bb492ec913-k8s-whisker--7989d9b698--55wz2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000384290), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344.0.0-6-bb492ec913", "pod":"whisker-7989d9b698-55wz2", "timestamp":"2025-05-27 18:15:50.150249612 +0000 UTC"}, Hostname:"ci-4344.0.0-6-bb492ec913", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 18:15:50.268906 containerd[1543]: 2025-05-27 18:15:50.152 [INFO][3800] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 18:15:50.268906 containerd[1543]: 2025-05-27 18:15:50.152 [INFO][3800] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 18:15:50.268906 containerd[1543]: 2025-05-27 18:15:50.152 [INFO][3800] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-6-bb492ec913' May 27 18:15:50.268906 containerd[1543]: 2025-05-27 18:15:50.179 [INFO][3800] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a306e80fca4c042a3555a7d7a5817b89b9ed415d51437cda5d631d5fc19b6bdc" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:50.268906 containerd[1543]: 2025-05-27 18:15:50.192 [INFO][3800] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-6-bb492ec913" May 27 18:15:50.268906 containerd[1543]: 2025-05-27 18:15:50.202 [INFO][3800] ipam/ipam.go 511: Trying affinity for 192.168.10.192/26 host="ci-4344.0.0-6-bb492ec913" May 27 18:15:50.268906 containerd[1543]: 2025-05-27 18:15:50.205 [INFO][3800] ipam/ipam.go 158: Attempting to load block cidr=192.168.10.192/26 host="ci-4344.0.0-6-bb492ec913" May 27 18:15:50.268906 containerd[1543]: 2025-05-27 18:15:50.210 [INFO][3800] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.10.192/26 host="ci-4344.0.0-6-bb492ec913" May 27 18:15:50.271574 containerd[1543]: 2025-05-27 18:15:50.210 [INFO][3800] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.10.192/26 handle="k8s-pod-network.a306e80fca4c042a3555a7d7a5817b89b9ed415d51437cda5d631d5fc19b6bdc" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:50.271574 containerd[1543]: 2025-05-27 18:15:50.214 [INFO][3800] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a306e80fca4c042a3555a7d7a5817b89b9ed415d51437cda5d631d5fc19b6bdc May 27 18:15:50.271574 containerd[1543]: 2025-05-27 18:15:50.222 [INFO][3800] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.10.192/26 handle="k8s-pod-network.a306e80fca4c042a3555a7d7a5817b89b9ed415d51437cda5d631d5fc19b6bdc" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:50.271574 containerd[1543]: 2025-05-27 18:15:50.231 [INFO][3800] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.10.193/26] block=192.168.10.192/26 handle="k8s-pod-network.a306e80fca4c042a3555a7d7a5817b89b9ed415d51437cda5d631d5fc19b6bdc" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:50.271574 containerd[1543]: 2025-05-27 18:15:50.231 [INFO][3800] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.10.193/26] handle="k8s-pod-network.a306e80fca4c042a3555a7d7a5817b89b9ed415d51437cda5d631d5fc19b6bdc" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:50.271574 containerd[1543]: 2025-05-27 18:15:50.231 [INFO][3800] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 18:15:50.271574 containerd[1543]: 2025-05-27 18:15:50.231 [INFO][3800] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.10.193/26] IPv6=[] ContainerID="a306e80fca4c042a3555a7d7a5817b89b9ed415d51437cda5d631d5fc19b6bdc" HandleID="k8s-pod-network.a306e80fca4c042a3555a7d7a5817b89b9ed415d51437cda5d631d5fc19b6bdc" Workload="ci--4344.0.0--6--bb492ec913-k8s-whisker--7989d9b698--55wz2-eth0" May 27 18:15:50.271856 containerd[1543]: 2025-05-27 18:15:50.234 [INFO][3792] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a306e80fca4c042a3555a7d7a5817b89b9ed415d51437cda5d631d5fc19b6bdc" Namespace="calico-system" Pod="whisker-7989d9b698-55wz2" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-whisker--7989d9b698--55wz2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--6--bb492ec913-k8s-whisker--7989d9b698--55wz2-eth0", GenerateName:"whisker-7989d9b698-", Namespace:"calico-system", SelfLink:"", UID:"461386c4-2f9e-4c80-a2ab-7b0260259077", ResourceVersion:"917", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 18, 15, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7989d9b698", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-6-bb492ec913", ContainerID:"", Pod:"whisker-7989d9b698-55wz2", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.10.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali1f2daeb2fde", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 18:15:50.271856 containerd[1543]: 2025-05-27 18:15:50.234 [INFO][3792] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.10.193/32] ContainerID="a306e80fca4c042a3555a7d7a5817b89b9ed415d51437cda5d631d5fc19b6bdc" Namespace="calico-system" Pod="whisker-7989d9b698-55wz2" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-whisker--7989d9b698--55wz2-eth0" May 27 18:15:50.271998 containerd[1543]: 2025-05-27 18:15:50.234 [INFO][3792] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1f2daeb2fde ContainerID="a306e80fca4c042a3555a7d7a5817b89b9ed415d51437cda5d631d5fc19b6bdc" Namespace="calico-system" Pod="whisker-7989d9b698-55wz2" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-whisker--7989d9b698--55wz2-eth0" May 27 18:15:50.271998 containerd[1543]: 2025-05-27 18:15:50.249 [INFO][3792] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a306e80fca4c042a3555a7d7a5817b89b9ed415d51437cda5d631d5fc19b6bdc" Namespace="calico-system" Pod="whisker-7989d9b698-55wz2" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-whisker--7989d9b698--55wz2-eth0" May 27 18:15:50.272070 containerd[1543]: 2025-05-27 18:15:50.250 [INFO][3792] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a306e80fca4c042a3555a7d7a5817b89b9ed415d51437cda5d631d5fc19b6bdc" Namespace="calico-system" Pod="whisker-7989d9b698-55wz2" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-whisker--7989d9b698--55wz2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--6--bb492ec913-k8s-whisker--7989d9b698--55wz2-eth0", GenerateName:"whisker-7989d9b698-", Namespace:"calico-system", SelfLink:"", UID:"461386c4-2f9e-4c80-a2ab-7b0260259077", ResourceVersion:"917", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 18, 15, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7989d9b698", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-6-bb492ec913", ContainerID:"a306e80fca4c042a3555a7d7a5817b89b9ed415d51437cda5d631d5fc19b6bdc", Pod:"whisker-7989d9b698-55wz2", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.10.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali1f2daeb2fde", MAC:"3e:61:bc:3e:a9:2a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 18:15:50.272151 containerd[1543]: 2025-05-27 18:15:50.261 [INFO][3792] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a306e80fca4c042a3555a7d7a5817b89b9ed415d51437cda5d631d5fc19b6bdc" Namespace="calico-system" Pod="whisker-7989d9b698-55wz2" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-whisker--7989d9b698--55wz2-eth0" May 27 18:15:50.342562 containerd[1543]: time="2025-05-27T18:15:50.342478295Z" level=info msg="connecting to shim a306e80fca4c042a3555a7d7a5817b89b9ed415d51437cda5d631d5fc19b6bdc" address="unix:///run/containerd/s/4ecf800c9e9f5d0d76f0dfed4753bc79063354111005a9201b42e73f4797be0c" namespace=k8s.io protocol=ttrpc version=3 May 27 18:15:50.417601 systemd[1]: Started cri-containerd-a306e80fca4c042a3555a7d7a5817b89b9ed415d51437cda5d631d5fc19b6bdc.scope - libcontainer container a306e80fca4c042a3555a7d7a5817b89b9ed415d51437cda5d631d5fc19b6bdc. May 27 18:15:50.450979 kubelet[2668]: I0527 18:15:50.450828 2668 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 18:15:50.588540 containerd[1543]: time="2025-05-27T18:15:50.588459183Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7989d9b698-55wz2,Uid:461386c4-2f9e-4c80-a2ab-7b0260259077,Namespace:calico-system,Attempt:0,} returns sandbox id \"a306e80fca4c042a3555a7d7a5817b89b9ed415d51437cda5d631d5fc19b6bdc\"" May 27 18:15:50.597314 containerd[1543]: time="2025-05-27T18:15:50.597076116Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 18:15:50.832468 containerd[1543]: time="2025-05-27T18:15:50.828414094Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:15:50.832468 containerd[1543]: time="2025-05-27T18:15:50.830451829Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:15:50.832468 containerd[1543]: time="2025-05-27T18:15:50.830601747Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 18:15:50.832845 kubelet[2668]: E0527 18:15:50.832680 2668 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 18:15:50.832845 kubelet[2668]: E0527 18:15:50.832765 2668 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 18:15:50.840859 kubelet[2668]: E0527 18:15:50.837347 2668 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:b58c9c6d236b46b38836eca7a8a2bb57,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l4mn6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7989d9b698-55wz2_calico-system(461386c4-2f9e-4c80-a2ab-7b0260259077): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:15:50.847470 containerd[1543]: time="2025-05-27T18:15:50.842650437Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 18:15:51.111809 containerd[1543]: time="2025-05-27T18:15:51.111654814Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:15:51.113886 containerd[1543]: time="2025-05-27T18:15:51.113834425Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 18:15:51.114119 containerd[1543]: time="2025-05-27T18:15:51.114079225Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:15:51.114797 kubelet[2668]: E0527 18:15:51.114739 2668 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 18:15:51.115024 kubelet[2668]: E0527 18:15:51.114998 2668 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 18:15:51.115375 kubelet[2668]: E0527 18:15:51.115307 2668 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l4mn6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7989d9b698-55wz2_calico-system(461386c4-2f9e-4c80-a2ab-7b0260259077): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:15:51.117496 kubelet[2668]: E0527 18:15:51.116759 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7989d9b698-55wz2" podUID="461386c4-2f9e-4c80-a2ab-7b0260259077" May 27 18:15:51.222774 containerd[1543]: time="2025-05-27T18:15:51.222736151Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cfef13c479386db1499369db02a1c6f1fb36ea37285d4077a26db8d97d74af06\" id:\"3e8ef7d4002397e187878348e90deb213db71e5501cb4511cb468e64e56a0a02\" pid:3963 exit_status:1 exited_at:{seconds:1748369751 nanos:222341610}" May 27 18:15:51.342191 containerd[1543]: time="2025-05-27T18:15:51.342137470Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cfef13c479386db1499369db02a1c6f1fb36ea37285d4077a26db8d97d74af06\" id:\"0d10a32e8ae6d5379fa14c0f4090ec04142f87793899831245b935c293112d15\" pid:4018 exit_status:1 exited_at:{seconds:1748369751 nanos:341701535}" May 27 18:15:51.453639 kubelet[2668]: E0527 18:15:51.453491 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7989d9b698-55wz2" podUID="461386c4-2f9e-4c80-a2ab-7b0260259077" May 27 18:15:51.592490 containerd[1543]: time="2025-05-27T18:15:51.591938695Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cfef13c479386db1499369db02a1c6f1fb36ea37285d4077a26db8d97d74af06\" id:\"aaa6fd4e4b96f2339837e1baca89298169d72c81136f535523a440a10f37f6ac\" pid:4044 exit_status:1 exited_at:{seconds:1748369751 nanos:591412571}" May 27 18:15:51.625143 systemd-networkd[1436]: vxlan.calico: Link UP May 27 18:15:51.625778 systemd-networkd[1436]: vxlan.calico: Gained carrier May 27 18:15:52.113068 containerd[1543]: time="2025-05-27T18:15:52.112707600Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59958685d4-qt6zw,Uid:03e37baf-9d1d-4e3b-b155-232ddbbdbb3b,Namespace:calico-apiserver,Attempt:0,}" May 27 18:15:52.271123 systemd-networkd[1436]: cali739ac5e79da: Link UP May 27 18:15:52.271893 systemd-networkd[1436]: cali739ac5e79da: Gained carrier May 27 18:15:52.296644 containerd[1543]: 2025-05-27 18:15:52.167 [INFO][4126] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--6--bb492ec913-k8s-calico--apiserver--59958685d4--qt6zw-eth0 calico-apiserver-59958685d4- calico-apiserver 03e37baf-9d1d-4e3b-b155-232ddbbdbb3b 837 0 2025-05-27 18:15:26 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:59958685d4 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4344.0.0-6-bb492ec913 calico-apiserver-59958685d4-qt6zw eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali739ac5e79da [] [] }} ContainerID="43094d8a66a156cc4160890853ec29d99a1192256813a069cae64064d957968c" Namespace="calico-apiserver" Pod="calico-apiserver-59958685d4-qt6zw" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-calico--apiserver--59958685d4--qt6zw-" May 27 18:15:52.296644 containerd[1543]: 2025-05-27 18:15:52.167 [INFO][4126] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="43094d8a66a156cc4160890853ec29d99a1192256813a069cae64064d957968c" Namespace="calico-apiserver" Pod="calico-apiserver-59958685d4-qt6zw" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-calico--apiserver--59958685d4--qt6zw-eth0" May 27 18:15:52.296644 containerd[1543]: 2025-05-27 18:15:52.206 [INFO][4137] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="43094d8a66a156cc4160890853ec29d99a1192256813a069cae64064d957968c" HandleID="k8s-pod-network.43094d8a66a156cc4160890853ec29d99a1192256813a069cae64064d957968c" Workload="ci--4344.0.0--6--bb492ec913-k8s-calico--apiserver--59958685d4--qt6zw-eth0" May 27 18:15:52.296935 containerd[1543]: 2025-05-27 18:15:52.206 [INFO][4137] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="43094d8a66a156cc4160890853ec29d99a1192256813a069cae64064d957968c" HandleID="k8s-pod-network.43094d8a66a156cc4160890853ec29d99a1192256813a069cae64064d957968c" Workload="ci--4344.0.0--6--bb492ec913-k8s-calico--apiserver--59958685d4--qt6zw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d9700), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4344.0.0-6-bb492ec913", "pod":"calico-apiserver-59958685d4-qt6zw", "timestamp":"2025-05-27 18:15:52.206718848 +0000 UTC"}, Hostname:"ci-4344.0.0-6-bb492ec913", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 18:15:52.296935 containerd[1543]: 2025-05-27 18:15:52.207 [INFO][4137] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 18:15:52.296935 containerd[1543]: 2025-05-27 18:15:52.207 [INFO][4137] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 18:15:52.296935 containerd[1543]: 2025-05-27 18:15:52.207 [INFO][4137] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-6-bb492ec913' May 27 18:15:52.296935 containerd[1543]: 2025-05-27 18:15:52.219 [INFO][4137] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.43094d8a66a156cc4160890853ec29d99a1192256813a069cae64064d957968c" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:52.296935 containerd[1543]: 2025-05-27 18:15:52.228 [INFO][4137] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-6-bb492ec913" May 27 18:15:52.296935 containerd[1543]: 2025-05-27 18:15:52.236 [INFO][4137] ipam/ipam.go 511: Trying affinity for 192.168.10.192/26 host="ci-4344.0.0-6-bb492ec913" May 27 18:15:52.296935 containerd[1543]: 2025-05-27 18:15:52.240 [INFO][4137] ipam/ipam.go 158: Attempting to load block cidr=192.168.10.192/26 host="ci-4344.0.0-6-bb492ec913" May 27 18:15:52.296935 containerd[1543]: 2025-05-27 18:15:52.243 [INFO][4137] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.10.192/26 host="ci-4344.0.0-6-bb492ec913" May 27 18:15:52.297229 containerd[1543]: 2025-05-27 18:15:52.243 [INFO][4137] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.10.192/26 handle="k8s-pod-network.43094d8a66a156cc4160890853ec29d99a1192256813a069cae64064d957968c" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:52.297229 containerd[1543]: 2025-05-27 18:15:52.246 [INFO][4137] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.43094d8a66a156cc4160890853ec29d99a1192256813a069cae64064d957968c May 27 18:15:52.297229 containerd[1543]: 2025-05-27 18:15:52.254 [INFO][4137] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.10.192/26 handle="k8s-pod-network.43094d8a66a156cc4160890853ec29d99a1192256813a069cae64064d957968c" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:52.297229 containerd[1543]: 2025-05-27 18:15:52.264 [INFO][4137] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.10.194/26] block=192.168.10.192/26 handle="k8s-pod-network.43094d8a66a156cc4160890853ec29d99a1192256813a069cae64064d957968c" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:52.297229 containerd[1543]: 2025-05-27 18:15:52.264 [INFO][4137] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.10.194/26] handle="k8s-pod-network.43094d8a66a156cc4160890853ec29d99a1192256813a069cae64064d957968c" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:52.297229 containerd[1543]: 2025-05-27 18:15:52.264 [INFO][4137] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 18:15:52.297229 containerd[1543]: 2025-05-27 18:15:52.264 [INFO][4137] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.10.194/26] IPv6=[] ContainerID="43094d8a66a156cc4160890853ec29d99a1192256813a069cae64064d957968c" HandleID="k8s-pod-network.43094d8a66a156cc4160890853ec29d99a1192256813a069cae64064d957968c" Workload="ci--4344.0.0--6--bb492ec913-k8s-calico--apiserver--59958685d4--qt6zw-eth0" May 27 18:15:52.298097 containerd[1543]: 2025-05-27 18:15:52.268 [INFO][4126] cni-plugin/k8s.go 418: Populated endpoint ContainerID="43094d8a66a156cc4160890853ec29d99a1192256813a069cae64064d957968c" Namespace="calico-apiserver" Pod="calico-apiserver-59958685d4-qt6zw" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-calico--apiserver--59958685d4--qt6zw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--6--bb492ec913-k8s-calico--apiserver--59958685d4--qt6zw-eth0", GenerateName:"calico-apiserver-59958685d4-", Namespace:"calico-apiserver", SelfLink:"", UID:"03e37baf-9d1d-4e3b-b155-232ddbbdbb3b", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 18, 15, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"59958685d4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-6-bb492ec913", ContainerID:"", Pod:"calico-apiserver-59958685d4-qt6zw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.10.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali739ac5e79da", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 18:15:52.298523 containerd[1543]: 2025-05-27 18:15:52.268 [INFO][4126] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.10.194/32] ContainerID="43094d8a66a156cc4160890853ec29d99a1192256813a069cae64064d957968c" Namespace="calico-apiserver" Pod="calico-apiserver-59958685d4-qt6zw" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-calico--apiserver--59958685d4--qt6zw-eth0" May 27 18:15:52.298523 containerd[1543]: 2025-05-27 18:15:52.268 [INFO][4126] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali739ac5e79da ContainerID="43094d8a66a156cc4160890853ec29d99a1192256813a069cae64064d957968c" Namespace="calico-apiserver" Pod="calico-apiserver-59958685d4-qt6zw" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-calico--apiserver--59958685d4--qt6zw-eth0" May 27 18:15:52.298523 containerd[1543]: 2025-05-27 18:15:52.272 [INFO][4126] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="43094d8a66a156cc4160890853ec29d99a1192256813a069cae64064d957968c" Namespace="calico-apiserver" Pod="calico-apiserver-59958685d4-qt6zw" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-calico--apiserver--59958685d4--qt6zw-eth0" May 27 18:15:52.299639 containerd[1543]: 2025-05-27 18:15:52.274 [INFO][4126] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="43094d8a66a156cc4160890853ec29d99a1192256813a069cae64064d957968c" Namespace="calico-apiserver" Pod="calico-apiserver-59958685d4-qt6zw" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-calico--apiserver--59958685d4--qt6zw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--6--bb492ec913-k8s-calico--apiserver--59958685d4--qt6zw-eth0", GenerateName:"calico-apiserver-59958685d4-", Namespace:"calico-apiserver", SelfLink:"", UID:"03e37baf-9d1d-4e3b-b155-232ddbbdbb3b", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 18, 15, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"59958685d4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-6-bb492ec913", ContainerID:"43094d8a66a156cc4160890853ec29d99a1192256813a069cae64064d957968c", Pod:"calico-apiserver-59958685d4-qt6zw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.10.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali739ac5e79da", MAC:"8a:8b:67:5d:e8:0c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 18:15:52.299742 containerd[1543]: 2025-05-27 18:15:52.293 [INFO][4126] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="43094d8a66a156cc4160890853ec29d99a1192256813a069cae64064d957968c" Namespace="calico-apiserver" Pod="calico-apiserver-59958685d4-qt6zw" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-calico--apiserver--59958685d4--qt6zw-eth0" May 27 18:15:52.316596 systemd-networkd[1436]: cali1f2daeb2fde: Gained IPv6LL May 27 18:15:52.348996 containerd[1543]: time="2025-05-27T18:15:52.348932758Z" level=info msg="connecting to shim 43094d8a66a156cc4160890853ec29d99a1192256813a069cae64064d957968c" address="unix:///run/containerd/s/52ce5e5e5de53297f67cb7a705a01717bcce4d4213259895422b3e7cf9b6566d" namespace=k8s.io protocol=ttrpc version=3 May 27 18:15:52.427832 systemd[1]: Started cri-containerd-43094d8a66a156cc4160890853ec29d99a1192256813a069cae64064d957968c.scope - libcontainer container 43094d8a66a156cc4160890853ec29d99a1192256813a069cae64064d957968c. May 27 18:15:52.451359 kubelet[2668]: E0527 18:15:52.451298 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7989d9b698-55wz2" podUID="461386c4-2f9e-4c80-a2ab-7b0260259077" May 27 18:15:52.528605 containerd[1543]: time="2025-05-27T18:15:52.528553979Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59958685d4-qt6zw,Uid:03e37baf-9d1d-4e3b-b155-232ddbbdbb3b,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"43094d8a66a156cc4160890853ec29d99a1192256813a069cae64064d957968c\"" May 27 18:15:52.531885 containerd[1543]: time="2025-05-27T18:15:52.531827269Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\"" May 27 18:15:52.763669 systemd-networkd[1436]: vxlan.calico: Gained IPv6LL May 27 18:15:53.112184 containerd[1543]: time="2025-05-27T18:15:53.111922677Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59958685d4-slstk,Uid:74b5366e-b0c8-4081-9c1e-92ebd54aa8ff,Namespace:calico-apiserver,Attempt:0,}" May 27 18:15:53.112414 containerd[1543]: time="2025-05-27T18:15:53.112381642Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-bglbs,Uid:201ff555-e16a-488d-8fbf-728dbbf651e9,Namespace:calico-system,Attempt:0,}" May 27 18:15:53.315599 systemd-networkd[1436]: cali544136dd0f3: Link UP May 27 18:15:53.319225 systemd-networkd[1436]: cali544136dd0f3: Gained carrier May 27 18:15:53.339534 containerd[1543]: 2025-05-27 18:15:53.198 [INFO][4196] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--6--bb492ec913-k8s-calico--apiserver--59958685d4--slstk-eth0 calico-apiserver-59958685d4- calico-apiserver 74b5366e-b0c8-4081-9c1e-92ebd54aa8ff 845 0 2025-05-27 18:15:26 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:59958685d4 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4344.0.0-6-bb492ec913 calico-apiserver-59958685d4-slstk eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali544136dd0f3 [] [] }} ContainerID="8680c485c287a90ed6bb801f77b5435527da6a8d1979c3ac3a3ffa006b899522" Namespace="calico-apiserver" Pod="calico-apiserver-59958685d4-slstk" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-calico--apiserver--59958685d4--slstk-" May 27 18:15:53.339534 containerd[1543]: 2025-05-27 18:15:53.199 [INFO][4196] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8680c485c287a90ed6bb801f77b5435527da6a8d1979c3ac3a3ffa006b899522" Namespace="calico-apiserver" Pod="calico-apiserver-59958685d4-slstk" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-calico--apiserver--59958685d4--slstk-eth0" May 27 18:15:53.339534 containerd[1543]: 2025-05-27 18:15:53.238 [INFO][4224] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8680c485c287a90ed6bb801f77b5435527da6a8d1979c3ac3a3ffa006b899522" HandleID="k8s-pod-network.8680c485c287a90ed6bb801f77b5435527da6a8d1979c3ac3a3ffa006b899522" Workload="ci--4344.0.0--6--bb492ec913-k8s-calico--apiserver--59958685d4--slstk-eth0" May 27 18:15:53.340380 containerd[1543]: 2025-05-27 18:15:53.239 [INFO][4224] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8680c485c287a90ed6bb801f77b5435527da6a8d1979c3ac3a3ffa006b899522" HandleID="k8s-pod-network.8680c485c287a90ed6bb801f77b5435527da6a8d1979c3ac3a3ffa006b899522" Workload="ci--4344.0.0--6--bb492ec913-k8s-calico--apiserver--59958685d4--slstk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000233060), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4344.0.0-6-bb492ec913", "pod":"calico-apiserver-59958685d4-slstk", "timestamp":"2025-05-27 18:15:53.238922531 +0000 UTC"}, Hostname:"ci-4344.0.0-6-bb492ec913", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 18:15:53.340380 containerd[1543]: 2025-05-27 18:15:53.239 [INFO][4224] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 18:15:53.340380 containerd[1543]: 2025-05-27 18:15:53.239 [INFO][4224] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 18:15:53.340380 containerd[1543]: 2025-05-27 18:15:53.239 [INFO][4224] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-6-bb492ec913' May 27 18:15:53.340380 containerd[1543]: 2025-05-27 18:15:53.252 [INFO][4224] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8680c485c287a90ed6bb801f77b5435527da6a8d1979c3ac3a3ffa006b899522" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:53.340380 containerd[1543]: 2025-05-27 18:15:53.265 [INFO][4224] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-6-bb492ec913" May 27 18:15:53.340380 containerd[1543]: 2025-05-27 18:15:53.272 [INFO][4224] ipam/ipam.go 511: Trying affinity for 192.168.10.192/26 host="ci-4344.0.0-6-bb492ec913" May 27 18:15:53.340380 containerd[1543]: 2025-05-27 18:15:53.276 [INFO][4224] ipam/ipam.go 158: Attempting to load block cidr=192.168.10.192/26 host="ci-4344.0.0-6-bb492ec913" May 27 18:15:53.340380 containerd[1543]: 2025-05-27 18:15:53.280 [INFO][4224] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.10.192/26 host="ci-4344.0.0-6-bb492ec913" May 27 18:15:53.342713 containerd[1543]: 2025-05-27 18:15:53.281 [INFO][4224] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.10.192/26 handle="k8s-pod-network.8680c485c287a90ed6bb801f77b5435527da6a8d1979c3ac3a3ffa006b899522" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:53.342713 containerd[1543]: 2025-05-27 18:15:53.283 [INFO][4224] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8680c485c287a90ed6bb801f77b5435527da6a8d1979c3ac3a3ffa006b899522 May 27 18:15:53.342713 containerd[1543]: 2025-05-27 18:15:53.290 [INFO][4224] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.10.192/26 handle="k8s-pod-network.8680c485c287a90ed6bb801f77b5435527da6a8d1979c3ac3a3ffa006b899522" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:53.342713 containerd[1543]: 2025-05-27 18:15:53.300 [INFO][4224] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.10.195/26] block=192.168.10.192/26 handle="k8s-pod-network.8680c485c287a90ed6bb801f77b5435527da6a8d1979c3ac3a3ffa006b899522" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:53.342713 containerd[1543]: 2025-05-27 18:15:53.301 [INFO][4224] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.10.195/26] handle="k8s-pod-network.8680c485c287a90ed6bb801f77b5435527da6a8d1979c3ac3a3ffa006b899522" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:53.342713 containerd[1543]: 2025-05-27 18:15:53.302 [INFO][4224] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 18:15:53.342713 containerd[1543]: 2025-05-27 18:15:53.303 [INFO][4224] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.10.195/26] IPv6=[] ContainerID="8680c485c287a90ed6bb801f77b5435527da6a8d1979c3ac3a3ffa006b899522" HandleID="k8s-pod-network.8680c485c287a90ed6bb801f77b5435527da6a8d1979c3ac3a3ffa006b899522" Workload="ci--4344.0.0--6--bb492ec913-k8s-calico--apiserver--59958685d4--slstk-eth0" May 27 18:15:53.343724 containerd[1543]: 2025-05-27 18:15:53.307 [INFO][4196] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8680c485c287a90ed6bb801f77b5435527da6a8d1979c3ac3a3ffa006b899522" Namespace="calico-apiserver" Pod="calico-apiserver-59958685d4-slstk" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-calico--apiserver--59958685d4--slstk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--6--bb492ec913-k8s-calico--apiserver--59958685d4--slstk-eth0", GenerateName:"calico-apiserver-59958685d4-", Namespace:"calico-apiserver", SelfLink:"", UID:"74b5366e-b0c8-4081-9c1e-92ebd54aa8ff", ResourceVersion:"845", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 18, 15, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"59958685d4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-6-bb492ec913", ContainerID:"", Pod:"calico-apiserver-59958685d4-slstk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.10.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali544136dd0f3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 18:15:53.343868 containerd[1543]: 2025-05-27 18:15:53.308 [INFO][4196] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.10.195/32] ContainerID="8680c485c287a90ed6bb801f77b5435527da6a8d1979c3ac3a3ffa006b899522" Namespace="calico-apiserver" Pod="calico-apiserver-59958685d4-slstk" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-calico--apiserver--59958685d4--slstk-eth0" May 27 18:15:53.343868 containerd[1543]: 2025-05-27 18:15:53.308 [INFO][4196] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali544136dd0f3 ContainerID="8680c485c287a90ed6bb801f77b5435527da6a8d1979c3ac3a3ffa006b899522" Namespace="calico-apiserver" Pod="calico-apiserver-59958685d4-slstk" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-calico--apiserver--59958685d4--slstk-eth0" May 27 18:15:53.343868 containerd[1543]: 2025-05-27 18:15:53.320 [INFO][4196] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8680c485c287a90ed6bb801f77b5435527da6a8d1979c3ac3a3ffa006b899522" Namespace="calico-apiserver" Pod="calico-apiserver-59958685d4-slstk" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-calico--apiserver--59958685d4--slstk-eth0" May 27 18:15:53.344002 containerd[1543]: 2025-05-27 18:15:53.320 [INFO][4196] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8680c485c287a90ed6bb801f77b5435527da6a8d1979c3ac3a3ffa006b899522" Namespace="calico-apiserver" Pod="calico-apiserver-59958685d4-slstk" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-calico--apiserver--59958685d4--slstk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--6--bb492ec913-k8s-calico--apiserver--59958685d4--slstk-eth0", GenerateName:"calico-apiserver-59958685d4-", Namespace:"calico-apiserver", SelfLink:"", UID:"74b5366e-b0c8-4081-9c1e-92ebd54aa8ff", ResourceVersion:"845", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 18, 15, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"59958685d4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-6-bb492ec913", ContainerID:"8680c485c287a90ed6bb801f77b5435527da6a8d1979c3ac3a3ffa006b899522", Pod:"calico-apiserver-59958685d4-slstk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.10.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali544136dd0f3", MAC:"1e:9f:6e:a0:16:e6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 18:15:53.344102 containerd[1543]: 2025-05-27 18:15:53.336 [INFO][4196] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8680c485c287a90ed6bb801f77b5435527da6a8d1979c3ac3a3ffa006b899522" Namespace="calico-apiserver" Pod="calico-apiserver-59958685d4-slstk" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-calico--apiserver--59958685d4--slstk-eth0" May 27 18:15:53.403052 containerd[1543]: time="2025-05-27T18:15:53.402844037Z" level=info msg="connecting to shim 8680c485c287a90ed6bb801f77b5435527da6a8d1979c3ac3a3ffa006b899522" address="unix:///run/containerd/s/bfa00951cb93c0ae54b26169bbf47bb9ea8635d7122d4f716f51f5867c93a435" namespace=k8s.io protocol=ttrpc version=3 May 27 18:15:53.458949 systemd[1]: Started cri-containerd-8680c485c287a90ed6bb801f77b5435527da6a8d1979c3ac3a3ffa006b899522.scope - libcontainer container 8680c485c287a90ed6bb801f77b5435527da6a8d1979c3ac3a3ffa006b899522. May 27 18:15:53.495797 systemd-networkd[1436]: calid890f6faf08: Link UP May 27 18:15:53.496882 systemd-networkd[1436]: calid890f6faf08: Gained carrier May 27 18:15:53.528925 containerd[1543]: 2025-05-27 18:15:53.195 [INFO][4199] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--6--bb492ec913-k8s-csi--node--driver--bglbs-eth0 csi-node-driver- calico-system 201ff555-e16a-488d-8fbf-728dbbf651e9 718 0 2025-05-27 18:15:29 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:78f6f74485 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4344.0.0-6-bb492ec913 csi-node-driver-bglbs eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calid890f6faf08 [] [] }} ContainerID="9a4f96b0211296ead37575ccced075a1d7868b4b0ad3a0e180207b49ea718c1d" Namespace="calico-system" Pod="csi-node-driver-bglbs" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-csi--node--driver--bglbs-" May 27 18:15:53.528925 containerd[1543]: 2025-05-27 18:15:53.196 [INFO][4199] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9a4f96b0211296ead37575ccced075a1d7868b4b0ad3a0e180207b49ea718c1d" Namespace="calico-system" Pod="csi-node-driver-bglbs" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-csi--node--driver--bglbs-eth0" May 27 18:15:53.528925 containerd[1543]: 2025-05-27 18:15:53.260 [INFO][4222] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9a4f96b0211296ead37575ccced075a1d7868b4b0ad3a0e180207b49ea718c1d" HandleID="k8s-pod-network.9a4f96b0211296ead37575ccced075a1d7868b4b0ad3a0e180207b49ea718c1d" Workload="ci--4344.0.0--6--bb492ec913-k8s-csi--node--driver--bglbs-eth0" May 27 18:15:53.529268 containerd[1543]: 2025-05-27 18:15:53.261 [INFO][4222] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9a4f96b0211296ead37575ccced075a1d7868b4b0ad3a0e180207b49ea718c1d" HandleID="k8s-pod-network.9a4f96b0211296ead37575ccced075a1d7868b4b0ad3a0e180207b49ea718c1d" Workload="ci--4344.0.0--6--bb492ec913-k8s-csi--node--driver--bglbs-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000333040), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344.0.0-6-bb492ec913", "pod":"csi-node-driver-bglbs", "timestamp":"2025-05-27 18:15:53.260827928 +0000 UTC"}, Hostname:"ci-4344.0.0-6-bb492ec913", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 18:15:53.529268 containerd[1543]: 2025-05-27 18:15:53.261 [INFO][4222] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 18:15:53.529268 containerd[1543]: 2025-05-27 18:15:53.302 [INFO][4222] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 18:15:53.529268 containerd[1543]: 2025-05-27 18:15:53.302 [INFO][4222] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-6-bb492ec913' May 27 18:15:53.529268 containerd[1543]: 2025-05-27 18:15:53.352 [INFO][4222] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9a4f96b0211296ead37575ccced075a1d7868b4b0ad3a0e180207b49ea718c1d" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:53.529268 containerd[1543]: 2025-05-27 18:15:53.380 [INFO][4222] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-6-bb492ec913" May 27 18:15:53.529268 containerd[1543]: 2025-05-27 18:15:53.398 [INFO][4222] ipam/ipam.go 511: Trying affinity for 192.168.10.192/26 host="ci-4344.0.0-6-bb492ec913" May 27 18:15:53.529268 containerd[1543]: 2025-05-27 18:15:53.404 [INFO][4222] ipam/ipam.go 158: Attempting to load block cidr=192.168.10.192/26 host="ci-4344.0.0-6-bb492ec913" May 27 18:15:53.529268 containerd[1543]: 2025-05-27 18:15:53.414 [INFO][4222] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.10.192/26 host="ci-4344.0.0-6-bb492ec913" May 27 18:15:53.529687 containerd[1543]: 2025-05-27 18:15:53.414 [INFO][4222] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.10.192/26 handle="k8s-pod-network.9a4f96b0211296ead37575ccced075a1d7868b4b0ad3a0e180207b49ea718c1d" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:53.529687 containerd[1543]: 2025-05-27 18:15:53.447 [INFO][4222] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9a4f96b0211296ead37575ccced075a1d7868b4b0ad3a0e180207b49ea718c1d May 27 18:15:53.529687 containerd[1543]: 2025-05-27 18:15:53.470 [INFO][4222] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.10.192/26 handle="k8s-pod-network.9a4f96b0211296ead37575ccced075a1d7868b4b0ad3a0e180207b49ea718c1d" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:53.529687 containerd[1543]: 2025-05-27 18:15:53.485 [INFO][4222] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.10.196/26] block=192.168.10.192/26 handle="k8s-pod-network.9a4f96b0211296ead37575ccced075a1d7868b4b0ad3a0e180207b49ea718c1d" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:53.529687 containerd[1543]: 2025-05-27 18:15:53.485 [INFO][4222] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.10.196/26] handle="k8s-pod-network.9a4f96b0211296ead37575ccced075a1d7868b4b0ad3a0e180207b49ea718c1d" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:53.529687 containerd[1543]: 2025-05-27 18:15:53.485 [INFO][4222] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 18:15:53.529687 containerd[1543]: 2025-05-27 18:15:53.485 [INFO][4222] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.10.196/26] IPv6=[] ContainerID="9a4f96b0211296ead37575ccced075a1d7868b4b0ad3a0e180207b49ea718c1d" HandleID="k8s-pod-network.9a4f96b0211296ead37575ccced075a1d7868b4b0ad3a0e180207b49ea718c1d" Workload="ci--4344.0.0--6--bb492ec913-k8s-csi--node--driver--bglbs-eth0" May 27 18:15:53.532062 containerd[1543]: 2025-05-27 18:15:53.488 [INFO][4199] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9a4f96b0211296ead37575ccced075a1d7868b4b0ad3a0e180207b49ea718c1d" Namespace="calico-system" Pod="csi-node-driver-bglbs" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-csi--node--driver--bglbs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--6--bb492ec913-k8s-csi--node--driver--bglbs-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"201ff555-e16a-488d-8fbf-728dbbf651e9", ResourceVersion:"718", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 18, 15, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"78f6f74485", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-6-bb492ec913", ContainerID:"", Pod:"csi-node-driver-bglbs", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.10.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid890f6faf08", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 18:15:53.532195 containerd[1543]: 2025-05-27 18:15:53.489 [INFO][4199] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.10.196/32] ContainerID="9a4f96b0211296ead37575ccced075a1d7868b4b0ad3a0e180207b49ea718c1d" Namespace="calico-system" Pod="csi-node-driver-bglbs" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-csi--node--driver--bglbs-eth0" May 27 18:15:53.532195 containerd[1543]: 2025-05-27 18:15:53.489 [INFO][4199] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid890f6faf08 ContainerID="9a4f96b0211296ead37575ccced075a1d7868b4b0ad3a0e180207b49ea718c1d" Namespace="calico-system" Pod="csi-node-driver-bglbs" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-csi--node--driver--bglbs-eth0" May 27 18:15:53.532195 containerd[1543]: 2025-05-27 18:15:53.497 [INFO][4199] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9a4f96b0211296ead37575ccced075a1d7868b4b0ad3a0e180207b49ea718c1d" Namespace="calico-system" Pod="csi-node-driver-bglbs" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-csi--node--driver--bglbs-eth0" May 27 18:15:53.532333 containerd[1543]: 2025-05-27 18:15:53.497 [INFO][4199] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9a4f96b0211296ead37575ccced075a1d7868b4b0ad3a0e180207b49ea718c1d" Namespace="calico-system" Pod="csi-node-driver-bglbs" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-csi--node--driver--bglbs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--6--bb492ec913-k8s-csi--node--driver--bglbs-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"201ff555-e16a-488d-8fbf-728dbbf651e9", ResourceVersion:"718", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 18, 15, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"78f6f74485", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-6-bb492ec913", ContainerID:"9a4f96b0211296ead37575ccced075a1d7868b4b0ad3a0e180207b49ea718c1d", Pod:"csi-node-driver-bglbs", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.10.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid890f6faf08", MAC:"a6:23:1e:b6:f1:73", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 18:15:53.532454 containerd[1543]: 2025-05-27 18:15:53.523 [INFO][4199] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9a4f96b0211296ead37575ccced075a1d7868b4b0ad3a0e180207b49ea718c1d" Namespace="calico-system" Pod="csi-node-driver-bglbs" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-csi--node--driver--bglbs-eth0" May 27 18:15:53.564082 containerd[1543]: time="2025-05-27T18:15:53.563985503Z" level=info msg="connecting to shim 9a4f96b0211296ead37575ccced075a1d7868b4b0ad3a0e180207b49ea718c1d" address="unix:///run/containerd/s/68c68446feb7302700ce0dd500e31b73c0a791c03fd07e81dc61176925c3f6b9" namespace=k8s.io protocol=ttrpc version=3 May 27 18:15:53.615819 systemd[1]: Started cri-containerd-9a4f96b0211296ead37575ccced075a1d7868b4b0ad3a0e180207b49ea718c1d.scope - libcontainer container 9a4f96b0211296ead37575ccced075a1d7868b4b0ad3a0e180207b49ea718c1d. May 27 18:15:53.637478 containerd[1543]: time="2025-05-27T18:15:53.637227399Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59958685d4-slstk,Uid:74b5366e-b0c8-4081-9c1e-92ebd54aa8ff,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"8680c485c287a90ed6bb801f77b5435527da6a8d1979c3ac3a3ffa006b899522\"" May 27 18:15:53.682168 containerd[1543]: time="2025-05-27T18:15:53.681959595Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-bglbs,Uid:201ff555-e16a-488d-8fbf-728dbbf651e9,Namespace:calico-system,Attempt:0,} returns sandbox id \"9a4f96b0211296ead37575ccced075a1d7868b4b0ad3a0e180207b49ea718c1d\"" May 27 18:15:53.788272 systemd-networkd[1436]: cali739ac5e79da: Gained IPv6LL May 27 18:15:54.817272 systemd-networkd[1436]: cali544136dd0f3: Gained IPv6LL May 27 18:15:55.112632 kubelet[2668]: E0527 18:15:55.111817 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:55.112632 kubelet[2668]: E0527 18:15:55.111897 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:55.114962 containerd[1543]: time="2025-05-27T18:15:55.113079203Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-wj82f,Uid:ce66028f-5ce9-4b5e-94a0-7c40a79bb9e7,Namespace:kube-system,Attempt:0,}" May 27 18:15:55.114962 containerd[1543]: time="2025-05-27T18:15:55.114597793Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-wl8mq,Uid:01eaa575-f0aa-4af1-a87a-d7de0da8f15b,Namespace:kube-system,Attempt:0,}" May 27 18:15:55.388362 systemd-networkd[1436]: calid890f6faf08: Gained IPv6LL May 27 18:15:55.444514 systemd-networkd[1436]: cali9fa3211248e: Link UP May 27 18:15:55.444901 systemd-networkd[1436]: cali9fa3211248e: Gained carrier May 27 18:15:55.555272 containerd[1543]: 2025-05-27 18:15:55.225 [INFO][4350] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--6--bb492ec913-k8s-coredns--674b8bbfcf--wj82f-eth0 coredns-674b8bbfcf- kube-system ce66028f-5ce9-4b5e-94a0-7c40a79bb9e7 835 0 2025-05-27 18:15:15 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4344.0.0-6-bb492ec913 coredns-674b8bbfcf-wj82f eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali9fa3211248e [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="43ad44df00f918c05327c6553e1402832cdc8a01bc0e56fdce0ff4e45abb2e27" Namespace="kube-system" Pod="coredns-674b8bbfcf-wj82f" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-coredns--674b8bbfcf--wj82f-" May 27 18:15:55.555272 containerd[1543]: 2025-05-27 18:15:55.225 [INFO][4350] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="43ad44df00f918c05327c6553e1402832cdc8a01bc0e56fdce0ff4e45abb2e27" Namespace="kube-system" Pod="coredns-674b8bbfcf-wj82f" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-coredns--674b8bbfcf--wj82f-eth0" May 27 18:15:55.555272 containerd[1543]: 2025-05-27 18:15:55.316 [INFO][4375] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="43ad44df00f918c05327c6553e1402832cdc8a01bc0e56fdce0ff4e45abb2e27" HandleID="k8s-pod-network.43ad44df00f918c05327c6553e1402832cdc8a01bc0e56fdce0ff4e45abb2e27" Workload="ci--4344.0.0--6--bb492ec913-k8s-coredns--674b8bbfcf--wj82f-eth0" May 27 18:15:55.556280 containerd[1543]: 2025-05-27 18:15:55.316 [INFO][4375] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="43ad44df00f918c05327c6553e1402832cdc8a01bc0e56fdce0ff4e45abb2e27" HandleID="k8s-pod-network.43ad44df00f918c05327c6553e1402832cdc8a01bc0e56fdce0ff4e45abb2e27" Workload="ci--4344.0.0--6--bb492ec913-k8s-coredns--674b8bbfcf--wj82f-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f770), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4344.0.0-6-bb492ec913", "pod":"coredns-674b8bbfcf-wj82f", "timestamp":"2025-05-27 18:15:55.316698685 +0000 UTC"}, Hostname:"ci-4344.0.0-6-bb492ec913", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 18:15:55.556280 containerd[1543]: 2025-05-27 18:15:55.316 [INFO][4375] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 18:15:55.556280 containerd[1543]: 2025-05-27 18:15:55.316 [INFO][4375] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 18:15:55.556280 containerd[1543]: 2025-05-27 18:15:55.317 [INFO][4375] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-6-bb492ec913' May 27 18:15:55.556280 containerd[1543]: 2025-05-27 18:15:55.338 [INFO][4375] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.43ad44df00f918c05327c6553e1402832cdc8a01bc0e56fdce0ff4e45abb2e27" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:55.556280 containerd[1543]: 2025-05-27 18:15:55.361 [INFO][4375] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-6-bb492ec913" May 27 18:15:55.556280 containerd[1543]: 2025-05-27 18:15:55.372 [INFO][4375] ipam/ipam.go 511: Trying affinity for 192.168.10.192/26 host="ci-4344.0.0-6-bb492ec913" May 27 18:15:55.556280 containerd[1543]: 2025-05-27 18:15:55.377 [INFO][4375] ipam/ipam.go 158: Attempting to load block cidr=192.168.10.192/26 host="ci-4344.0.0-6-bb492ec913" May 27 18:15:55.556280 containerd[1543]: 2025-05-27 18:15:55.386 [INFO][4375] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.10.192/26 host="ci-4344.0.0-6-bb492ec913" May 27 18:15:55.558262 containerd[1543]: 2025-05-27 18:15:55.386 [INFO][4375] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.10.192/26 handle="k8s-pod-network.43ad44df00f918c05327c6553e1402832cdc8a01bc0e56fdce0ff4e45abb2e27" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:55.558262 containerd[1543]: 2025-05-27 18:15:55.398 [INFO][4375] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.43ad44df00f918c05327c6553e1402832cdc8a01bc0e56fdce0ff4e45abb2e27 May 27 18:15:55.558262 containerd[1543]: 2025-05-27 18:15:55.409 [INFO][4375] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.10.192/26 handle="k8s-pod-network.43ad44df00f918c05327c6553e1402832cdc8a01bc0e56fdce0ff4e45abb2e27" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:55.558262 containerd[1543]: 2025-05-27 18:15:55.422 [INFO][4375] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.10.197/26] block=192.168.10.192/26 handle="k8s-pod-network.43ad44df00f918c05327c6553e1402832cdc8a01bc0e56fdce0ff4e45abb2e27" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:55.558262 containerd[1543]: 2025-05-27 18:15:55.422 [INFO][4375] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.10.197/26] handle="k8s-pod-network.43ad44df00f918c05327c6553e1402832cdc8a01bc0e56fdce0ff4e45abb2e27" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:55.558262 containerd[1543]: 2025-05-27 18:15:55.423 [INFO][4375] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 18:15:55.558262 containerd[1543]: 2025-05-27 18:15:55.423 [INFO][4375] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.10.197/26] IPv6=[] ContainerID="43ad44df00f918c05327c6553e1402832cdc8a01bc0e56fdce0ff4e45abb2e27" HandleID="k8s-pod-network.43ad44df00f918c05327c6553e1402832cdc8a01bc0e56fdce0ff4e45abb2e27" Workload="ci--4344.0.0--6--bb492ec913-k8s-coredns--674b8bbfcf--wj82f-eth0" May 27 18:15:55.558603 containerd[1543]: 2025-05-27 18:15:55.426 [INFO][4350] cni-plugin/k8s.go 418: Populated endpoint ContainerID="43ad44df00f918c05327c6553e1402832cdc8a01bc0e56fdce0ff4e45abb2e27" Namespace="kube-system" Pod="coredns-674b8bbfcf-wj82f" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-coredns--674b8bbfcf--wj82f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--6--bb492ec913-k8s-coredns--674b8bbfcf--wj82f-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"ce66028f-5ce9-4b5e-94a0-7c40a79bb9e7", ResourceVersion:"835", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 18, 15, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-6-bb492ec913", ContainerID:"", Pod:"coredns-674b8bbfcf-wj82f", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.10.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9fa3211248e", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 18:15:55.558603 containerd[1543]: 2025-05-27 18:15:55.430 [INFO][4350] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.10.197/32] ContainerID="43ad44df00f918c05327c6553e1402832cdc8a01bc0e56fdce0ff4e45abb2e27" Namespace="kube-system" Pod="coredns-674b8bbfcf-wj82f" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-coredns--674b8bbfcf--wj82f-eth0" May 27 18:15:55.558603 containerd[1543]: 2025-05-27 18:15:55.430 [INFO][4350] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9fa3211248e ContainerID="43ad44df00f918c05327c6553e1402832cdc8a01bc0e56fdce0ff4e45abb2e27" Namespace="kube-system" Pod="coredns-674b8bbfcf-wj82f" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-coredns--674b8bbfcf--wj82f-eth0" May 27 18:15:55.558603 containerd[1543]: 2025-05-27 18:15:55.443 [INFO][4350] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="43ad44df00f918c05327c6553e1402832cdc8a01bc0e56fdce0ff4e45abb2e27" Namespace="kube-system" Pod="coredns-674b8bbfcf-wj82f" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-coredns--674b8bbfcf--wj82f-eth0" May 27 18:15:55.558603 containerd[1543]: 2025-05-27 18:15:55.449 [INFO][4350] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="43ad44df00f918c05327c6553e1402832cdc8a01bc0e56fdce0ff4e45abb2e27" Namespace="kube-system" Pod="coredns-674b8bbfcf-wj82f" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-coredns--674b8bbfcf--wj82f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--6--bb492ec913-k8s-coredns--674b8bbfcf--wj82f-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"ce66028f-5ce9-4b5e-94a0-7c40a79bb9e7", ResourceVersion:"835", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 18, 15, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-6-bb492ec913", ContainerID:"43ad44df00f918c05327c6553e1402832cdc8a01bc0e56fdce0ff4e45abb2e27", Pod:"coredns-674b8bbfcf-wj82f", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.10.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9fa3211248e", MAC:"d6:c4:28:35:81:3d", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 18:15:55.558603 containerd[1543]: 2025-05-27 18:15:55.486 [INFO][4350] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="43ad44df00f918c05327c6553e1402832cdc8a01bc0e56fdce0ff4e45abb2e27" Namespace="kube-system" Pod="coredns-674b8bbfcf-wj82f" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-coredns--674b8bbfcf--wj82f-eth0" May 27 18:15:55.614831 systemd-networkd[1436]: calicfe7f3bf969: Link UP May 27 18:15:55.619777 systemd-networkd[1436]: calicfe7f3bf969: Gained carrier May 27 18:15:55.699568 containerd[1543]: 2025-05-27 18:15:55.264 [INFO][4349] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--6--bb492ec913-k8s-coredns--674b8bbfcf--wl8mq-eth0 coredns-674b8bbfcf- kube-system 01eaa575-f0aa-4af1-a87a-d7de0da8f15b 836 0 2025-05-27 18:15:15 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4344.0.0-6-bb492ec913 coredns-674b8bbfcf-wl8mq eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calicfe7f3bf969 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="f683cd486b83ae7dae1d12979afa64e7fd2334ffc3307de27161e1db6ddd0e8a" Namespace="kube-system" Pod="coredns-674b8bbfcf-wl8mq" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-coredns--674b8bbfcf--wl8mq-" May 27 18:15:55.699568 containerd[1543]: 2025-05-27 18:15:55.265 [INFO][4349] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f683cd486b83ae7dae1d12979afa64e7fd2334ffc3307de27161e1db6ddd0e8a" Namespace="kube-system" Pod="coredns-674b8bbfcf-wl8mq" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-coredns--674b8bbfcf--wl8mq-eth0" May 27 18:15:55.699568 containerd[1543]: 2025-05-27 18:15:55.350 [INFO][4381] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f683cd486b83ae7dae1d12979afa64e7fd2334ffc3307de27161e1db6ddd0e8a" HandleID="k8s-pod-network.f683cd486b83ae7dae1d12979afa64e7fd2334ffc3307de27161e1db6ddd0e8a" Workload="ci--4344.0.0--6--bb492ec913-k8s-coredns--674b8bbfcf--wl8mq-eth0" May 27 18:15:55.699568 containerd[1543]: 2025-05-27 18:15:55.351 [INFO][4381] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f683cd486b83ae7dae1d12979afa64e7fd2334ffc3307de27161e1db6ddd0e8a" HandleID="k8s-pod-network.f683cd486b83ae7dae1d12979afa64e7fd2334ffc3307de27161e1db6ddd0e8a" Workload="ci--4344.0.0--6--bb492ec913-k8s-coredns--674b8bbfcf--wl8mq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d9900), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4344.0.0-6-bb492ec913", "pod":"coredns-674b8bbfcf-wl8mq", "timestamp":"2025-05-27 18:15:55.350564653 +0000 UTC"}, Hostname:"ci-4344.0.0-6-bb492ec913", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 18:15:55.699568 containerd[1543]: 2025-05-27 18:15:55.351 [INFO][4381] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 18:15:55.699568 containerd[1543]: 2025-05-27 18:15:55.424 [INFO][4381] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 18:15:55.699568 containerd[1543]: 2025-05-27 18:15:55.425 [INFO][4381] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-6-bb492ec913' May 27 18:15:55.699568 containerd[1543]: 2025-05-27 18:15:55.448 [INFO][4381] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f683cd486b83ae7dae1d12979afa64e7fd2334ffc3307de27161e1db6ddd0e8a" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:55.699568 containerd[1543]: 2025-05-27 18:15:55.501 [INFO][4381] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-6-bb492ec913" May 27 18:15:55.699568 containerd[1543]: 2025-05-27 18:15:55.522 [INFO][4381] ipam/ipam.go 511: Trying affinity for 192.168.10.192/26 host="ci-4344.0.0-6-bb492ec913" May 27 18:15:55.699568 containerd[1543]: 2025-05-27 18:15:55.533 [INFO][4381] ipam/ipam.go 158: Attempting to load block cidr=192.168.10.192/26 host="ci-4344.0.0-6-bb492ec913" May 27 18:15:55.699568 containerd[1543]: 2025-05-27 18:15:55.543 [INFO][4381] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.10.192/26 host="ci-4344.0.0-6-bb492ec913" May 27 18:15:55.699568 containerd[1543]: 2025-05-27 18:15:55.544 [INFO][4381] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.10.192/26 handle="k8s-pod-network.f683cd486b83ae7dae1d12979afa64e7fd2334ffc3307de27161e1db6ddd0e8a" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:55.699568 containerd[1543]: 2025-05-27 18:15:55.549 [INFO][4381] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f683cd486b83ae7dae1d12979afa64e7fd2334ffc3307de27161e1db6ddd0e8a May 27 18:15:55.699568 containerd[1543]: 2025-05-27 18:15:55.569 [INFO][4381] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.10.192/26 handle="k8s-pod-network.f683cd486b83ae7dae1d12979afa64e7fd2334ffc3307de27161e1db6ddd0e8a" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:55.699568 containerd[1543]: 2025-05-27 18:15:55.589 [INFO][4381] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.10.198/26] block=192.168.10.192/26 handle="k8s-pod-network.f683cd486b83ae7dae1d12979afa64e7fd2334ffc3307de27161e1db6ddd0e8a" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:55.699568 containerd[1543]: 2025-05-27 18:15:55.589 [INFO][4381] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.10.198/26] handle="k8s-pod-network.f683cd486b83ae7dae1d12979afa64e7fd2334ffc3307de27161e1db6ddd0e8a" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:55.699568 containerd[1543]: 2025-05-27 18:15:55.589 [INFO][4381] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 18:15:55.699568 containerd[1543]: 2025-05-27 18:15:55.589 [INFO][4381] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.10.198/26] IPv6=[] ContainerID="f683cd486b83ae7dae1d12979afa64e7fd2334ffc3307de27161e1db6ddd0e8a" HandleID="k8s-pod-network.f683cd486b83ae7dae1d12979afa64e7fd2334ffc3307de27161e1db6ddd0e8a" Workload="ci--4344.0.0--6--bb492ec913-k8s-coredns--674b8bbfcf--wl8mq-eth0" May 27 18:15:55.702225 containerd[1543]: 2025-05-27 18:15:55.599 [INFO][4349] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f683cd486b83ae7dae1d12979afa64e7fd2334ffc3307de27161e1db6ddd0e8a" Namespace="kube-system" Pod="coredns-674b8bbfcf-wl8mq" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-coredns--674b8bbfcf--wl8mq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--6--bb492ec913-k8s-coredns--674b8bbfcf--wl8mq-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"01eaa575-f0aa-4af1-a87a-d7de0da8f15b", ResourceVersion:"836", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 18, 15, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-6-bb492ec913", ContainerID:"", Pod:"coredns-674b8bbfcf-wl8mq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.10.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calicfe7f3bf969", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 18:15:55.702225 containerd[1543]: 2025-05-27 18:15:55.600 [INFO][4349] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.10.198/32] ContainerID="f683cd486b83ae7dae1d12979afa64e7fd2334ffc3307de27161e1db6ddd0e8a" Namespace="kube-system" Pod="coredns-674b8bbfcf-wl8mq" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-coredns--674b8bbfcf--wl8mq-eth0" May 27 18:15:55.702225 containerd[1543]: 2025-05-27 18:15:55.600 [INFO][4349] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calicfe7f3bf969 ContainerID="f683cd486b83ae7dae1d12979afa64e7fd2334ffc3307de27161e1db6ddd0e8a" Namespace="kube-system" Pod="coredns-674b8bbfcf-wl8mq" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-coredns--674b8bbfcf--wl8mq-eth0" May 27 18:15:55.702225 containerd[1543]: 2025-05-27 18:15:55.639 [INFO][4349] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f683cd486b83ae7dae1d12979afa64e7fd2334ffc3307de27161e1db6ddd0e8a" Namespace="kube-system" Pod="coredns-674b8bbfcf-wl8mq" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-coredns--674b8bbfcf--wl8mq-eth0" May 27 18:15:55.702225 containerd[1543]: 2025-05-27 18:15:55.646 [INFO][4349] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f683cd486b83ae7dae1d12979afa64e7fd2334ffc3307de27161e1db6ddd0e8a" Namespace="kube-system" Pod="coredns-674b8bbfcf-wl8mq" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-coredns--674b8bbfcf--wl8mq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--6--bb492ec913-k8s-coredns--674b8bbfcf--wl8mq-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"01eaa575-f0aa-4af1-a87a-d7de0da8f15b", ResourceVersion:"836", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 18, 15, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-6-bb492ec913", ContainerID:"f683cd486b83ae7dae1d12979afa64e7fd2334ffc3307de27161e1db6ddd0e8a", Pod:"coredns-674b8bbfcf-wl8mq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.10.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calicfe7f3bf969", MAC:"52:5d:f7:0e:72:8c", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 18:15:55.702225 containerd[1543]: 2025-05-27 18:15:55.681 [INFO][4349] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f683cd486b83ae7dae1d12979afa64e7fd2334ffc3307de27161e1db6ddd0e8a" Namespace="kube-system" Pod="coredns-674b8bbfcf-wl8mq" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-coredns--674b8bbfcf--wl8mq-eth0" May 27 18:15:55.799590 containerd[1543]: time="2025-05-27T18:15:55.798633162Z" level=info msg="connecting to shim 43ad44df00f918c05327c6553e1402832cdc8a01bc0e56fdce0ff4e45abb2e27" address="unix:///run/containerd/s/53e7404de52e8f105ff13c4b6b2c90f0708817c44f137fb77fa930f29ab0793c" namespace=k8s.io protocol=ttrpc version=3 May 27 18:15:55.800921 containerd[1543]: time="2025-05-27T18:15:55.800858417Z" level=info msg="connecting to shim f683cd486b83ae7dae1d12979afa64e7fd2334ffc3307de27161e1db6ddd0e8a" address="unix:///run/containerd/s/3cc2c1a8867aa701b53d572ed05837c4a8d9eb71672e2f3312ccf4ed918273f5" namespace=k8s.io protocol=ttrpc version=3 May 27 18:15:55.878778 systemd[1]: Started cri-containerd-f683cd486b83ae7dae1d12979afa64e7fd2334ffc3307de27161e1db6ddd0e8a.scope - libcontainer container f683cd486b83ae7dae1d12979afa64e7fd2334ffc3307de27161e1db6ddd0e8a. May 27 18:15:55.903758 systemd[1]: Started cri-containerd-43ad44df00f918c05327c6553e1402832cdc8a01bc0e56fdce0ff4e45abb2e27.scope - libcontainer container 43ad44df00f918c05327c6553e1402832cdc8a01bc0e56fdce0ff4e45abb2e27. May 27 18:15:56.049071 containerd[1543]: time="2025-05-27T18:15:56.049000242Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-wl8mq,Uid:01eaa575-f0aa-4af1-a87a-d7de0da8f15b,Namespace:kube-system,Attempt:0,} returns sandbox id \"f683cd486b83ae7dae1d12979afa64e7fd2334ffc3307de27161e1db6ddd0e8a\"" May 27 18:15:56.052870 kubelet[2668]: E0527 18:15:56.052419 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:56.092733 containerd[1543]: time="2025-05-27T18:15:56.092651792Z" level=info msg="CreateContainer within sandbox \"f683cd486b83ae7dae1d12979afa64e7fd2334ffc3307de27161e1db6ddd0e8a\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 27 18:15:56.171182 containerd[1543]: time="2025-05-27T18:15:56.171051165Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-wj82f,Uid:ce66028f-5ce9-4b5e-94a0-7c40a79bb9e7,Namespace:kube-system,Attempt:0,} returns sandbox id \"43ad44df00f918c05327c6553e1402832cdc8a01bc0e56fdce0ff4e45abb2e27\"" May 27 18:15:56.173845 containerd[1543]: time="2025-05-27T18:15:56.173683568Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-hqs5q,Uid:312bbe06-8b56-48c8-bd2e-4f07049cb4ed,Namespace:calico-system,Attempt:0,}" May 27 18:15:56.198760 containerd[1543]: time="2025-05-27T18:15:56.198706834Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-78bf7cd7cb-xmffc,Uid:0cb2dcfc-ccf3-42e3-add2-e6cfa71bb149,Namespace:calico-system,Attempt:0,}" May 27 18:15:56.204077 kubelet[2668]: E0527 18:15:56.200233 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:56.214824 containerd[1543]: time="2025-05-27T18:15:56.212930328Z" level=info msg="CreateContainer within sandbox \"43ad44df00f918c05327c6553e1402832cdc8a01bc0e56fdce0ff4e45abb2e27\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 27 18:15:56.341648 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2426538038.mount: Deactivated successfully. May 27 18:15:56.374427 containerd[1543]: time="2025-05-27T18:15:56.374357628Z" level=info msg="Container 9b9a2f2a858037461ac95fac323f50112f544df1b464f28b7d4eaa0d7ceb6eea: CDI devices from CRI Config.CDIDevices: []" May 27 18:15:56.374900 containerd[1543]: time="2025-05-27T18:15:56.374697054Z" level=info msg="Container 8ab5edf4287172000ec17054ffb98a210ab0c768d835bf47dc7c5e13624685b0: CDI devices from CRI Config.CDIDevices: []" May 27 18:15:56.506471 containerd[1543]: time="2025-05-27T18:15:56.506041811Z" level=info msg="CreateContainer within sandbox \"f683cd486b83ae7dae1d12979afa64e7fd2334ffc3307de27161e1db6ddd0e8a\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"9b9a2f2a858037461ac95fac323f50112f544df1b464f28b7d4eaa0d7ceb6eea\"" May 27 18:15:56.514631 containerd[1543]: time="2025-05-27T18:15:56.514581292Z" level=info msg="StartContainer for \"9b9a2f2a858037461ac95fac323f50112f544df1b464f28b7d4eaa0d7ceb6eea\"" May 27 18:15:56.520935 containerd[1543]: time="2025-05-27T18:15:56.520169155Z" level=info msg="CreateContainer within sandbox \"43ad44df00f918c05327c6553e1402832cdc8a01bc0e56fdce0ff4e45abb2e27\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"8ab5edf4287172000ec17054ffb98a210ab0c768d835bf47dc7c5e13624685b0\"" May 27 18:15:56.526782 containerd[1543]: time="2025-05-27T18:15:56.526320751Z" level=info msg="StartContainer for \"8ab5edf4287172000ec17054ffb98a210ab0c768d835bf47dc7c5e13624685b0\"" May 27 18:15:56.535689 containerd[1543]: time="2025-05-27T18:15:56.534594529Z" level=info msg="connecting to shim 8ab5edf4287172000ec17054ffb98a210ab0c768d835bf47dc7c5e13624685b0" address="unix:///run/containerd/s/53e7404de52e8f105ff13c4b6b2c90f0708817c44f137fb77fa930f29ab0793c" protocol=ttrpc version=3 May 27 18:15:56.543108 containerd[1543]: time="2025-05-27T18:15:56.543050064Z" level=info msg="connecting to shim 9b9a2f2a858037461ac95fac323f50112f544df1b464f28b7d4eaa0d7ceb6eea" address="unix:///run/containerd/s/3cc2c1a8867aa701b53d572ed05837c4a8d9eb71672e2f3312ccf4ed918273f5" protocol=ttrpc version=3 May 27 18:15:56.551293 systemd-networkd[1436]: calib08907b2482: Link UP May 27 18:15:56.553251 systemd-networkd[1436]: calib08907b2482: Gained carrier May 27 18:15:56.609718 containerd[1543]: 2025-05-27 18:15:56.291 [INFO][4503] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--6--bb492ec913-k8s-goldmane--78d55f7ddc--hqs5q-eth0 goldmane-78d55f7ddc- calico-system 312bbe06-8b56-48c8-bd2e-4f07049cb4ed 842 0 2025-05-27 18:15:29 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:78d55f7ddc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4344.0.0-6-bb492ec913 goldmane-78d55f7ddc-hqs5q eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calib08907b2482 [] [] }} ContainerID="1ad2b5906a62aa7bdfa0ac3230bf3496afebb4a9b9fb3edcfda189a4f3edd83c" Namespace="calico-system" Pod="goldmane-78d55f7ddc-hqs5q" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-goldmane--78d55f7ddc--hqs5q-" May 27 18:15:56.609718 containerd[1543]: 2025-05-27 18:15:56.292 [INFO][4503] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1ad2b5906a62aa7bdfa0ac3230bf3496afebb4a9b9fb3edcfda189a4f3edd83c" Namespace="calico-system" Pod="goldmane-78d55f7ddc-hqs5q" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-goldmane--78d55f7ddc--hqs5q-eth0" May 27 18:15:56.609718 containerd[1543]: 2025-05-27 18:15:56.411 [INFO][4526] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1ad2b5906a62aa7bdfa0ac3230bf3496afebb4a9b9fb3edcfda189a4f3edd83c" HandleID="k8s-pod-network.1ad2b5906a62aa7bdfa0ac3230bf3496afebb4a9b9fb3edcfda189a4f3edd83c" Workload="ci--4344.0.0--6--bb492ec913-k8s-goldmane--78d55f7ddc--hqs5q-eth0" May 27 18:15:56.609718 containerd[1543]: 2025-05-27 18:15:56.412 [INFO][4526] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1ad2b5906a62aa7bdfa0ac3230bf3496afebb4a9b9fb3edcfda189a4f3edd83c" HandleID="k8s-pod-network.1ad2b5906a62aa7bdfa0ac3230bf3496afebb4a9b9fb3edcfda189a4f3edd83c" Workload="ci--4344.0.0--6--bb492ec913-k8s-goldmane--78d55f7ddc--hqs5q-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d9020), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344.0.0-6-bb492ec913", "pod":"goldmane-78d55f7ddc-hqs5q", "timestamp":"2025-05-27 18:15:56.411572779 +0000 UTC"}, Hostname:"ci-4344.0.0-6-bb492ec913", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 18:15:56.609718 containerd[1543]: 2025-05-27 18:15:56.412 [INFO][4526] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 18:15:56.609718 containerd[1543]: 2025-05-27 18:15:56.412 [INFO][4526] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 18:15:56.609718 containerd[1543]: 2025-05-27 18:15:56.412 [INFO][4526] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-6-bb492ec913' May 27 18:15:56.609718 containerd[1543]: 2025-05-27 18:15:56.438 [INFO][4526] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1ad2b5906a62aa7bdfa0ac3230bf3496afebb4a9b9fb3edcfda189a4f3edd83c" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:56.609718 containerd[1543]: 2025-05-27 18:15:56.459 [INFO][4526] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-6-bb492ec913" May 27 18:15:56.609718 containerd[1543]: 2025-05-27 18:15:56.476 [INFO][4526] ipam/ipam.go 511: Trying affinity for 192.168.10.192/26 host="ci-4344.0.0-6-bb492ec913" May 27 18:15:56.609718 containerd[1543]: 2025-05-27 18:15:56.480 [INFO][4526] ipam/ipam.go 158: Attempting to load block cidr=192.168.10.192/26 host="ci-4344.0.0-6-bb492ec913" May 27 18:15:56.609718 containerd[1543]: 2025-05-27 18:15:56.485 [INFO][4526] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.10.192/26 host="ci-4344.0.0-6-bb492ec913" May 27 18:15:56.609718 containerd[1543]: 2025-05-27 18:15:56.485 [INFO][4526] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.10.192/26 handle="k8s-pod-network.1ad2b5906a62aa7bdfa0ac3230bf3496afebb4a9b9fb3edcfda189a4f3edd83c" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:56.609718 containerd[1543]: 2025-05-27 18:15:56.488 [INFO][4526] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1ad2b5906a62aa7bdfa0ac3230bf3496afebb4a9b9fb3edcfda189a4f3edd83c May 27 18:15:56.609718 containerd[1543]: 2025-05-27 18:15:56.498 [INFO][4526] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.10.192/26 handle="k8s-pod-network.1ad2b5906a62aa7bdfa0ac3230bf3496afebb4a9b9fb3edcfda189a4f3edd83c" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:56.609718 containerd[1543]: 2025-05-27 18:15:56.527 [INFO][4526] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.10.199/26] block=192.168.10.192/26 handle="k8s-pod-network.1ad2b5906a62aa7bdfa0ac3230bf3496afebb4a9b9fb3edcfda189a4f3edd83c" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:56.609718 containerd[1543]: 2025-05-27 18:15:56.527 [INFO][4526] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.10.199/26] handle="k8s-pod-network.1ad2b5906a62aa7bdfa0ac3230bf3496afebb4a9b9fb3edcfda189a4f3edd83c" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:56.609718 containerd[1543]: 2025-05-27 18:15:56.527 [INFO][4526] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 18:15:56.609718 containerd[1543]: 2025-05-27 18:15:56.527 [INFO][4526] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.10.199/26] IPv6=[] ContainerID="1ad2b5906a62aa7bdfa0ac3230bf3496afebb4a9b9fb3edcfda189a4f3edd83c" HandleID="k8s-pod-network.1ad2b5906a62aa7bdfa0ac3230bf3496afebb4a9b9fb3edcfda189a4f3edd83c" Workload="ci--4344.0.0--6--bb492ec913-k8s-goldmane--78d55f7ddc--hqs5q-eth0" May 27 18:15:56.612112 containerd[1543]: 2025-05-27 18:15:56.543 [INFO][4503] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1ad2b5906a62aa7bdfa0ac3230bf3496afebb4a9b9fb3edcfda189a4f3edd83c" Namespace="calico-system" Pod="goldmane-78d55f7ddc-hqs5q" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-goldmane--78d55f7ddc--hqs5q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--6--bb492ec913-k8s-goldmane--78d55f7ddc--hqs5q-eth0", GenerateName:"goldmane-78d55f7ddc-", Namespace:"calico-system", SelfLink:"", UID:"312bbe06-8b56-48c8-bd2e-4f07049cb4ed", ResourceVersion:"842", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 18, 15, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"78d55f7ddc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-6-bb492ec913", ContainerID:"", Pod:"goldmane-78d55f7ddc-hqs5q", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.10.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calib08907b2482", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 18:15:56.612112 containerd[1543]: 2025-05-27 18:15:56.544 [INFO][4503] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.10.199/32] ContainerID="1ad2b5906a62aa7bdfa0ac3230bf3496afebb4a9b9fb3edcfda189a4f3edd83c" Namespace="calico-system" Pod="goldmane-78d55f7ddc-hqs5q" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-goldmane--78d55f7ddc--hqs5q-eth0" May 27 18:15:56.612112 containerd[1543]: 2025-05-27 18:15:56.545 [INFO][4503] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib08907b2482 ContainerID="1ad2b5906a62aa7bdfa0ac3230bf3496afebb4a9b9fb3edcfda189a4f3edd83c" Namespace="calico-system" Pod="goldmane-78d55f7ddc-hqs5q" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-goldmane--78d55f7ddc--hqs5q-eth0" May 27 18:15:56.612112 containerd[1543]: 2025-05-27 18:15:56.551 [INFO][4503] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1ad2b5906a62aa7bdfa0ac3230bf3496afebb4a9b9fb3edcfda189a4f3edd83c" Namespace="calico-system" Pod="goldmane-78d55f7ddc-hqs5q" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-goldmane--78d55f7ddc--hqs5q-eth0" May 27 18:15:56.612112 containerd[1543]: 2025-05-27 18:15:56.553 [INFO][4503] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1ad2b5906a62aa7bdfa0ac3230bf3496afebb4a9b9fb3edcfda189a4f3edd83c" Namespace="calico-system" Pod="goldmane-78d55f7ddc-hqs5q" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-goldmane--78d55f7ddc--hqs5q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--6--bb492ec913-k8s-goldmane--78d55f7ddc--hqs5q-eth0", GenerateName:"goldmane-78d55f7ddc-", Namespace:"calico-system", SelfLink:"", UID:"312bbe06-8b56-48c8-bd2e-4f07049cb4ed", ResourceVersion:"842", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 18, 15, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"78d55f7ddc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-6-bb492ec913", ContainerID:"1ad2b5906a62aa7bdfa0ac3230bf3496afebb4a9b9fb3edcfda189a4f3edd83c", Pod:"goldmane-78d55f7ddc-hqs5q", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.10.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calib08907b2482", MAC:"2a:c3:8a:d1:9f:c7", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 18:15:56.612112 containerd[1543]: 2025-05-27 18:15:56.593 [INFO][4503] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1ad2b5906a62aa7bdfa0ac3230bf3496afebb4a9b9fb3edcfda189a4f3edd83c" Namespace="calico-system" Pod="goldmane-78d55f7ddc-hqs5q" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-goldmane--78d55f7ddc--hqs5q-eth0" May 27 18:15:56.636702 systemd[1]: Started cri-containerd-9b9a2f2a858037461ac95fac323f50112f544df1b464f28b7d4eaa0d7ceb6eea.scope - libcontainer container 9b9a2f2a858037461ac95fac323f50112f544df1b464f28b7d4eaa0d7ceb6eea. May 27 18:15:56.671779 systemd[1]: Started cri-containerd-8ab5edf4287172000ec17054ffb98a210ab0c768d835bf47dc7c5e13624685b0.scope - libcontainer container 8ab5edf4287172000ec17054ffb98a210ab0c768d835bf47dc7c5e13624685b0. May 27 18:15:56.739509 containerd[1543]: time="2025-05-27T18:15:56.738387897Z" level=info msg="connecting to shim 1ad2b5906a62aa7bdfa0ac3230bf3496afebb4a9b9fb3edcfda189a4f3edd83c" address="unix:///run/containerd/s/01f8a89e40e32b4bfb5aef17169e7cbea700ecebbdaaac5aea2c466522cf8c7a" namespace=k8s.io protocol=ttrpc version=3 May 27 18:15:56.770971 systemd-networkd[1436]: cali4e7b5a0eebc: Link UP May 27 18:15:56.773156 systemd-networkd[1436]: cali4e7b5a0eebc: Gained carrier May 27 18:15:56.811600 containerd[1543]: 2025-05-27 18:15:56.440 [INFO][4530] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--6--bb492ec913-k8s-calico--kube--controllers--78bf7cd7cb--xmffc-eth0 calico-kube-controllers-78bf7cd7cb- calico-system 0cb2dcfc-ccf3-42e3-add2-e6cfa71bb149 839 0 2025-05-27 18:15:30 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:78bf7cd7cb projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4344.0.0-6-bb492ec913 calico-kube-controllers-78bf7cd7cb-xmffc eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali4e7b5a0eebc [] [] }} ContainerID="43f9a14017dfdaeadb1a5459031f676c6da12d22a3828519a7e67611107d2ef2" Namespace="calico-system" Pod="calico-kube-controllers-78bf7cd7cb-xmffc" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-calico--kube--controllers--78bf7cd7cb--xmffc-" May 27 18:15:56.811600 containerd[1543]: 2025-05-27 18:15:56.440 [INFO][4530] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="43f9a14017dfdaeadb1a5459031f676c6da12d22a3828519a7e67611107d2ef2" Namespace="calico-system" Pod="calico-kube-controllers-78bf7cd7cb-xmffc" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-calico--kube--controllers--78bf7cd7cb--xmffc-eth0" May 27 18:15:56.811600 containerd[1543]: 2025-05-27 18:15:56.587 [INFO][4539] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="43f9a14017dfdaeadb1a5459031f676c6da12d22a3828519a7e67611107d2ef2" HandleID="k8s-pod-network.43f9a14017dfdaeadb1a5459031f676c6da12d22a3828519a7e67611107d2ef2" Workload="ci--4344.0.0--6--bb492ec913-k8s-calico--kube--controllers--78bf7cd7cb--xmffc-eth0" May 27 18:15:56.811600 containerd[1543]: 2025-05-27 18:15:56.588 [INFO][4539] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="43f9a14017dfdaeadb1a5459031f676c6da12d22a3828519a7e67611107d2ef2" HandleID="k8s-pod-network.43f9a14017dfdaeadb1a5459031f676c6da12d22a3828519a7e67611107d2ef2" Workload="ci--4344.0.0--6--bb492ec913-k8s-calico--kube--controllers--78bf7cd7cb--xmffc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000371a30), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344.0.0-6-bb492ec913", "pod":"calico-kube-controllers-78bf7cd7cb-xmffc", "timestamp":"2025-05-27 18:15:56.587350274 +0000 UTC"}, Hostname:"ci-4344.0.0-6-bb492ec913", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 18:15:56.811600 containerd[1543]: 2025-05-27 18:15:56.589 [INFO][4539] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 18:15:56.811600 containerd[1543]: 2025-05-27 18:15:56.590 [INFO][4539] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 18:15:56.811600 containerd[1543]: 2025-05-27 18:15:56.590 [INFO][4539] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-6-bb492ec913' May 27 18:15:56.811600 containerd[1543]: 2025-05-27 18:15:56.613 [INFO][4539] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.43f9a14017dfdaeadb1a5459031f676c6da12d22a3828519a7e67611107d2ef2" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:56.811600 containerd[1543]: 2025-05-27 18:15:56.641 [INFO][4539] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-6-bb492ec913" May 27 18:15:56.811600 containerd[1543]: 2025-05-27 18:15:56.658 [INFO][4539] ipam/ipam.go 511: Trying affinity for 192.168.10.192/26 host="ci-4344.0.0-6-bb492ec913" May 27 18:15:56.811600 containerd[1543]: 2025-05-27 18:15:56.670 [INFO][4539] ipam/ipam.go 158: Attempting to load block cidr=192.168.10.192/26 host="ci-4344.0.0-6-bb492ec913" May 27 18:15:56.811600 containerd[1543]: 2025-05-27 18:15:56.679 [INFO][4539] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.10.192/26 host="ci-4344.0.0-6-bb492ec913" May 27 18:15:56.811600 containerd[1543]: 2025-05-27 18:15:56.680 [INFO][4539] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.10.192/26 handle="k8s-pod-network.43f9a14017dfdaeadb1a5459031f676c6da12d22a3828519a7e67611107d2ef2" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:56.811600 containerd[1543]: 2025-05-27 18:15:56.685 [INFO][4539] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.43f9a14017dfdaeadb1a5459031f676c6da12d22a3828519a7e67611107d2ef2 May 27 18:15:56.811600 containerd[1543]: 2025-05-27 18:15:56.704 [INFO][4539] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.10.192/26 handle="k8s-pod-network.43f9a14017dfdaeadb1a5459031f676c6da12d22a3828519a7e67611107d2ef2" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:56.811600 containerd[1543]: 2025-05-27 18:15:56.744 [INFO][4539] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.10.200/26] block=192.168.10.192/26 handle="k8s-pod-network.43f9a14017dfdaeadb1a5459031f676c6da12d22a3828519a7e67611107d2ef2" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:56.811600 containerd[1543]: 2025-05-27 18:15:56.744 [INFO][4539] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.10.200/26] handle="k8s-pod-network.43f9a14017dfdaeadb1a5459031f676c6da12d22a3828519a7e67611107d2ef2" host="ci-4344.0.0-6-bb492ec913" May 27 18:15:56.811600 containerd[1543]: 2025-05-27 18:15:56.744 [INFO][4539] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 18:15:56.811600 containerd[1543]: 2025-05-27 18:15:56.744 [INFO][4539] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.10.200/26] IPv6=[] ContainerID="43f9a14017dfdaeadb1a5459031f676c6da12d22a3828519a7e67611107d2ef2" HandleID="k8s-pod-network.43f9a14017dfdaeadb1a5459031f676c6da12d22a3828519a7e67611107d2ef2" Workload="ci--4344.0.0--6--bb492ec913-k8s-calico--kube--controllers--78bf7cd7cb--xmffc-eth0" May 27 18:15:56.813354 containerd[1543]: 2025-05-27 18:15:56.765 [INFO][4530] cni-plugin/k8s.go 418: Populated endpoint ContainerID="43f9a14017dfdaeadb1a5459031f676c6da12d22a3828519a7e67611107d2ef2" Namespace="calico-system" Pod="calico-kube-controllers-78bf7cd7cb-xmffc" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-calico--kube--controllers--78bf7cd7cb--xmffc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--6--bb492ec913-k8s-calico--kube--controllers--78bf7cd7cb--xmffc-eth0", GenerateName:"calico-kube-controllers-78bf7cd7cb-", Namespace:"calico-system", SelfLink:"", UID:"0cb2dcfc-ccf3-42e3-add2-e6cfa71bb149", ResourceVersion:"839", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 18, 15, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"78bf7cd7cb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-6-bb492ec913", ContainerID:"", Pod:"calico-kube-controllers-78bf7cd7cb-xmffc", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.10.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali4e7b5a0eebc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 18:15:56.813354 containerd[1543]: 2025-05-27 18:15:56.765 [INFO][4530] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.10.200/32] ContainerID="43f9a14017dfdaeadb1a5459031f676c6da12d22a3828519a7e67611107d2ef2" Namespace="calico-system" Pod="calico-kube-controllers-78bf7cd7cb-xmffc" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-calico--kube--controllers--78bf7cd7cb--xmffc-eth0" May 27 18:15:56.813354 containerd[1543]: 2025-05-27 18:15:56.765 [INFO][4530] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4e7b5a0eebc ContainerID="43f9a14017dfdaeadb1a5459031f676c6da12d22a3828519a7e67611107d2ef2" Namespace="calico-system" Pod="calico-kube-controllers-78bf7cd7cb-xmffc" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-calico--kube--controllers--78bf7cd7cb--xmffc-eth0" May 27 18:15:56.813354 containerd[1543]: 2025-05-27 18:15:56.774 [INFO][4530] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="43f9a14017dfdaeadb1a5459031f676c6da12d22a3828519a7e67611107d2ef2" Namespace="calico-system" Pod="calico-kube-controllers-78bf7cd7cb-xmffc" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-calico--kube--controllers--78bf7cd7cb--xmffc-eth0" May 27 18:15:56.813354 containerd[1543]: 2025-05-27 18:15:56.775 [INFO][4530] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="43f9a14017dfdaeadb1a5459031f676c6da12d22a3828519a7e67611107d2ef2" Namespace="calico-system" Pod="calico-kube-controllers-78bf7cd7cb-xmffc" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-calico--kube--controllers--78bf7cd7cb--xmffc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--6--bb492ec913-k8s-calico--kube--controllers--78bf7cd7cb--xmffc-eth0", GenerateName:"calico-kube-controllers-78bf7cd7cb-", Namespace:"calico-system", SelfLink:"", UID:"0cb2dcfc-ccf3-42e3-add2-e6cfa71bb149", ResourceVersion:"839", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 18, 15, 30, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"78bf7cd7cb", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-6-bb492ec913", ContainerID:"43f9a14017dfdaeadb1a5459031f676c6da12d22a3828519a7e67611107d2ef2", Pod:"calico-kube-controllers-78bf7cd7cb-xmffc", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.10.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali4e7b5a0eebc", MAC:"e6:82:c7:f9:6b:dc", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 18:15:56.813354 containerd[1543]: 2025-05-27 18:15:56.801 [INFO][4530] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="43f9a14017dfdaeadb1a5459031f676c6da12d22a3828519a7e67611107d2ef2" Namespace="calico-system" Pod="calico-kube-controllers-78bf7cd7cb-xmffc" WorkloadEndpoint="ci--4344.0.0--6--bb492ec913-k8s-calico--kube--controllers--78bf7cd7cb--xmffc-eth0" May 27 18:15:56.884704 containerd[1543]: time="2025-05-27T18:15:56.884345587Z" level=info msg="StartContainer for \"8ab5edf4287172000ec17054ffb98a210ab0c768d835bf47dc7c5e13624685b0\" returns successfully" May 27 18:15:56.894253 systemd[1]: Started cri-containerd-1ad2b5906a62aa7bdfa0ac3230bf3496afebb4a9b9fb3edcfda189a4f3edd83c.scope - libcontainer container 1ad2b5906a62aa7bdfa0ac3230bf3496afebb4a9b9fb3edcfda189a4f3edd83c. May 27 18:15:56.909927 containerd[1543]: time="2025-05-27T18:15:56.909733608Z" level=info msg="StartContainer for \"9b9a2f2a858037461ac95fac323f50112f544df1b464f28b7d4eaa0d7ceb6eea\" returns successfully" May 27 18:15:56.942576 containerd[1543]: time="2025-05-27T18:15:56.942341472Z" level=info msg="connecting to shim 43f9a14017dfdaeadb1a5459031f676c6da12d22a3828519a7e67611107d2ef2" address="unix:///run/containerd/s/8a536baff0bb379482091ac1ff26c853f39f5a7ee78f9ca87e26d6f58da9cebe" namespace=k8s.io protocol=ttrpc version=3 May 27 18:15:57.051755 systemd-networkd[1436]: cali9fa3211248e: Gained IPv6LL May 27 18:15:57.052802 systemd[1]: Started cri-containerd-43f9a14017dfdaeadb1a5459031f676c6da12d22a3828519a7e67611107d2ef2.scope - libcontainer container 43f9a14017dfdaeadb1a5459031f676c6da12d22a3828519a7e67611107d2ef2. May 27 18:15:57.435664 systemd-networkd[1436]: calicfe7f3bf969: Gained IPv6LL May 27 18:15:57.608484 containerd[1543]: time="2025-05-27T18:15:57.607983245Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-hqs5q,Uid:312bbe06-8b56-48c8-bd2e-4f07049cb4ed,Namespace:calico-system,Attempt:0,} returns sandbox id \"1ad2b5906a62aa7bdfa0ac3230bf3496afebb4a9b9fb3edcfda189a4f3edd83c\"" May 27 18:15:57.616369 kubelet[2668]: E0527 18:15:57.615291 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:57.635636 containerd[1543]: time="2025-05-27T18:15:57.635580333Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-78bf7cd7cb-xmffc,Uid:0cb2dcfc-ccf3-42e3-add2-e6cfa71bb149,Namespace:calico-system,Attempt:0,} returns sandbox id \"43f9a14017dfdaeadb1a5459031f676c6da12d22a3828519a7e67611107d2ef2\"" May 27 18:15:57.646343 kubelet[2668]: E0527 18:15:57.645764 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:57.693285 kubelet[2668]: I0527 18:15:57.688945 2668 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-wl8mq" podStartSLOduration=42.68429544 podStartE2EDuration="42.68429544s" podCreationTimestamp="2025-05-27 18:15:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 18:15:57.682888941 +0000 UTC m=+47.810718266" watchObservedRunningTime="2025-05-27 18:15:57.68429544 +0000 UTC m=+47.812124764" May 27 18:15:57.795112 containerd[1543]: time="2025-05-27T18:15:57.795041810Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:15:57.797909 kubelet[2668]: I0527 18:15:57.796561 2668 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-wj82f" podStartSLOduration=42.796510234 podStartE2EDuration="42.796510234s" podCreationTimestamp="2025-05-27 18:15:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 18:15:57.788766471 +0000 UTC m=+47.916595795" watchObservedRunningTime="2025-05-27 18:15:57.796510234 +0000 UTC m=+47.924339560" May 27 18:15:57.804455 containerd[1543]: time="2025-05-27T18:15:57.803992227Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.0: active requests=0, bytes read=47252431" May 27 18:15:57.808781 containerd[1543]: time="2025-05-27T18:15:57.808616374Z" level=info msg="ImageCreate event name:\"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:15:57.816562 containerd[1543]: time="2025-05-27T18:15:57.816505930Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:15:57.818120 containerd[1543]: time="2025-05-27T18:15:57.817891229Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" with image id \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\", size \"48745150\" in 5.286000609s" May 27 18:15:57.818394 containerd[1543]: time="2025-05-27T18:15:57.818292362Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" returns image reference \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\"" May 27 18:15:57.820118 systemd-networkd[1436]: calib08907b2482: Gained IPv6LL May 27 18:15:57.832652 containerd[1543]: time="2025-05-27T18:15:57.832474093Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\"" May 27 18:15:57.838958 containerd[1543]: time="2025-05-27T18:15:57.837567818Z" level=info msg="CreateContainer within sandbox \"43094d8a66a156cc4160890853ec29d99a1192256813a069cae64064d957968c\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 27 18:15:57.872918 containerd[1543]: time="2025-05-27T18:15:57.872708782Z" level=info msg="Container cdfb3bccce12a0499fb6fa3a3191dbd978029abb637d93d8fae386dc1634af8f: CDI devices from CRI Config.CDIDevices: []" May 27 18:15:57.912724 containerd[1543]: time="2025-05-27T18:15:57.912429848Z" level=info msg="CreateContainer within sandbox \"43094d8a66a156cc4160890853ec29d99a1192256813a069cae64064d957968c\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"cdfb3bccce12a0499fb6fa3a3191dbd978029abb637d93d8fae386dc1634af8f\"" May 27 18:15:57.914774 containerd[1543]: time="2025-05-27T18:15:57.914728412Z" level=info msg="StartContainer for \"cdfb3bccce12a0499fb6fa3a3191dbd978029abb637d93d8fae386dc1634af8f\"" May 27 18:15:57.917672 containerd[1543]: time="2025-05-27T18:15:57.917605070Z" level=info msg="connecting to shim cdfb3bccce12a0499fb6fa3a3191dbd978029abb637d93d8fae386dc1634af8f" address="unix:///run/containerd/s/52ce5e5e5de53297f67cb7a705a01717bcce4d4213259895422b3e7cf9b6566d" protocol=ttrpc version=3 May 27 18:15:57.978846 systemd[1]: Started cri-containerd-cdfb3bccce12a0499fb6fa3a3191dbd978029abb637d93d8fae386dc1634af8f.scope - libcontainer container cdfb3bccce12a0499fb6fa3a3191dbd978029abb637d93d8fae386dc1634af8f. May 27 18:15:58.093883 containerd[1543]: time="2025-05-27T18:15:58.093839872Z" level=info msg="StartContainer for \"cdfb3bccce12a0499fb6fa3a3191dbd978029abb637d93d8fae386dc1634af8f\" returns successfully" May 27 18:15:58.300428 containerd[1543]: time="2025-05-27T18:15:58.300196707Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:15:58.302310 containerd[1543]: time="2025-05-27T18:15:58.302251221Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.0: active requests=0, bytes read=77" May 27 18:15:58.311365 containerd[1543]: time="2025-05-27T18:15:58.311245832Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" with image id \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\", size \"48745150\" in 478.712058ms" May 27 18:15:58.311365 containerd[1543]: time="2025-05-27T18:15:58.311350901Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" returns image reference \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\"" May 27 18:15:58.314218 containerd[1543]: time="2025-05-27T18:15:58.313833417Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.0\"" May 27 18:15:58.319483 containerd[1543]: time="2025-05-27T18:15:58.319386876Z" level=info msg="CreateContainer within sandbox \"8680c485c287a90ed6bb801f77b5435527da6a8d1979c3ac3a3ffa006b899522\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 27 18:15:58.331486 containerd[1543]: time="2025-05-27T18:15:58.328597958Z" level=info msg="Container 82e586de711c2e869bacbb33310c373224a0c52c76be103134dfc10299091f73: CDI devices from CRI Config.CDIDevices: []" May 27 18:15:58.370220 containerd[1543]: time="2025-05-27T18:15:58.369516490Z" level=info msg="CreateContainer within sandbox \"8680c485c287a90ed6bb801f77b5435527da6a8d1979c3ac3a3ffa006b899522\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"82e586de711c2e869bacbb33310c373224a0c52c76be103134dfc10299091f73\"" May 27 18:15:58.371864 containerd[1543]: time="2025-05-27T18:15:58.371817890Z" level=info msg="StartContainer for \"82e586de711c2e869bacbb33310c373224a0c52c76be103134dfc10299091f73\"" May 27 18:15:58.374300 containerd[1543]: time="2025-05-27T18:15:58.374236919Z" level=info msg="connecting to shim 82e586de711c2e869bacbb33310c373224a0c52c76be103134dfc10299091f73" address="unix:///run/containerd/s/bfa00951cb93c0ae54b26169bbf47bb9ea8635d7122d4f716f51f5867c93a435" protocol=ttrpc version=3 May 27 18:15:58.432687 systemd[1]: Started cri-containerd-82e586de711c2e869bacbb33310c373224a0c52c76be103134dfc10299091f73.scope - libcontainer container 82e586de711c2e869bacbb33310c373224a0c52c76be103134dfc10299091f73. May 27 18:15:58.522187 containerd[1543]: time="2025-05-27T18:15:58.522101491Z" level=info msg="StartContainer for \"82e586de711c2e869bacbb33310c373224a0c52c76be103134dfc10299091f73\" returns successfully" May 27 18:15:58.524138 systemd-networkd[1436]: cali4e7b5a0eebc: Gained IPv6LL May 27 18:15:58.685686 kubelet[2668]: I0527 18:15:58.684809 2668 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-59958685d4-qt6zw" podStartSLOduration=27.389022572000002 podStartE2EDuration="32.68478852s" podCreationTimestamp="2025-05-27 18:15:26 +0000 UTC" firstStartedPulling="2025-05-27 18:15:52.531511738 +0000 UTC m=+42.659341053" lastFinishedPulling="2025-05-27 18:15:57.827277677 +0000 UTC m=+47.955107001" observedRunningTime="2025-05-27 18:15:58.684321345 +0000 UTC m=+48.812150665" watchObservedRunningTime="2025-05-27 18:15:58.68478852 +0000 UTC m=+48.812617845" May 27 18:15:58.692700 kubelet[2668]: E0527 18:15:58.692393 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:58.692700 kubelet[2668]: E0527 18:15:58.692411 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:58.711838 kubelet[2668]: I0527 18:15:58.711657 2668 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-59958685d4-slstk" podStartSLOduration=28.038861364 podStartE2EDuration="32.711564953s" podCreationTimestamp="2025-05-27 18:15:26 +0000 UTC" firstStartedPulling="2025-05-27 18:15:53.640648865 +0000 UTC m=+43.768478184" lastFinishedPulling="2025-05-27 18:15:58.313352455 +0000 UTC m=+48.441181773" observedRunningTime="2025-05-27 18:15:58.710973729 +0000 UTC m=+48.838803053" watchObservedRunningTime="2025-05-27 18:15:58.711564953 +0000 UTC m=+48.839394277" May 27 18:15:59.696413 kubelet[2668]: E0527 18:15:59.696371 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:59.698843 kubelet[2668]: E0527 18:15:59.698803 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:15:59.704977 kubelet[2668]: I0527 18:15:59.699009 2668 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 18:15:59.707541 kubelet[2668]: I0527 18:15:59.695358 2668 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 18:16:00.553118 containerd[1543]: time="2025-05-27T18:16:00.552965395Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:16:00.554463 containerd[1543]: time="2025-05-27T18:16:00.553937761Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.0: active requests=0, bytes read=8758390" May 27 18:16:00.555314 containerd[1543]: time="2025-05-27T18:16:00.554877128Z" level=info msg="ImageCreate event name:\"sha256:d5b08093b7928c0ac1122e59edf69b2e58c6d10ecc8b9e5cffeb809a956dc48e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:16:00.557265 containerd[1543]: time="2025-05-27T18:16:00.557222432Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:27883a4104876fe239311dd93ce6efd0c4a87de7163d57a4c8d96bd65a287ffd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:16:00.557990 containerd[1543]: time="2025-05-27T18:16:00.557859908Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.0\" with image id \"sha256:d5b08093b7928c0ac1122e59edf69b2e58c6d10ecc8b9e5cffeb809a956dc48e\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:27883a4104876fe239311dd93ce6efd0c4a87de7163d57a4c8d96bd65a287ffd\", size \"10251093\" in 2.243990323s" May 27 18:16:00.557990 containerd[1543]: time="2025-05-27T18:16:00.557907681Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.0\" returns image reference \"sha256:d5b08093b7928c0ac1122e59edf69b2e58c6d10ecc8b9e5cffeb809a956dc48e\"" May 27 18:16:00.561278 containerd[1543]: time="2025-05-27T18:16:00.561088256Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 18:16:00.567031 containerd[1543]: time="2025-05-27T18:16:00.566872565Z" level=info msg="CreateContainer within sandbox \"9a4f96b0211296ead37575ccced075a1d7868b4b0ad3a0e180207b49ea718c1d\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 27 18:16:00.586465 containerd[1543]: time="2025-05-27T18:16:00.582216360Z" level=info msg="Container 5b4c85a622d6a433a09285e3277d549dc725c44b3f3df8210a5e0b43601d36ef: CDI devices from CRI Config.CDIDevices: []" May 27 18:16:00.603266 containerd[1543]: time="2025-05-27T18:16:00.603206591Z" level=info msg="CreateContainer within sandbox \"9a4f96b0211296ead37575ccced075a1d7868b4b0ad3a0e180207b49ea718c1d\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"5b4c85a622d6a433a09285e3277d549dc725c44b3f3df8210a5e0b43601d36ef\"" May 27 18:16:00.605043 containerd[1543]: time="2025-05-27T18:16:00.604919894Z" level=info msg="StartContainer for \"5b4c85a622d6a433a09285e3277d549dc725c44b3f3df8210a5e0b43601d36ef\"" May 27 18:16:00.609353 containerd[1543]: time="2025-05-27T18:16:00.609286967Z" level=info msg="connecting to shim 5b4c85a622d6a433a09285e3277d549dc725c44b3f3df8210a5e0b43601d36ef" address="unix:///run/containerd/s/68c68446feb7302700ce0dd500e31b73c0a791c03fd07e81dc61176925c3f6b9" protocol=ttrpc version=3 May 27 18:16:00.670830 systemd[1]: Started cri-containerd-5b4c85a622d6a433a09285e3277d549dc725c44b3f3df8210a5e0b43601d36ef.scope - libcontainer container 5b4c85a622d6a433a09285e3277d549dc725c44b3f3df8210a5e0b43601d36ef. May 27 18:16:00.758561 containerd[1543]: time="2025-05-27T18:16:00.758371117Z" level=info msg="StartContainer for \"5b4c85a622d6a433a09285e3277d549dc725c44b3f3df8210a5e0b43601d36ef\" returns successfully" May 27 18:16:00.819819 containerd[1543]: time="2025-05-27T18:16:00.819589175Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:16:00.830925 containerd[1543]: time="2025-05-27T18:16:00.828365204Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:16:00.843692 containerd[1543]: time="2025-05-27T18:16:00.831372729Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 18:16:00.856629 kubelet[2668]: E0527 18:16:00.851909 2668 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 18:16:00.856629 kubelet[2668]: E0527 18:16:00.856330 2668 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 18:16:00.858517 containerd[1543]: time="2025-05-27T18:16:00.857762072Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\"" May 27 18:16:00.890392 kubelet[2668]: E0527 18:16:00.890256 2668 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dctn5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-hqs5q_calico-system(312bbe06-8b56-48c8-bd2e-4f07049cb4ed): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:16:00.905557 kubelet[2668]: E0527 18:16:00.905425 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hqs5q" podUID="312bbe06-8b56-48c8-bd2e-4f07049cb4ed" May 27 18:16:01.721429 kubelet[2668]: E0527 18:16:01.721172 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hqs5q" podUID="312bbe06-8b56-48c8-bd2e-4f07049cb4ed" May 27 18:16:04.330109 containerd[1543]: time="2025-05-27T18:16:04.329836496Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:16:04.331606 containerd[1543]: time="2025-05-27T18:16:04.331558392Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.0: active requests=0, bytes read=51178512" May 27 18:16:04.332938 containerd[1543]: time="2025-05-27T18:16:04.332873036Z" level=info msg="ImageCreate event name:\"sha256:094053209304a3d20e6561c18d37ac2dc4c7fbb68c1579d9864c303edebffa50\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:16:04.336321 containerd[1543]: time="2025-05-27T18:16:04.336249185Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:eb5bc5c9e7a71f1d8ea69bbcc8e54b84fb7ec1e32d919c8b148f80b770f20182\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:16:04.337327 containerd[1543]: time="2025-05-27T18:16:04.337226400Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" with image id \"sha256:094053209304a3d20e6561c18d37ac2dc4c7fbb68c1579d9864c303edebffa50\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:eb5bc5c9e7a71f1d8ea69bbcc8e54b84fb7ec1e32d919c8b148f80b770f20182\", size \"52671183\" in 3.479392828s" May 27 18:16:04.337327 containerd[1543]: time="2025-05-27T18:16:04.337281114Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" returns image reference \"sha256:094053209304a3d20e6561c18d37ac2dc4c7fbb68c1579d9864c303edebffa50\"" May 27 18:16:04.340715 containerd[1543]: time="2025-05-27T18:16:04.339732373Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\"" May 27 18:16:04.431369 containerd[1543]: time="2025-05-27T18:16:04.431300690Z" level=info msg="CreateContainer within sandbox \"43f9a14017dfdaeadb1a5459031f676c6da12d22a3828519a7e67611107d2ef2\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 27 18:16:04.449513 containerd[1543]: time="2025-05-27T18:16:04.448056075Z" level=info msg="Container 42963c6a800eeb5920b985546148e121e7547be360de37269e45c118688a457a: CDI devices from CRI Config.CDIDevices: []" May 27 18:16:04.462791 containerd[1543]: time="2025-05-27T18:16:04.462655904Z" level=info msg="CreateContainer within sandbox \"43f9a14017dfdaeadb1a5459031f676c6da12d22a3828519a7e67611107d2ef2\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"42963c6a800eeb5920b985546148e121e7547be360de37269e45c118688a457a\"" May 27 18:16:04.463642 containerd[1543]: time="2025-05-27T18:16:04.463524193Z" level=info msg="StartContainer for \"42963c6a800eeb5920b985546148e121e7547be360de37269e45c118688a457a\"" May 27 18:16:04.468768 containerd[1543]: time="2025-05-27T18:16:04.468711078Z" level=info msg="connecting to shim 42963c6a800eeb5920b985546148e121e7547be360de37269e45c118688a457a" address="unix:///run/containerd/s/8a536baff0bb379482091ac1ff26c853f39f5a7ee78f9ca87e26d6f58da9cebe" protocol=ttrpc version=3 May 27 18:16:04.550799 systemd[1]: Started cri-containerd-42963c6a800eeb5920b985546148e121e7547be360de37269e45c118688a457a.scope - libcontainer container 42963c6a800eeb5920b985546148e121e7547be360de37269e45c118688a457a. May 27 18:16:04.623214 containerd[1543]: time="2025-05-27T18:16:04.623001088Z" level=info msg="StartContainer for \"42963c6a800eeb5920b985546148e121e7547be360de37269e45c118688a457a\" returns successfully" May 27 18:16:04.954820 containerd[1543]: time="2025-05-27T18:16:04.954670233Z" level=info msg="TaskExit event in podsandbox handler container_id:\"42963c6a800eeb5920b985546148e121e7547be360de37269e45c118688a457a\" id:\"fdb6ccbb0e83b42068012a2319255207fc8adf946417f29fde133fbd04f305df\" pid:4920 exited_at:{seconds:1748369764 nanos:917956104}" May 27 18:16:04.986004 kubelet[2668]: I0527 18:16:04.985904 2668 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-78bf7cd7cb-xmffc" podStartSLOduration=28.288154293 podStartE2EDuration="34.985879649s" podCreationTimestamp="2025-05-27 18:15:30 +0000 UTC" firstStartedPulling="2025-05-27 18:15:57.64153331 +0000 UTC m=+47.769362626" lastFinishedPulling="2025-05-27 18:16:04.339258671 +0000 UTC m=+54.467087982" observedRunningTime="2025-05-27 18:16:04.78072967 +0000 UTC m=+54.908558994" watchObservedRunningTime="2025-05-27 18:16:04.985879649 +0000 UTC m=+55.113708967" May 27 18:16:06.266769 containerd[1543]: time="2025-05-27T18:16:06.266707439Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:16:06.268199 containerd[1543]: time="2025-05-27T18:16:06.267798715Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0: active requests=0, bytes read=14705639" May 27 18:16:06.268199 containerd[1543]: time="2025-05-27T18:16:06.268130144Z" level=info msg="ImageCreate event name:\"sha256:45c8692ffc029387ee93ba83da8ad26da9749cf2ba6ed03981f8f9933ed5a5b0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:16:06.275027 containerd[1543]: time="2025-05-27T18:16:06.274957398Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:dca5c16181edde2e860463615523ce457cd9dcfca85b7cfdcd6f3ea7de6f2ac8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 18:16:06.276203 containerd[1543]: time="2025-05-27T18:16:06.276160628Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" with image id \"sha256:45c8692ffc029387ee93ba83da8ad26da9749cf2ba6ed03981f8f9933ed5a5b0\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:dca5c16181edde2e860463615523ce457cd9dcfca85b7cfdcd6f3ea7de6f2ac8\", size \"16198294\" in 1.93543691s" May 27 18:16:06.276388 containerd[1543]: time="2025-05-27T18:16:06.276368742Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" returns image reference \"sha256:45c8692ffc029387ee93ba83da8ad26da9749cf2ba6ed03981f8f9933ed5a5b0\"" May 27 18:16:06.278900 containerd[1543]: time="2025-05-27T18:16:06.278700165Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 18:16:06.284948 containerd[1543]: time="2025-05-27T18:16:06.284886871Z" level=info msg="CreateContainer within sandbox \"9a4f96b0211296ead37575ccced075a1d7868b4b0ad3a0e180207b49ea718c1d\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 27 18:16:06.303378 containerd[1543]: time="2025-05-27T18:16:06.303305655Z" level=info msg="Container 0583a1392296f68bd00d482da14ae8a2fa32026acdfbf34f97fe4edccdab395f: CDI devices from CRI Config.CDIDevices: []" May 27 18:16:06.320314 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2522413934.mount: Deactivated successfully. May 27 18:16:06.334341 containerd[1543]: time="2025-05-27T18:16:06.334231869Z" level=info msg="CreateContainer within sandbox \"9a4f96b0211296ead37575ccced075a1d7868b4b0ad3a0e180207b49ea718c1d\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"0583a1392296f68bd00d482da14ae8a2fa32026acdfbf34f97fe4edccdab395f\"" May 27 18:16:06.336682 containerd[1543]: time="2025-05-27T18:16:06.335588398Z" level=info msg="StartContainer for \"0583a1392296f68bd00d482da14ae8a2fa32026acdfbf34f97fe4edccdab395f\"" May 27 18:16:06.341197 containerd[1543]: time="2025-05-27T18:16:06.341138287Z" level=info msg="connecting to shim 0583a1392296f68bd00d482da14ae8a2fa32026acdfbf34f97fe4edccdab395f" address="unix:///run/containerd/s/68c68446feb7302700ce0dd500e31b73c0a791c03fd07e81dc61176925c3f6b9" protocol=ttrpc version=3 May 27 18:16:06.381909 systemd[1]: Started cri-containerd-0583a1392296f68bd00d482da14ae8a2fa32026acdfbf34f97fe4edccdab395f.scope - libcontainer container 0583a1392296f68bd00d482da14ae8a2fa32026acdfbf34f97fe4edccdab395f. May 27 18:16:06.481807 containerd[1543]: time="2025-05-27T18:16:06.481747757Z" level=info msg="StartContainer for \"0583a1392296f68bd00d482da14ae8a2fa32026acdfbf34f97fe4edccdab395f\" returns successfully" May 27 18:16:06.539180 containerd[1543]: time="2025-05-27T18:16:06.539006727Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:16:06.541360 containerd[1543]: time="2025-05-27T18:16:06.541250414Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:16:06.541824 containerd[1543]: time="2025-05-27T18:16:06.541289946Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 18:16:06.542590 kubelet[2668]: E0527 18:16:06.542466 2668 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 18:16:06.542590 kubelet[2668]: E0527 18:16:06.542553 2668 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 18:16:06.557463 kubelet[2668]: E0527 18:16:06.556979 2668 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:b58c9c6d236b46b38836eca7a8a2bb57,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l4mn6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7989d9b698-55wz2_calico-system(461386c4-2f9e-4c80-a2ab-7b0260259077): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:16:06.560134 containerd[1543]: time="2025-05-27T18:16:06.560093926Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 18:16:06.795569 containerd[1543]: time="2025-05-27T18:16:06.795181134Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:16:06.798691 containerd[1543]: time="2025-05-27T18:16:06.798582054Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:16:06.800290 containerd[1543]: time="2025-05-27T18:16:06.798615990Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 18:16:06.800661 kubelet[2668]: E0527 18:16:06.800607 2668 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 18:16:06.801189 kubelet[2668]: E0527 18:16:06.800887 2668 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 18:16:06.801189 kubelet[2668]: E0527 18:16:06.801111 2668 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l4mn6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7989d9b698-55wz2_calico-system(461386c4-2f9e-4c80-a2ab-7b0260259077): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:16:06.802751 kubelet[2668]: E0527 18:16:06.802644 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7989d9b698-55wz2" podUID="461386c4-2f9e-4c80-a2ab-7b0260259077" May 27 18:16:07.397346 kubelet[2668]: I0527 18:16:07.397264 2668 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 27 18:16:07.403242 kubelet[2668]: I0527 18:16:07.403183 2668 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 27 18:16:07.559768 systemd[1]: Started sshd@7-146.190.127.126:22-139.178.68.195:55346.service - OpenSSH per-connection server daemon (139.178.68.195:55346). May 27 18:16:07.704556 sshd[4971]: Accepted publickey for core from 139.178.68.195 port 55346 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:16:07.707863 sshd-session[4971]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:16:07.719420 systemd-logind[1520]: New session 8 of user core. May 27 18:16:07.726231 systemd[1]: Started session-8.scope - Session 8 of User core. May 27 18:16:08.569029 sshd[4973]: Connection closed by 139.178.68.195 port 55346 May 27 18:16:08.569664 sshd-session[4971]: pam_unix(sshd:session): session closed for user core May 27 18:16:08.580083 systemd-logind[1520]: Session 8 logged out. Waiting for processes to exit. May 27 18:16:08.581544 systemd[1]: sshd@7-146.190.127.126:22-139.178.68.195:55346.service: Deactivated successfully. May 27 18:16:08.586810 systemd[1]: session-8.scope: Deactivated successfully. May 27 18:16:08.591288 systemd-logind[1520]: Removed session 8. May 27 18:16:12.114711 containerd[1543]: time="2025-05-27T18:16:12.114656079Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 18:16:12.131630 kubelet[2668]: I0527 18:16:12.131486 2668 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-bglbs" podStartSLOduration=30.540408638 podStartE2EDuration="43.131461285s" podCreationTimestamp="2025-05-27 18:15:29 +0000 UTC" firstStartedPulling="2025-05-27 18:15:53.68707485 +0000 UTC m=+43.814904154" lastFinishedPulling="2025-05-27 18:16:06.27812747 +0000 UTC m=+56.405956801" observedRunningTime="2025-05-27 18:16:06.834649095 +0000 UTC m=+56.962478421" watchObservedRunningTime="2025-05-27 18:16:12.131461285 +0000 UTC m=+62.259290609" May 27 18:16:12.358636 containerd[1543]: time="2025-05-27T18:16:12.358528718Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:16:12.359314 containerd[1543]: time="2025-05-27T18:16:12.359268392Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:16:12.359428 containerd[1543]: time="2025-05-27T18:16:12.359369843Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 18:16:12.359746 kubelet[2668]: E0527 18:16:12.359665 2668 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 18:16:12.359839 kubelet[2668]: E0527 18:16:12.359739 2668 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 18:16:12.361726 kubelet[2668]: E0527 18:16:12.361609 2668 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dctn5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-hqs5q_calico-system(312bbe06-8b56-48c8-bd2e-4f07049cb4ed): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:16:12.363152 kubelet[2668]: E0527 18:16:12.363086 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hqs5q" podUID="312bbe06-8b56-48c8-bd2e-4f07049cb4ed" May 27 18:16:13.588150 systemd[1]: Started sshd@8-146.190.127.126:22-139.178.68.195:45624.service - OpenSSH per-connection server daemon (139.178.68.195:45624). May 27 18:16:13.680478 sshd[4993]: Accepted publickey for core from 139.178.68.195 port 45624 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:16:13.682570 sshd-session[4993]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:16:13.690932 systemd-logind[1520]: New session 9 of user core. May 27 18:16:13.700786 systemd[1]: Started session-9.scope - Session 9 of User core. May 27 18:16:13.877558 sshd[4995]: Connection closed by 139.178.68.195 port 45624 May 27 18:16:13.876491 sshd-session[4993]: pam_unix(sshd:session): session closed for user core May 27 18:16:13.882675 systemd[1]: sshd@8-146.190.127.126:22-139.178.68.195:45624.service: Deactivated successfully. May 27 18:16:13.886573 systemd[1]: session-9.scope: Deactivated successfully. May 27 18:16:13.888272 systemd-logind[1520]: Session 9 logged out. Waiting for processes to exit. May 27 18:16:13.890370 systemd-logind[1520]: Removed session 9. May 27 18:16:14.027531 kubelet[2668]: I0527 18:16:14.027310 2668 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 18:16:18.893137 systemd[1]: Started sshd@9-146.190.127.126:22-139.178.68.195:45632.service - OpenSSH per-connection server daemon (139.178.68.195:45632). May 27 18:16:18.984498 sshd[5014]: Accepted publickey for core from 139.178.68.195 port 45632 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:16:18.986324 sshd-session[5014]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:16:18.992796 systemd-logind[1520]: New session 10 of user core. May 27 18:16:19.005032 systemd[1]: Started session-10.scope - Session 10 of User core. May 27 18:16:19.207873 sshd[5016]: Connection closed by 139.178.68.195 port 45632 May 27 18:16:19.209151 sshd-session[5014]: pam_unix(sshd:session): session closed for user core May 27 18:16:19.223295 systemd[1]: sshd@9-146.190.127.126:22-139.178.68.195:45632.service: Deactivated successfully. May 27 18:16:19.227135 systemd[1]: session-10.scope: Deactivated successfully. May 27 18:16:19.230735 systemd-logind[1520]: Session 10 logged out. Waiting for processes to exit. May 27 18:16:19.235673 systemd[1]: Started sshd@10-146.190.127.126:22-139.178.68.195:45646.service - OpenSSH per-connection server daemon (139.178.68.195:45646). May 27 18:16:19.237456 systemd-logind[1520]: Removed session 10. May 27 18:16:19.326837 sshd[5029]: Accepted publickey for core from 139.178.68.195 port 45646 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:16:19.329380 sshd-session[5029]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:16:19.336704 systemd-logind[1520]: New session 11 of user core. May 27 18:16:19.341762 systemd[1]: Started session-11.scope - Session 11 of User core. May 27 18:16:19.572783 sshd[5031]: Connection closed by 139.178.68.195 port 45646 May 27 18:16:19.577267 sshd-session[5029]: pam_unix(sshd:session): session closed for user core May 27 18:16:19.588796 systemd[1]: sshd@10-146.190.127.126:22-139.178.68.195:45646.service: Deactivated successfully. May 27 18:16:19.591950 systemd[1]: session-11.scope: Deactivated successfully. May 27 18:16:19.594530 systemd-logind[1520]: Session 11 logged out. Waiting for processes to exit. May 27 18:16:19.605447 systemd-logind[1520]: Removed session 11. May 27 18:16:19.611218 systemd[1]: Started sshd@11-146.190.127.126:22-139.178.68.195:45652.service - OpenSSH per-connection server daemon (139.178.68.195:45652). May 27 18:16:19.696248 sshd[5041]: Accepted publickey for core from 139.178.68.195 port 45652 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:16:19.698506 sshd-session[5041]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:16:19.706717 systemd-logind[1520]: New session 12 of user core. May 27 18:16:19.717736 systemd[1]: Started session-12.scope - Session 12 of User core. May 27 18:16:19.871975 sshd[5043]: Connection closed by 139.178.68.195 port 45652 May 27 18:16:19.873721 sshd-session[5041]: pam_unix(sshd:session): session closed for user core May 27 18:16:19.879220 systemd-logind[1520]: Session 12 logged out. Waiting for processes to exit. May 27 18:16:19.880185 systemd[1]: sshd@11-146.190.127.126:22-139.178.68.195:45652.service: Deactivated successfully. May 27 18:16:19.882960 systemd[1]: session-12.scope: Deactivated successfully. May 27 18:16:19.885261 systemd-logind[1520]: Removed session 12. May 27 18:16:21.670532 containerd[1543]: time="2025-05-27T18:16:21.670471298Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cfef13c479386db1499369db02a1c6f1fb36ea37285d4077a26db8d97d74af06\" id:\"0a968fe5317dd45a50ecf3f331d1ca380223c9339892a99b38090a10dbaf15d4\" pid:5068 exited_at:{seconds:1748369781 nanos:669515532}" May 27 18:16:22.114469 kubelet[2668]: E0527 18:16:22.114334 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7989d9b698-55wz2" podUID="461386c4-2f9e-4c80-a2ab-7b0260259077" May 27 18:16:24.902905 systemd[1]: Started sshd@12-146.190.127.126:22-139.178.68.195:43432.service - OpenSSH per-connection server daemon (139.178.68.195:43432). May 27 18:16:25.045011 sshd[5086]: Accepted publickey for core from 139.178.68.195 port 43432 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:16:25.048225 sshd-session[5086]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:16:25.055107 systemd-logind[1520]: New session 13 of user core. May 27 18:16:25.061753 systemd[1]: Started session-13.scope - Session 13 of User core. May 27 18:16:25.116526 kubelet[2668]: E0527 18:16:25.115084 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hqs5q" podUID="312bbe06-8b56-48c8-bd2e-4f07049cb4ed" May 27 18:16:25.360525 sshd[5088]: Connection closed by 139.178.68.195 port 43432 May 27 18:16:25.361665 sshd-session[5086]: pam_unix(sshd:session): session closed for user core May 27 18:16:25.373252 systemd[1]: sshd@12-146.190.127.126:22-139.178.68.195:43432.service: Deactivated successfully. May 27 18:16:25.378940 systemd[1]: session-13.scope: Deactivated successfully. May 27 18:16:25.381010 systemd-logind[1520]: Session 13 logged out. Waiting for processes to exit. May 27 18:16:25.385173 systemd-logind[1520]: Removed session 13. May 27 18:16:30.376973 systemd[1]: Started sshd@13-146.190.127.126:22-139.178.68.195:43442.service - OpenSSH per-connection server daemon (139.178.68.195:43442). May 27 18:16:30.488373 sshd[5100]: Accepted publickey for core from 139.178.68.195 port 43442 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:16:30.492332 sshd-session[5100]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:16:30.503008 systemd-logind[1520]: New session 14 of user core. May 27 18:16:30.510583 systemd[1]: Started session-14.scope - Session 14 of User core. May 27 18:16:30.727283 sshd[5102]: Connection closed by 139.178.68.195 port 43442 May 27 18:16:30.728296 sshd-session[5100]: pam_unix(sshd:session): session closed for user core May 27 18:16:30.736907 systemd[1]: sshd@13-146.190.127.126:22-139.178.68.195:43442.service: Deactivated successfully. May 27 18:16:30.739958 systemd[1]: session-14.scope: Deactivated successfully. May 27 18:16:30.742119 systemd-logind[1520]: Session 14 logged out. Waiting for processes to exit. May 27 18:16:30.744145 systemd-logind[1520]: Removed session 14. May 27 18:16:31.913640 kubelet[2668]: I0527 18:16:31.913586 2668 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 18:16:32.086263 containerd[1543]: time="2025-05-27T18:16:32.085774568Z" level=info msg="TaskExit event in podsandbox handler container_id:\"42963c6a800eeb5920b985546148e121e7547be360de37269e45c118688a457a\" id:\"6b18423ac0f1d66b18807e76bbf4de8e42e203ca5ecdd2dabe8b7f517e743646\" pid:5125 exited_at:{seconds:1748369792 nanos:83080988}" May 27 18:16:33.119465 kubelet[2668]: E0527 18:16:33.117545 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:16:34.802049 containerd[1543]: time="2025-05-27T18:16:34.801968561Z" level=info msg="TaskExit event in podsandbox handler container_id:\"42963c6a800eeb5920b985546148e121e7547be360de37269e45c118688a457a\" id:\"edc26fb4b553d090287098bdc067e7bf9808fa1347b32c9109efe48997745722\" pid:5155 exited_at:{seconds:1748369794 nanos:801095599}" May 27 18:16:35.743632 systemd[1]: Started sshd@14-146.190.127.126:22-139.178.68.195:60864.service - OpenSSH per-connection server daemon (139.178.68.195:60864). May 27 18:16:35.894718 sshd[5166]: Accepted publickey for core from 139.178.68.195 port 60864 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:16:35.896608 sshd-session[5166]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:16:35.905181 systemd-logind[1520]: New session 15 of user core. May 27 18:16:35.908754 systemd[1]: Started session-15.scope - Session 15 of User core. May 27 18:16:36.125357 containerd[1543]: time="2025-05-27T18:16:36.122310601Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 18:16:36.246515 sshd[5168]: Connection closed by 139.178.68.195 port 60864 May 27 18:16:36.248381 sshd-session[5166]: pam_unix(sshd:session): session closed for user core May 27 18:16:36.256403 systemd-logind[1520]: Session 15 logged out. Waiting for processes to exit. May 27 18:16:36.257259 systemd[1]: sshd@14-146.190.127.126:22-139.178.68.195:60864.service: Deactivated successfully. May 27 18:16:36.260070 systemd[1]: session-15.scope: Deactivated successfully. May 27 18:16:36.264606 systemd-logind[1520]: Removed session 15. May 27 18:16:36.375565 containerd[1543]: time="2025-05-27T18:16:36.375275952Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:16:36.376765 containerd[1543]: time="2025-05-27T18:16:36.376622225Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:16:36.376765 containerd[1543]: time="2025-05-27T18:16:36.376689257Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 18:16:36.385987 kubelet[2668]: E0527 18:16:36.385821 2668 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 18:16:36.387173 kubelet[2668]: E0527 18:16:36.386018 2668 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 18:16:36.390604 kubelet[2668]: E0527 18:16:36.390504 2668 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:b58c9c6d236b46b38836eca7a8a2bb57,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l4mn6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7989d9b698-55wz2_calico-system(461386c4-2f9e-4c80-a2ab-7b0260259077): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:16:36.393696 containerd[1543]: time="2025-05-27T18:16:36.393510763Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 18:16:36.646856 containerd[1543]: time="2025-05-27T18:16:36.646202175Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:16:36.647996 containerd[1543]: time="2025-05-27T18:16:36.647905611Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:16:36.649086 containerd[1543]: time="2025-05-27T18:16:36.647952747Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 18:16:36.649227 kubelet[2668]: E0527 18:16:36.648390 2668 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 18:16:36.649227 kubelet[2668]: E0527 18:16:36.648476 2668 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 18:16:36.649227 kubelet[2668]: E0527 18:16:36.648634 2668 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l4mn6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7989d9b698-55wz2_calico-system(461386c4-2f9e-4c80-a2ab-7b0260259077): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:16:36.649927 kubelet[2668]: E0527 18:16:36.649860 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7989d9b698-55wz2" podUID="461386c4-2f9e-4c80-a2ab-7b0260259077" May 27 18:16:39.113193 kubelet[2668]: E0527 18:16:39.112027 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:16:39.117823 containerd[1543]: time="2025-05-27T18:16:39.117382914Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 18:16:39.424511 containerd[1543]: time="2025-05-27T18:16:39.423971992Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:16:39.425377 containerd[1543]: time="2025-05-27T18:16:39.425268056Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:16:39.425377 containerd[1543]: time="2025-05-27T18:16:39.425321936Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 18:16:39.426139 kubelet[2668]: E0527 18:16:39.425752 2668 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 18:16:39.426139 kubelet[2668]: E0527 18:16:39.425815 2668 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 18:16:39.426139 kubelet[2668]: E0527 18:16:39.425968 2668 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dctn5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-hqs5q_calico-system(312bbe06-8b56-48c8-bd2e-4f07049cb4ed): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:16:39.427890 kubelet[2668]: E0527 18:16:39.427770 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hqs5q" podUID="312bbe06-8b56-48c8-bd2e-4f07049cb4ed" May 27 18:16:41.112295 kubelet[2668]: E0527 18:16:41.112244 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:16:41.260778 systemd[1]: Started sshd@15-146.190.127.126:22-139.178.68.195:60874.service - OpenSSH per-connection server daemon (139.178.68.195:60874). May 27 18:16:41.331662 sshd[5181]: Accepted publickey for core from 139.178.68.195 port 60874 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:16:41.333626 sshd-session[5181]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:16:41.339469 systemd-logind[1520]: New session 16 of user core. May 27 18:16:41.348764 systemd[1]: Started session-16.scope - Session 16 of User core. May 27 18:16:41.498480 sshd[5183]: Connection closed by 139.178.68.195 port 60874 May 27 18:16:41.499428 sshd-session[5181]: pam_unix(sshd:session): session closed for user core May 27 18:16:41.505642 systemd[1]: sshd@15-146.190.127.126:22-139.178.68.195:60874.service: Deactivated successfully. May 27 18:16:41.509730 systemd[1]: session-16.scope: Deactivated successfully. May 27 18:16:41.512360 systemd-logind[1520]: Session 16 logged out. Waiting for processes to exit. May 27 18:16:41.515128 systemd-logind[1520]: Removed session 16. May 27 18:16:42.111511 kubelet[2668]: E0527 18:16:42.111302 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:16:46.521375 systemd[1]: Started sshd@16-146.190.127.126:22-139.178.68.195:48278.service - OpenSSH per-connection server daemon (139.178.68.195:48278). May 27 18:16:46.635769 sshd[5198]: Accepted publickey for core from 139.178.68.195 port 48278 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:16:46.638977 sshd-session[5198]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:16:46.647004 systemd-logind[1520]: New session 17 of user core. May 27 18:16:46.657828 systemd[1]: Started session-17.scope - Session 17 of User core. May 27 18:16:46.879190 sshd[5200]: Connection closed by 139.178.68.195 port 48278 May 27 18:16:46.880069 sshd-session[5198]: pam_unix(sshd:session): session closed for user core May 27 18:16:46.886798 systemd-logind[1520]: Session 17 logged out. Waiting for processes to exit. May 27 18:16:46.888309 systemd[1]: sshd@16-146.190.127.126:22-139.178.68.195:48278.service: Deactivated successfully. May 27 18:16:46.891398 systemd[1]: session-17.scope: Deactivated successfully. May 27 18:16:46.894933 systemd-logind[1520]: Removed session 17. May 27 18:16:48.115443 kubelet[2668]: E0527 18:16:48.115314 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7989d9b698-55wz2" podUID="461386c4-2f9e-4c80-a2ab-7b0260259077" May 27 18:16:51.653606 containerd[1543]: time="2025-05-27T18:16:51.653551255Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cfef13c479386db1499369db02a1c6f1fb36ea37285d4077a26db8d97d74af06\" id:\"9b062c027d9eab5648047773e6d3100ced78b0e9842cfbb85c69e44e9b1a02de\" pid:5223 exited_at:{seconds:1748369811 nanos:652930611}" May 27 18:16:51.897068 systemd[1]: Started sshd@17-146.190.127.126:22-139.178.68.195:48286.service - OpenSSH per-connection server daemon (139.178.68.195:48286). May 27 18:16:52.003898 sshd[5236]: Accepted publickey for core from 139.178.68.195 port 48286 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:16:52.006480 sshd-session[5236]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:16:52.016906 systemd-logind[1520]: New session 18 of user core. May 27 18:16:52.025805 systemd[1]: Started session-18.scope - Session 18 of User core. May 27 18:16:52.351583 sshd[5238]: Connection closed by 139.178.68.195 port 48286 May 27 18:16:52.352404 sshd-session[5236]: pam_unix(sshd:session): session closed for user core May 27 18:16:52.361827 systemd-logind[1520]: Session 18 logged out. Waiting for processes to exit. May 27 18:16:52.362544 systemd[1]: sshd@17-146.190.127.126:22-139.178.68.195:48286.service: Deactivated successfully. May 27 18:16:52.365071 systemd[1]: session-18.scope: Deactivated successfully. May 27 18:16:52.367360 systemd-logind[1520]: Removed session 18. May 27 18:16:53.114074 kubelet[2668]: E0527 18:16:53.114006 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hqs5q" podUID="312bbe06-8b56-48c8-bd2e-4f07049cb4ed" May 27 18:16:57.364768 systemd[1]: Started sshd@18-146.190.127.126:22-139.178.68.195:33484.service - OpenSSH per-connection server daemon (139.178.68.195:33484). May 27 18:16:57.454937 sshd[5251]: Accepted publickey for core from 139.178.68.195 port 33484 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:16:57.457665 sshd-session[5251]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:16:57.468520 systemd-logind[1520]: New session 19 of user core. May 27 18:16:57.471924 systemd[1]: Started session-19.scope - Session 19 of User core. May 27 18:16:57.722511 sshd[5253]: Connection closed by 139.178.68.195 port 33484 May 27 18:16:57.724803 sshd-session[5251]: pam_unix(sshd:session): session closed for user core May 27 18:16:57.732247 systemd[1]: sshd@18-146.190.127.126:22-139.178.68.195:33484.service: Deactivated successfully. May 27 18:16:57.738262 systemd[1]: session-19.scope: Deactivated successfully. May 27 18:16:57.740974 systemd-logind[1520]: Session 19 logged out. Waiting for processes to exit. May 27 18:16:57.748533 systemd-logind[1520]: Removed session 19. May 27 18:16:59.114571 kubelet[2668]: E0527 18:16:59.114425 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7989d9b698-55wz2" podUID="461386c4-2f9e-4c80-a2ab-7b0260259077" May 27 18:17:02.745943 systemd[1]: Started sshd@19-146.190.127.126:22-139.178.68.195:33496.service - OpenSSH per-connection server daemon (139.178.68.195:33496). May 27 18:17:02.908049 sshd[5264]: Accepted publickey for core from 139.178.68.195 port 33496 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:17:02.912970 sshd-session[5264]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:17:02.924466 systemd-logind[1520]: New session 20 of user core. May 27 18:17:02.932723 systemd[1]: Started session-20.scope - Session 20 of User core. May 27 18:17:03.639424 sshd[5266]: Connection closed by 139.178.68.195 port 33496 May 27 18:17:03.642007 sshd-session[5264]: pam_unix(sshd:session): session closed for user core May 27 18:17:03.650221 systemd[1]: sshd@19-146.190.127.126:22-139.178.68.195:33496.service: Deactivated successfully. May 27 18:17:03.656231 systemd[1]: session-20.scope: Deactivated successfully. May 27 18:17:03.659687 systemd-logind[1520]: Session 20 logged out. Waiting for processes to exit. May 27 18:17:03.664790 systemd-logind[1520]: Removed session 20. May 27 18:17:04.112204 kubelet[2668]: E0527 18:17:04.112157 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:17:04.857228 containerd[1543]: time="2025-05-27T18:17:04.857024331Z" level=info msg="TaskExit event in podsandbox handler container_id:\"42963c6a800eeb5920b985546148e121e7547be360de37269e45c118688a457a\" id:\"ddc021fd79c983dd75c75a861f41790306b869523849c8ea1ecb3a5f7c15de54\" pid:5290 exited_at:{seconds:1748369824 nanos:856264775}" May 27 18:17:05.112379 kubelet[2668]: E0527 18:17:05.112241 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hqs5q" podUID="312bbe06-8b56-48c8-bd2e-4f07049cb4ed" May 27 18:17:08.657916 systemd[1]: Started sshd@20-146.190.127.126:22-139.178.68.195:43390.service - OpenSSH per-connection server daemon (139.178.68.195:43390). May 27 18:17:08.737584 sshd[5300]: Accepted publickey for core from 139.178.68.195 port 43390 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:17:08.741202 sshd-session[5300]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:17:08.750033 systemd-logind[1520]: New session 21 of user core. May 27 18:17:08.756653 systemd[1]: Started session-21.scope - Session 21 of User core. May 27 18:17:08.975102 sshd[5302]: Connection closed by 139.178.68.195 port 43390 May 27 18:17:08.976915 sshd-session[5300]: pam_unix(sshd:session): session closed for user core May 27 18:17:08.982185 systemd[1]: sshd@20-146.190.127.126:22-139.178.68.195:43390.service: Deactivated successfully. May 27 18:17:08.985863 systemd[1]: session-21.scope: Deactivated successfully. May 27 18:17:08.987975 systemd-logind[1520]: Session 21 logged out. Waiting for processes to exit. May 27 18:17:08.992584 systemd-logind[1520]: Removed session 21. May 27 18:17:12.113368 kubelet[2668]: E0527 18:17:12.113245 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:17:12.120678 kubelet[2668]: E0527 18:17:12.120623 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7989d9b698-55wz2" podUID="461386c4-2f9e-4c80-a2ab-7b0260259077" May 27 18:17:13.990836 systemd[1]: Started sshd@21-146.190.127.126:22-139.178.68.195:50846.service - OpenSSH per-connection server daemon (139.178.68.195:50846). May 27 18:17:14.078807 sshd[5323]: Accepted publickey for core from 139.178.68.195 port 50846 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:17:14.080512 sshd-session[5323]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:17:14.090654 systemd-logind[1520]: New session 22 of user core. May 27 18:17:14.097725 systemd[1]: Started session-22.scope - Session 22 of User core. May 27 18:17:14.291851 sshd[5325]: Connection closed by 139.178.68.195 port 50846 May 27 18:17:14.293807 sshd-session[5323]: pam_unix(sshd:session): session closed for user core May 27 18:17:14.298164 systemd[1]: sshd@21-146.190.127.126:22-139.178.68.195:50846.service: Deactivated successfully. May 27 18:17:14.302584 systemd[1]: session-22.scope: Deactivated successfully. May 27 18:17:14.307389 systemd-logind[1520]: Session 22 logged out. Waiting for processes to exit. May 27 18:17:14.309893 systemd-logind[1520]: Removed session 22. May 27 18:17:16.114685 kubelet[2668]: E0527 18:17:16.113174 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hqs5q" podUID="312bbe06-8b56-48c8-bd2e-4f07049cb4ed" May 27 18:17:19.311257 systemd[1]: Started sshd@22-146.190.127.126:22-139.178.68.195:50860.service - OpenSSH per-connection server daemon (139.178.68.195:50860). May 27 18:17:19.398661 sshd[5342]: Accepted publickey for core from 139.178.68.195 port 50860 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:17:19.401507 sshd-session[5342]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:17:19.407620 systemd-logind[1520]: New session 23 of user core. May 27 18:17:19.415838 systemd[1]: Started session-23.scope - Session 23 of User core. May 27 18:17:19.614631 sshd[5344]: Connection closed by 139.178.68.195 port 50860 May 27 18:17:19.615356 sshd-session[5342]: pam_unix(sshd:session): session closed for user core May 27 18:17:19.619474 systemd-logind[1520]: Session 23 logged out. Waiting for processes to exit. May 27 18:17:19.620476 systemd[1]: sshd@22-146.190.127.126:22-139.178.68.195:50860.service: Deactivated successfully. May 27 18:17:19.623936 systemd[1]: session-23.scope: Deactivated successfully. May 27 18:17:19.630750 systemd-logind[1520]: Removed session 23. May 27 18:17:21.589477 containerd[1543]: time="2025-05-27T18:17:21.582042081Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cfef13c479386db1499369db02a1c6f1fb36ea37285d4077a26db8d97d74af06\" id:\"100be0bebb0674b9719800119d19439b166f7c5bcf9ed8a0a325167843f3cb94\" pid:5367 exited_at:{seconds:1748369841 nanos:581234448}" May 27 18:17:23.113844 containerd[1543]: time="2025-05-27T18:17:23.112952416Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 18:17:23.444103 containerd[1543]: time="2025-05-27T18:17:23.443552281Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:17:23.445161 containerd[1543]: time="2025-05-27T18:17:23.445027911Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:17:23.445161 containerd[1543]: time="2025-05-27T18:17:23.445073064Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 18:17:23.446468 kubelet[2668]: E0527 18:17:23.445617 2668 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 18:17:23.446468 kubelet[2668]: E0527 18:17:23.445676 2668 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 18:17:23.446468 kubelet[2668]: E0527 18:17:23.445798 2668 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:b58c9c6d236b46b38836eca7a8a2bb57,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l4mn6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7989d9b698-55wz2_calico-system(461386c4-2f9e-4c80-a2ab-7b0260259077): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:17:23.450605 containerd[1543]: time="2025-05-27T18:17:23.450560289Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 18:17:23.677862 containerd[1543]: time="2025-05-27T18:17:23.677784021Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:17:23.681380 containerd[1543]: time="2025-05-27T18:17:23.680943547Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 18:17:23.681380 containerd[1543]: time="2025-05-27T18:17:23.680959434Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:17:23.681890 kubelet[2668]: E0527 18:17:23.681788 2668 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 18:17:23.682515 kubelet[2668]: E0527 18:17:23.681944 2668 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 18:17:23.682515 kubelet[2668]: E0527 18:17:23.682198 2668 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l4mn6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7989d9b698-55wz2_calico-system(461386c4-2f9e-4c80-a2ab-7b0260259077): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:17:23.683741 kubelet[2668]: E0527 18:17:23.683618 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7989d9b698-55wz2" podUID="461386c4-2f9e-4c80-a2ab-7b0260259077" May 27 18:17:24.631813 systemd[1]: Started sshd@23-146.190.127.126:22-139.178.68.195:58686.service - OpenSSH per-connection server daemon (139.178.68.195:58686). May 27 18:17:24.730306 sshd[5394]: Accepted publickey for core from 139.178.68.195 port 58686 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:17:24.732734 sshd-session[5394]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:17:24.742272 systemd-logind[1520]: New session 24 of user core. May 27 18:17:24.746054 systemd[1]: Started session-24.scope - Session 24 of User core. May 27 18:17:24.998516 sshd[5396]: Connection closed by 139.178.68.195 port 58686 May 27 18:17:24.999338 sshd-session[5394]: pam_unix(sshd:session): session closed for user core May 27 18:17:25.006198 systemd[1]: sshd@23-146.190.127.126:22-139.178.68.195:58686.service: Deactivated successfully. May 27 18:17:25.011240 systemd[1]: session-24.scope: Deactivated successfully. May 27 18:17:25.012926 systemd-logind[1520]: Session 24 logged out. Waiting for processes to exit. May 27 18:17:25.016720 systemd-logind[1520]: Removed session 24. May 27 18:17:25.111967 kubelet[2668]: E0527 18:17:25.111818 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:17:30.020003 systemd[1]: Started sshd@24-146.190.127.126:22-139.178.68.195:58690.service - OpenSSH per-connection server daemon (139.178.68.195:58690). May 27 18:17:30.097127 sshd[5416]: Accepted publickey for core from 139.178.68.195 port 58690 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:17:30.100120 sshd-session[5416]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:17:30.110640 systemd-logind[1520]: New session 25 of user core. May 27 18:17:30.120195 systemd[1]: Started session-25.scope - Session 25 of User core. May 27 18:17:30.127395 containerd[1543]: time="2025-05-27T18:17:30.127340108Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 18:17:30.368807 containerd[1543]: time="2025-05-27T18:17:30.368667274Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:17:30.370632 containerd[1543]: time="2025-05-27T18:17:30.370544545Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:17:30.370976 containerd[1543]: time="2025-05-27T18:17:30.370552643Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 18:17:30.371220 kubelet[2668]: E0527 18:17:30.370832 2668 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 18:17:30.371220 kubelet[2668]: E0527 18:17:30.370884 2668 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 18:17:30.371220 kubelet[2668]: E0527 18:17:30.371051 2668 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dctn5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-hqs5q_calico-system(312bbe06-8b56-48c8-bd2e-4f07049cb4ed): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:17:30.372234 kubelet[2668]: E0527 18:17:30.372180 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hqs5q" podUID="312bbe06-8b56-48c8-bd2e-4f07049cb4ed" May 27 18:17:30.419570 sshd[5418]: Connection closed by 139.178.68.195 port 58690 May 27 18:17:30.420719 sshd-session[5416]: pam_unix(sshd:session): session closed for user core May 27 18:17:30.428200 systemd[1]: sshd@24-146.190.127.126:22-139.178.68.195:58690.service: Deactivated successfully. May 27 18:17:30.433912 systemd[1]: session-25.scope: Deactivated successfully. May 27 18:17:30.435349 systemd-logind[1520]: Session 25 logged out. Waiting for processes to exit. May 27 18:17:30.437802 systemd-logind[1520]: Removed session 25. May 27 18:17:32.016459 containerd[1543]: time="2025-05-27T18:17:32.016389212Z" level=info msg="TaskExit event in podsandbox handler container_id:\"42963c6a800eeb5920b985546148e121e7547be360de37269e45c118688a457a\" id:\"056198c0453f3755f70a87697e8a6d70fff30382b6213d398975a474d42b521b\" pid:5441 exited_at:{seconds:1748369852 nanos:16008985}" May 27 18:17:34.113518 kubelet[2668]: E0527 18:17:34.112093 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:17:34.821427 containerd[1543]: time="2025-05-27T18:17:34.821381232Z" level=info msg="TaskExit event in podsandbox handler container_id:\"42963c6a800eeb5920b985546148e121e7547be360de37269e45c118688a457a\" id:\"f571eaed53db96861dc5a28b115ac286bc92dcd050a6f4262740524f21a808f0\" pid:5462 exited_at:{seconds:1748369854 nanos:821045872}" May 27 18:17:35.440636 systemd[1]: Started sshd@25-146.190.127.126:22-139.178.68.195:57200.service - OpenSSH per-connection server daemon (139.178.68.195:57200). May 27 18:17:35.534466 sshd[5473]: Accepted publickey for core from 139.178.68.195 port 57200 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:17:35.537914 sshd-session[5473]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:17:35.550049 systemd-logind[1520]: New session 26 of user core. May 27 18:17:35.554673 systemd[1]: Started session-26.scope - Session 26 of User core. May 27 18:17:35.750615 sshd[5475]: Connection closed by 139.178.68.195 port 57200 May 27 18:17:35.752299 sshd-session[5473]: pam_unix(sshd:session): session closed for user core May 27 18:17:35.759002 systemd-logind[1520]: Session 26 logged out. Waiting for processes to exit. May 27 18:17:35.759736 systemd[1]: sshd@25-146.190.127.126:22-139.178.68.195:57200.service: Deactivated successfully. May 27 18:17:35.763998 systemd[1]: session-26.scope: Deactivated successfully. May 27 18:17:35.769092 systemd-logind[1520]: Removed session 26. May 27 18:17:39.117691 kubelet[2668]: E0527 18:17:39.117246 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7989d9b698-55wz2" podUID="461386c4-2f9e-4c80-a2ab-7b0260259077" May 27 18:17:40.768607 systemd[1]: Started sshd@26-146.190.127.126:22-139.178.68.195:57204.service - OpenSSH per-connection server daemon (139.178.68.195:57204). May 27 18:17:40.847889 sshd[5487]: Accepted publickey for core from 139.178.68.195 port 57204 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:17:40.850234 sshd-session[5487]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:17:40.858083 systemd-logind[1520]: New session 27 of user core. May 27 18:17:40.865878 systemd[1]: Started session-27.scope - Session 27 of User core. May 27 18:17:41.023546 sshd[5489]: Connection closed by 139.178.68.195 port 57204 May 27 18:17:41.024349 sshd-session[5487]: pam_unix(sshd:session): session closed for user core May 27 18:17:41.030340 systemd[1]: sshd@26-146.190.127.126:22-139.178.68.195:57204.service: Deactivated successfully. May 27 18:17:41.033412 systemd[1]: session-27.scope: Deactivated successfully. May 27 18:17:41.034982 systemd-logind[1520]: Session 27 logged out. Waiting for processes to exit. May 27 18:17:41.037667 systemd-logind[1520]: Removed session 27. May 27 18:17:44.116480 kubelet[2668]: E0527 18:17:44.114893 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hqs5q" podUID="312bbe06-8b56-48c8-bd2e-4f07049cb4ed" May 27 18:17:46.044066 systemd[1]: Started sshd@27-146.190.127.126:22-139.178.68.195:52838.service - OpenSSH per-connection server daemon (139.178.68.195:52838). May 27 18:17:46.120863 sshd[5503]: Accepted publickey for core from 139.178.68.195 port 52838 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:17:46.123605 sshd-session[5503]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:17:46.131501 systemd-logind[1520]: New session 28 of user core. May 27 18:17:46.138807 systemd[1]: Started session-28.scope - Session 28 of User core. May 27 18:17:46.308857 sshd[5505]: Connection closed by 139.178.68.195 port 52838 May 27 18:17:46.309767 sshd-session[5503]: pam_unix(sshd:session): session closed for user core May 27 18:17:46.317670 systemd-logind[1520]: Session 28 logged out. Waiting for processes to exit. May 27 18:17:46.317890 systemd[1]: sshd@27-146.190.127.126:22-139.178.68.195:52838.service: Deactivated successfully. May 27 18:17:46.321517 systemd[1]: session-28.scope: Deactivated successfully. May 27 18:17:46.326472 systemd-logind[1520]: Removed session 28. May 27 18:17:51.112491 kubelet[2668]: E0527 18:17:51.111594 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:17:51.325951 systemd[1]: Started sshd@28-146.190.127.126:22-139.178.68.195:52848.service - OpenSSH per-connection server daemon (139.178.68.195:52848). May 27 18:17:51.385620 sshd[5517]: Accepted publickey for core from 139.178.68.195 port 52848 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:17:51.387153 sshd-session[5517]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:17:51.393698 systemd-logind[1520]: New session 29 of user core. May 27 18:17:51.400995 systemd[1]: Started session-29.scope - Session 29 of User core. May 27 18:17:51.606689 containerd[1543]: time="2025-05-27T18:17:51.606619183Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cfef13c479386db1499369db02a1c6f1fb36ea37285d4077a26db8d97d74af06\" id:\"ef9fd9c960fc4496a8c4888981c074b8f83a308c31a98dc4f292dea73f52135d\" pid:5539 exited_at:{seconds:1748369871 nanos:606204278}" May 27 18:17:51.727726 sshd[5519]: Connection closed by 139.178.68.195 port 52848 May 27 18:17:51.729960 sshd-session[5517]: pam_unix(sshd:session): session closed for user core May 27 18:17:51.737988 systemd-logind[1520]: Session 29 logged out. Waiting for processes to exit. May 27 18:17:51.739281 systemd[1]: sshd@28-146.190.127.126:22-139.178.68.195:52848.service: Deactivated successfully. May 27 18:17:51.743080 systemd[1]: session-29.scope: Deactivated successfully. May 27 18:17:51.746677 systemd-logind[1520]: Removed session 29. May 27 18:17:53.116028 kubelet[2668]: E0527 18:17:53.115950 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7989d9b698-55wz2" podUID="461386c4-2f9e-4c80-a2ab-7b0260259077" May 27 18:17:56.112727 kubelet[2668]: E0527 18:17:56.111668 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:17:56.114674 kubelet[2668]: E0527 18:17:56.114640 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hqs5q" podUID="312bbe06-8b56-48c8-bd2e-4f07049cb4ed" May 27 18:17:56.749157 systemd[1]: Started sshd@29-146.190.127.126:22-139.178.68.195:54178.service - OpenSSH per-connection server daemon (139.178.68.195:54178). May 27 18:17:56.842746 sshd[5555]: Accepted publickey for core from 139.178.68.195 port 54178 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:17:56.844593 sshd-session[5555]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:17:56.849946 systemd-logind[1520]: New session 30 of user core. May 27 18:17:56.858675 systemd[1]: Started session-30.scope - Session 30 of User core. May 27 18:17:57.078155 sshd[5557]: Connection closed by 139.178.68.195 port 54178 May 27 18:17:57.078896 sshd-session[5555]: pam_unix(sshd:session): session closed for user core May 27 18:17:57.088481 systemd[1]: sshd@29-146.190.127.126:22-139.178.68.195:54178.service: Deactivated successfully. May 27 18:17:57.091074 systemd[1]: session-30.scope: Deactivated successfully. May 27 18:17:57.095845 systemd-logind[1520]: Session 30 logged out. Waiting for processes to exit. May 27 18:17:57.097311 systemd-logind[1520]: Removed session 30. May 27 18:17:58.111493 kubelet[2668]: E0527 18:17:58.111345 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:18:02.095062 systemd[1]: Started sshd@30-146.190.127.126:22-139.178.68.195:54186.service - OpenSSH per-connection server daemon (139.178.68.195:54186). May 27 18:18:02.176734 sshd[5569]: Accepted publickey for core from 139.178.68.195 port 54186 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:18:02.179674 sshd-session[5569]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:18:02.188020 systemd-logind[1520]: New session 31 of user core. May 27 18:18:02.195840 systemd[1]: Started session-31.scope - Session 31 of User core. May 27 18:18:02.365454 sshd[5571]: Connection closed by 139.178.68.195 port 54186 May 27 18:18:02.366377 sshd-session[5569]: pam_unix(sshd:session): session closed for user core May 27 18:18:02.371734 systemd-logind[1520]: Session 31 logged out. Waiting for processes to exit. May 27 18:18:02.372772 systemd[1]: sshd@30-146.190.127.126:22-139.178.68.195:54186.service: Deactivated successfully. May 27 18:18:02.376226 systemd[1]: session-31.scope: Deactivated successfully. May 27 18:18:02.381231 systemd-logind[1520]: Removed session 31. May 27 18:18:04.789105 containerd[1543]: time="2025-05-27T18:18:04.789040886Z" level=info msg="TaskExit event in podsandbox handler container_id:\"42963c6a800eeb5920b985546148e121e7547be360de37269e45c118688a457a\" id:\"44e11e1955cb512d4717a24e381929d7b0fc042c14febcfad71fc91909dd9c27\" pid:5594 exited_at:{seconds:1748369884 nanos:788619790}" May 27 18:18:05.114492 kubelet[2668]: E0527 18:18:05.114328 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7989d9b698-55wz2" podUID="461386c4-2f9e-4c80-a2ab-7b0260259077" May 27 18:18:07.391114 systemd[1]: Started sshd@31-146.190.127.126:22-139.178.68.195:50028.service - OpenSSH per-connection server daemon (139.178.68.195:50028). May 27 18:18:07.468130 sshd[5605]: Accepted publickey for core from 139.178.68.195 port 50028 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:18:07.471270 sshd-session[5605]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:18:07.478756 systemd-logind[1520]: New session 32 of user core. May 27 18:18:07.487768 systemd[1]: Started session-32.scope - Session 32 of User core. May 27 18:18:07.632267 sshd[5607]: Connection closed by 139.178.68.195 port 50028 May 27 18:18:07.632987 sshd-session[5605]: pam_unix(sshd:session): session closed for user core May 27 18:18:07.637686 systemd[1]: sshd@31-146.190.127.126:22-139.178.68.195:50028.service: Deactivated successfully. May 27 18:18:07.640270 systemd[1]: session-32.scope: Deactivated successfully. May 27 18:18:07.641896 systemd-logind[1520]: Session 32 logged out. Waiting for processes to exit. May 27 18:18:07.644289 systemd-logind[1520]: Removed session 32. May 27 18:18:09.112590 kubelet[2668]: E0527 18:18:09.112404 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hqs5q" podUID="312bbe06-8b56-48c8-bd2e-4f07049cb4ed" May 27 18:18:12.658337 systemd[1]: Started sshd@32-146.190.127.126:22-139.178.68.195:50032.service - OpenSSH per-connection server daemon (139.178.68.195:50032). May 27 18:18:12.728143 sshd[5622]: Accepted publickey for core from 139.178.68.195 port 50032 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:18:12.730650 sshd-session[5622]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:18:12.738920 systemd-logind[1520]: New session 33 of user core. May 27 18:18:12.746152 systemd[1]: Started session-33.scope - Session 33 of User core. May 27 18:18:12.890196 sshd[5624]: Connection closed by 139.178.68.195 port 50032 May 27 18:18:12.891109 sshd-session[5622]: pam_unix(sshd:session): session closed for user core May 27 18:18:12.898125 systemd[1]: sshd@32-146.190.127.126:22-139.178.68.195:50032.service: Deactivated successfully. May 27 18:18:12.903084 systemd[1]: session-33.scope: Deactivated successfully. May 27 18:18:12.905897 systemd-logind[1520]: Session 33 logged out. Waiting for processes to exit. May 27 18:18:12.909137 systemd-logind[1520]: Removed session 33. May 27 18:18:14.112309 kubelet[2668]: E0527 18:18:14.111921 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:18:17.113594 kubelet[2668]: E0527 18:18:17.113280 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7989d9b698-55wz2" podUID="461386c4-2f9e-4c80-a2ab-7b0260259077" May 27 18:18:17.905649 systemd[1]: Started sshd@33-146.190.127.126:22-139.178.68.195:53768.service - OpenSSH per-connection server daemon (139.178.68.195:53768). May 27 18:18:17.977841 sshd[5639]: Accepted publickey for core from 139.178.68.195 port 53768 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:18:17.979964 sshd-session[5639]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:18:17.985747 systemd-logind[1520]: New session 34 of user core. May 27 18:18:17.996809 systemd[1]: Started session-34.scope - Session 34 of User core. May 27 18:18:18.165939 sshd[5641]: Connection closed by 139.178.68.195 port 53768 May 27 18:18:18.166936 sshd-session[5639]: pam_unix(sshd:session): session closed for user core May 27 18:18:18.173790 systemd[1]: sshd@33-146.190.127.126:22-139.178.68.195:53768.service: Deactivated successfully. May 27 18:18:18.177497 systemd[1]: session-34.scope: Deactivated successfully. May 27 18:18:18.179059 systemd-logind[1520]: Session 34 logged out. Waiting for processes to exit. May 27 18:18:18.182383 systemd-logind[1520]: Removed session 34. May 27 18:18:20.113188 kubelet[2668]: E0527 18:18:20.113125 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hqs5q" podUID="312bbe06-8b56-48c8-bd2e-4f07049cb4ed" May 27 18:18:21.565470 containerd[1543]: time="2025-05-27T18:18:21.565391807Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cfef13c479386db1499369db02a1c6f1fb36ea37285d4077a26db8d97d74af06\" id:\"e9c2dac4c898b10bb2f9b049b4b22c9f09717400ac8b82b172821bcb5ae959f1\" pid:5663 exited_at:{seconds:1748369901 nanos:565056340}" May 27 18:18:23.183306 systemd[1]: Started sshd@34-146.190.127.126:22-139.178.68.195:53784.service - OpenSSH per-connection server daemon (139.178.68.195:53784). May 27 18:18:23.261465 sshd[5675]: Accepted publickey for core from 139.178.68.195 port 53784 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:18:23.263701 sshd-session[5675]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:18:23.270575 systemd-logind[1520]: New session 35 of user core. May 27 18:18:23.276945 systemd[1]: Started session-35.scope - Session 35 of User core. May 27 18:18:23.439915 sshd[5677]: Connection closed by 139.178.68.195 port 53784 May 27 18:18:23.440902 sshd-session[5675]: pam_unix(sshd:session): session closed for user core May 27 18:18:23.447290 systemd[1]: sshd@34-146.190.127.126:22-139.178.68.195:53784.service: Deactivated successfully. May 27 18:18:23.450669 systemd[1]: session-35.scope: Deactivated successfully. May 27 18:18:23.452875 systemd-logind[1520]: Session 35 logged out. Waiting for processes to exit. May 27 18:18:23.454981 systemd-logind[1520]: Removed session 35. May 27 18:18:28.456289 systemd[1]: Started sshd@35-146.190.127.126:22-139.178.68.195:52622.service - OpenSSH per-connection server daemon (139.178.68.195:52622). May 27 18:18:28.522249 sshd[5690]: Accepted publickey for core from 139.178.68.195 port 52622 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:18:28.524236 sshd-session[5690]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:18:28.530088 systemd-logind[1520]: New session 36 of user core. May 27 18:18:28.539716 systemd[1]: Started session-36.scope - Session 36 of User core. May 27 18:18:28.682280 sshd[5692]: Connection closed by 139.178.68.195 port 52622 May 27 18:18:28.682078 sshd-session[5690]: pam_unix(sshd:session): session closed for user core May 27 18:18:28.688957 systemd[1]: sshd@35-146.190.127.126:22-139.178.68.195:52622.service: Deactivated successfully. May 27 18:18:28.691473 systemd[1]: session-36.scope: Deactivated successfully. May 27 18:18:28.693603 systemd-logind[1520]: Session 36 logged out. Waiting for processes to exit. May 27 18:18:28.696329 systemd-logind[1520]: Removed session 36. May 27 18:18:30.112177 kubelet[2668]: E0527 18:18:30.112127 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:18:31.114203 kubelet[2668]: E0527 18:18:31.114111 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7989d9b698-55wz2" podUID="461386c4-2f9e-4c80-a2ab-7b0260259077" May 27 18:18:32.008112 containerd[1543]: time="2025-05-27T18:18:32.008059049Z" level=info msg="TaskExit event in podsandbox handler container_id:\"42963c6a800eeb5920b985546148e121e7547be360de37269e45c118688a457a\" id:\"b4d31c534369ff7caec0330fe1ea1c61bd2821a8daa1f38aa0913a710470d5ef\" pid:5716 exited_at:{seconds:1748369912 nanos:7716015}" May 27 18:18:33.113175 kubelet[2668]: E0527 18:18:33.113105 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hqs5q" podUID="312bbe06-8b56-48c8-bd2e-4f07049cb4ed" May 27 18:18:33.703259 systemd[1]: Started sshd@36-146.190.127.126:22-139.178.68.195:60520.service - OpenSSH per-connection server daemon (139.178.68.195:60520). May 27 18:18:33.771471 sshd[5732]: Accepted publickey for core from 139.178.68.195 port 60520 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:18:33.774593 sshd-session[5732]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:18:33.784015 systemd-logind[1520]: New session 37 of user core. May 27 18:18:33.789739 systemd[1]: Started session-37.scope - Session 37 of User core. May 27 18:18:33.983765 sshd[5734]: Connection closed by 139.178.68.195 port 60520 May 27 18:18:33.985144 sshd-session[5732]: pam_unix(sshd:session): session closed for user core May 27 18:18:33.991235 systemd[1]: sshd@36-146.190.127.126:22-139.178.68.195:60520.service: Deactivated successfully. May 27 18:18:33.995696 systemd[1]: session-37.scope: Deactivated successfully. May 27 18:18:33.998555 systemd-logind[1520]: Session 37 logged out. Waiting for processes to exit. May 27 18:18:34.002386 systemd-logind[1520]: Removed session 37. May 27 18:18:34.118279 kubelet[2668]: E0527 18:18:34.118211 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:18:34.794271 containerd[1543]: time="2025-05-27T18:18:34.794210618Z" level=info msg="TaskExit event in podsandbox handler container_id:\"42963c6a800eeb5920b985546148e121e7547be360de37269e45c118688a457a\" id:\"162bfc9a01efb6054651e7047fcfeca0735ad2e82873cfe386882f4c2413f825\" pid:5759 exited_at:{seconds:1748369914 nanos:793339826}" May 27 18:18:39.006117 systemd[1]: Started sshd@37-146.190.127.126:22-139.178.68.195:60534.service - OpenSSH per-connection server daemon (139.178.68.195:60534). May 27 18:18:39.079602 sshd[5772]: Accepted publickey for core from 139.178.68.195 port 60534 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:18:39.082035 sshd-session[5772]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:18:39.092026 systemd-logind[1520]: New session 38 of user core. May 27 18:18:39.098801 systemd[1]: Started session-38.scope - Session 38 of User core. May 27 18:18:39.252144 sshd[5776]: Connection closed by 139.178.68.195 port 60534 May 27 18:18:39.253314 sshd-session[5772]: pam_unix(sshd:session): session closed for user core May 27 18:18:39.258085 systemd-logind[1520]: Session 38 logged out. Waiting for processes to exit. May 27 18:18:39.258333 systemd[1]: sshd@37-146.190.127.126:22-139.178.68.195:60534.service: Deactivated successfully. May 27 18:18:39.261101 systemd[1]: session-38.scope: Deactivated successfully. May 27 18:18:39.268176 systemd-logind[1520]: Removed session 38. May 27 18:18:44.113393 containerd[1543]: time="2025-05-27T18:18:44.113274680Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 18:18:44.270634 systemd[1]: Started sshd@38-146.190.127.126:22-139.178.68.195:56226.service - OpenSSH per-connection server daemon (139.178.68.195:56226). May 27 18:18:44.336172 sshd[5788]: Accepted publickey for core from 139.178.68.195 port 56226 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:18:44.338460 sshd-session[5788]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:18:44.354524 systemd-logind[1520]: New session 39 of user core. May 27 18:18:44.366753 systemd[1]: Started session-39.scope - Session 39 of User core. May 27 18:18:44.370556 containerd[1543]: time="2025-05-27T18:18:44.370493308Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:18:44.371779 containerd[1543]: time="2025-05-27T18:18:44.371269667Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:18:44.371779 containerd[1543]: time="2025-05-27T18:18:44.371363017Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 18:18:44.371892 kubelet[2668]: E0527 18:18:44.371544 2668 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 18:18:44.371892 kubelet[2668]: E0527 18:18:44.371601 2668 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 18:18:44.377285 kubelet[2668]: E0527 18:18:44.377213 2668 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:b58c9c6d236b46b38836eca7a8a2bb57,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l4mn6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7989d9b698-55wz2_calico-system(461386c4-2f9e-4c80-a2ab-7b0260259077): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:18:44.379556 containerd[1543]: time="2025-05-27T18:18:44.379520151Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 18:18:44.526762 sshd[5791]: Connection closed by 139.178.68.195 port 56226 May 27 18:18:44.527198 sshd-session[5788]: pam_unix(sshd:session): session closed for user core May 27 18:18:44.533205 systemd[1]: sshd@38-146.190.127.126:22-139.178.68.195:56226.service: Deactivated successfully. May 27 18:18:44.537555 systemd[1]: session-39.scope: Deactivated successfully. May 27 18:18:44.539161 systemd-logind[1520]: Session 39 logged out. Waiting for processes to exit. May 27 18:18:44.542675 systemd-logind[1520]: Removed session 39. May 27 18:18:44.621271 containerd[1543]: time="2025-05-27T18:18:44.621039003Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:18:44.623012 containerd[1543]: time="2025-05-27T18:18:44.622872837Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:18:44.625523 kubelet[2668]: E0527 18:18:44.624820 2668 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 18:18:44.625523 kubelet[2668]: E0527 18:18:44.624884 2668 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 18:18:44.625523 kubelet[2668]: E0527 18:18:44.625031 2668 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l4mn6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7989d9b698-55wz2_calico-system(461386c4-2f9e-4c80-a2ab-7b0260259077): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:18:44.626699 kubelet[2668]: E0527 18:18:44.626591 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7989d9b698-55wz2" podUID="461386c4-2f9e-4c80-a2ab-7b0260259077" May 27 18:18:44.644928 containerd[1543]: time="2025-05-27T18:18:44.623166761Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 18:18:45.113395 kubelet[2668]: E0527 18:18:45.113326 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hqs5q" podUID="312bbe06-8b56-48c8-bd2e-4f07049cb4ed" May 27 18:18:49.548384 systemd[1]: Started sshd@39-146.190.127.126:22-139.178.68.195:56228.service - OpenSSH per-connection server daemon (139.178.68.195:56228). May 27 18:18:49.616598 sshd[5804]: Accepted publickey for core from 139.178.68.195 port 56228 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:18:49.619263 sshd-session[5804]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:18:49.626969 systemd-logind[1520]: New session 40 of user core. May 27 18:18:49.633847 systemd[1]: Started session-40.scope - Session 40 of User core. May 27 18:18:49.787617 sshd[5806]: Connection closed by 139.178.68.195 port 56228 May 27 18:18:49.788508 sshd-session[5804]: pam_unix(sshd:session): session closed for user core May 27 18:18:49.795191 systemd[1]: sshd@39-146.190.127.126:22-139.178.68.195:56228.service: Deactivated successfully. May 27 18:18:49.795343 systemd-logind[1520]: Session 40 logged out. Waiting for processes to exit. May 27 18:18:49.799604 systemd[1]: session-40.scope: Deactivated successfully. May 27 18:18:49.803846 systemd-logind[1520]: Removed session 40. May 27 18:18:51.555525 containerd[1543]: time="2025-05-27T18:18:51.555409516Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cfef13c479386db1499369db02a1c6f1fb36ea37285d4077a26db8d97d74af06\" id:\"12d1363533fac65f2d92a06b73e9a6613ddee020aceb6c723b6a9733202b6a53\" pid:5829 exited_at:{seconds:1748369931 nanos:554647666}" May 27 18:18:53.117108 kubelet[2668]: E0527 18:18:53.116819 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:18:54.803584 systemd[1]: Started sshd@40-146.190.127.126:22-139.178.68.195:49484.service - OpenSSH per-connection server daemon (139.178.68.195:49484). May 27 18:18:54.880857 sshd[5842]: Accepted publickey for core from 139.178.68.195 port 49484 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:18:54.883515 sshd-session[5842]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:18:54.890541 systemd-logind[1520]: New session 41 of user core. May 27 18:18:54.896737 systemd[1]: Started session-41.scope - Session 41 of User core. May 27 18:18:55.123376 sshd[5844]: Connection closed by 139.178.68.195 port 49484 May 27 18:18:55.124257 sshd-session[5842]: pam_unix(sshd:session): session closed for user core May 27 18:18:55.129315 systemd[1]: sshd@40-146.190.127.126:22-139.178.68.195:49484.service: Deactivated successfully. May 27 18:18:55.132242 systemd[1]: session-41.scope: Deactivated successfully. May 27 18:18:55.133918 systemd-logind[1520]: Session 41 logged out. Waiting for processes to exit. May 27 18:18:55.136265 systemd-logind[1520]: Removed session 41. May 27 18:18:56.112178 kubelet[2668]: E0527 18:18:56.112053 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:18:57.111618 kubelet[2668]: E0527 18:18:57.111569 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:18:59.114571 containerd[1543]: time="2025-05-27T18:18:59.114503361Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 18:18:59.396628 containerd[1543]: time="2025-05-27T18:18:59.396477641Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:18:59.397698 containerd[1543]: time="2025-05-27T18:18:59.397635424Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 18:18:59.398001 containerd[1543]: time="2025-05-27T18:18:59.397637135Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:18:59.398500 kubelet[2668]: E0527 18:18:59.398189 2668 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 18:18:59.398500 kubelet[2668]: E0527 18:18:59.398243 2668 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 18:18:59.399507 kubelet[2668]: E0527 18:18:59.398404 2668 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dctn5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-hqs5q_calico-system(312bbe06-8b56-48c8-bd2e-4f07049cb4ed): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:18:59.400695 kubelet[2668]: E0527 18:18:59.400638 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hqs5q" podUID="312bbe06-8b56-48c8-bd2e-4f07049cb4ed" May 27 18:19:00.115376 kubelet[2668]: E0527 18:19:00.115303 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7989d9b698-55wz2" podUID="461386c4-2f9e-4c80-a2ab-7b0260259077" May 27 18:19:00.147951 systemd[1]: Started sshd@41-146.190.127.126:22-139.178.68.195:49498.service - OpenSSH per-connection server daemon (139.178.68.195:49498). May 27 18:19:00.216511 sshd[5877]: Accepted publickey for core from 139.178.68.195 port 49498 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:19:00.219146 sshd-session[5877]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:19:00.228619 systemd-logind[1520]: New session 42 of user core. May 27 18:19:00.235825 systemd[1]: Started session-42.scope - Session 42 of User core. May 27 18:19:00.408579 sshd[5879]: Connection closed by 139.178.68.195 port 49498 May 27 18:19:00.410119 sshd-session[5877]: pam_unix(sshd:session): session closed for user core May 27 18:19:00.417609 systemd[1]: sshd@41-146.190.127.126:22-139.178.68.195:49498.service: Deactivated successfully. May 27 18:19:00.418095 systemd-logind[1520]: Session 42 logged out. Waiting for processes to exit. May 27 18:19:00.422174 systemd[1]: session-42.scope: Deactivated successfully. May 27 18:19:00.427928 systemd-logind[1520]: Removed session 42. May 27 18:19:04.790071 containerd[1543]: time="2025-05-27T18:19:04.789357796Z" level=info msg="TaskExit event in podsandbox handler container_id:\"42963c6a800eeb5920b985546148e121e7547be360de37269e45c118688a457a\" id:\"65020eabfaa5a40cf721a9db2e57e354a603a2a026fa38b479e0ee4d5fdd6d5b\" pid:5904 exited_at:{seconds:1748369944 nanos:788973021}" May 27 18:19:05.425386 systemd[1]: Started sshd@42-146.190.127.126:22-139.178.68.195:42380.service - OpenSSH per-connection server daemon (139.178.68.195:42380). May 27 18:19:05.510942 sshd[5914]: Accepted publickey for core from 139.178.68.195 port 42380 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:19:05.513030 sshd-session[5914]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:19:05.521666 systemd-logind[1520]: New session 43 of user core. May 27 18:19:05.526702 systemd[1]: Started session-43.scope - Session 43 of User core. May 27 18:19:05.684614 sshd[5916]: Connection closed by 139.178.68.195 port 42380 May 27 18:19:05.685492 sshd-session[5914]: pam_unix(sshd:session): session closed for user core May 27 18:19:05.691038 systemd-logind[1520]: Session 43 logged out. Waiting for processes to exit. May 27 18:19:05.691286 systemd[1]: sshd@42-146.190.127.126:22-139.178.68.195:42380.service: Deactivated successfully. May 27 18:19:05.693960 systemd[1]: session-43.scope: Deactivated successfully. May 27 18:19:05.696498 systemd-logind[1520]: Removed session 43. May 27 18:19:07.850042 update_engine[1522]: I20250527 18:19:07.849925 1522 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs May 27 18:19:07.850042 update_engine[1522]: I20250527 18:19:07.850011 1522 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs May 27 18:19:07.852178 update_engine[1522]: I20250527 18:19:07.852118 1522 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs May 27 18:19:07.853299 update_engine[1522]: I20250527 18:19:07.853253 1522 omaha_request_params.cc:62] Current group set to alpha May 27 18:19:07.853740 update_engine[1522]: I20250527 18:19:07.853643 1522 update_attempter.cc:499] Already updated boot flags. Skipping. May 27 18:19:07.853740 update_engine[1522]: I20250527 18:19:07.853667 1522 update_attempter.cc:643] Scheduling an action processor start. May 27 18:19:07.853740 update_engine[1522]: I20250527 18:19:07.853691 1522 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 27 18:19:07.853911 update_engine[1522]: I20250527 18:19:07.853746 1522 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs May 27 18:19:07.853911 update_engine[1522]: I20250527 18:19:07.853837 1522 omaha_request_action.cc:271] Posting an Omaha request to disabled May 27 18:19:07.853911 update_engine[1522]: I20250527 18:19:07.853844 1522 omaha_request_action.cc:272] Request: May 27 18:19:07.853911 update_engine[1522]: May 27 18:19:07.853911 update_engine[1522]: May 27 18:19:07.853911 update_engine[1522]: May 27 18:19:07.853911 update_engine[1522]: May 27 18:19:07.853911 update_engine[1522]: May 27 18:19:07.853911 update_engine[1522]: May 27 18:19:07.853911 update_engine[1522]: May 27 18:19:07.853911 update_engine[1522]: May 27 18:19:07.853911 update_engine[1522]: I20250527 18:19:07.853852 1522 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 27 18:19:07.865839 update_engine[1522]: I20250527 18:19:07.865621 1522 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 27 18:19:07.866128 update_engine[1522]: I20250527 18:19:07.866031 1522 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 27 18:19:07.866456 update_engine[1522]: E20250527 18:19:07.866359 1522 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 27 18:19:07.866541 update_engine[1522]: I20250527 18:19:07.866485 1522 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 May 27 18:19:07.886294 locksmithd[1558]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 May 27 18:19:10.703118 systemd[1]: Started sshd@43-146.190.127.126:22-139.178.68.195:42392.service - OpenSSH per-connection server daemon (139.178.68.195:42392). May 27 18:19:10.774584 sshd[5931]: Accepted publickey for core from 139.178.68.195 port 42392 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:19:10.776998 sshd-session[5931]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:19:10.784410 systemd-logind[1520]: New session 44 of user core. May 27 18:19:10.791773 systemd[1]: Started session-44.scope - Session 44 of User core. May 27 18:19:11.250348 sshd[5933]: Connection closed by 139.178.68.195 port 42392 May 27 18:19:11.252248 sshd-session[5931]: pam_unix(sshd:session): session closed for user core May 27 18:19:11.265390 systemd[1]: sshd@43-146.190.127.126:22-139.178.68.195:42392.service: Deactivated successfully. May 27 18:19:11.268180 systemd[1]: session-44.scope: Deactivated successfully. May 27 18:19:11.269419 systemd-logind[1520]: Session 44 logged out. Waiting for processes to exit. May 27 18:19:11.272198 systemd-logind[1520]: Removed session 44. May 27 18:19:12.111310 kubelet[2668]: E0527 18:19:12.111186 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:19:12.114975 kubelet[2668]: E0527 18:19:12.114226 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hqs5q" podUID="312bbe06-8b56-48c8-bd2e-4f07049cb4ed" May 27 18:19:15.114146 kubelet[2668]: E0527 18:19:15.113904 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7989d9b698-55wz2" podUID="461386c4-2f9e-4c80-a2ab-7b0260259077" May 27 18:19:16.268645 systemd[1]: Started sshd@44-146.190.127.126:22-139.178.68.195:36414.service - OpenSSH per-connection server daemon (139.178.68.195:36414). May 27 18:19:16.388178 sshd[5948]: Accepted publickey for core from 139.178.68.195 port 36414 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:19:16.389935 sshd-session[5948]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:19:16.398748 systemd-logind[1520]: New session 45 of user core. May 27 18:19:16.408828 systemd[1]: Started session-45.scope - Session 45 of User core. May 27 18:19:16.663996 sshd[5950]: Connection closed by 139.178.68.195 port 36414 May 27 18:19:16.665026 sshd-session[5948]: pam_unix(sshd:session): session closed for user core May 27 18:19:16.672144 systemd-logind[1520]: Session 45 logged out. Waiting for processes to exit. May 27 18:19:16.673323 systemd[1]: sshd@44-146.190.127.126:22-139.178.68.195:36414.service: Deactivated successfully. May 27 18:19:16.677313 systemd[1]: session-45.scope: Deactivated successfully. May 27 18:19:16.681876 systemd-logind[1520]: Removed session 45. May 27 18:19:17.111237 kubelet[2668]: E0527 18:19:17.111188 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:19:17.780526 update_engine[1522]: I20250527 18:19:17.779959 1522 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 27 18:19:17.780526 update_engine[1522]: I20250527 18:19:17.780267 1522 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 27 18:19:17.786210 update_engine[1522]: I20250527 18:19:17.785846 1522 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 27 18:19:17.786210 update_engine[1522]: E20250527 18:19:17.786114 1522 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 27 18:19:17.786386 update_engine[1522]: I20250527 18:19:17.786273 1522 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 May 27 18:19:21.586514 containerd[1543]: time="2025-05-27T18:19:21.585369578Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cfef13c479386db1499369db02a1c6f1fb36ea37285d4077a26db8d97d74af06\" id:\"5221b3817e61834b18d8ed56eb31a0ef6ff8f5281d396e4cc782fef9d89b6198\" pid:5973 exited_at:{seconds:1748369961 nanos:584511165}" May 27 18:19:21.691941 systemd[1]: Started sshd@45-146.190.127.126:22-139.178.68.195:36430.service - OpenSSH per-connection server daemon (139.178.68.195:36430). May 27 18:19:21.783627 sshd[5987]: Accepted publickey for core from 139.178.68.195 port 36430 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:19:21.787091 sshd-session[5987]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:19:21.793953 systemd-logind[1520]: New session 46 of user core. May 27 18:19:21.799796 systemd[1]: Started session-46.scope - Session 46 of User core. May 27 18:19:22.129508 sshd[5989]: Connection closed by 139.178.68.195 port 36430 May 27 18:19:22.130947 sshd-session[5987]: pam_unix(sshd:session): session closed for user core May 27 18:19:22.135049 systemd[1]: sshd@45-146.190.127.126:22-139.178.68.195:36430.service: Deactivated successfully. May 27 18:19:22.138020 systemd[1]: session-46.scope: Deactivated successfully. May 27 18:19:22.142454 systemd-logind[1520]: Session 46 logged out. Waiting for processes to exit. May 27 18:19:22.147114 systemd-logind[1520]: Removed session 46. May 27 18:19:27.113413 kubelet[2668]: E0527 18:19:27.113069 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hqs5q" podUID="312bbe06-8b56-48c8-bd2e-4f07049cb4ed" May 27 18:19:27.151581 systemd[1]: Started sshd@46-146.190.127.126:22-139.178.68.195:44298.service - OpenSSH per-connection server daemon (139.178.68.195:44298). May 27 18:19:27.221603 sshd[6000]: Accepted publickey for core from 139.178.68.195 port 44298 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:19:27.225159 sshd-session[6000]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:19:27.234526 systemd-logind[1520]: New session 47 of user core. May 27 18:19:27.242716 systemd[1]: Started session-47.scope - Session 47 of User core. May 27 18:19:27.411511 sshd[6002]: Connection closed by 139.178.68.195 port 44298 May 27 18:19:27.412701 sshd-session[6000]: pam_unix(sshd:session): session closed for user core May 27 18:19:27.425747 systemd[1]: sshd@46-146.190.127.126:22-139.178.68.195:44298.service: Deactivated successfully. May 27 18:19:27.428788 systemd[1]: session-47.scope: Deactivated successfully. May 27 18:19:27.430530 systemd-logind[1520]: Session 47 logged out. Waiting for processes to exit. May 27 18:19:27.437410 systemd[1]: Started sshd@47-146.190.127.126:22-139.178.68.195:44306.service - OpenSSH per-connection server daemon (139.178.68.195:44306). May 27 18:19:27.439058 systemd-logind[1520]: Removed session 47. May 27 18:19:27.503493 sshd[6014]: Accepted publickey for core from 139.178.68.195 port 44306 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:19:27.505539 sshd-session[6014]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:19:27.511535 systemd-logind[1520]: New session 48 of user core. May 27 18:19:27.518701 systemd[1]: Started session-48.scope - Session 48 of User core. May 27 18:19:27.779721 update_engine[1522]: I20250527 18:19:27.779506 1522 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 27 18:19:27.780254 update_engine[1522]: I20250527 18:19:27.779991 1522 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 27 18:19:27.781178 update_engine[1522]: I20250527 18:19:27.780455 1522 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 27 18:19:27.781178 update_engine[1522]: E20250527 18:19:27.780974 1522 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 27 18:19:27.781178 update_engine[1522]: I20250527 18:19:27.781096 1522 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 May 27 18:19:27.899000 sshd[6016]: Connection closed by 139.178.68.195 port 44306 May 27 18:19:27.900477 sshd-session[6014]: pam_unix(sshd:session): session closed for user core May 27 18:19:27.915185 systemd[1]: Started sshd@48-146.190.127.126:22-139.178.68.195:44312.service - OpenSSH per-connection server daemon (139.178.68.195:44312). May 27 18:19:27.925763 systemd[1]: sshd@47-146.190.127.126:22-139.178.68.195:44306.service: Deactivated successfully. May 27 18:19:27.932358 systemd[1]: session-48.scope: Deactivated successfully. May 27 18:19:27.934525 systemd-logind[1520]: Session 48 logged out. Waiting for processes to exit. May 27 18:19:27.938682 systemd-logind[1520]: Removed session 48. May 27 18:19:28.054161 sshd[6023]: Accepted publickey for core from 139.178.68.195 port 44312 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:19:28.057187 sshd-session[6023]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:19:28.064190 systemd-logind[1520]: New session 49 of user core. May 27 18:19:28.075869 systemd[1]: Started session-49.scope - Session 49 of User core. May 27 18:19:29.185400 kubelet[2668]: E0527 18:19:29.184933 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7989d9b698-55wz2" podUID="461386c4-2f9e-4c80-a2ab-7b0260259077" May 27 18:19:29.281545 sshd[6028]: Connection closed by 139.178.68.195 port 44312 May 27 18:19:29.283875 sshd-session[6023]: pam_unix(sshd:session): session closed for user core May 27 18:19:29.296367 systemd[1]: Started sshd@49-146.190.127.126:22-139.178.68.195:44318.service - OpenSSH per-connection server daemon (139.178.68.195:44318). May 27 18:19:29.299631 systemd[1]: sshd@48-146.190.127.126:22-139.178.68.195:44312.service: Deactivated successfully. May 27 18:19:29.308388 systemd[1]: session-49.scope: Deactivated successfully. May 27 18:19:29.318058 systemd-logind[1520]: Session 49 logged out. Waiting for processes to exit. May 27 18:19:29.326387 systemd-logind[1520]: Removed session 49. May 27 18:19:29.380837 sshd[6040]: Accepted publickey for core from 139.178.68.195 port 44318 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:19:29.385115 sshd-session[6040]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:19:29.398963 systemd-logind[1520]: New session 50 of user core. May 27 18:19:29.403819 systemd[1]: Started session-50.scope - Session 50 of User core. May 27 18:19:30.384399 sshd[6047]: Connection closed by 139.178.68.195 port 44318 May 27 18:19:30.384845 sshd-session[6040]: pam_unix(sshd:session): session closed for user core May 27 18:19:30.400348 systemd[1]: sshd@49-146.190.127.126:22-139.178.68.195:44318.service: Deactivated successfully. May 27 18:19:30.404350 systemd[1]: session-50.scope: Deactivated successfully. May 27 18:19:30.407076 systemd-logind[1520]: Session 50 logged out. Waiting for processes to exit. May 27 18:19:30.414404 systemd[1]: Started sshd@50-146.190.127.126:22-139.178.68.195:44334.service - OpenSSH per-connection server daemon (139.178.68.195:44334). May 27 18:19:30.421093 systemd-logind[1520]: Removed session 50. May 27 18:19:30.490016 sshd[6057]: Accepted publickey for core from 139.178.68.195 port 44334 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:19:30.492544 sshd-session[6057]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:19:30.500815 systemd-logind[1520]: New session 51 of user core. May 27 18:19:30.510891 systemd[1]: Started session-51.scope - Session 51 of User core. May 27 18:19:30.693244 sshd[6059]: Connection closed by 139.178.68.195 port 44334 May 27 18:19:30.694048 sshd-session[6057]: pam_unix(sshd:session): session closed for user core May 27 18:19:30.700101 systemd[1]: sshd@50-146.190.127.126:22-139.178.68.195:44334.service: Deactivated successfully. May 27 18:19:30.704314 systemd[1]: session-51.scope: Deactivated successfully. May 27 18:19:30.705673 systemd-logind[1520]: Session 51 logged out. Waiting for processes to exit. May 27 18:19:30.708144 systemd-logind[1520]: Removed session 51. May 27 18:19:32.081088 containerd[1543]: time="2025-05-27T18:19:32.081029981Z" level=info msg="TaskExit event in podsandbox handler container_id:\"42963c6a800eeb5920b985546148e121e7547be360de37269e45c118688a457a\" id:\"c1355020e58121a1cc2917862795c121a916cafdc5bd1383e9b0f984a1fb717f\" pid:6082 exited_at:{seconds:1748369972 nanos:80636057}" May 27 18:19:34.801951 containerd[1543]: time="2025-05-27T18:19:34.801870720Z" level=info msg="TaskExit event in podsandbox handler container_id:\"42963c6a800eeb5920b985546148e121e7547be360de37269e45c118688a457a\" id:\"d9aa482c41843d5c3c54f66672ad8227606e2d0daf226ee26335cb0605d20352\" pid:6106 exited_at:{seconds:1748369974 nanos:801041662}" May 27 18:19:35.712628 systemd[1]: Started sshd@51-146.190.127.126:22-139.178.68.195:44084.service - OpenSSH per-connection server daemon (139.178.68.195:44084). May 27 18:19:35.780229 sshd[6116]: Accepted publickey for core from 139.178.68.195 port 44084 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:19:35.782361 sshd-session[6116]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:19:35.789965 systemd-logind[1520]: New session 52 of user core. May 27 18:19:35.798809 systemd[1]: Started session-52.scope - Session 52 of User core. May 27 18:19:35.960135 sshd[6118]: Connection closed by 139.178.68.195 port 44084 May 27 18:19:35.961057 sshd-session[6116]: pam_unix(sshd:session): session closed for user core May 27 18:19:35.966208 systemd[1]: sshd@51-146.190.127.126:22-139.178.68.195:44084.service: Deactivated successfully. May 27 18:19:35.969130 systemd[1]: session-52.scope: Deactivated successfully. May 27 18:19:35.970933 systemd-logind[1520]: Session 52 logged out. Waiting for processes to exit. May 27 18:19:35.973170 systemd-logind[1520]: Removed session 52. May 27 18:19:37.781804 update_engine[1522]: I20250527 18:19:37.781332 1522 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 27 18:19:37.782970 update_engine[1522]: I20250527 18:19:37.782569 1522 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 27 18:19:37.782970 update_engine[1522]: I20250527 18:19:37.782905 1522 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 27 18:19:37.783285 update_engine[1522]: E20250527 18:19:37.783258 1522 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 27 18:19:37.783462 update_engine[1522]: I20250527 18:19:37.783423 1522 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 27 18:19:37.783829 update_engine[1522]: I20250527 18:19:37.783628 1522 omaha_request_action.cc:617] Omaha request response: May 27 18:19:37.783829 update_engine[1522]: E20250527 18:19:37.783725 1522 omaha_request_action.cc:636] Omaha request network transfer failed. May 27 18:19:37.791861 update_engine[1522]: I20250527 18:19:37.787539 1522 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. May 27 18:19:37.791861 update_engine[1522]: I20250527 18:19:37.787595 1522 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 27 18:19:37.791861 update_engine[1522]: I20250527 18:19:37.787604 1522 update_attempter.cc:306] Processing Done. May 27 18:19:37.791861 update_engine[1522]: E20250527 18:19:37.787626 1522 update_attempter.cc:619] Update failed. May 27 18:19:37.791861 update_engine[1522]: I20250527 18:19:37.787633 1522 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse May 27 18:19:37.791861 update_engine[1522]: I20250527 18:19:37.787638 1522 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) May 27 18:19:37.791861 update_engine[1522]: I20250527 18:19:37.787648 1522 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. May 27 18:19:37.792210 update_engine[1522]: I20250527 18:19:37.792152 1522 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 27 18:19:37.792256 update_engine[1522]: I20250527 18:19:37.792230 1522 omaha_request_action.cc:271] Posting an Omaha request to disabled May 27 18:19:37.792256 update_engine[1522]: I20250527 18:19:37.792248 1522 omaha_request_action.cc:272] Request: May 27 18:19:37.792256 update_engine[1522]: May 27 18:19:37.792256 update_engine[1522]: May 27 18:19:37.792256 update_engine[1522]: May 27 18:19:37.792256 update_engine[1522]: May 27 18:19:37.792256 update_engine[1522]: May 27 18:19:37.792256 update_engine[1522]: May 27 18:19:37.792454 update_engine[1522]: I20250527 18:19:37.792258 1522 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 27 18:19:37.792708 update_engine[1522]: I20250527 18:19:37.792570 1522 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 27 18:19:37.792763 locksmithd[1558]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 May 27 18:19:37.793197 update_engine[1522]: I20250527 18:19:37.792937 1522 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 27 18:19:37.793294 update_engine[1522]: E20250527 18:19:37.793246 1522 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 27 18:19:37.793337 update_engine[1522]: I20250527 18:19:37.793318 1522 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 27 18:19:37.793337 update_engine[1522]: I20250527 18:19:37.793327 1522 omaha_request_action.cc:617] Omaha request response: May 27 18:19:37.793412 update_engine[1522]: I20250527 18:19:37.793336 1522 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 27 18:19:37.793412 update_engine[1522]: I20250527 18:19:37.793345 1522 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 27 18:19:37.793412 update_engine[1522]: I20250527 18:19:37.793354 1522 update_attempter.cc:306] Processing Done. May 27 18:19:37.793412 update_engine[1522]: I20250527 18:19:37.793362 1522 update_attempter.cc:310] Error event sent. May 27 18:19:37.793412 update_engine[1522]: I20250527 18:19:37.793375 1522 update_check_scheduler.cc:74] Next update check in 49m58s May 27 18:19:37.793848 locksmithd[1558]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 May 27 18:19:40.982161 systemd[1]: Started sshd@52-146.190.127.126:22-139.178.68.195:44090.service - OpenSSH per-connection server daemon (139.178.68.195:44090). May 27 18:19:41.060496 sshd[6130]: Accepted publickey for core from 139.178.68.195 port 44090 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:19:41.062323 sshd-session[6130]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:19:41.070553 systemd-logind[1520]: New session 53 of user core. May 27 18:19:41.084852 systemd[1]: Started session-53.scope - Session 53 of User core. May 27 18:19:41.113055 kubelet[2668]: E0527 18:19:41.113008 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hqs5q" podUID="312bbe06-8b56-48c8-bd2e-4f07049cb4ed" May 27 18:19:41.240146 sshd[6132]: Connection closed by 139.178.68.195 port 44090 May 27 18:19:41.241141 sshd-session[6130]: pam_unix(sshd:session): session closed for user core May 27 18:19:41.248011 systemd[1]: sshd@52-146.190.127.126:22-139.178.68.195:44090.service: Deactivated successfully. May 27 18:19:41.253236 systemd[1]: session-53.scope: Deactivated successfully. May 27 18:19:41.256823 systemd-logind[1520]: Session 53 logged out. Waiting for processes to exit. May 27 18:19:41.259866 systemd-logind[1520]: Removed session 53. May 27 18:19:43.114319 kubelet[2668]: E0527 18:19:43.113951 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7989d9b698-55wz2" podUID="461386c4-2f9e-4c80-a2ab-7b0260259077" May 27 18:19:44.114604 kubelet[2668]: E0527 18:19:44.114562 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:19:45.112105 kubelet[2668]: E0527 18:19:45.111949 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:19:46.258089 systemd[1]: Started sshd@53-146.190.127.126:22-139.178.68.195:53836.service - OpenSSH per-connection server daemon (139.178.68.195:53836). May 27 18:19:46.329015 sshd[6145]: Accepted publickey for core from 139.178.68.195 port 53836 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:19:46.331300 sshd-session[6145]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:19:46.338928 systemd-logind[1520]: New session 54 of user core. May 27 18:19:46.347784 systemd[1]: Started session-54.scope - Session 54 of User core. May 27 18:19:46.500303 sshd[6147]: Connection closed by 139.178.68.195 port 53836 May 27 18:19:46.501391 sshd-session[6145]: pam_unix(sshd:session): session closed for user core May 27 18:19:46.507185 systemd[1]: sshd@53-146.190.127.126:22-139.178.68.195:53836.service: Deactivated successfully. May 27 18:19:46.510213 systemd[1]: session-54.scope: Deactivated successfully. May 27 18:19:46.512712 systemd-logind[1520]: Session 54 logged out. Waiting for processes to exit. May 27 18:19:46.514658 systemd-logind[1520]: Removed session 54. May 27 18:19:51.520794 systemd[1]: Started sshd@54-146.190.127.126:22-139.178.68.195:53852.service - OpenSSH per-connection server daemon (139.178.68.195:53852). May 27 18:19:51.604948 sshd[6181]: Accepted publickey for core from 139.178.68.195 port 53852 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:19:51.608755 sshd-session[6181]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:19:51.619071 systemd-logind[1520]: New session 55 of user core. May 27 18:19:51.624379 containerd[1543]: time="2025-05-27T18:19:51.624324940Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cfef13c479386db1499369db02a1c6f1fb36ea37285d4077a26db8d97d74af06\" id:\"d217fc4a726af71fd6c82dc19f79768e192109ccd4c375ab9a7697003fd45acc\" pid:6169 exited_at:{seconds:1748369991 nanos:623194101}" May 27 18:19:51.626915 systemd[1]: Started session-55.scope - Session 55 of User core. May 27 18:19:51.803781 sshd[6184]: Connection closed by 139.178.68.195 port 53852 May 27 18:19:51.805118 sshd-session[6181]: pam_unix(sshd:session): session closed for user core May 27 18:19:51.812989 systemd-logind[1520]: Session 55 logged out. Waiting for processes to exit. May 27 18:19:51.813778 systemd[1]: sshd@54-146.190.127.126:22-139.178.68.195:53852.service: Deactivated successfully. May 27 18:19:51.818106 systemd[1]: session-55.scope: Deactivated successfully. May 27 18:19:51.822956 systemd-logind[1520]: Removed session 55. May 27 18:19:52.114089 kubelet[2668]: E0527 18:19:52.113005 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hqs5q" podUID="312bbe06-8b56-48c8-bd2e-4f07049cb4ed" May 27 18:19:56.113365 kubelet[2668]: E0527 18:19:56.113277 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7989d9b698-55wz2" podUID="461386c4-2f9e-4c80-a2ab-7b0260259077" May 27 18:19:56.825824 systemd[1]: Started sshd@55-146.190.127.126:22-139.178.68.195:41050.service - OpenSSH per-connection server daemon (139.178.68.195:41050). May 27 18:19:56.898127 sshd[6196]: Accepted publickey for core from 139.178.68.195 port 41050 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:19:56.900579 sshd-session[6196]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:19:56.909023 systemd-logind[1520]: New session 56 of user core. May 27 18:19:56.913756 systemd[1]: Started session-56.scope - Session 56 of User core. May 27 18:19:57.061867 sshd[6198]: Connection closed by 139.178.68.195 port 41050 May 27 18:19:57.062407 sshd-session[6196]: pam_unix(sshd:session): session closed for user core May 27 18:19:57.069748 systemd[1]: sshd@55-146.190.127.126:22-139.178.68.195:41050.service: Deactivated successfully. May 27 18:19:57.076487 systemd[1]: session-56.scope: Deactivated successfully. May 27 18:19:57.079859 systemd-logind[1520]: Session 56 logged out. Waiting for processes to exit. May 27 18:19:57.084087 systemd-logind[1520]: Removed session 56. May 27 18:20:02.091748 systemd[1]: Started sshd@56-146.190.127.126:22-139.178.68.195:41064.service - OpenSSH per-connection server daemon (139.178.68.195:41064). May 27 18:20:02.210530 sshd[6209]: Accepted publickey for core from 139.178.68.195 port 41064 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:20:02.216989 sshd-session[6209]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:20:02.232785 systemd-logind[1520]: New session 57 of user core. May 27 18:20:02.240850 systemd[1]: Started session-57.scope - Session 57 of User core. May 27 18:20:02.499659 sshd[6211]: Connection closed by 139.178.68.195 port 41064 May 27 18:20:02.500245 sshd-session[6209]: pam_unix(sshd:session): session closed for user core May 27 18:20:02.509621 systemd[1]: sshd@56-146.190.127.126:22-139.178.68.195:41064.service: Deactivated successfully. May 27 18:20:02.520151 systemd[1]: session-57.scope: Deactivated successfully. May 27 18:20:02.529810 systemd-logind[1520]: Session 57 logged out. Waiting for processes to exit. May 27 18:20:02.533559 systemd-logind[1520]: Removed session 57. May 27 18:20:03.631614 containerd[1543]: time="2025-05-27T18:20:03.554664603Z" level=warning msg="container event discarded" container=b71ba0f0440fb2019f59702bf81336f19930f7f8078502591d78c676d53fb06a type=CONTAINER_CREATED_EVENT May 27 18:20:03.727350 containerd[1543]: time="2025-05-27T18:20:03.727229120Z" level=warning msg="container event discarded" container=b71ba0f0440fb2019f59702bf81336f19930f7f8078502591d78c676d53fb06a type=CONTAINER_STARTED_EVENT May 27 18:20:03.727350 containerd[1543]: time="2025-05-27T18:20:03.727299425Z" level=warning msg="container event discarded" container=cbc4dea54516389535ed9010272ee6a5ef584fe710e3122f61c24444c95de71a type=CONTAINER_CREATED_EVENT May 27 18:20:03.727350 containerd[1543]: time="2025-05-27T18:20:03.727311679Z" level=warning msg="container event discarded" container=cbc4dea54516389535ed9010272ee6a5ef584fe710e3122f61c24444c95de71a type=CONTAINER_STARTED_EVENT May 27 18:20:03.727350 containerd[1543]: time="2025-05-27T18:20:03.727328560Z" level=warning msg="container event discarded" container=24fa8905e40320b64978f0bf1723397043bcf3fb5e0b3df4f47006972379005f type=CONTAINER_CREATED_EVENT May 27 18:20:03.727350 containerd[1543]: time="2025-05-27T18:20:03.727341168Z" level=warning msg="container event discarded" container=20ae5a8e2bf286b09ba1a7e10fddf56802a3b35db73826f34c29228d5b1aca9f type=CONTAINER_CREATED_EVENT May 27 18:20:03.727350 containerd[1543]: time="2025-05-27T18:20:03.727356168Z" level=warning msg="container event discarded" container=20ae5a8e2bf286b09ba1a7e10fddf56802a3b35db73826f34c29228d5b1aca9f type=CONTAINER_STARTED_EVENT May 27 18:20:03.727350 containerd[1543]: time="2025-05-27T18:20:03.727364488Z" level=warning msg="container event discarded" container=b8141abc808c60bc02684a7a34b0df5a87993331e55fe3eb026277765620d710 type=CONTAINER_CREATED_EVENT May 27 18:20:03.727911 containerd[1543]: time="2025-05-27T18:20:03.727374029Z" level=warning msg="container event discarded" container=ac97c0914d6ed9fb061f2297c2ac82962f0b38d902dc5d840a9916bca0c1cb80 type=CONTAINER_CREATED_EVENT May 27 18:20:03.798356 containerd[1543]: time="2025-05-27T18:20:03.798253137Z" level=warning msg="container event discarded" container=24fa8905e40320b64978f0bf1723397043bcf3fb5e0b3df4f47006972379005f type=CONTAINER_STARTED_EVENT May 27 18:20:03.814734 containerd[1543]: time="2025-05-27T18:20:03.814613010Z" level=warning msg="container event discarded" container=b8141abc808c60bc02684a7a34b0df5a87993331e55fe3eb026277765620d710 type=CONTAINER_STARTED_EVENT May 27 18:20:03.894373 containerd[1543]: time="2025-05-27T18:20:03.893683074Z" level=warning msg="container event discarded" container=ac97c0914d6ed9fb061f2297c2ac82962f0b38d902dc5d840a9916bca0c1cb80 type=CONTAINER_STARTED_EVENT May 27 18:20:04.798697 containerd[1543]: time="2025-05-27T18:20:04.798640825Z" level=info msg="TaskExit event in podsandbox handler container_id:\"42963c6a800eeb5920b985546148e121e7547be360de37269e45c118688a457a\" id:\"4337b7d11e4fbf237c755df1eb9393bc295f10ed2c25cb9a872b38dddb49c832\" pid:6236 exited_at:{seconds:1748370004 nanos:798056429}" May 27 18:20:05.113668 kubelet[2668]: E0527 18:20:05.113383 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hqs5q" podUID="312bbe06-8b56-48c8-bd2e-4f07049cb4ed" May 27 18:20:07.113807 kubelet[2668]: E0527 18:20:07.113698 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7989d9b698-55wz2" podUID="461386c4-2f9e-4c80-a2ab-7b0260259077" May 27 18:20:07.516593 systemd[1]: Started sshd@57-146.190.127.126:22-139.178.68.195:47632.service - OpenSSH per-connection server daemon (139.178.68.195:47632). May 27 18:20:07.593603 sshd[6246]: Accepted publickey for core from 139.178.68.195 port 47632 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:20:07.596421 sshd-session[6246]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:20:07.606109 systemd-logind[1520]: New session 58 of user core. May 27 18:20:07.611791 systemd[1]: Started session-58.scope - Session 58 of User core. May 27 18:20:07.775377 sshd[6248]: Connection closed by 139.178.68.195 port 47632 May 27 18:20:07.776282 sshd-session[6246]: pam_unix(sshd:session): session closed for user core May 27 18:20:07.783130 systemd[1]: sshd@57-146.190.127.126:22-139.178.68.195:47632.service: Deactivated successfully. May 27 18:20:07.788366 systemd[1]: session-58.scope: Deactivated successfully. May 27 18:20:07.790517 systemd-logind[1520]: Session 58 logged out. Waiting for processes to exit. May 27 18:20:07.793486 systemd-logind[1520]: Removed session 58. May 27 18:20:12.113129 kubelet[2668]: E0527 18:20:12.113088 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:20:12.791180 systemd[1]: Started sshd@58-146.190.127.126:22-139.178.68.195:47646.service - OpenSSH per-connection server daemon (139.178.68.195:47646). May 27 18:20:12.860041 sshd[6262]: Accepted publickey for core from 139.178.68.195 port 47646 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:20:12.862267 sshd-session[6262]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:20:12.870766 systemd-logind[1520]: New session 59 of user core. May 27 18:20:12.882771 systemd[1]: Started session-59.scope - Session 59 of User core. May 27 18:20:13.024986 sshd[6264]: Connection closed by 139.178.68.195 port 47646 May 27 18:20:13.026078 sshd-session[6262]: pam_unix(sshd:session): session closed for user core May 27 18:20:13.035287 systemd-logind[1520]: Session 59 logged out. Waiting for processes to exit. May 27 18:20:13.036187 systemd[1]: sshd@58-146.190.127.126:22-139.178.68.195:47646.service: Deactivated successfully. May 27 18:20:13.040932 systemd[1]: session-59.scope: Deactivated successfully. May 27 18:20:13.044563 systemd-logind[1520]: Removed session 59. May 27 18:20:13.112125 kubelet[2668]: E0527 18:20:13.111509 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:20:14.113228 kubelet[2668]: E0527 18:20:14.113118 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:20:15.418235 containerd[1543]: time="2025-05-27T18:20:15.418096000Z" level=warning msg="container event discarded" container=5af9526ee9a4fef232627600828ea67614d14c5b3a41c5011eb249c66897d1a0 type=CONTAINER_CREATED_EVENT May 27 18:20:15.418235 containerd[1543]: time="2025-05-27T18:20:15.418188646Z" level=warning msg="container event discarded" container=5af9526ee9a4fef232627600828ea67614d14c5b3a41c5011eb249c66897d1a0 type=CONTAINER_STARTED_EVENT May 27 18:20:15.446527 containerd[1543]: time="2025-05-27T18:20:15.446414078Z" level=warning msg="container event discarded" container=1beb0084cf04d2626db20ff865f6f99f644acfc5d5088852f8461a6d9c31d6d8 type=CONTAINER_CREATED_EVENT May 27 18:20:15.555929 containerd[1543]: time="2025-05-27T18:20:15.555794561Z" level=warning msg="container event discarded" container=1beb0084cf04d2626db20ff865f6f99f644acfc5d5088852f8461a6d9c31d6d8 type=CONTAINER_STARTED_EVENT May 27 18:20:15.627784 containerd[1543]: time="2025-05-27T18:20:15.627703621Z" level=warning msg="container event discarded" container=46cad09c3894210fd9b3a97603bfcd693e6f6a1203195a10fe19133c9aa82673 type=CONTAINER_CREATED_EVENT May 27 18:20:15.627784 containerd[1543]: time="2025-05-27T18:20:15.627765642Z" level=warning msg="container event discarded" container=46cad09c3894210fd9b3a97603bfcd693e6f6a1203195a10fe19133c9aa82673 type=CONTAINER_STARTED_EVENT May 27 18:20:18.060789 systemd[1]: Started sshd@59-146.190.127.126:22-139.178.68.195:39662.service - OpenSSH per-connection server daemon (139.178.68.195:39662). May 27 18:20:18.130312 sshd[6278]: Accepted publickey for core from 139.178.68.195 port 39662 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:20:18.132328 sshd-session[6278]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:20:18.138681 systemd-logind[1520]: New session 60 of user core. May 27 18:20:18.145774 systemd[1]: Started session-60.scope - Session 60 of User core. May 27 18:20:18.217586 containerd[1543]: time="2025-05-27T18:20:18.217518079Z" level=warning msg="container event discarded" container=9da6560a3f84068ec512461e4da61494562f699e533e34b0eb9a584c3bf80eba type=CONTAINER_CREATED_EVENT May 27 18:20:18.289462 sshd[6280]: Connection closed by 139.178.68.195 port 39662 May 27 18:20:18.290596 sshd-session[6278]: pam_unix(sshd:session): session closed for user core May 27 18:20:18.297179 systemd-logind[1520]: Session 60 logged out. Waiting for processes to exit. May 27 18:20:18.297965 systemd[1]: sshd@59-146.190.127.126:22-139.178.68.195:39662.service: Deactivated successfully. May 27 18:20:18.300624 systemd[1]: session-60.scope: Deactivated successfully. May 27 18:20:18.305097 systemd-logind[1520]: Removed session 60. May 27 18:20:18.316187 containerd[1543]: time="2025-05-27T18:20:18.316023810Z" level=warning msg="container event discarded" container=9da6560a3f84068ec512461e4da61494562f699e533e34b0eb9a584c3bf80eba type=CONTAINER_STARTED_EVENT May 27 18:20:19.111952 kubelet[2668]: E0527 18:20:19.111834 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:20:20.124412 kubelet[2668]: E0527 18:20:20.123872 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hqs5q" podUID="312bbe06-8b56-48c8-bd2e-4f07049cb4ed" May 27 18:20:21.545688 containerd[1543]: time="2025-05-27T18:20:21.545616775Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cfef13c479386db1499369db02a1c6f1fb36ea37285d4077a26db8d97d74af06\" id:\"9ae81613d8b808c6c77f8a9cea92b870d5d1fe3333ac7d90cc8d992b6ac6e695\" pid:6301 exited_at:{seconds:1748370021 nanos:545154716}" May 27 18:20:22.114123 kubelet[2668]: E0527 18:20:22.114034 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7989d9b698-55wz2" podUID="461386c4-2f9e-4c80-a2ab-7b0260259077" May 27 18:20:23.111853 kubelet[2668]: E0527 18:20:23.111688 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:20:23.310888 systemd[1]: Started sshd@60-146.190.127.126:22-139.178.68.195:39678.service - OpenSSH per-connection server daemon (139.178.68.195:39678). May 27 18:20:23.384547 sshd[6314]: Accepted publickey for core from 139.178.68.195 port 39678 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:20:23.389998 sshd-session[6314]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:20:23.402699 systemd-logind[1520]: New session 61 of user core. May 27 18:20:23.410750 systemd[1]: Started session-61.scope - Session 61 of User core. May 27 18:20:23.779582 sshd[6316]: Connection closed by 139.178.68.195 port 39678 May 27 18:20:23.793414 sshd-session[6314]: pam_unix(sshd:session): session closed for user core May 27 18:20:23.812813 systemd-logind[1520]: Session 61 logged out. Waiting for processes to exit. May 27 18:20:23.813113 systemd[1]: sshd@60-146.190.127.126:22-139.178.68.195:39678.service: Deactivated successfully. May 27 18:20:23.816156 systemd[1]: session-61.scope: Deactivated successfully. May 27 18:20:23.820494 systemd-logind[1520]: Removed session 61. May 27 18:20:28.795427 systemd[1]: Started sshd@61-146.190.127.126:22-139.178.68.195:45950.service - OpenSSH per-connection server daemon (139.178.68.195:45950). May 27 18:20:28.900297 sshd[6342]: Accepted publickey for core from 139.178.68.195 port 45950 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:20:28.902886 sshd-session[6342]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:20:28.910373 systemd-logind[1520]: New session 62 of user core. May 27 18:20:28.918794 systemd[1]: Started session-62.scope - Session 62 of User core. May 27 18:20:29.153120 sshd[6344]: Connection closed by 139.178.68.195 port 45950 May 27 18:20:29.153607 sshd-session[6342]: pam_unix(sshd:session): session closed for user core May 27 18:20:29.167094 systemd[1]: sshd@61-146.190.127.126:22-139.178.68.195:45950.service: Deactivated successfully. May 27 18:20:29.167548 systemd-logind[1520]: Session 62 logged out. Waiting for processes to exit. May 27 18:20:29.172060 systemd[1]: session-62.scope: Deactivated successfully. May 27 18:20:29.176279 systemd-logind[1520]: Removed session 62. May 27 18:20:30.113864 containerd[1543]: time="2025-05-27T18:20:30.113686323Z" level=warning msg="container event discarded" container=889bba814c9e6e9f261acdf40ff1a7814556a8cdf542770a569c16f826a5e74e type=CONTAINER_CREATED_EVENT May 27 18:20:30.113864 containerd[1543]: time="2025-05-27T18:20:30.113793432Z" level=warning msg="container event discarded" container=889bba814c9e6e9f261acdf40ff1a7814556a8cdf542770a569c16f826a5e74e type=CONTAINER_STARTED_EVENT May 27 18:20:30.351262 containerd[1543]: time="2025-05-27T18:20:30.351182365Z" level=warning msg="container event discarded" container=e1edfeb6bc174344d1893d9d9a496ed2407a3d49caba857eb73d775fe58fd102 type=CONTAINER_CREATED_EVENT May 27 18:20:30.351262 containerd[1543]: time="2025-05-27T18:20:30.351248437Z" level=warning msg="container event discarded" container=e1edfeb6bc174344d1893d9d9a496ed2407a3d49caba857eb73d775fe58fd102 type=CONTAINER_STARTED_EVENT May 27 18:20:32.019532 containerd[1543]: time="2025-05-27T18:20:32.019474347Z" level=info msg="TaskExit event in podsandbox handler container_id:\"42963c6a800eeb5920b985546148e121e7547be360de37269e45c118688a457a\" id:\"a06e68697d89e89e9da9f0c5f752a32fcae255e50f469ad08e43de647f7c4f21\" pid:6369 exited_at:{seconds:1748370032 nanos:17418380}" May 27 18:20:32.993984 containerd[1543]: time="2025-05-27T18:20:32.993831849Z" level=warning msg="container event discarded" container=f35266d878fb26d9036046931a867d725d6e150a9e4c78e645fb6dad03893406 type=CONTAINER_CREATED_EVENT May 27 18:20:33.149516 containerd[1543]: time="2025-05-27T18:20:33.149419348Z" level=warning msg="container event discarded" container=f35266d878fb26d9036046931a867d725d6e150a9e4c78e645fb6dad03893406 type=CONTAINER_STARTED_EVENT May 27 18:20:34.115235 kubelet[2668]: E0527 18:20:34.114705 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7989d9b698-55wz2" podUID="461386c4-2f9e-4c80-a2ab-7b0260259077" May 27 18:20:34.171051 systemd[1]: Started sshd@62-146.190.127.126:22-139.178.68.195:34178.service - OpenSSH per-connection server daemon (139.178.68.195:34178). May 27 18:20:34.265625 sshd[6379]: Accepted publickey for core from 139.178.68.195 port 34178 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:20:34.267817 sshd-session[6379]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:20:34.275915 systemd-logind[1520]: New session 63 of user core. May 27 18:20:34.286858 systemd[1]: Started session-63.scope - Session 63 of User core. May 27 18:20:34.514038 sshd[6381]: Connection closed by 139.178.68.195 port 34178 May 27 18:20:34.514775 sshd-session[6379]: pam_unix(sshd:session): session closed for user core May 27 18:20:34.523304 systemd[1]: sshd@62-146.190.127.126:22-139.178.68.195:34178.service: Deactivated successfully. May 27 18:20:34.527503 systemd[1]: session-63.scope: Deactivated successfully. May 27 18:20:34.529842 systemd-logind[1520]: Session 63 logged out. Waiting for processes to exit. May 27 18:20:34.532914 systemd-logind[1520]: Removed session 63. May 27 18:20:34.792885 containerd[1543]: time="2025-05-27T18:20:34.792670662Z" level=info msg="TaskExit event in podsandbox handler container_id:\"42963c6a800eeb5920b985546148e121e7547be360de37269e45c118688a457a\" id:\"719aa44321e35df1f8f61196e542def30b39cc291ae1b482ae6fbef343e8fd04\" pid:6406 exited_at:{seconds:1748370034 nanos:792018834}" May 27 18:20:34.923557 containerd[1543]: time="2025-05-27T18:20:34.923428089Z" level=warning msg="container event discarded" container=2d5175992773f3f9c20c4c62c53b246bb2d06c65019f8a14e196609275ddfc18 type=CONTAINER_CREATED_EVENT May 27 18:20:35.011135 containerd[1543]: time="2025-05-27T18:20:35.011024576Z" level=warning msg="container event discarded" container=2d5175992773f3f9c20c4c62c53b246bb2d06c65019f8a14e196609275ddfc18 type=CONTAINER_STARTED_EVENT May 27 18:20:35.122180 kubelet[2668]: E0527 18:20:35.121856 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hqs5q" podUID="312bbe06-8b56-48c8-bd2e-4f07049cb4ed" May 27 18:20:35.137733 containerd[1543]: time="2025-05-27T18:20:35.137620983Z" level=warning msg="container event discarded" container=2d5175992773f3f9c20c4c62c53b246bb2d06c65019f8a14e196609275ddfc18 type=CONTAINER_STOPPED_EVENT May 27 18:20:39.264060 containerd[1543]: time="2025-05-27T18:20:39.263938638Z" level=warning msg="container event discarded" container=11170eb582c7344789b0a9631283e4f3b467b2d5e3dd782191011ed7eb64f9c3 type=CONTAINER_CREATED_EVENT May 27 18:20:39.365659 containerd[1543]: time="2025-05-27T18:20:39.365558298Z" level=warning msg="container event discarded" container=11170eb582c7344789b0a9631283e4f3b467b2d5e3dd782191011ed7eb64f9c3 type=CONTAINER_STARTED_EVENT May 27 18:20:39.533340 systemd[1]: Started sshd@63-146.190.127.126:22-139.178.68.195:34180.service - OpenSSH per-connection server daemon (139.178.68.195:34180). May 27 18:20:39.622387 sshd[6423]: Accepted publickey for core from 139.178.68.195 port 34180 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:20:39.625755 sshd-session[6423]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:20:39.633817 systemd-logind[1520]: New session 64 of user core. May 27 18:20:39.637696 systemd[1]: Started session-64.scope - Session 64 of User core. May 27 18:20:39.896474 sshd[6425]: Connection closed by 139.178.68.195 port 34180 May 27 18:20:39.897797 sshd-session[6423]: pam_unix(sshd:session): session closed for user core May 27 18:20:39.903620 systemd[1]: sshd@63-146.190.127.126:22-139.178.68.195:34180.service: Deactivated successfully. May 27 18:20:39.907372 systemd[1]: session-64.scope: Deactivated successfully. May 27 18:20:39.909886 systemd-logind[1520]: Session 64 logged out. Waiting for processes to exit. May 27 18:20:39.914057 systemd-logind[1520]: Removed session 64. May 27 18:20:40.111497 containerd[1543]: time="2025-05-27T18:20:40.111312677Z" level=warning msg="container event discarded" container=11170eb582c7344789b0a9631283e4f3b467b2d5e3dd782191011ed7eb64f9c3 type=CONTAINER_STOPPED_EVENT May 27 18:20:44.923995 systemd[1]: Started sshd@64-146.190.127.126:22-139.178.68.195:57482.service - OpenSSH per-connection server daemon (139.178.68.195:57482). May 27 18:20:44.996720 sshd[6436]: Accepted publickey for core from 139.178.68.195 port 57482 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:20:44.999309 sshd-session[6436]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:20:45.007067 systemd-logind[1520]: New session 65 of user core. May 27 18:20:45.014871 systemd[1]: Started session-65.scope - Session 65 of User core. May 27 18:20:45.193807 sshd[6438]: Connection closed by 139.178.68.195 port 57482 May 27 18:20:45.195931 sshd-session[6436]: pam_unix(sshd:session): session closed for user core May 27 18:20:45.204097 systemd[1]: sshd@64-146.190.127.126:22-139.178.68.195:57482.service: Deactivated successfully. May 27 18:20:45.205360 systemd-logind[1520]: Session 65 logged out. Waiting for processes to exit. May 27 18:20:45.210213 systemd[1]: session-65.scope: Deactivated successfully. May 27 18:20:45.217050 systemd-logind[1520]: Removed session 65. May 27 18:20:47.114397 kubelet[2668]: E0527 18:20:47.114336 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hqs5q" podUID="312bbe06-8b56-48c8-bd2e-4f07049cb4ed" May 27 18:20:47.116930 kubelet[2668]: E0527 18:20:47.114693 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7989d9b698-55wz2" podUID="461386c4-2f9e-4c80-a2ab-7b0260259077" May 27 18:20:48.156810 containerd[1543]: time="2025-05-27T18:20:48.156692966Z" level=warning msg="container event discarded" container=cfef13c479386db1499369db02a1c6f1fb36ea37285d4077a26db8d97d74af06 type=CONTAINER_CREATED_EVENT May 27 18:20:48.470701 containerd[1543]: time="2025-05-27T18:20:48.470601959Z" level=warning msg="container event discarded" container=cfef13c479386db1499369db02a1c6f1fb36ea37285d4077a26db8d97d74af06 type=CONTAINER_STARTED_EVENT May 27 18:20:50.216651 systemd[1]: Started sshd@65-146.190.127.126:22-139.178.68.195:57498.service - OpenSSH per-connection server daemon (139.178.68.195:57498). May 27 18:20:50.291533 sshd[6452]: Accepted publickey for core from 139.178.68.195 port 57498 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:20:50.294888 sshd-session[6452]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:20:50.307309 systemd-logind[1520]: New session 66 of user core. May 27 18:20:50.313503 systemd[1]: Started session-66.scope - Session 66 of User core. May 27 18:20:50.476187 sshd[6454]: Connection closed by 139.178.68.195 port 57498 May 27 18:20:50.475287 sshd-session[6452]: pam_unix(sshd:session): session closed for user core May 27 18:20:50.483878 systemd[1]: sshd@65-146.190.127.126:22-139.178.68.195:57498.service: Deactivated successfully. May 27 18:20:50.488771 systemd[1]: session-66.scope: Deactivated successfully. May 27 18:20:50.490401 systemd-logind[1520]: Session 66 logged out. Waiting for processes to exit. May 27 18:20:50.493949 systemd-logind[1520]: Removed session 66. May 27 18:20:50.599161 containerd[1543]: time="2025-05-27T18:20:50.599023014Z" level=warning msg="container event discarded" container=a306e80fca4c042a3555a7d7a5817b89b9ed415d51437cda5d631d5fc19b6bdc type=CONTAINER_CREATED_EVENT May 27 18:20:50.599161 containerd[1543]: time="2025-05-27T18:20:50.599116678Z" level=warning msg="container event discarded" container=a306e80fca4c042a3555a7d7a5817b89b9ed415d51437cda5d631d5fc19b6bdc type=CONTAINER_STARTED_EVENT May 27 18:20:51.559254 containerd[1543]: time="2025-05-27T18:20:51.559198698Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cfef13c479386db1499369db02a1c6f1fb36ea37285d4077a26db8d97d74af06\" id:\"ec037b03608771516c159e0ec5b0ed3f43b1ee4ad8a3d5d03e10c0b16aa3a689\" pid:6477 exited_at:{seconds:1748370051 nanos:558105333}" May 27 18:20:52.539553 containerd[1543]: time="2025-05-27T18:20:52.539357396Z" level=warning msg="container event discarded" container=43094d8a66a156cc4160890853ec29d99a1192256813a069cae64064d957968c type=CONTAINER_CREATED_EVENT May 27 18:20:52.540353 containerd[1543]: time="2025-05-27T18:20:52.539879375Z" level=warning msg="container event discarded" container=43094d8a66a156cc4160890853ec29d99a1192256813a069cae64064d957968c type=CONTAINER_STARTED_EVENT May 27 18:20:53.648264 containerd[1543]: time="2025-05-27T18:20:53.648152396Z" level=warning msg="container event discarded" container=8680c485c287a90ed6bb801f77b5435527da6a8d1979c3ac3a3ffa006b899522 type=CONTAINER_CREATED_EVENT May 27 18:20:53.648264 containerd[1543]: time="2025-05-27T18:20:53.648228560Z" level=warning msg="container event discarded" container=8680c485c287a90ed6bb801f77b5435527da6a8d1979c3ac3a3ffa006b899522 type=CONTAINER_STARTED_EVENT May 27 18:20:53.692628 containerd[1543]: time="2025-05-27T18:20:53.692502216Z" level=warning msg="container event discarded" container=9a4f96b0211296ead37575ccced075a1d7868b4b0ad3a0e180207b49ea718c1d type=CONTAINER_CREATED_EVENT May 27 18:20:53.692628 containerd[1543]: time="2025-05-27T18:20:53.692577055Z" level=warning msg="container event discarded" container=9a4f96b0211296ead37575ccced075a1d7868b4b0ad3a0e180207b49ea718c1d type=CONTAINER_STARTED_EVENT May 27 18:20:55.494289 systemd[1]: Started sshd@66-146.190.127.126:22-139.178.68.195:33782.service - OpenSSH per-connection server daemon (139.178.68.195:33782). May 27 18:20:55.664505 sshd[6490]: Accepted publickey for core from 139.178.68.195 port 33782 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:20:55.667752 sshd-session[6490]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:20:55.679100 systemd-logind[1520]: New session 67 of user core. May 27 18:20:55.689977 systemd[1]: Started session-67.scope - Session 67 of User core. May 27 18:20:56.057210 sshd[6492]: Connection closed by 139.178.68.195 port 33782 May 27 18:20:56.058762 sshd-session[6490]: pam_unix(sshd:session): session closed for user core May 27 18:20:56.059350 containerd[1543]: time="2025-05-27T18:20:56.059262551Z" level=warning msg="container event discarded" container=f683cd486b83ae7dae1d12979afa64e7fd2334ffc3307de27161e1db6ddd0e8a type=CONTAINER_CREATED_EVENT May 27 18:20:56.059350 containerd[1543]: time="2025-05-27T18:20:56.059322962Z" level=warning msg="container event discarded" container=f683cd486b83ae7dae1d12979afa64e7fd2334ffc3307de27161e1db6ddd0e8a type=CONTAINER_STARTED_EVENT May 27 18:20:56.064445 systemd[1]: sshd@66-146.190.127.126:22-139.178.68.195:33782.service: Deactivated successfully. May 27 18:20:56.068379 systemd[1]: session-67.scope: Deactivated successfully. May 27 18:20:56.073940 systemd-logind[1520]: Session 67 logged out. Waiting for processes to exit. May 27 18:20:56.077074 systemd-logind[1520]: Removed session 67. May 27 18:20:56.181630 containerd[1543]: time="2025-05-27T18:20:56.181551665Z" level=warning msg="container event discarded" container=43ad44df00f918c05327c6553e1402832cdc8a01bc0e56fdce0ff4e45abb2e27 type=CONTAINER_CREATED_EVENT May 27 18:20:56.181937 containerd[1543]: time="2025-05-27T18:20:56.181877563Z" level=warning msg="container event discarded" container=43ad44df00f918c05327c6553e1402832cdc8a01bc0e56fdce0ff4e45abb2e27 type=CONTAINER_STARTED_EVENT May 27 18:20:56.505273 containerd[1543]: time="2025-05-27T18:20:56.505097189Z" level=warning msg="container event discarded" container=9b9a2f2a858037461ac95fac323f50112f544df1b464f28b7d4eaa0d7ceb6eea type=CONTAINER_CREATED_EVENT May 27 18:20:56.521533 containerd[1543]: time="2025-05-27T18:20:56.521409269Z" level=warning msg="container event discarded" container=8ab5edf4287172000ec17054ffb98a210ab0c768d835bf47dc7c5e13624685b0 type=CONTAINER_CREATED_EVENT May 27 18:20:56.888895 containerd[1543]: time="2025-05-27T18:20:56.888654984Z" level=warning msg="container event discarded" container=8ab5edf4287172000ec17054ffb98a210ab0c768d835bf47dc7c5e13624685b0 type=CONTAINER_STARTED_EVENT May 27 18:20:56.909029 containerd[1543]: time="2025-05-27T18:20:56.908914850Z" level=warning msg="container event discarded" container=9b9a2f2a858037461ac95fac323f50112f544df1b464f28b7d4eaa0d7ceb6eea type=CONTAINER_STARTED_EVENT May 27 18:20:57.618141 containerd[1543]: time="2025-05-27T18:20:57.618038068Z" level=warning msg="container event discarded" container=1ad2b5906a62aa7bdfa0ac3230bf3496afebb4a9b9fb3edcfda189a4f3edd83c type=CONTAINER_CREATED_EVENT May 27 18:20:57.618141 containerd[1543]: time="2025-05-27T18:20:57.618100602Z" level=warning msg="container event discarded" container=1ad2b5906a62aa7bdfa0ac3230bf3496afebb4a9b9fb3edcfda189a4f3edd83c type=CONTAINER_STARTED_EVENT May 27 18:20:57.645783 containerd[1543]: time="2025-05-27T18:20:57.645656362Z" level=warning msg="container event discarded" container=43f9a14017dfdaeadb1a5459031f676c6da12d22a3828519a7e67611107d2ef2 type=CONTAINER_CREATED_EVENT May 27 18:20:57.645783 containerd[1543]: time="2025-05-27T18:20:57.645740596Z" level=warning msg="container event discarded" container=43f9a14017dfdaeadb1a5459031f676c6da12d22a3828519a7e67611107d2ef2 type=CONTAINER_STARTED_EVENT May 27 18:20:57.921338 containerd[1543]: time="2025-05-27T18:20:57.921108382Z" level=warning msg="container event discarded" container=cdfb3bccce12a0499fb6fa3a3191dbd978029abb637d93d8fae386dc1634af8f type=CONTAINER_CREATED_EVENT May 27 18:20:58.100897 containerd[1543]: time="2025-05-27T18:20:58.100817703Z" level=warning msg="container event discarded" container=cdfb3bccce12a0499fb6fa3a3191dbd978029abb637d93d8fae386dc1634af8f type=CONTAINER_STARTED_EVENT May 27 18:20:58.377194 containerd[1543]: time="2025-05-27T18:20:58.377058720Z" level=warning msg="container event discarded" container=82e586de711c2e869bacbb33310c373224a0c52c76be103134dfc10299091f73 type=CONTAINER_CREATED_EVENT May 27 18:20:58.531674 containerd[1543]: time="2025-05-27T18:20:58.531587183Z" level=warning msg="container event discarded" container=82e586de711c2e869bacbb33310c373224a0c52c76be103134dfc10299091f73 type=CONTAINER_STARTED_EVENT May 27 18:20:59.113241 kubelet[2668]: E0527 18:20:59.112674 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hqs5q" podUID="312bbe06-8b56-48c8-bd2e-4f07049cb4ed" May 27 18:21:00.118562 kubelet[2668]: E0527 18:21:00.118478 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7989d9b698-55wz2" podUID="461386c4-2f9e-4c80-a2ab-7b0260259077" May 27 18:21:00.612833 containerd[1543]: time="2025-05-27T18:21:00.612742204Z" level=warning msg="container event discarded" container=5b4c85a622d6a433a09285e3277d549dc725c44b3f3df8210a5e0b43601d36ef type=CONTAINER_CREATED_EVENT May 27 18:21:00.766735 containerd[1543]: time="2025-05-27T18:21:00.766632178Z" level=warning msg="container event discarded" container=5b4c85a622d6a433a09285e3277d549dc725c44b3f3df8210a5e0b43601d36ef type=CONTAINER_STARTED_EVENT May 27 18:21:01.077991 systemd[1]: Started sshd@67-146.190.127.126:22-139.178.68.195:33784.service - OpenSSH per-connection server daemon (139.178.68.195:33784). May 27 18:21:01.160050 sshd[6504]: Accepted publickey for core from 139.178.68.195 port 33784 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:21:01.163106 sshd-session[6504]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:21:01.172570 systemd-logind[1520]: New session 68 of user core. May 27 18:21:01.177889 systemd[1]: Started session-68.scope - Session 68 of User core. May 27 18:21:01.370061 sshd[6506]: Connection closed by 139.178.68.195 port 33784 May 27 18:21:01.371219 sshd-session[6504]: pam_unix(sshd:session): session closed for user core May 27 18:21:01.381366 systemd[1]: sshd@67-146.190.127.126:22-139.178.68.195:33784.service: Deactivated successfully. May 27 18:21:01.388581 systemd[1]: session-68.scope: Deactivated successfully. May 27 18:21:01.395558 systemd-logind[1520]: Session 68 logged out. Waiting for processes to exit. May 27 18:21:01.399748 systemd-logind[1520]: Removed session 68. May 27 18:21:04.472024 containerd[1543]: time="2025-05-27T18:21:04.471909633Z" level=warning msg="container event discarded" container=42963c6a800eeb5920b985546148e121e7547be360de37269e45c118688a457a type=CONTAINER_CREATED_EVENT May 27 18:21:04.630697 containerd[1543]: time="2025-05-27T18:21:04.630585932Z" level=warning msg="container event discarded" container=42963c6a800eeb5920b985546148e121e7547be360de37269e45c118688a457a type=CONTAINER_STARTED_EVENT May 27 18:21:04.799647 containerd[1543]: time="2025-05-27T18:21:04.799493556Z" level=info msg="TaskExit event in podsandbox handler container_id:\"42963c6a800eeb5920b985546148e121e7547be360de37269e45c118688a457a\" id:\"cd8d60ae3cd4710ea51f77391b5af94a0f85fd829ca45d371c8ad8b591559627\" pid:6529 exited_at:{seconds:1748370064 nanos:799125540}" May 27 18:21:06.342594 containerd[1543]: time="2025-05-27T18:21:06.342493945Z" level=warning msg="container event discarded" container=0583a1392296f68bd00d482da14ae8a2fa32026acdfbf34f97fe4edccdab395f type=CONTAINER_CREATED_EVENT May 27 18:21:06.392801 systemd[1]: Started sshd@68-146.190.127.126:22-139.178.68.195:60408.service - OpenSSH per-connection server daemon (139.178.68.195:60408). May 27 18:21:06.461580 sshd[6539]: Accepted publickey for core from 139.178.68.195 port 60408 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:21:06.465111 sshd-session[6539]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:21:06.472810 systemd-logind[1520]: New session 69 of user core. May 27 18:21:06.478793 systemd[1]: Started session-69.scope - Session 69 of User core. May 27 18:21:06.482615 containerd[1543]: time="2025-05-27T18:21:06.482556379Z" level=warning msg="container event discarded" container=0583a1392296f68bd00d482da14ae8a2fa32026acdfbf34f97fe4edccdab395f type=CONTAINER_STARTED_EVENT May 27 18:21:06.669565 sshd[6541]: Connection closed by 139.178.68.195 port 60408 May 27 18:21:06.672966 sshd-session[6539]: pam_unix(sshd:session): session closed for user core May 27 18:21:06.679777 systemd[1]: sshd@68-146.190.127.126:22-139.178.68.195:60408.service: Deactivated successfully. May 27 18:21:06.684594 systemd[1]: session-69.scope: Deactivated successfully. May 27 18:21:06.689142 systemd-logind[1520]: Session 69 logged out. Waiting for processes to exit. May 27 18:21:06.691680 systemd-logind[1520]: Removed session 69. May 27 18:21:11.114397 kubelet[2668]: E0527 18:21:11.113645 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hqs5q" podUID="312bbe06-8b56-48c8-bd2e-4f07049cb4ed" May 27 18:21:11.694013 systemd[1]: Started sshd@69-146.190.127.126:22-139.178.68.195:60418.service - OpenSSH per-connection server daemon (139.178.68.195:60418). May 27 18:21:11.770572 sshd[6555]: Accepted publickey for core from 139.178.68.195 port 60418 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:21:11.773253 sshd-session[6555]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:21:11.781212 systemd-logind[1520]: New session 70 of user core. May 27 18:21:11.791917 systemd[1]: Started session-70.scope - Session 70 of User core. May 27 18:21:12.127619 sshd[6558]: Connection closed by 139.178.68.195 port 60418 May 27 18:21:12.127886 sshd-session[6555]: pam_unix(sshd:session): session closed for user core May 27 18:21:12.137228 kubelet[2668]: E0527 18:21:12.135141 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7989d9b698-55wz2" podUID="461386c4-2f9e-4c80-a2ab-7b0260259077" May 27 18:21:12.138214 systemd[1]: sshd@69-146.190.127.126:22-139.178.68.195:60418.service: Deactivated successfully. May 27 18:21:12.143005 systemd[1]: session-70.scope: Deactivated successfully. May 27 18:21:12.149064 systemd-logind[1520]: Session 70 logged out. Waiting for processes to exit. May 27 18:21:12.152536 systemd-logind[1520]: Removed session 70. May 27 18:21:13.111906 kubelet[2668]: E0527 18:21:13.111610 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:21:13.111906 kubelet[2668]: E0527 18:21:13.111829 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:21:17.151662 systemd[1]: Started sshd@70-146.190.127.126:22-139.178.68.195:53156.service - OpenSSH per-connection server daemon (139.178.68.195:53156). May 27 18:21:17.291276 sshd[6580]: Accepted publickey for core from 139.178.68.195 port 53156 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:21:17.293772 sshd-session[6580]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:21:17.301969 systemd-logind[1520]: New session 71 of user core. May 27 18:21:17.309779 systemd[1]: Started session-71.scope - Session 71 of User core. May 27 18:21:17.557661 sshd[6582]: Connection closed by 139.178.68.195 port 53156 May 27 18:21:17.558888 sshd-session[6580]: pam_unix(sshd:session): session closed for user core May 27 18:21:17.565605 systemd[1]: sshd@70-146.190.127.126:22-139.178.68.195:53156.service: Deactivated successfully. May 27 18:21:17.569800 systemd[1]: session-71.scope: Deactivated successfully. May 27 18:21:17.573133 systemd-logind[1520]: Session 71 logged out. Waiting for processes to exit. May 27 18:21:17.575429 systemd-logind[1520]: Removed session 71. May 27 18:21:21.112181 kubelet[2668]: E0527 18:21:21.112031 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:21:21.570813 containerd[1543]: time="2025-05-27T18:21:21.570734613Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cfef13c479386db1499369db02a1c6f1fb36ea37285d4077a26db8d97d74af06\" id:\"fbd716759a4641fbb6d339112647ddbf671420a5d73fbce31bfd42a76ad45c5f\" pid:6607 exited_at:{seconds:1748370081 nanos:569794380}" May 27 18:21:22.580163 systemd[1]: Started sshd@71-146.190.127.126:22-139.178.68.195:53166.service - OpenSSH per-connection server daemon (139.178.68.195:53166). May 27 18:21:22.685488 sshd[6620]: Accepted publickey for core from 139.178.68.195 port 53166 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:21:22.688106 sshd-session[6620]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:21:22.695544 systemd-logind[1520]: New session 72 of user core. May 27 18:21:22.704112 systemd[1]: Started session-72.scope - Session 72 of User core. May 27 18:21:23.131893 sshd[6622]: Connection closed by 139.178.68.195 port 53166 May 27 18:21:23.132699 sshd-session[6620]: pam_unix(sshd:session): session closed for user core May 27 18:21:23.141689 systemd[1]: sshd@71-146.190.127.126:22-139.178.68.195:53166.service: Deactivated successfully. May 27 18:21:23.145872 systemd[1]: session-72.scope: Deactivated successfully. May 27 18:21:23.147477 systemd-logind[1520]: Session 72 logged out. Waiting for processes to exit. May 27 18:21:23.150678 systemd-logind[1520]: Removed session 72. May 27 18:21:25.116966 containerd[1543]: time="2025-05-27T18:21:25.116396128Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 18:21:25.339321 containerd[1543]: time="2025-05-27T18:21:25.339216123Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:21:25.340420 containerd[1543]: time="2025-05-27T18:21:25.340334153Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 18:21:25.340420 containerd[1543]: time="2025-05-27T18:21:25.340308309Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:21:25.345805 kubelet[2668]: E0527 18:21:25.345600 2668 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 18:21:25.347533 kubelet[2668]: E0527 18:21:25.346601 2668 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 18:21:25.349192 kubelet[2668]: E0527 18:21:25.349086 2668 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:b58c9c6d236b46b38836eca7a8a2bb57,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l4mn6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7989d9b698-55wz2_calico-system(461386c4-2f9e-4c80-a2ab-7b0260259077): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:21:25.352024 containerd[1543]: time="2025-05-27T18:21:25.351968073Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 18:21:25.638740 containerd[1543]: time="2025-05-27T18:21:25.638649700Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:21:25.639587 containerd[1543]: time="2025-05-27T18:21:25.639509894Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:21:25.639895 containerd[1543]: time="2025-05-27T18:21:25.639543252Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 18:21:25.640412 kubelet[2668]: E0527 18:21:25.639856 2668 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 18:21:25.640412 kubelet[2668]: E0527 18:21:25.639919 2668 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 18:21:25.640412 kubelet[2668]: E0527 18:21:25.640086 2668 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l4mn6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7989d9b698-55wz2_calico-system(461386c4-2f9e-4c80-a2ab-7b0260259077): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:21:25.641383 kubelet[2668]: E0527 18:21:25.641316 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7989d9b698-55wz2" podUID="461386c4-2f9e-4c80-a2ab-7b0260259077" May 27 18:21:26.111826 kubelet[2668]: E0527 18:21:26.111648 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:21:26.115990 kubelet[2668]: E0527 18:21:26.115917 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hqs5q" podUID="312bbe06-8b56-48c8-bd2e-4f07049cb4ed" May 27 18:21:28.112249 kubelet[2668]: E0527 18:21:28.112108 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:21:28.150785 systemd[1]: Started sshd@72-146.190.127.126:22-139.178.68.195:39276.service - OpenSSH per-connection server daemon (139.178.68.195:39276). May 27 18:21:28.226917 sshd[6633]: Accepted publickey for core from 139.178.68.195 port 39276 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:21:28.229540 sshd-session[6633]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:21:28.237877 systemd-logind[1520]: New session 73 of user core. May 27 18:21:28.251903 systemd[1]: Started session-73.scope - Session 73 of User core. May 27 18:21:28.435201 sshd[6635]: Connection closed by 139.178.68.195 port 39276 May 27 18:21:28.436210 sshd-session[6633]: pam_unix(sshd:session): session closed for user core May 27 18:21:28.444664 systemd[1]: sshd@72-146.190.127.126:22-139.178.68.195:39276.service: Deactivated successfully. May 27 18:21:28.450009 systemd[1]: session-73.scope: Deactivated successfully. May 27 18:21:28.452591 systemd-logind[1520]: Session 73 logged out. Waiting for processes to exit. May 27 18:21:28.455747 systemd-logind[1520]: Removed session 73. May 27 18:21:32.020812 containerd[1543]: time="2025-05-27T18:21:32.020723058Z" level=info msg="TaskExit event in podsandbox handler container_id:\"42963c6a800eeb5920b985546148e121e7547be360de37269e45c118688a457a\" id:\"23fa06d1cd06815d720ce79d3e596e7c28792a94d00091110815c3c81b952497\" pid:6660 exited_at:{seconds:1748370092 nanos:20256167}" May 27 18:21:33.457865 systemd[1]: Started sshd@73-146.190.127.126:22-139.178.68.195:39284.service - OpenSSH per-connection server daemon (139.178.68.195:39284). May 27 18:21:33.555108 sshd[6671]: Accepted publickey for core from 139.178.68.195 port 39284 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:21:33.558141 sshd-session[6671]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:21:33.568412 systemd-logind[1520]: New session 74 of user core. May 27 18:21:33.576752 systemd[1]: Started session-74.scope - Session 74 of User core. May 27 18:21:33.770868 sshd[6673]: Connection closed by 139.178.68.195 port 39284 May 27 18:21:33.770138 sshd-session[6671]: pam_unix(sshd:session): session closed for user core May 27 18:21:33.779427 systemd[1]: sshd@73-146.190.127.126:22-139.178.68.195:39284.service: Deactivated successfully. May 27 18:21:33.783128 systemd[1]: session-74.scope: Deactivated successfully. May 27 18:21:33.786399 systemd-logind[1520]: Session 74 logged out. Waiting for processes to exit. May 27 18:21:33.789154 systemd-logind[1520]: Removed session 74. May 27 18:21:34.111659 kubelet[2668]: E0527 18:21:34.111238 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:21:34.804961 containerd[1543]: time="2025-05-27T18:21:34.804905312Z" level=info msg="TaskExit event in podsandbox handler container_id:\"42963c6a800eeb5920b985546148e121e7547be360de37269e45c118688a457a\" id:\"77584755164868c16d9d5eed3c5d0f06b631556121ffdccc178baacea22cf4fc\" pid:6696 exited_at:{seconds:1748370094 nanos:804322588}" May 27 18:21:38.115802 kubelet[2668]: E0527 18:21:38.115737 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hqs5q" podUID="312bbe06-8b56-48c8-bd2e-4f07049cb4ed" May 27 18:21:38.790828 systemd[1]: Started sshd@74-146.190.127.126:22-139.178.68.195:57782.service - OpenSSH per-connection server daemon (139.178.68.195:57782). May 27 18:21:38.872558 sshd[6707]: Accepted publickey for core from 139.178.68.195 port 57782 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:21:38.874936 sshd-session[6707]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:21:38.883566 systemd-logind[1520]: New session 75 of user core. May 27 18:21:38.896802 systemd[1]: Started session-75.scope - Session 75 of User core. May 27 18:21:39.084308 sshd[6709]: Connection closed by 139.178.68.195 port 57782 May 27 18:21:39.085509 sshd-session[6707]: pam_unix(sshd:session): session closed for user core May 27 18:21:39.094302 systemd[1]: sshd@74-146.190.127.126:22-139.178.68.195:57782.service: Deactivated successfully. May 27 18:21:39.098508 systemd[1]: session-75.scope: Deactivated successfully. May 27 18:21:39.101784 systemd-logind[1520]: Session 75 logged out. Waiting for processes to exit. May 27 18:21:39.105089 systemd-logind[1520]: Removed session 75. May 27 18:21:39.111222 kubelet[2668]: E0527 18:21:39.111139 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:21:40.120153 kubelet[2668]: E0527 18:21:40.119852 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7989d9b698-55wz2" podUID="461386c4-2f9e-4c80-a2ab-7b0260259077" May 27 18:21:44.100735 systemd[1]: Started sshd@75-146.190.127.126:22-139.178.68.195:55386.service - OpenSSH per-connection server daemon (139.178.68.195:55386). May 27 18:21:44.178847 sshd[6720]: Accepted publickey for core from 139.178.68.195 port 55386 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:21:44.181418 sshd-session[6720]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:21:44.190754 systemd-logind[1520]: New session 76 of user core. May 27 18:21:44.199818 systemd[1]: Started session-76.scope - Session 76 of User core. May 27 18:21:44.360278 sshd[6722]: Connection closed by 139.178.68.195 port 55386 May 27 18:21:44.361104 sshd-session[6720]: pam_unix(sshd:session): session closed for user core May 27 18:21:44.367915 systemd[1]: sshd@75-146.190.127.126:22-139.178.68.195:55386.service: Deactivated successfully. May 27 18:21:44.372079 systemd[1]: session-76.scope: Deactivated successfully. May 27 18:21:44.373859 systemd-logind[1520]: Session 76 logged out. Waiting for processes to exit. May 27 18:21:44.377883 systemd-logind[1520]: Removed session 76. May 27 18:21:49.389958 systemd[1]: Started sshd@76-146.190.127.126:22-139.178.68.195:55392.service - OpenSSH per-connection server daemon (139.178.68.195:55392). May 27 18:21:49.458317 sshd[6735]: Accepted publickey for core from 139.178.68.195 port 55392 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:21:49.460260 sshd-session[6735]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:21:49.468374 systemd-logind[1520]: New session 77 of user core. May 27 18:21:49.472722 systemd[1]: Started session-77.scope - Session 77 of User core. May 27 18:21:49.638082 sshd[6737]: Connection closed by 139.178.68.195 port 55392 May 27 18:21:49.639331 sshd-session[6735]: pam_unix(sshd:session): session closed for user core May 27 18:21:49.646307 systemd-logind[1520]: Session 77 logged out. Waiting for processes to exit. May 27 18:21:49.647958 systemd[1]: sshd@76-146.190.127.126:22-139.178.68.195:55392.service: Deactivated successfully. May 27 18:21:49.652192 systemd[1]: session-77.scope: Deactivated successfully. May 27 18:21:49.658686 systemd-logind[1520]: Removed session 77. May 27 18:21:51.576901 containerd[1543]: time="2025-05-27T18:21:51.576839777Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cfef13c479386db1499369db02a1c6f1fb36ea37285d4077a26db8d97d74af06\" id:\"412f714d3f3b06e28f160f60f77c665700cf7a6de2e9f8fb5fba4fd7bb4a7a93\" pid:6760 exited_at:{seconds:1748370111 nanos:575968340}" May 27 18:21:53.115961 containerd[1543]: time="2025-05-27T18:21:53.115397471Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 18:21:53.118370 kubelet[2668]: E0527 18:21:53.118171 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7989d9b698-55wz2" podUID="461386c4-2f9e-4c80-a2ab-7b0260259077" May 27 18:21:53.452229 containerd[1543]: time="2025-05-27T18:21:53.452062559Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 18:21:53.453110 containerd[1543]: time="2025-05-27T18:21:53.453054083Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 18:21:53.453647 containerd[1543]: time="2025-05-27T18:21:53.453062187Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 18:21:53.454160 kubelet[2668]: E0527 18:21:53.453371 2668 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 18:21:53.454160 kubelet[2668]: E0527 18:21:53.453605 2668 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 18:21:53.454160 kubelet[2668]: E0527 18:21:53.453864 2668 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dctn5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-hqs5q_calico-system(312bbe06-8b56-48c8-bd2e-4f07049cb4ed): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 18:21:53.455622 kubelet[2668]: E0527 18:21:53.455538 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hqs5q" podUID="312bbe06-8b56-48c8-bd2e-4f07049cb4ed" May 27 18:21:54.663199 systemd[1]: Started sshd@77-146.190.127.126:22-139.178.68.195:42226.service - OpenSSH per-connection server daemon (139.178.68.195:42226). May 27 18:21:54.743208 sshd[6773]: Accepted publickey for core from 139.178.68.195 port 42226 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:21:54.745635 sshd-session[6773]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:21:54.753237 systemd-logind[1520]: New session 78 of user core. May 27 18:21:54.763838 systemd[1]: Started session-78.scope - Session 78 of User core. May 27 18:21:54.926838 sshd[6775]: Connection closed by 139.178.68.195 port 42226 May 27 18:21:54.927916 sshd-session[6773]: pam_unix(sshd:session): session closed for user core May 27 18:21:54.936159 systemd[1]: sshd@77-146.190.127.126:22-139.178.68.195:42226.service: Deactivated successfully. May 27 18:21:54.940102 systemd[1]: session-78.scope: Deactivated successfully. May 27 18:21:54.943020 systemd-logind[1520]: Session 78 logged out. Waiting for processes to exit. May 27 18:21:54.945812 systemd-logind[1520]: Removed session 78. May 27 18:21:59.952082 systemd[1]: Started sshd@78-146.190.127.126:22-139.178.68.195:42230.service - OpenSSH per-connection server daemon (139.178.68.195:42230). May 27 18:22:00.050173 sshd[6787]: Accepted publickey for core from 139.178.68.195 port 42230 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:22:00.054152 sshd-session[6787]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:22:00.079443 systemd-logind[1520]: New session 79 of user core. May 27 18:22:00.084961 systemd[1]: Started session-79.scope - Session 79 of User core. May 27 18:22:00.290245 sshd[6789]: Connection closed by 139.178.68.195 port 42230 May 27 18:22:00.291919 sshd-session[6787]: pam_unix(sshd:session): session closed for user core May 27 18:22:00.302389 systemd[1]: sshd@78-146.190.127.126:22-139.178.68.195:42230.service: Deactivated successfully. May 27 18:22:00.308077 systemd[1]: session-79.scope: Deactivated successfully. May 27 18:22:00.311347 systemd-logind[1520]: Session 79 logged out. Waiting for processes to exit. May 27 18:22:00.316213 systemd-logind[1520]: Removed session 79. May 27 18:22:04.795261 containerd[1543]: time="2025-05-27T18:22:04.795139155Z" level=info msg="TaskExit event in podsandbox handler container_id:\"42963c6a800eeb5920b985546148e121e7547be360de37269e45c118688a457a\" id:\"7506ff1fe3d4611323b3f3ec357ee385bdddf20b4c717f72917d5f06e7dc7259\" pid:6825 exited_at:{seconds:1748370124 nanos:794600854}" May 27 18:22:05.308032 systemd[1]: Started sshd@79-146.190.127.126:22-139.178.68.195:40860.service - OpenSSH per-connection server daemon (139.178.68.195:40860). May 27 18:22:05.376507 sshd[6835]: Accepted publickey for core from 139.178.68.195 port 40860 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:22:05.379701 sshd-session[6835]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:22:05.387578 systemd-logind[1520]: New session 80 of user core. May 27 18:22:05.393514 systemd[1]: Started session-80.scope - Session 80 of User core. May 27 18:22:05.540472 sshd[6837]: Connection closed by 139.178.68.195 port 40860 May 27 18:22:05.541202 sshd-session[6835]: pam_unix(sshd:session): session closed for user core May 27 18:22:05.548540 systemd[1]: sshd@79-146.190.127.126:22-139.178.68.195:40860.service: Deactivated successfully. May 27 18:22:05.552312 systemd[1]: session-80.scope: Deactivated successfully. May 27 18:22:05.554062 systemd-logind[1520]: Session 80 logged out. Waiting for processes to exit. May 27 18:22:05.556088 systemd-logind[1520]: Removed session 80. May 27 18:22:07.113684 kubelet[2668]: E0527 18:22:07.113613 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7989d9b698-55wz2" podUID="461386c4-2f9e-4c80-a2ab-7b0260259077" May 27 18:22:08.116182 kubelet[2668]: E0527 18:22:08.116097 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hqs5q" podUID="312bbe06-8b56-48c8-bd2e-4f07049cb4ed" May 27 18:22:10.560144 systemd[1]: Started sshd@80-146.190.127.126:22-139.178.68.195:40864.service - OpenSSH per-connection server daemon (139.178.68.195:40864). May 27 18:22:10.638714 sshd[6851]: Accepted publickey for core from 139.178.68.195 port 40864 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:22:10.641043 sshd-session[6851]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:22:10.649986 systemd-logind[1520]: New session 81 of user core. May 27 18:22:10.659755 systemd[1]: Started session-81.scope - Session 81 of User core. May 27 18:22:10.820390 sshd[6853]: Connection closed by 139.178.68.195 port 40864 May 27 18:22:10.821971 sshd-session[6851]: pam_unix(sshd:session): session closed for user core May 27 18:22:10.828717 systemd[1]: sshd@80-146.190.127.126:22-139.178.68.195:40864.service: Deactivated successfully. May 27 18:22:10.835046 systemd[1]: session-81.scope: Deactivated successfully. May 27 18:22:10.837245 systemd-logind[1520]: Session 81 logged out. Waiting for processes to exit. May 27 18:22:10.841491 systemd-logind[1520]: Removed session 81. May 27 18:22:15.840643 systemd[1]: Started sshd@81-146.190.127.126:22-139.178.68.195:42516.service - OpenSSH per-connection server daemon (139.178.68.195:42516). May 27 18:22:15.911053 sshd[6874]: Accepted publickey for core from 139.178.68.195 port 42516 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:22:15.913239 sshd-session[6874]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:22:15.919796 systemd-logind[1520]: New session 82 of user core. May 27 18:22:15.926852 systemd[1]: Started session-82.scope - Session 82 of User core. May 27 18:22:16.080407 sshd[6876]: Connection closed by 139.178.68.195 port 42516 May 27 18:22:16.081794 sshd-session[6874]: pam_unix(sshd:session): session closed for user core May 27 18:22:16.087392 systemd[1]: sshd@81-146.190.127.126:22-139.178.68.195:42516.service: Deactivated successfully. May 27 18:22:16.091309 systemd[1]: session-82.scope: Deactivated successfully. May 27 18:22:16.093050 systemd-logind[1520]: Session 82 logged out. Waiting for processes to exit. May 27 18:22:16.095501 systemd-logind[1520]: Removed session 82. May 27 18:22:19.113663 kubelet[2668]: E0527 18:22:19.113415 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7989d9b698-55wz2" podUID="461386c4-2f9e-4c80-a2ab-7b0260259077" May 27 18:22:21.100691 systemd[1]: Started sshd@82-146.190.127.126:22-139.178.68.195:42532.service - OpenSSH per-connection server daemon (139.178.68.195:42532). May 27 18:22:21.175519 sshd[6888]: Accepted publickey for core from 139.178.68.195 port 42532 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:22:21.178945 sshd-session[6888]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:22:21.186871 systemd-logind[1520]: New session 83 of user core. May 27 18:22:21.194808 systemd[1]: Started session-83.scope - Session 83 of User core. May 27 18:22:21.380920 sshd[6890]: Connection closed by 139.178.68.195 port 42532 May 27 18:22:21.381876 sshd-session[6888]: pam_unix(sshd:session): session closed for user core May 27 18:22:21.393542 systemd[1]: sshd@82-146.190.127.126:22-139.178.68.195:42532.service: Deactivated successfully. May 27 18:22:21.398240 systemd[1]: session-83.scope: Deactivated successfully. May 27 18:22:21.400662 systemd-logind[1520]: Session 83 logged out. Waiting for processes to exit. May 27 18:22:21.404956 systemd-logind[1520]: Removed session 83. May 27 18:22:21.562402 containerd[1543]: time="2025-05-27T18:22:21.560417148Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cfef13c479386db1499369db02a1c6f1fb36ea37285d4077a26db8d97d74af06\" id:\"b0a40ae374ca60515fa95a65ed882942af5ec352631870a6f5224bf46d3fba71\" pid:6914 exited_at:{seconds:1748370141 nanos:559687004}" May 27 18:22:22.115578 kubelet[2668]: E0527 18:22:22.115210 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hqs5q" podUID="312bbe06-8b56-48c8-bd2e-4f07049cb4ed" May 27 18:22:26.400202 systemd[1]: Started sshd@83-146.190.127.126:22-139.178.68.195:59662.service - OpenSSH per-connection server daemon (139.178.68.195:59662). May 27 18:22:26.482744 sshd[6926]: Accepted publickey for core from 139.178.68.195 port 59662 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:22:26.484668 sshd-session[6926]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:22:26.494347 systemd-logind[1520]: New session 84 of user core. May 27 18:22:26.502763 systemd[1]: Started session-84.scope - Session 84 of User core. May 27 18:22:26.649722 sshd[6928]: Connection closed by 139.178.68.195 port 59662 May 27 18:22:26.650996 sshd-session[6926]: pam_unix(sshd:session): session closed for user core May 27 18:22:26.659986 systemd[1]: sshd@83-146.190.127.126:22-139.178.68.195:59662.service: Deactivated successfully. May 27 18:22:26.663696 systemd[1]: session-84.scope: Deactivated successfully. May 27 18:22:26.666328 systemd-logind[1520]: Session 84 logged out. Waiting for processes to exit. May 27 18:22:26.668961 systemd-logind[1520]: Removed session 84. May 27 18:22:31.111504 kubelet[2668]: E0527 18:22:31.111376 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:22:31.669706 systemd[1]: Started sshd@84-146.190.127.126:22-139.178.68.195:59668.service - OpenSSH per-connection server daemon (139.178.68.195:59668). May 27 18:22:31.735478 sshd[6940]: Accepted publickey for core from 139.178.68.195 port 59668 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:22:31.737601 sshd-session[6940]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:22:31.744160 systemd-logind[1520]: New session 85 of user core. May 27 18:22:31.752759 systemd[1]: Started session-85.scope - Session 85 of User core. May 27 18:22:31.907547 sshd[6942]: Connection closed by 139.178.68.195 port 59668 May 27 18:22:31.910671 sshd-session[6940]: pam_unix(sshd:session): session closed for user core May 27 18:22:31.916535 systemd[1]: sshd@84-146.190.127.126:22-139.178.68.195:59668.service: Deactivated successfully. May 27 18:22:31.920078 systemd[1]: session-85.scope: Deactivated successfully. May 27 18:22:31.923179 systemd-logind[1520]: Session 85 logged out. Waiting for processes to exit. May 27 18:22:31.925802 systemd-logind[1520]: Removed session 85. May 27 18:22:32.023126 containerd[1543]: time="2025-05-27T18:22:32.022869950Z" level=info msg="TaskExit event in podsandbox handler container_id:\"42963c6a800eeb5920b985546148e121e7547be360de37269e45c118688a457a\" id:\"1fa40c4ac3c2f9d9483a14bb95a6eebb5446f6eb044eb04d2a2f9361d3a30b09\" pid:6964 exited_at:{seconds:1748370152 nanos:21910553}" May 27 18:22:33.113517 kubelet[2668]: E0527 18:22:33.113308 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:22:33.115060 kubelet[2668]: E0527 18:22:33.114979 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7989d9b698-55wz2" podUID="461386c4-2f9e-4c80-a2ab-7b0260259077" May 27 18:22:34.113369 kubelet[2668]: E0527 18:22:34.113068 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-hqs5q" podUID="312bbe06-8b56-48c8-bd2e-4f07049cb4ed" May 27 18:22:34.805210 containerd[1543]: time="2025-05-27T18:22:34.805151465Z" level=info msg="TaskExit event in podsandbox handler container_id:\"42963c6a800eeb5920b985546148e121e7547be360de37269e45c118688a457a\" id:\"6558f96a3a00bf1ce377362fbfc8e0473a728b64739879f6fc86ad05698a184f\" pid:6986 exited_at:{seconds:1748370154 nanos:804300814}" May 27 18:22:36.934015 systemd[1]: Started sshd@85-146.190.127.126:22-139.178.68.195:54850.service - OpenSSH per-connection server daemon (139.178.68.195:54850). May 27 18:22:37.007091 sshd[6995]: Accepted publickey for core from 139.178.68.195 port 54850 ssh2: RSA SHA256:4XUDqK0eZl9/JoHWa9cgZT5JQIr/TJd1ha4IPbi4WlY May 27 18:22:37.009957 sshd-session[6995]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 18:22:37.016528 systemd-logind[1520]: New session 86 of user core. May 27 18:22:37.022785 systemd[1]: Started session-86.scope - Session 86 of User core. May 27 18:22:37.111342 kubelet[2668]: E0527 18:22:37.111278 2668 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 67.207.67.2 67.207.67.3 67.207.67.2" May 27 18:22:37.193593 sshd[6997]: Connection closed by 139.178.68.195 port 54850 May 27 18:22:37.194297 sshd-session[6995]: pam_unix(sshd:session): session closed for user core May 27 18:22:37.202672 systemd[1]: sshd@85-146.190.127.126:22-139.178.68.195:54850.service: Deactivated successfully. May 27 18:22:37.207139 systemd[1]: session-86.scope: Deactivated successfully. May 27 18:22:37.208881 systemd-logind[1520]: Session 86 logged out. Waiting for processes to exit. May 27 18:22:37.211028 systemd-logind[1520]: Removed session 86.