May 27 17:46:52.774509 kernel: Linux version 6.12.30-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue May 27 15:32:02 -00 2025 May 27 17:46:52.774596 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=daa3e2d55cc4a7ff0ec15aa9bb0c07df9999cb4e3041f3adad1b1101efdea101 May 27 17:46:52.774605 kernel: BIOS-provided physical RAM map: May 27 17:46:52.774610 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable May 27 17:46:52.774615 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved May 27 17:46:52.774619 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved May 27 17:46:52.774626 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007cfdbfff] usable May 27 17:46:52.774631 kernel: BIOS-e820: [mem 0x000000007cfdc000-0x000000007cffffff] reserved May 27 17:46:52.774636 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved May 27 17:46:52.774641 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved May 27 17:46:52.774657 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved May 27 17:46:52.774662 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved May 27 17:46:52.774667 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved May 27 17:46:52.774672 kernel: NX (Execute Disable) protection: active May 27 17:46:52.774679 kernel: APIC: Static calls initialized May 27 17:46:52.774684 kernel: SMBIOS 3.0.0 present. May 27 17:46:52.774690 kernel: DMI: Hetzner vServer/Standard PC (Q35 + ICH9, 2009), BIOS 20171111 11/11/2017 May 27 17:46:52.774695 kernel: DMI: Memory slots populated: 1/1 May 27 17:46:52.774700 kernel: Hypervisor detected: KVM May 27 17:46:52.774705 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 May 27 17:46:52.774710 kernel: kvm-clock: using sched offset of 4051691184 cycles May 27 17:46:52.774715 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns May 27 17:46:52.774722 kernel: tsc: Detected 2445.404 MHz processor May 27 17:46:52.774727 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved May 27 17:46:52.774733 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable May 27 17:46:52.774738 kernel: last_pfn = 0x7cfdc max_arch_pfn = 0x400000000 May 27 17:46:52.774744 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs May 27 17:46:52.774749 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT May 27 17:46:52.774754 kernel: Using GB pages for direct mapping May 27 17:46:52.774759 kernel: ACPI: Early table checksum verification disabled May 27 17:46:52.774765 kernel: ACPI: RSDP 0x00000000000F5270 000014 (v00 BOCHS ) May 27 17:46:52.774771 kernel: ACPI: RSDT 0x000000007CFE265D 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 27 17:46:52.774780 kernel: ACPI: FACP 0x000000007CFE244D 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) May 27 17:46:52.774790 kernel: ACPI: DSDT 0x000000007CFE0040 00240D (v01 BOCHS BXPC 00000001 BXPC 00000001) May 27 17:46:52.774799 kernel: ACPI: FACS 0x000000007CFE0000 000040 May 27 17:46:52.774809 kernel: ACPI: APIC 0x000000007CFE2541 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) May 27 17:46:52.774819 kernel: ACPI: HPET 0x000000007CFE25C1 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 27 17:46:52.774828 kernel: ACPI: MCFG 0x000000007CFE25F9 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) May 27 17:46:52.774833 kernel: ACPI: WAET 0x000000007CFE2635 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 27 17:46:52.774839 kernel: ACPI: Reserving FACP table memory at [mem 0x7cfe244d-0x7cfe2540] May 27 17:46:52.774848 kernel: ACPI: Reserving DSDT table memory at [mem 0x7cfe0040-0x7cfe244c] May 27 17:46:52.774853 kernel: ACPI: Reserving FACS table memory at [mem 0x7cfe0000-0x7cfe003f] May 27 17:46:52.774860 kernel: ACPI: Reserving APIC table memory at [mem 0x7cfe2541-0x7cfe25c0] May 27 17:46:52.774870 kernel: ACPI: Reserving HPET table memory at [mem 0x7cfe25c1-0x7cfe25f8] May 27 17:46:52.774881 kernel: ACPI: Reserving MCFG table memory at [mem 0x7cfe25f9-0x7cfe2634] May 27 17:46:52.774891 kernel: ACPI: Reserving WAET table memory at [mem 0x7cfe2635-0x7cfe265c] May 27 17:46:52.774897 kernel: No NUMA configuration found May 27 17:46:52.774902 kernel: Faking a node at [mem 0x0000000000000000-0x000000007cfdbfff] May 27 17:46:52.774908 kernel: NODE_DATA(0) allocated [mem 0x7cfd4dc0-0x7cfdbfff] May 27 17:46:52.774913 kernel: Zone ranges: May 27 17:46:52.774919 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] May 27 17:46:52.774924 kernel: DMA32 [mem 0x0000000001000000-0x000000007cfdbfff] May 27 17:46:52.774930 kernel: Normal empty May 27 17:46:52.774936 kernel: Device empty May 27 17:46:52.774941 kernel: Movable zone start for each node May 27 17:46:52.774947 kernel: Early memory node ranges May 27 17:46:52.774953 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] May 27 17:46:52.774958 kernel: node 0: [mem 0x0000000000100000-0x000000007cfdbfff] May 27 17:46:52.774964 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007cfdbfff] May 27 17:46:52.774969 kernel: On node 0, zone DMA: 1 pages in unavailable ranges May 27 17:46:52.774974 kernel: On node 0, zone DMA: 97 pages in unavailable ranges May 27 17:46:52.774980 kernel: On node 0, zone DMA32: 12324 pages in unavailable ranges May 27 17:46:52.774985 kernel: ACPI: PM-Timer IO Port: 0x608 May 27 17:46:52.774991 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) May 27 17:46:52.774997 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 May 27 17:46:52.775002 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) May 27 17:46:52.775008 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) May 27 17:46:52.775013 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) May 27 17:46:52.775019 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) May 27 17:46:52.775024 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) May 27 17:46:52.775030 kernel: ACPI: Using ACPI (MADT) for SMP configuration information May 27 17:46:52.775035 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 May 27 17:46:52.775041 kernel: CPU topo: Max. logical packages: 1 May 27 17:46:52.775047 kernel: CPU topo: Max. logical dies: 1 May 27 17:46:52.775052 kernel: CPU topo: Max. dies per package: 1 May 27 17:46:52.775058 kernel: CPU topo: Max. threads per core: 1 May 27 17:46:52.775063 kernel: CPU topo: Num. cores per package: 2 May 27 17:46:52.775069 kernel: CPU topo: Num. threads per package: 2 May 27 17:46:52.775074 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs May 27 17:46:52.775080 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() May 27 17:46:52.775085 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices May 27 17:46:52.775091 kernel: Booting paravirtualized kernel on KVM May 27 17:46:52.775097 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns May 27 17:46:52.775103 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 May 27 17:46:52.775109 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 May 27 17:46:52.775114 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 May 27 17:46:52.775119 kernel: pcpu-alloc: [0] 0 1 May 27 17:46:52.775125 kernel: kvm-guest: PV spinlocks disabled, no host support May 27 17:46:52.775131 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=daa3e2d55cc4a7ff0ec15aa9bb0c07df9999cb4e3041f3adad1b1101efdea101 May 27 17:46:52.775137 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 27 17:46:52.775143 kernel: random: crng init done May 27 17:46:52.775149 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 27 17:46:52.775154 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) May 27 17:46:52.775160 kernel: Fallback order for Node 0: 0 May 27 17:46:52.775165 kernel: Built 1 zonelists, mobility grouping on. Total pages: 511866 May 27 17:46:52.775171 kernel: Policy zone: DMA32 May 27 17:46:52.775176 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 27 17:46:52.775182 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 May 27 17:46:52.775187 kernel: ftrace: allocating 40081 entries in 157 pages May 27 17:46:52.775192 kernel: ftrace: allocated 157 pages with 5 groups May 27 17:46:52.775199 kernel: Dynamic Preempt: voluntary May 27 17:46:52.775205 kernel: rcu: Preemptible hierarchical RCU implementation. May 27 17:46:52.775211 kernel: rcu: RCU event tracing is enabled. May 27 17:46:52.775217 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. May 27 17:46:52.775223 kernel: Trampoline variant of Tasks RCU enabled. May 27 17:46:52.775228 kernel: Rude variant of Tasks RCU enabled. May 27 17:46:52.775234 kernel: Tracing variant of Tasks RCU enabled. May 27 17:46:52.775239 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 27 17:46:52.775245 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 May 27 17:46:52.775250 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 27 17:46:52.775257 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 27 17:46:52.775262 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 27 17:46:52.775268 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 May 27 17:46:52.775273 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. May 27 17:46:52.775279 kernel: Console: colour VGA+ 80x25 May 27 17:46:52.775285 kernel: printk: legacy console [tty0] enabled May 27 17:46:52.775290 kernel: printk: legacy console [ttyS0] enabled May 27 17:46:52.775296 kernel: ACPI: Core revision 20240827 May 27 17:46:52.775306 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns May 27 17:46:52.775311 kernel: APIC: Switch to symmetric I/O mode setup May 27 17:46:52.775317 kernel: x2apic enabled May 27 17:46:52.775324 kernel: APIC: Switched APIC routing to: physical x2apic May 27 17:46:52.775330 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 May 27 17:46:52.775336 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x233fc319723, max_idle_ns: 440795258057 ns May 27 17:46:52.775342 kernel: Calibrating delay loop (skipped) preset value.. 4890.80 BogoMIPS (lpj=2445404) May 27 17:46:52.775347 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated May 27 17:46:52.775353 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 May 27 17:46:52.775360 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 May 27 17:46:52.775366 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization May 27 17:46:52.775372 kernel: Spectre V2 : Mitigation: Retpolines May 27 17:46:52.775377 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT May 27 17:46:52.775383 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls May 27 17:46:52.775389 kernel: RETBleed: Mitigation: untrained return thunk May 27 17:46:52.775395 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier May 27 17:46:52.775400 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl May 27 17:46:52.775407 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' May 27 17:46:52.775413 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' May 27 17:46:52.775419 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' May 27 17:46:52.775425 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 May 27 17:46:52.775431 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. May 27 17:46:52.775436 kernel: Freeing SMP alternatives memory: 32K May 27 17:46:52.775442 kernel: pid_max: default: 32768 minimum: 301 May 27 17:46:52.775448 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima May 27 17:46:52.775453 kernel: landlock: Up and running. May 27 17:46:52.775460 kernel: SELinux: Initializing. May 27 17:46:52.775466 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) May 27 17:46:52.775472 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) May 27 17:46:52.775478 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0) May 27 17:46:52.775483 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. May 27 17:46:52.775489 kernel: ... version: 0 May 27 17:46:52.775495 kernel: ... bit width: 48 May 27 17:46:52.775500 kernel: ... generic registers: 6 May 27 17:46:52.775506 kernel: ... value mask: 0000ffffffffffff May 27 17:46:52.775513 kernel: ... max period: 00007fffffffffff May 27 17:46:52.775534 kernel: ... fixed-purpose events: 0 May 27 17:46:52.775541 kernel: ... event mask: 000000000000003f May 27 17:46:52.775547 kernel: signal: max sigframe size: 1776 May 27 17:46:52.775553 kernel: rcu: Hierarchical SRCU implementation. May 27 17:46:52.775559 kernel: rcu: Max phase no-delay instances is 400. May 27 17:46:52.775564 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level May 27 17:46:52.775570 kernel: smp: Bringing up secondary CPUs ... May 27 17:46:52.775576 kernel: smpboot: x86: Booting SMP configuration: May 27 17:46:52.775583 kernel: .... node #0, CPUs: #1 May 27 17:46:52.775589 kernel: smp: Brought up 1 node, 2 CPUs May 27 17:46:52.775595 kernel: smpboot: Total of 2 processors activated (9781.61 BogoMIPS) May 27 17:46:52.775601 kernel: Memory: 1917780K/2047464K available (14336K kernel code, 2430K rwdata, 9952K rodata, 54416K init, 2552K bss, 125140K reserved, 0K cma-reserved) May 27 17:46:52.775607 kernel: devtmpfs: initialized May 27 17:46:52.775612 kernel: x86/mm: Memory block size: 128MB May 27 17:46:52.775618 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 27 17:46:52.775624 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) May 27 17:46:52.775630 kernel: pinctrl core: initialized pinctrl subsystem May 27 17:46:52.775637 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 27 17:46:52.775642 kernel: audit: initializing netlink subsys (disabled) May 27 17:46:52.775655 kernel: audit: type=2000 audit(1748368010.395:1): state=initialized audit_enabled=0 res=1 May 27 17:46:52.775661 kernel: thermal_sys: Registered thermal governor 'step_wise' May 27 17:46:52.775667 kernel: thermal_sys: Registered thermal governor 'user_space' May 27 17:46:52.775673 kernel: cpuidle: using governor menu May 27 17:46:52.775678 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 27 17:46:52.775684 kernel: dca service started, version 1.12.1 May 27 17:46:52.775690 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff] May 27 17:46:52.775697 kernel: PCI: Using configuration type 1 for base access May 27 17:46:52.775703 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. May 27 17:46:52.775709 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages May 27 17:46:52.775715 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page May 27 17:46:52.775720 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 27 17:46:52.775726 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page May 27 17:46:52.775732 kernel: ACPI: Added _OSI(Module Device) May 27 17:46:52.775737 kernel: ACPI: Added _OSI(Processor Device) May 27 17:46:52.775743 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 27 17:46:52.775750 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 27 17:46:52.775756 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 27 17:46:52.775761 kernel: ACPI: Interpreter enabled May 27 17:46:52.775767 kernel: ACPI: PM: (supports S0 S5) May 27 17:46:52.775773 kernel: ACPI: Using IOAPIC for interrupt routing May 27 17:46:52.775778 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug May 27 17:46:52.775784 kernel: PCI: Using E820 reservations for host bridge windows May 27 17:46:52.775790 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F May 27 17:46:52.775796 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) May 27 17:46:52.775903 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 27 17:46:52.775967 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] May 27 17:46:52.776025 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] May 27 17:46:52.776033 kernel: PCI host bridge to bus 0000:00 May 27 17:46:52.776096 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] May 27 17:46:52.776149 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] May 27 17:46:52.776202 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] May 27 17:46:52.776251 kernel: pci_bus 0000:00: root bus resource [mem 0x7d000000-0xafffffff window] May 27 17:46:52.776300 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] May 27 17:46:52.776349 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x8ffffffff window] May 27 17:46:52.776399 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] May 27 17:46:52.776469 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint May 27 17:46:52.776562 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint May 27 17:46:52.776632 kernel: pci 0000:00:01.0: BAR 0 [mem 0xfb800000-0xfbffffff pref] May 27 17:46:52.776702 kernel: pci 0000:00:01.0: BAR 2 [mem 0xfd200000-0xfd203fff 64bit pref] May 27 17:46:52.776760 kernel: pci 0000:00:01.0: BAR 4 [mem 0xfea10000-0xfea10fff] May 27 17:46:52.776816 kernel: pci 0000:00:01.0: ROM [mem 0xfea00000-0xfea0ffff pref] May 27 17:46:52.776874 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] May 27 17:46:52.776937 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 27 17:46:52.776999 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfea11000-0xfea11fff] May 27 17:46:52.777056 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] May 27 17:46:52.777111 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] May 27 17:46:52.777167 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] May 27 17:46:52.777229 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 27 17:46:52.777287 kernel: pci 0000:00:02.1: BAR 0 [mem 0xfea12000-0xfea12fff] May 27 17:46:52.777344 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] May 27 17:46:52.777402 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] May 27 17:46:52.777458 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] May 27 17:46:52.777553 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 27 17:46:52.777621 kernel: pci 0000:00:02.2: BAR 0 [mem 0xfea13000-0xfea13fff] May 27 17:46:52.777690 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] May 27 17:46:52.777747 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] May 27 17:46:52.777804 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] May 27 17:46:52.777872 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 27 17:46:52.777930 kernel: pci 0000:00:02.3: BAR 0 [mem 0xfea14000-0xfea14fff] May 27 17:46:52.777988 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] May 27 17:46:52.778045 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] May 27 17:46:52.778101 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] May 27 17:46:52.778186 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 27 17:46:52.778280 kernel: pci 0000:00:02.4: BAR 0 [mem 0xfea15000-0xfea15fff] May 27 17:46:52.778349 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] May 27 17:46:52.778409 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] May 27 17:46:52.778468 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] May 27 17:46:52.779572 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 27 17:46:52.779666 kernel: pci 0000:00:02.5: BAR 0 [mem 0xfea16000-0xfea16fff] May 27 17:46:52.779731 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] May 27 17:46:52.779790 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] May 27 17:46:52.779848 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] May 27 17:46:52.779917 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 27 17:46:52.779975 kernel: pci 0000:00:02.6: BAR 0 [mem 0xfea17000-0xfea17fff] May 27 17:46:52.780032 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] May 27 17:46:52.780088 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] May 27 17:46:52.780144 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] May 27 17:46:52.780209 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 27 17:46:52.780270 kernel: pci 0000:00:02.7: BAR 0 [mem 0xfea18000-0xfea18fff] May 27 17:46:52.780328 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] May 27 17:46:52.780383 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] May 27 17:46:52.780439 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] May 27 17:46:52.780502 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 27 17:46:52.781612 kernel: pci 0000:00:03.0: BAR 0 [mem 0xfea19000-0xfea19fff] May 27 17:46:52.781694 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] May 27 17:46:52.781760 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] May 27 17:46:52.781820 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] May 27 17:46:52.781884 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint May 27 17:46:52.781944 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO May 27 17:46:52.782007 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint May 27 17:46:52.782065 kernel: pci 0000:00:1f.2: BAR 4 [io 0xc040-0xc05f] May 27 17:46:52.782123 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfea1a000-0xfea1afff] May 27 17:46:52.782191 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint May 27 17:46:52.782249 kernel: pci 0000:00:1f.3: BAR 4 [io 0x0700-0x073f] May 27 17:46:52.782318 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint May 27 17:46:52.782379 kernel: pci 0000:01:00.0: BAR 1 [mem 0xfe880000-0xfe880fff] May 27 17:46:52.782439 kernel: pci 0000:01:00.0: BAR 4 [mem 0xfd000000-0xfd003fff 64bit pref] May 27 17:46:52.782498 kernel: pci 0000:01:00.0: ROM [mem 0xfe800000-0xfe87ffff pref] May 27 17:46:52.783194 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] May 27 17:46:52.783272 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint May 27 17:46:52.783337 kernel: pci 0000:02:00.0: BAR 0 [mem 0xfe600000-0xfe603fff 64bit] May 27 17:46:52.783399 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] May 27 17:46:52.783467 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint May 27 17:46:52.783550 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfe400000-0xfe400fff] May 27 17:46:52.783617 kernel: pci 0000:03:00.0: BAR 4 [mem 0xfcc00000-0xfcc03fff 64bit pref] May 27 17:46:52.783694 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] May 27 17:46:52.783851 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint May 27 17:46:52.783924 kernel: pci 0000:04:00.0: BAR 4 [mem 0xfca00000-0xfca03fff 64bit pref] May 27 17:46:52.784032 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] May 27 17:46:52.784104 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint May 27 17:46:52.784164 kernel: pci 0000:05:00.0: BAR 1 [mem 0xfe000000-0xfe000fff] May 27 17:46:52.784435 kernel: pci 0000:05:00.0: BAR 4 [mem 0xfc800000-0xfc803fff 64bit pref] May 27 17:46:52.784502 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] May 27 17:46:52.784590 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint May 27 17:46:52.784663 kernel: pci 0000:06:00.0: BAR 1 [mem 0xfde00000-0xfde00fff] May 27 17:46:52.784724 kernel: pci 0000:06:00.0: BAR 4 [mem 0xfc600000-0xfc603fff 64bit pref] May 27 17:46:52.784782 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] May 27 17:46:52.784790 kernel: acpiphp: Slot [0] registered May 27 17:46:52.784860 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint May 27 17:46:52.784921 kernel: pci 0000:07:00.0: BAR 1 [mem 0xfdc80000-0xfdc80fff] May 27 17:46:52.784980 kernel: pci 0000:07:00.0: BAR 4 [mem 0xfc400000-0xfc403fff 64bit pref] May 27 17:46:52.785038 kernel: pci 0000:07:00.0: ROM [mem 0xfdc00000-0xfdc7ffff pref] May 27 17:46:52.785095 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] May 27 17:46:52.785104 kernel: acpiphp: Slot [0-2] registered May 27 17:46:52.785159 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] May 27 17:46:52.785167 kernel: acpiphp: Slot [0-3] registered May 27 17:46:52.785225 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] May 27 17:46:52.785233 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 May 27 17:46:52.785239 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 May 27 17:46:52.785245 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 May 27 17:46:52.785251 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 May 27 17:46:52.785257 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 May 27 17:46:52.785263 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 May 27 17:46:52.785268 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 May 27 17:46:52.785276 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 May 27 17:46:52.785282 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 May 27 17:46:52.785287 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 May 27 17:46:52.785293 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 May 27 17:46:52.785299 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 May 27 17:46:52.785304 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 May 27 17:46:52.785310 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 May 27 17:46:52.785316 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 May 27 17:46:52.785321 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 May 27 17:46:52.785328 kernel: iommu: Default domain type: Translated May 27 17:46:52.785334 kernel: iommu: DMA domain TLB invalidation policy: lazy mode May 27 17:46:52.785340 kernel: PCI: Using ACPI for IRQ routing May 27 17:46:52.785346 kernel: PCI: pci_cache_line_size set to 64 bytes May 27 17:46:52.785351 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] May 27 17:46:52.785357 kernel: e820: reserve RAM buffer [mem 0x7cfdc000-0x7fffffff] May 27 17:46:52.785413 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device May 27 17:46:52.785469 kernel: pci 0000:00:01.0: vgaarb: bridge control possible May 27 17:46:52.787433 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none May 27 17:46:52.787450 kernel: vgaarb: loaded May 27 17:46:52.787457 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 May 27 17:46:52.787464 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter May 27 17:46:52.787470 kernel: clocksource: Switched to clocksource kvm-clock May 27 17:46:52.787476 kernel: VFS: Disk quotas dquot_6.6.0 May 27 17:46:52.787482 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 27 17:46:52.787487 kernel: pnp: PnP ACPI init May 27 17:46:52.787584 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved May 27 17:46:52.787596 kernel: pnp: PnP ACPI: found 5 devices May 27 17:46:52.787604 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns May 27 17:46:52.787610 kernel: NET: Registered PF_INET protocol family May 27 17:46:52.787616 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) May 27 17:46:52.787622 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) May 27 17:46:52.787628 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 27 17:46:52.787634 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) May 27 17:46:52.787640 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) May 27 17:46:52.787654 kernel: TCP: Hash tables configured (established 16384 bind 16384) May 27 17:46:52.787662 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) May 27 17:46:52.787668 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) May 27 17:46:52.787674 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 27 17:46:52.787680 kernel: NET: Registered PF_XDP protocol family May 27 17:46:52.787745 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 May 27 17:46:52.787806 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 May 27 17:46:52.787864 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 May 27 17:46:52.787922 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff]: assigned May 27 17:46:52.787981 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff]: assigned May 27 17:46:52.788049 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff]: assigned May 27 17:46:52.788110 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] May 27 17:46:52.788169 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] May 27 17:46:52.788234 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] May 27 17:46:52.788292 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] May 27 17:46:52.788351 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] May 27 17:46:52.788411 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] May 27 17:46:52.788473 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] May 27 17:46:52.789579 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] May 27 17:46:52.789690 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] May 27 17:46:52.789756 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] May 27 17:46:52.789817 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] May 27 17:46:52.789876 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] May 27 17:46:52.789938 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] May 27 17:46:52.789996 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] May 27 17:46:52.790057 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] May 27 17:46:52.790115 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] May 27 17:46:52.790173 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] May 27 17:46:52.790231 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] May 27 17:46:52.790287 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] May 27 17:46:52.790345 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff] May 27 17:46:52.790405 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] May 27 17:46:52.790499 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] May 27 17:46:52.791609 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] May 27 17:46:52.791690 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff] May 27 17:46:52.791750 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] May 27 17:46:52.791809 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] May 27 17:46:52.791867 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] May 27 17:46:52.791923 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff] May 27 17:46:52.791981 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] May 27 17:46:52.792043 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] May 27 17:46:52.792098 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] May 27 17:46:52.792149 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] May 27 17:46:52.792199 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] May 27 17:46:52.792249 kernel: pci_bus 0000:00: resource 7 [mem 0x7d000000-0xafffffff window] May 27 17:46:52.792298 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] May 27 17:46:52.792347 kernel: pci_bus 0000:00: resource 9 [mem 0x100000000-0x8ffffffff window] May 27 17:46:52.792407 kernel: pci_bus 0000:01: resource 1 [mem 0xfe800000-0xfe9fffff] May 27 17:46:52.792465 kernel: pci_bus 0000:01: resource 2 [mem 0xfd000000-0xfd1fffff 64bit pref] May 27 17:46:52.792540 kernel: pci_bus 0000:02: resource 1 [mem 0xfe600000-0xfe7fffff] May 27 17:46:52.792599 kernel: pci_bus 0000:02: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] May 27 17:46:52.792680 kernel: pci_bus 0000:03: resource 1 [mem 0xfe400000-0xfe5fffff] May 27 17:46:52.792783 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] May 27 17:46:52.792846 kernel: pci_bus 0000:04: resource 1 [mem 0xfe200000-0xfe3fffff] May 27 17:46:52.792905 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] May 27 17:46:52.792963 kernel: pci_bus 0000:05: resource 1 [mem 0xfe000000-0xfe1fffff] May 27 17:46:52.793020 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] May 27 17:46:52.793079 kernel: pci_bus 0000:06: resource 1 [mem 0xfde00000-0xfdffffff] May 27 17:46:52.793133 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] May 27 17:46:52.793193 kernel: pci_bus 0000:07: resource 0 [io 0x1000-0x1fff] May 27 17:46:52.793250 kernel: pci_bus 0000:07: resource 1 [mem 0xfdc00000-0xfddfffff] May 27 17:46:52.793336 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] May 27 17:46:52.793400 kernel: pci_bus 0000:08: resource 0 [io 0x2000-0x2fff] May 27 17:46:52.796602 kernel: pci_bus 0000:08: resource 1 [mem 0xfda00000-0xfdbfffff] May 27 17:46:52.796719 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] May 27 17:46:52.796800 kernel: pci_bus 0000:09: resource 0 [io 0x3000-0x3fff] May 27 17:46:52.796858 kernel: pci_bus 0000:09: resource 1 [mem 0xfd800000-0xfd9fffff] May 27 17:46:52.796916 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] May 27 17:46:52.796926 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 May 27 17:46:52.796933 kernel: PCI: CLS 0 bytes, default 64 May 27 17:46:52.796939 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x233fc319723, max_idle_ns: 440795258057 ns May 27 17:46:52.796946 kernel: Initialise system trusted keyrings May 27 17:46:52.796952 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 May 27 17:46:52.796958 kernel: Key type asymmetric registered May 27 17:46:52.796965 kernel: Asymmetric key parser 'x509' registered May 27 17:46:52.796973 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) May 27 17:46:52.796979 kernel: io scheduler mq-deadline registered May 27 17:46:52.796986 kernel: io scheduler kyber registered May 27 17:46:52.796992 kernel: io scheduler bfq registered May 27 17:46:52.797054 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 May 27 17:46:52.797118 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 May 27 17:46:52.797178 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 May 27 17:46:52.797237 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 May 27 17:46:52.797294 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 May 27 17:46:52.797355 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 May 27 17:46:52.797412 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 May 27 17:46:52.797470 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 May 27 17:46:52.797585 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 May 27 17:46:52.797687 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 May 27 17:46:52.797764 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 May 27 17:46:52.797823 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 May 27 17:46:52.797881 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 May 27 17:46:52.797942 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 May 27 17:46:52.797999 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 May 27 17:46:52.798055 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 May 27 17:46:52.798065 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 May 27 17:46:52.798119 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 May 27 17:46:52.798176 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 May 27 17:46:52.798187 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 May 27 17:46:52.798194 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 May 27 17:46:52.798200 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 27 17:46:52.798207 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A May 27 17:46:52.798213 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 May 27 17:46:52.798219 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 May 27 17:46:52.798225 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 May 27 17:46:52.798231 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 May 27 17:46:52.798364 kernel: rtc_cmos 00:03: RTC can wake from S4 May 27 17:46:52.798440 kernel: rtc_cmos 00:03: registered as rtc0 May 27 17:46:52.798495 kernel: rtc_cmos 00:03: setting system clock to 2025-05-27T17:46:52 UTC (1748368012) May 27 17:46:52.799599 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram, hpet irqs May 27 17:46:52.799612 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled May 27 17:46:52.799619 kernel: NET: Registered PF_INET6 protocol family May 27 17:46:52.799625 kernel: Segment Routing with IPv6 May 27 17:46:52.799631 kernel: In-situ OAM (IOAM) with IPv6 May 27 17:46:52.799642 kernel: NET: Registered PF_PACKET protocol family May 27 17:46:52.799657 kernel: Key type dns_resolver registered May 27 17:46:52.799663 kernel: IPI shorthand broadcast: enabled May 27 17:46:52.799670 kernel: sched_clock: Marking stable (2882011746, 142841278)->(3032865728, -8012704) May 27 17:46:52.799676 kernel: registered taskstats version 1 May 27 17:46:52.799682 kernel: Loading compiled-in X.509 certificates May 27 17:46:52.799688 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.30-flatcar: 9507e5c390e18536b38d58c90da64baf0ac9837c' May 27 17:46:52.799694 kernel: Demotion targets for Node 0: null May 27 17:46:52.799701 kernel: Key type .fscrypt registered May 27 17:46:52.799708 kernel: Key type fscrypt-provisioning registered May 27 17:46:52.799714 kernel: ima: No TPM chip found, activating TPM-bypass! May 27 17:46:52.799720 kernel: ima: Allocated hash algorithm: sha1 May 27 17:46:52.799726 kernel: ima: No architecture policies found May 27 17:46:52.799732 kernel: clk: Disabling unused clocks May 27 17:46:52.799739 kernel: Warning: unable to open an initial console. May 27 17:46:52.799745 kernel: Freeing unused kernel image (initmem) memory: 54416K May 27 17:46:52.799751 kernel: Write protecting the kernel read-only data: 24576k May 27 17:46:52.799757 kernel: Freeing unused kernel image (rodata/data gap) memory: 288K May 27 17:46:52.799764 kernel: Run /init as init process May 27 17:46:52.799770 kernel: with arguments: May 27 17:46:52.799776 kernel: /init May 27 17:46:52.799782 kernel: with environment: May 27 17:46:52.799788 kernel: HOME=/ May 27 17:46:52.799794 kernel: TERM=linux May 27 17:46:52.799800 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 27 17:46:52.799808 systemd[1]: Successfully made /usr/ read-only. May 27 17:46:52.799817 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 27 17:46:52.799826 systemd[1]: Detected virtualization kvm. May 27 17:46:52.799833 systemd[1]: Detected architecture x86-64. May 27 17:46:52.799839 systemd[1]: Running in initrd. May 27 17:46:52.799845 systemd[1]: No hostname configured, using default hostname. May 27 17:46:52.799852 systemd[1]: Hostname set to . May 27 17:46:52.799859 systemd[1]: Initializing machine ID from VM UUID. May 27 17:46:52.799865 systemd[1]: Queued start job for default target initrd.target. May 27 17:46:52.799873 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 17:46:52.799879 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 17:46:52.799887 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 27 17:46:52.799894 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 27 17:46:52.799901 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 27 17:46:52.799908 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 27 17:46:52.799915 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 27 17:46:52.799923 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 27 17:46:52.799930 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 17:46:52.799937 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 27 17:46:52.799943 systemd[1]: Reached target paths.target - Path Units. May 27 17:46:52.799950 systemd[1]: Reached target slices.target - Slice Units. May 27 17:46:52.799956 systemd[1]: Reached target swap.target - Swaps. May 27 17:46:52.799963 systemd[1]: Reached target timers.target - Timer Units. May 27 17:46:52.799969 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 27 17:46:52.799977 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 27 17:46:52.799984 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 27 17:46:52.799990 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 27 17:46:52.799997 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 27 17:46:52.800004 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 27 17:46:52.800010 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 27 17:46:52.800017 systemd[1]: Reached target sockets.target - Socket Units. May 27 17:46:52.800023 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 27 17:46:52.800030 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 27 17:46:52.800038 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 27 17:46:52.800045 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). May 27 17:46:52.800052 systemd[1]: Starting systemd-fsck-usr.service... May 27 17:46:52.800058 systemd[1]: Starting systemd-journald.service - Journal Service... May 27 17:46:52.800065 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 27 17:46:52.800071 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 17:46:52.800094 systemd-journald[216]: Collecting audit messages is disabled. May 27 17:46:52.800114 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 27 17:46:52.800122 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 27 17:46:52.800130 systemd[1]: Finished systemd-fsck-usr.service. May 27 17:46:52.800137 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 27 17:46:52.800144 systemd-journald[216]: Journal started May 27 17:46:52.800161 systemd-journald[216]: Runtime Journal (/run/log/journal/0a4b3c57cc0d42afade77b9863612342) is 4.8M, max 38.6M, 33.7M free. May 27 17:46:52.796745 systemd-modules-load[217]: Inserted module 'overlay' May 27 17:46:52.838906 systemd[1]: Started systemd-journald.service - Journal Service. May 27 17:46:52.838924 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 27 17:46:52.838935 kernel: Bridge firewalling registered May 27 17:46:52.815618 systemd-modules-load[217]: Inserted module 'br_netfilter' May 27 17:46:52.844609 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 27 17:46:52.846154 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 17:46:52.849115 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 27 17:46:52.850722 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 27 17:46:52.852618 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 27 17:46:52.858214 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 27 17:46:52.860616 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 27 17:46:52.864739 systemd-tmpfiles[235]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. May 27 17:46:52.866724 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 27 17:46:52.870717 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 17:46:52.875261 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 27 17:46:52.876766 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 17:46:52.879013 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 27 17:46:52.884036 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 27 17:46:52.895594 dracut-cmdline[256]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=daa3e2d55cc4a7ff0ec15aa9bb0c07df9999cb4e3041f3adad1b1101efdea101 May 27 17:46:52.910565 systemd-resolved[251]: Positive Trust Anchors: May 27 17:46:52.910576 systemd-resolved[251]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 27 17:46:52.910601 systemd-resolved[251]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 27 17:46:52.915956 systemd-resolved[251]: Defaulting to hostname 'linux'. May 27 17:46:52.916847 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 27 17:46:52.917563 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 27 17:46:52.955562 kernel: SCSI subsystem initialized May 27 17:46:52.963554 kernel: Loading iSCSI transport class v2.0-870. May 27 17:46:52.971545 kernel: iscsi: registered transport (tcp) May 27 17:46:52.987552 kernel: iscsi: registered transport (qla4xxx) May 27 17:46:52.987588 kernel: QLogic iSCSI HBA Driver May 27 17:46:53.003182 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 27 17:46:53.014384 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 27 17:46:53.017015 systemd[1]: Reached target network-pre.target - Preparation for Network. May 27 17:46:53.050333 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 27 17:46:53.052341 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 27 17:46:53.097559 kernel: raid6: avx2x4 gen() 34507 MB/s May 27 17:46:53.113547 kernel: raid6: avx2x2 gen() 32715 MB/s May 27 17:46:53.130661 kernel: raid6: avx2x1 gen() 23342 MB/s May 27 17:46:53.130708 kernel: raid6: using algorithm avx2x4 gen() 34507 MB/s May 27 17:46:53.148745 kernel: raid6: .... xor() 4672 MB/s, rmw enabled May 27 17:46:53.148806 kernel: raid6: using avx2x2 recovery algorithm May 27 17:46:53.165565 kernel: xor: automatically using best checksumming function avx May 27 17:46:53.281570 kernel: Btrfs loaded, zoned=no, fsverity=no May 27 17:46:53.286943 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 27 17:46:53.288596 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 17:46:53.308944 systemd-udevd[464]: Using default interface naming scheme 'v255'. May 27 17:46:53.312489 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 17:46:53.314963 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 27 17:46:53.336790 dracut-pre-trigger[476]: rd.md=0: removing MD RAID activation May 27 17:46:53.354560 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 27 17:46:53.356008 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 27 17:46:53.394104 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 27 17:46:53.396679 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 27 17:46:53.447573 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues May 27 17:46:53.452943 kernel: scsi host0: Virtio SCSI HBA May 27 17:46:53.453088 kernel: scsi 0:0:0:0: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 May 27 17:46:53.465554 kernel: libata version 3.00 loaded. May 27 17:46:53.489169 kernel: ahci 0000:00:1f.2: version 3.0 May 27 17:46:53.489386 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 May 27 17:46:53.495428 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode May 27 17:46:53.495667 kernel: cryptd: max_cpu_qlen set to 1000 May 27 17:46:53.495680 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) May 27 17:46:53.495769 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only May 27 17:46:53.500547 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 May 27 17:46:53.503810 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 27 17:46:53.504553 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 27 17:46:53.506598 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 27 17:46:53.509441 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 17:46:53.520551 kernel: ACPI: bus type USB registered May 27 17:46:53.522551 kernel: usbcore: registered new interface driver usbfs May 27 17:46:53.525595 kernel: usbcore: registered new interface driver hub May 27 17:46:53.525660 kernel: usbcore: registered new device driver usb May 27 17:46:53.525671 kernel: scsi host1: ahci May 27 17:46:53.535558 kernel: AES CTR mode by8 optimization enabled May 27 17:46:53.570228 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller May 27 17:46:53.570474 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 May 27 17:46:53.570591 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 May 27 17:46:53.570686 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller May 27 17:46:53.570762 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 May 27 17:46:53.570840 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed May 27 17:46:53.570915 kernel: hub 1-0:1.0: USB hub found May 27 17:46:53.571011 kernel: hub 1-0:1.0: 4 ports detected May 27 17:46:53.572771 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. May 27 17:46:53.574579 kernel: hub 2-0:1.0: USB hub found May 27 17:46:53.574690 kernel: hub 2-0:1.0: 4 ports detected May 27 17:46:53.581546 kernel: sd 0:0:0:0: Power-on or device reset occurred May 27 17:46:53.581711 kernel: sd 0:0:0:0: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) May 27 17:46:53.581795 kernel: sd 0:0:0:0: [sda] Write Protect is off May 27 17:46:53.581869 kernel: sd 0:0:0:0: [sda] Mode Sense: 63 00 00 08 May 27 17:46:53.581945 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA May 27 17:46:53.586581 kernel: scsi host2: ahci May 27 17:46:53.587570 kernel: scsi host3: ahci May 27 17:46:53.587610 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. May 27 17:46:53.587621 kernel: GPT:17805311 != 80003071 May 27 17:46:53.587629 kernel: GPT:Alternate GPT header not at the end of the disk. May 27 17:46:53.587637 kernel: GPT:17805311 != 80003071 May 27 17:46:53.587644 kernel: GPT: Use GNU Parted to correct GPT errors. May 27 17:46:53.587666 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 27 17:46:53.587674 kernel: sd 0:0:0:0: [sda] Attached SCSI disk May 27 17:46:53.592542 kernel: scsi host4: ahci May 27 17:46:53.592711 kernel: scsi host5: ahci May 27 17:46:53.595721 kernel: scsi host6: ahci May 27 17:46:53.595844 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a100 irq 48 lpm-pol 0 May 27 17:46:53.595854 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a180 irq 48 lpm-pol 0 May 27 17:46:53.595867 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a200 irq 48 lpm-pol 0 May 27 17:46:53.595875 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a280 irq 48 lpm-pol 0 May 27 17:46:53.595882 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a300 irq 48 lpm-pol 0 May 27 17:46:53.595889 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a380 irq 48 lpm-pol 0 May 27 17:46:53.649654 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. May 27 17:46:53.682019 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 17:46:53.690260 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. May 27 17:46:53.697726 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. May 27 17:46:53.703823 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. May 27 17:46:53.704346 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. May 27 17:46:53.707027 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 27 17:46:53.719923 disk-uuid[618]: Primary Header is updated. May 27 17:46:53.719923 disk-uuid[618]: Secondary Entries is updated. May 27 17:46:53.719923 disk-uuid[618]: Secondary Header is updated. May 27 17:46:53.731570 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 27 17:46:53.747549 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 27 17:46:53.812633 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd May 27 17:46:53.909273 kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) May 27 17:46:53.909347 kernel: ata3: SATA link down (SStatus 0 SControl 300) May 27 17:46:53.909359 kernel: ata2: SATA link down (SStatus 0 SControl 300) May 27 17:46:53.909369 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 May 27 17:46:53.909388 kernel: ata1.00: applying bridge limits May 27 17:46:53.909397 kernel: ata5: SATA link down (SStatus 0 SControl 300) May 27 17:46:53.911803 kernel: ata6: SATA link down (SStatus 0 SControl 300) May 27 17:46:53.912546 kernel: ata4: SATA link down (SStatus 0 SControl 300) May 27 17:46:53.913551 kernel: ata1.00: configured for UDMA/100 May 27 17:46:53.915185 kernel: scsi 1:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 May 27 17:46:53.949574 kernel: hid: raw HID events driver (C) Jiri Kosina May 27 17:46:53.953591 kernel: usbcore: registered new interface driver usbhid May 27 17:46:53.953642 kernel: usbhid: USB HID core driver May 27 17:46:53.959866 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input3 May 27 17:46:53.959899 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 May 27 17:46:53.963877 kernel: sr 1:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray May 27 17:46:53.964054 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 May 27 17:46:53.987544 kernel: sr 1:0:0:0: Attached scsi CD-ROM sr0 May 27 17:46:54.290165 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 27 17:46:54.291148 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 27 17:46:54.292169 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 17:46:54.293485 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 27 17:46:54.295491 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 27 17:46:54.317469 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 27 17:46:54.760595 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 27 17:46:54.760672 disk-uuid[619]: The operation has completed successfully. May 27 17:46:54.804339 systemd[1]: disk-uuid.service: Deactivated successfully. May 27 17:46:54.804422 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 27 17:46:54.828618 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 27 17:46:54.844257 sh[663]: Success May 27 17:46:54.860800 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 27 17:46:54.860849 kernel: device-mapper: uevent: version 1.0.3 May 27 17:46:54.861963 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev May 27 17:46:54.871624 kernel: device-mapper: verity: sha256 using shash "sha256-ni" May 27 17:46:54.920617 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 27 17:46:54.924278 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 27 17:46:54.932743 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 27 17:46:54.943937 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' May 27 17:46:54.943974 kernel: BTRFS: device fsid 7caef027-0915-4c01-a3d5-28eff70f7ebd devid 1 transid 39 /dev/mapper/usr (254:0) scanned by mount (675) May 27 17:46:54.948612 kernel: BTRFS info (device dm-0): first mount of filesystem 7caef027-0915-4c01-a3d5-28eff70f7ebd May 27 17:46:54.948671 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm May 27 17:46:54.951162 kernel: BTRFS info (device dm-0): using free-space-tree May 27 17:46:54.959023 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 27 17:46:54.959945 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. May 27 17:46:54.960798 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. May 27 17:46:54.961459 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 27 17:46:54.964644 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 27 17:46:54.991573 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 (8:6) scanned by mount (710) May 27 17:46:54.993915 kernel: BTRFS info (device sda6): first mount of filesystem be856aed-e34b-4b7b-be8a-0716b27db212 May 27 17:46:54.993947 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm May 27 17:46:54.995997 kernel: BTRFS info (device sda6): using free-space-tree May 27 17:46:55.002685 kernel: BTRFS info (device sda6): last unmount of filesystem be856aed-e34b-4b7b-be8a-0716b27db212 May 27 17:46:55.003183 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 27 17:46:55.004623 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 27 17:46:55.069927 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 27 17:46:55.077237 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 27 17:46:55.089845 ignition[776]: Ignition 2.21.0 May 27 17:46:55.089860 ignition[776]: Stage: fetch-offline May 27 17:46:55.089885 ignition[776]: no configs at "/usr/lib/ignition/base.d" May 27 17:46:55.089892 ignition[776]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 27 17:46:55.091685 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 27 17:46:55.089954 ignition[776]: parsed url from cmdline: "" May 27 17:46:55.089956 ignition[776]: no config URL provided May 27 17:46:55.089960 ignition[776]: reading system config file "/usr/lib/ignition/user.ign" May 27 17:46:55.089964 ignition[776]: no config at "/usr/lib/ignition/user.ign" May 27 17:46:55.089968 ignition[776]: failed to fetch config: resource requires networking May 27 17:46:55.090154 ignition[776]: Ignition finished successfully May 27 17:46:55.103894 systemd-networkd[848]: lo: Link UP May 27 17:46:55.103904 systemd-networkd[848]: lo: Gained carrier May 27 17:46:55.105274 systemd-networkd[848]: Enumeration completed May 27 17:46:55.105424 systemd[1]: Started systemd-networkd.service - Network Configuration. May 27 17:46:55.105878 systemd-networkd[848]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 17:46:55.105882 systemd-networkd[848]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 27 17:46:55.106411 systemd[1]: Reached target network.target - Network. May 27 17:46:55.106794 systemd-networkd[848]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 17:46:55.106797 systemd-networkd[848]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. May 27 17:46:55.107457 systemd-networkd[848]: eth0: Link UP May 27 17:46:55.107460 systemd-networkd[848]: eth0: Gained carrier May 27 17:46:55.107467 systemd-networkd[848]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 17:46:55.109619 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... May 27 17:46:55.110698 systemd-networkd[848]: eth1: Link UP May 27 17:46:55.110701 systemd-networkd[848]: eth1: Gained carrier May 27 17:46:55.110708 systemd-networkd[848]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 17:46:55.128991 ignition[852]: Ignition 2.21.0 May 27 17:46:55.129004 ignition[852]: Stage: fetch May 27 17:46:55.129110 ignition[852]: no configs at "/usr/lib/ignition/base.d" May 27 17:46:55.129118 ignition[852]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 27 17:46:55.129176 ignition[852]: parsed url from cmdline: "" May 27 17:46:55.130591 systemd-networkd[848]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 May 27 17:46:55.129179 ignition[852]: no config URL provided May 27 17:46:55.129182 ignition[852]: reading system config file "/usr/lib/ignition/user.ign" May 27 17:46:55.129187 ignition[852]: no config at "/usr/lib/ignition/user.ign" May 27 17:46:55.129213 ignition[852]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 May 27 17:46:55.129509 ignition[852]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable May 27 17:46:55.174581 systemd-networkd[848]: eth0: DHCPv4 address 157.180.123.17/32, gateway 172.31.1.1 acquired from 172.31.1.1 May 27 17:46:55.330766 ignition[852]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 May 27 17:46:55.337468 ignition[852]: GET result: OK May 27 17:46:55.337611 ignition[852]: parsing config with SHA512: 14edca29d2570f8dc6685f92f240eff6a8a92d1c359d835d0a63cf3e5402e315cf36c4515c2c210015e182f4799ced99e06125777a21574c299aada67c47df6a May 27 17:46:55.343586 unknown[852]: fetched base config from "system" May 27 17:46:55.343603 unknown[852]: fetched base config from "system" May 27 17:46:55.344072 ignition[852]: fetch: fetch complete May 27 17:46:55.343610 unknown[852]: fetched user config from "hetzner" May 27 17:46:55.344079 ignition[852]: fetch: fetch passed May 27 17:46:55.344138 ignition[852]: Ignition finished successfully May 27 17:46:55.347269 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). May 27 17:46:55.349644 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 27 17:46:55.376726 ignition[860]: Ignition 2.21.0 May 27 17:46:55.376741 ignition[860]: Stage: kargs May 27 17:46:55.376873 ignition[860]: no configs at "/usr/lib/ignition/base.d" May 27 17:46:55.376882 ignition[860]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 27 17:46:55.378237 ignition[860]: kargs: kargs passed May 27 17:46:55.378292 ignition[860]: Ignition finished successfully May 27 17:46:55.380814 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 27 17:46:55.383008 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 27 17:46:55.404292 ignition[866]: Ignition 2.21.0 May 27 17:46:55.404308 ignition[866]: Stage: disks May 27 17:46:55.404447 ignition[866]: no configs at "/usr/lib/ignition/base.d" May 27 17:46:55.404457 ignition[866]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 27 17:46:55.407363 ignition[866]: disks: disks passed May 27 17:46:55.407488 ignition[866]: Ignition finished successfully May 27 17:46:55.408389 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 27 17:46:55.409406 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 27 17:46:55.410103 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 27 17:46:55.411283 systemd[1]: Reached target local-fs.target - Local File Systems. May 27 17:46:55.412446 systemd[1]: Reached target sysinit.target - System Initialization. May 27 17:46:55.413449 systemd[1]: Reached target basic.target - Basic System. May 27 17:46:55.415358 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 27 17:46:55.442320 systemd-fsck[875]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks May 27 17:46:55.446261 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 27 17:46:55.449059 systemd[1]: Mounting sysroot.mount - /sysroot... May 27 17:46:55.543548 kernel: EXT4-fs (sda9): mounted filesystem bf93e767-f532-4480-b210-a196f7ac181e r/w with ordered data mode. Quota mode: none. May 27 17:46:55.544445 systemd[1]: Mounted sysroot.mount - /sysroot. May 27 17:46:55.545371 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 27 17:46:55.547917 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 27 17:46:55.550591 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 27 17:46:55.552806 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... May 27 17:46:55.553981 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 27 17:46:55.554715 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 27 17:46:55.559504 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 27 17:46:55.561670 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 27 17:46:55.573557 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 (8:6) scanned by mount (883) May 27 17:46:55.578203 kernel: BTRFS info (device sda6): first mount of filesystem be856aed-e34b-4b7b-be8a-0716b27db212 May 27 17:46:55.578271 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm May 27 17:46:55.578283 kernel: BTRFS info (device sda6): using free-space-tree May 27 17:46:55.596998 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 27 17:46:55.605470 coreos-metadata[885]: May 27 17:46:55.605 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 May 27 17:46:55.606800 coreos-metadata[885]: May 27 17:46:55.606 INFO Fetch successful May 27 17:46:55.607467 coreos-metadata[885]: May 27 17:46:55.606 INFO wrote hostname ci-4344-0-0-a-c8f0a3e630 to /sysroot/etc/hostname May 27 17:46:55.608431 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. May 27 17:46:55.614967 initrd-setup-root[911]: cut: /sysroot/etc/passwd: No such file or directory May 27 17:46:55.618433 initrd-setup-root[918]: cut: /sysroot/etc/group: No such file or directory May 27 17:46:55.622641 initrd-setup-root[925]: cut: /sysroot/etc/shadow: No such file or directory May 27 17:46:55.625781 initrd-setup-root[932]: cut: /sysroot/etc/gshadow: No such file or directory May 27 17:46:55.698347 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 27 17:46:55.699957 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 27 17:46:55.701466 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 27 17:46:55.719540 kernel: BTRFS info (device sda6): last unmount of filesystem be856aed-e34b-4b7b-be8a-0716b27db212 May 27 17:46:55.734721 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 27 17:46:55.740454 ignition[1000]: INFO : Ignition 2.21.0 May 27 17:46:55.740454 ignition[1000]: INFO : Stage: mount May 27 17:46:55.742302 ignition[1000]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 17:46:55.742302 ignition[1000]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 27 17:46:55.742302 ignition[1000]: INFO : mount: mount passed May 27 17:46:55.742302 ignition[1000]: INFO : Ignition finished successfully May 27 17:46:55.742590 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 27 17:46:55.744133 systemd[1]: Starting ignition-files.service - Ignition (files)... May 27 17:46:55.944569 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 27 17:46:55.946255 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 27 17:46:55.970625 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 (8:6) scanned by mount (1011) May 27 17:46:55.975287 kernel: BTRFS info (device sda6): first mount of filesystem be856aed-e34b-4b7b-be8a-0716b27db212 May 27 17:46:55.975337 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm May 27 17:46:55.978002 kernel: BTRFS info (device sda6): using free-space-tree May 27 17:46:55.990139 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 27 17:46:56.035481 ignition[1027]: INFO : Ignition 2.21.0 May 27 17:46:56.035481 ignition[1027]: INFO : Stage: files May 27 17:46:56.036975 ignition[1027]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 17:46:56.036975 ignition[1027]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 27 17:46:56.039035 ignition[1027]: DEBUG : files: compiled without relabeling support, skipping May 27 17:46:56.041306 ignition[1027]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 27 17:46:56.041306 ignition[1027]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 27 17:46:56.044378 ignition[1027]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 27 17:46:56.045469 ignition[1027]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 27 17:46:56.045469 ignition[1027]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 27 17:46:56.044770 unknown[1027]: wrote ssh authorized keys file for user: core May 27 17:46:56.048394 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" May 27 17:46:56.048394 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 May 27 17:46:56.340505 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 27 17:46:56.537829 systemd-networkd[848]: eth1: Gained IPv6LL May 27 17:46:56.985685 systemd-networkd[848]: eth0: Gained IPv6LL May 27 17:46:58.441910 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" May 27 17:46:58.441910 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 27 17:46:58.444375 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 27 17:46:58.444375 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 27 17:46:58.444375 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 27 17:46:58.444375 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 27 17:46:58.444375 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 27 17:46:58.444375 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 27 17:46:58.444375 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 27 17:46:58.450455 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 27 17:46:58.450455 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 27 17:46:58.450455 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" May 27 17:46:58.450455 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" May 27 17:46:58.450455 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" May 27 17:46:58.450455 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-x86-64.raw: attempt #1 May 27 17:46:59.262026 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 27 17:46:59.793760 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" May 27 17:46:59.793760 ignition[1027]: INFO : files: op(b): [started] processing unit "prepare-helm.service" May 27 17:46:59.796039 ignition[1027]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 27 17:46:59.797090 ignition[1027]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 27 17:46:59.797090 ignition[1027]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" May 27 17:46:59.797090 ignition[1027]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" May 27 17:46:59.797090 ignition[1027]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" May 27 17:46:59.797090 ignition[1027]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" May 27 17:46:59.797090 ignition[1027]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" May 27 17:46:59.797090 ignition[1027]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" May 27 17:46:59.807541 ignition[1027]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" May 27 17:46:59.807541 ignition[1027]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" May 27 17:46:59.807541 ignition[1027]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" May 27 17:46:59.807541 ignition[1027]: INFO : files: files passed May 27 17:46:59.807541 ignition[1027]: INFO : Ignition finished successfully May 27 17:46:59.798358 systemd[1]: Finished ignition-files.service - Ignition (files). May 27 17:46:59.801609 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 27 17:46:59.804607 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 27 17:46:59.814922 systemd[1]: ignition-quench.service: Deactivated successfully. May 27 17:46:59.815018 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 27 17:46:59.820130 initrd-setup-root-after-ignition[1058]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 27 17:46:59.820130 initrd-setup-root-after-ignition[1058]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 27 17:46:59.822797 initrd-setup-root-after-ignition[1062]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 27 17:46:59.822685 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 27 17:46:59.824374 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 27 17:46:59.825680 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 27 17:46:59.856875 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 27 17:46:59.856951 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 27 17:46:59.858114 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 27 17:46:59.858911 systemd[1]: Reached target initrd.target - Initrd Default Target. May 27 17:46:59.859904 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 27 17:46:59.860486 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 27 17:46:59.894963 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 27 17:46:59.897449 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 27 17:46:59.912606 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 27 17:46:59.913715 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 17:46:59.914417 systemd[1]: Stopped target timers.target - Timer Units. May 27 17:46:59.915481 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 27 17:46:59.915695 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 27 17:46:59.916812 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 27 17:46:59.917511 systemd[1]: Stopped target basic.target - Basic System. May 27 17:46:59.918681 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 27 17:46:59.919712 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 27 17:46:59.920736 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 27 17:46:59.922027 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. May 27 17:46:59.923144 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 27 17:46:59.924277 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 27 17:46:59.925490 systemd[1]: Stopped target sysinit.target - System Initialization. May 27 17:46:59.926917 systemd[1]: Stopped target local-fs.target - Local File Systems. May 27 17:46:59.928420 systemd[1]: Stopped target swap.target - Swaps. May 27 17:46:59.929675 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 27 17:46:59.929781 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 27 17:46:59.931483 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 27 17:46:59.932836 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 17:46:59.934007 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 27 17:46:59.934385 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 17:46:59.935240 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 27 17:46:59.935351 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 27 17:46:59.936940 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 27 17:46:59.937069 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 27 17:46:59.937752 systemd[1]: ignition-files.service: Deactivated successfully. May 27 17:46:59.937865 systemd[1]: Stopped ignition-files.service - Ignition (files). May 27 17:46:59.939268 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. May 27 17:46:59.939443 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. May 27 17:46:59.942606 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 27 17:46:59.953768 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 27 17:46:59.954968 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 27 17:46:59.955132 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 27 17:46:59.957145 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 27 17:46:59.957278 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 27 17:46:59.963702 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 27 17:46:59.964419 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 27 17:46:59.972991 ignition[1082]: INFO : Ignition 2.21.0 May 27 17:46:59.972991 ignition[1082]: INFO : Stage: umount May 27 17:46:59.972991 ignition[1082]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 17:46:59.972991 ignition[1082]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 27 17:46:59.972991 ignition[1082]: INFO : umount: umount passed May 27 17:46:59.972991 ignition[1082]: INFO : Ignition finished successfully May 27 17:46:59.974952 systemd[1]: ignition-mount.service: Deactivated successfully. May 27 17:46:59.975060 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 27 17:46:59.977500 systemd[1]: ignition-disks.service: Deactivated successfully. May 27 17:46:59.978048 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 27 17:46:59.978978 systemd[1]: ignition-kargs.service: Deactivated successfully. May 27 17:46:59.979023 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 27 17:46:59.980364 systemd[1]: ignition-fetch.service: Deactivated successfully. May 27 17:46:59.980413 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). May 27 17:46:59.981874 systemd[1]: Stopped target network.target - Network. May 27 17:46:59.983190 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 27 17:46:59.983245 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 27 17:46:59.986292 systemd[1]: Stopped target paths.target - Path Units. May 27 17:46:59.989056 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 27 17:46:59.989824 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 17:46:59.991031 systemd[1]: Stopped target slices.target - Slice Units. May 27 17:46:59.991517 systemd[1]: Stopped target sockets.target - Socket Units. May 27 17:46:59.992307 systemd[1]: iscsid.socket: Deactivated successfully. May 27 17:46:59.992351 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 27 17:46:59.993323 systemd[1]: iscsiuio.socket: Deactivated successfully. May 27 17:46:59.993365 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 27 17:46:59.995147 systemd[1]: ignition-setup.service: Deactivated successfully. May 27 17:46:59.995215 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 27 17:46:59.996153 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 27 17:46:59.996210 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 27 17:46:59.997176 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 27 17:46:59.998157 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 27 17:46:59.999973 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 27 17:47:00.000457 systemd[1]: sysroot-boot.service: Deactivated successfully. May 27 17:47:00.000535 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 27 17:47:00.001817 systemd[1]: systemd-resolved.service: Deactivated successfully. May 27 17:47:00.001944 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 27 17:47:00.005194 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 27 17:47:00.005698 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 27 17:47:00.005753 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 27 17:47:00.006690 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 27 17:47:00.006724 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 17:47:00.010099 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 27 17:47:00.010363 systemd[1]: systemd-networkd.service: Deactivated successfully. May 27 17:47:00.010459 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 27 17:47:00.012294 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 27 17:47:00.012738 systemd[1]: Stopped target network-pre.target - Preparation for Network. May 27 17:47:00.013492 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 27 17:47:00.013562 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 27 17:47:00.015265 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 27 17:47:00.016207 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 27 17:47:00.016244 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 27 17:47:00.018237 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 27 17:47:00.018272 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 27 17:47:00.018863 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 27 17:47:00.018895 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 27 17:47:00.020284 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 17:47:00.022020 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 27 17:47:00.029827 systemd[1]: systemd-udevd.service: Deactivated successfully. May 27 17:47:00.029938 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 17:47:00.030872 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 27 17:47:00.030904 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 27 17:47:00.031453 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 27 17:47:00.031483 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 27 17:47:00.032450 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 27 17:47:00.032483 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 27 17:47:00.034674 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 27 17:47:00.034735 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 27 17:47:00.035821 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 27 17:47:00.035873 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 27 17:47:00.037886 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 27 17:47:00.039860 systemd[1]: systemd-network-generator.service: Deactivated successfully. May 27 17:47:00.039919 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. May 27 17:47:00.041373 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 27 17:47:00.041411 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 17:47:00.042827 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. May 27 17:47:00.042871 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 27 17:47:00.044513 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 27 17:47:00.044563 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 27 17:47:00.045464 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 27 17:47:00.045501 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 27 17:47:00.047451 systemd[1]: network-cleanup.service: Deactivated successfully. May 27 17:47:00.049596 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 27 17:47:00.052391 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 27 17:47:00.052455 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 27 17:47:00.053784 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 27 17:47:00.055626 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 27 17:47:00.076617 systemd[1]: Switching root. May 27 17:47:00.103888 systemd-journald[216]: Journal stopped May 27 17:47:00.956108 systemd-journald[216]: Received SIGTERM from PID 1 (systemd). May 27 17:47:00.956167 kernel: SELinux: policy capability network_peer_controls=1 May 27 17:47:00.956185 kernel: SELinux: policy capability open_perms=1 May 27 17:47:00.956202 kernel: SELinux: policy capability extended_socket_class=1 May 27 17:47:00.956212 kernel: SELinux: policy capability always_check_network=0 May 27 17:47:00.956221 kernel: SELinux: policy capability cgroup_seclabel=1 May 27 17:47:00.956234 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 27 17:47:00.956248 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 27 17:47:00.956261 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 27 17:47:00.956270 kernel: SELinux: policy capability userspace_initial_context=0 May 27 17:47:00.956278 kernel: audit: type=1403 audit(1748368020.235:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 27 17:47:00.956289 systemd[1]: Successfully loaded SELinux policy in 46.305ms. May 27 17:47:00.956308 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 10.112ms. May 27 17:47:00.956317 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 27 17:47:00.956326 systemd[1]: Detected virtualization kvm. May 27 17:47:00.956334 systemd[1]: Detected architecture x86-64. May 27 17:47:00.956342 systemd[1]: Detected first boot. May 27 17:47:00.956354 systemd[1]: Hostname set to . May 27 17:47:00.956362 systemd[1]: Initializing machine ID from VM UUID. May 27 17:47:00.956370 zram_generator::config[1125]: No configuration found. May 27 17:47:00.956381 kernel: Guest personality initialized and is inactive May 27 17:47:00.956388 kernel: VMCI host device registered (name=vmci, major=10, minor=125) May 27 17:47:00.956396 kernel: Initialized host personality May 27 17:47:00.956404 kernel: NET: Registered PF_VSOCK protocol family May 27 17:47:00.956411 systemd[1]: Populated /etc with preset unit settings. May 27 17:47:00.956421 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 27 17:47:00.956429 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 27 17:47:00.956437 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 27 17:47:00.956447 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 27 17:47:00.956456 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 27 17:47:00.956464 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 27 17:47:00.956472 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 27 17:47:00.956480 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 27 17:47:00.956491 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 27 17:47:00.956501 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 27 17:47:00.956509 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 27 17:47:00.956517 systemd[1]: Created slice user.slice - User and Session Slice. May 27 17:47:00.959242 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 17:47:00.959266 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 17:47:00.959284 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 27 17:47:00.959294 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 27 17:47:00.959304 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 27 17:47:00.959313 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 27 17:47:00.959321 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... May 27 17:47:00.959329 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 17:47:00.959338 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 27 17:47:00.959346 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 27 17:47:00.959354 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 27 17:47:00.959362 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 27 17:47:00.959371 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 27 17:47:00.959380 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 17:47:00.959388 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 27 17:47:00.959398 systemd[1]: Reached target slices.target - Slice Units. May 27 17:47:00.959406 systemd[1]: Reached target swap.target - Swaps. May 27 17:47:00.959414 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 27 17:47:00.959422 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 27 17:47:00.959430 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 27 17:47:00.959439 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 27 17:47:00.959448 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 27 17:47:00.959457 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 27 17:47:00.959465 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 27 17:47:00.959473 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 27 17:47:00.959481 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 27 17:47:00.959489 systemd[1]: Mounting media.mount - External Media Directory... May 27 17:47:00.959497 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 17:47:00.959505 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 27 17:47:00.959514 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 27 17:47:00.959544 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 27 17:47:00.959555 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 27 17:47:00.959563 systemd[1]: Reached target machines.target - Containers. May 27 17:47:00.959571 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 27 17:47:00.959580 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 17:47:00.959588 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 27 17:47:00.959597 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 27 17:47:00.959605 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 27 17:47:00.959615 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 27 17:47:00.959623 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 27 17:47:00.959631 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 27 17:47:00.959639 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 27 17:47:00.959648 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 27 17:47:00.959667 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 27 17:47:00.959676 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 27 17:47:00.959684 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 27 17:47:00.959693 systemd[1]: Stopped systemd-fsck-usr.service. May 27 17:47:00.959702 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 17:47:00.959711 systemd[1]: Starting systemd-journald.service - Journal Service... May 27 17:47:00.959719 kernel: loop: module loaded May 27 17:47:00.959728 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 27 17:47:00.959736 kernel: fuse: init (API version 7.41) May 27 17:47:00.959744 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 27 17:47:00.959753 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 27 17:47:00.959761 kernel: ACPI: bus type drm_connector registered May 27 17:47:00.959770 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 27 17:47:00.959779 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 27 17:47:00.959789 systemd[1]: verity-setup.service: Deactivated successfully. May 27 17:47:00.959798 systemd[1]: Stopped verity-setup.service. May 27 17:47:00.959807 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 17:47:00.959815 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 27 17:47:00.959824 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 27 17:47:00.959850 systemd-journald[1216]: Collecting audit messages is disabled. May 27 17:47:00.959872 systemd[1]: Mounted media.mount - External Media Directory. May 27 17:47:00.959881 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 27 17:47:00.959889 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 27 17:47:00.959897 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 27 17:47:00.959906 systemd-journald[1216]: Journal started May 27 17:47:00.959923 systemd-journald[1216]: Runtime Journal (/run/log/journal/0a4b3c57cc0d42afade77b9863612342) is 4.8M, max 38.6M, 33.7M free. May 27 17:47:00.693868 systemd[1]: Queued start job for default target multi-user.target. May 27 17:47:00.702674 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. May 27 17:47:00.703207 systemd[1]: systemd-journald.service: Deactivated successfully. May 27 17:47:00.962580 systemd[1]: Started systemd-journald.service - Journal Service. May 27 17:47:00.963718 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 27 17:47:00.964341 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 27 17:47:00.965133 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 27 17:47:00.965252 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 27 17:47:00.965970 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 27 17:47:00.966130 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 27 17:47:00.966792 systemd[1]: modprobe@drm.service: Deactivated successfully. May 27 17:47:00.966959 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 27 17:47:00.967729 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 27 17:47:00.967899 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 27 17:47:00.968647 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 27 17:47:00.968822 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 27 17:47:00.969446 systemd[1]: modprobe@loop.service: Deactivated successfully. May 27 17:47:00.969642 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 27 17:47:00.970324 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 27 17:47:00.971074 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 27 17:47:00.971797 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 27 17:47:00.972705 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 27 17:47:00.978977 systemd[1]: Reached target network-pre.target - Preparation for Network. May 27 17:47:00.980603 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 27 17:47:00.982628 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 27 17:47:00.983577 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 27 17:47:00.984577 systemd[1]: Reached target local-fs.target - Local File Systems. May 27 17:47:00.985974 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 27 17:47:00.990614 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 27 17:47:00.991280 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 17:47:00.994413 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 27 17:47:00.996741 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 27 17:47:00.997431 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 27 17:47:01.000298 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 27 17:47:01.000812 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 27 17:47:01.002884 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 27 17:47:01.008150 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 27 17:47:01.010770 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 27 17:47:01.022288 systemd-journald[1216]: Time spent on flushing to /var/log/journal/0a4b3c57cc0d42afade77b9863612342 is 23.312ms for 1159 entries. May 27 17:47:01.022288 systemd-journald[1216]: System Journal (/var/log/journal/0a4b3c57cc0d42afade77b9863612342) is 8M, max 584.8M, 576.8M free. May 27 17:47:01.055577 systemd-journald[1216]: Received client request to flush runtime journal. May 27 17:47:01.055614 kernel: loop0: detected capacity change from 0 to 113872 May 27 17:47:01.013807 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 27 17:47:01.015634 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 27 17:47:01.036577 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 27 17:47:01.037488 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 27 17:47:01.041052 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 27 17:47:01.056742 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 27 17:47:01.063579 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 27 17:47:01.064341 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 27 17:47:01.077269 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 27 17:47:01.084676 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 27 17:47:01.083805 systemd-tmpfiles[1251]: ACLs are not supported, ignoring. May 27 17:47:01.083814 systemd-tmpfiles[1251]: ACLs are not supported, ignoring. May 27 17:47:01.088271 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 27 17:47:01.090623 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 27 17:47:01.102545 kernel: loop1: detected capacity change from 0 to 146240 May 27 17:47:01.132040 kernel: loop2: detected capacity change from 0 to 8 May 27 17:47:01.130071 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 27 17:47:01.131587 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 27 17:47:01.145680 kernel: loop3: detected capacity change from 0 to 229808 May 27 17:47:01.153444 systemd-tmpfiles[1274]: ACLs are not supported, ignoring. May 27 17:47:01.153702 systemd-tmpfiles[1274]: ACLs are not supported, ignoring. May 27 17:47:01.156945 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 17:47:01.185551 kernel: loop4: detected capacity change from 0 to 113872 May 27 17:47:01.200587 kernel: loop5: detected capacity change from 0 to 146240 May 27 17:47:01.222612 kernel: loop6: detected capacity change from 0 to 8 May 27 17:47:01.225711 kernel: loop7: detected capacity change from 0 to 229808 May 27 17:47:01.240779 (sd-merge)[1278]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. May 27 17:47:01.241092 (sd-merge)[1278]: Merged extensions into '/usr'. May 27 17:47:01.244458 systemd[1]: Reload requested from client PID 1250 ('systemd-sysext') (unit systemd-sysext.service)... May 27 17:47:01.244560 systemd[1]: Reloading... May 27 17:47:01.299556 zram_generator::config[1303]: No configuration found. May 27 17:47:01.389245 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 17:47:01.467730 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 27 17:47:01.468039 systemd[1]: Reloading finished in 223 ms. May 27 17:47:01.476154 ldconfig[1245]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 27 17:47:01.486208 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 27 17:47:01.487599 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 27 17:47:01.500618 systemd[1]: Starting ensure-sysext.service... May 27 17:47:01.503405 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 27 17:47:01.526398 systemd-tmpfiles[1348]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. May 27 17:47:01.526650 systemd-tmpfiles[1348]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. May 27 17:47:01.526882 systemd-tmpfiles[1348]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 27 17:47:01.527038 systemd-tmpfiles[1348]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 27 17:47:01.527249 systemd[1]: Reload requested from client PID 1347 ('systemctl') (unit ensure-sysext.service)... May 27 17:47:01.527260 systemd[1]: Reloading... May 27 17:47:01.527548 systemd-tmpfiles[1348]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 27 17:47:01.527742 systemd-tmpfiles[1348]: ACLs are not supported, ignoring. May 27 17:47:01.527778 systemd-tmpfiles[1348]: ACLs are not supported, ignoring. May 27 17:47:01.534002 systemd-tmpfiles[1348]: Detected autofs mount point /boot during canonicalization of boot. May 27 17:47:01.534110 systemd-tmpfiles[1348]: Skipping /boot May 27 17:47:01.544952 systemd-tmpfiles[1348]: Detected autofs mount point /boot during canonicalization of boot. May 27 17:47:01.545083 systemd-tmpfiles[1348]: Skipping /boot May 27 17:47:01.582561 zram_generator::config[1375]: No configuration found. May 27 17:47:01.652815 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 17:47:01.716644 systemd[1]: Reloading finished in 189 ms. May 27 17:47:01.730664 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 27 17:47:01.734636 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 17:47:01.739590 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 27 17:47:01.742646 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 27 17:47:01.748648 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 27 17:47:01.751613 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 27 17:47:01.754201 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 17:47:01.759682 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 27 17:47:01.768259 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 17:47:01.768383 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 17:47:01.770162 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 27 17:47:01.772366 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 27 17:47:01.775345 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 27 17:47:01.776690 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 17:47:01.776780 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 17:47:01.780553 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 27 17:47:01.780993 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 17:47:01.785725 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 17:47:01.786752 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 17:47:01.786907 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 17:47:01.787007 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 17:47:01.787107 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 17:47:01.790700 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 17:47:01.791702 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 17:47:01.796193 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 27 17:47:01.797668 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 17:47:01.797753 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 17:47:01.797859 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 17:47:01.798719 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 27 17:47:01.801699 systemd[1]: Finished ensure-sysext.service. May 27 17:47:01.805706 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 27 17:47:01.806365 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 27 17:47:01.806482 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 27 17:47:01.809978 systemd-udevd[1425]: Using default interface naming scheme 'v255'. May 27 17:47:01.812382 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... May 27 17:47:01.817855 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 27 17:47:01.819821 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 27 17:47:01.819959 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 27 17:47:01.820721 systemd[1]: modprobe@loop.service: Deactivated successfully. May 27 17:47:01.820850 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 27 17:47:01.823259 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 27 17:47:01.823332 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 27 17:47:01.832701 systemd[1]: modprobe@drm.service: Deactivated successfully. May 27 17:47:01.832828 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 27 17:47:01.842101 augenrules[1459]: No rules May 27 17:47:01.843053 systemd[1]: audit-rules.service: Deactivated successfully. May 27 17:47:01.843243 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 27 17:47:01.844895 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 27 17:47:01.854025 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 27 17:47:01.857928 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 17:47:01.860983 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 27 17:47:01.864025 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 27 17:47:01.864883 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 27 17:47:02.029558 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 May 27 17:47:02.037548 kernel: ACPI: button: Power Button [PWRF] May 27 17:47:02.051052 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. May 27 17:47:02.051743 systemd[1]: Reached target time-set.target - System Time Set. May 27 17:47:02.055430 systemd-resolved[1424]: Positive Trust Anchors: May 27 17:47:02.055448 systemd-resolved[1424]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 27 17:47:02.055473 systemd-resolved[1424]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 27 17:47:02.066774 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. May 27 17:47:02.068046 systemd-resolved[1424]: Using system hostname 'ci-4344-0-0-a-c8f0a3e630'. May 27 17:47:02.070195 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 27 17:47:02.070744 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 27 17:47:02.071373 systemd-networkd[1475]: lo: Link UP May 27 17:47:02.071603 systemd[1]: Reached target sysinit.target - System Initialization. May 27 17:47:02.072086 systemd-networkd[1475]: lo: Gained carrier May 27 17:47:02.072668 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 27 17:47:02.073590 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 27 17:47:02.074573 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. May 27 17:47:02.080091 systemd-networkd[1475]: Enumeration completed May 27 17:47:02.080814 systemd-networkd[1475]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 17:47:02.080871 systemd-networkd[1475]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. May 27 17:47:02.081912 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 27 17:47:02.082635 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 27 17:47:02.083229 systemd-networkd[1475]: eth1: Link UP May 27 17:47:02.083596 systemd-networkd[1475]: eth1: Gained carrier May 27 17:47:02.083634 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 27 17:47:02.084169 systemd-networkd[1475]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 17:47:02.084587 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 27 17:47:02.084687 systemd[1]: Reached target paths.target - Path Units. May 27 17:47:02.085564 systemd[1]: Reached target timers.target - Timer Units. May 27 17:47:02.087407 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 27 17:47:02.089032 systemd[1]: Starting docker.socket - Docker Socket for the API... May 27 17:47:02.090848 systemd-networkd[1475]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 17:47:02.090905 systemd-networkd[1475]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 27 17:47:02.091284 systemd-networkd[1475]: eth0: Link UP May 27 17:47:02.091591 systemd-networkd[1475]: eth0: Gained carrier May 27 17:47:02.091653 systemd-networkd[1475]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 17:47:02.092537 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 27 17:47:02.093266 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). May 27 17:47:02.093854 systemd[1]: Reached target ssh-access.target - SSH Access Available. May 27 17:47:02.096489 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 27 17:47:02.097493 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 27 17:47:02.099135 systemd[1]: Started systemd-networkd.service - Network Configuration. May 27 17:47:02.100014 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 27 17:47:02.105572 kernel: mousedev: PS/2 mouse device common for all mice May 27 17:47:02.106565 systemd-networkd[1475]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 May 27 17:47:02.106937 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. May 27 17:47:02.107018 systemd[1]: Reached target network.target - Network. May 27 17:47:02.107399 systemd-timesyncd[1450]: Network configuration changed, trying to establish connection. May 27 17:47:02.107958 systemd[1]: Reached target sockets.target - Socket Units. May 27 17:47:02.108579 systemd[1]: Reached target basic.target - Basic System. May 27 17:47:02.109488 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 27 17:47:02.109518 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 27 17:47:02.110805 systemd[1]: Starting containerd.service - containerd container runtime... May 27 17:47:02.112602 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... May 27 17:47:02.115999 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 27 17:47:02.124956 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 27 17:47:02.128776 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 27 17:47:02.131219 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 27 17:47:02.131729 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 27 17:47:02.136929 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... May 27 17:47:02.141218 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 27 17:47:02.143476 jq[1525]: false May 27 17:47:02.144102 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 27 17:47:02.146912 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. May 27 17:47:02.148367 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 27 17:47:02.151570 systemd-networkd[1475]: eth0: DHCPv4 address 157.180.123.17/32, gateway 172.31.1.1 acquired from 172.31.1.1 May 27 17:47:02.154701 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 27 17:47:02.155318 google_oslogin_nss_cache[1528]: oslogin_cache_refresh[1528]: Refreshing passwd entry cache May 27 17:47:02.155486 oslogin_cache_refresh[1528]: Refreshing passwd entry cache May 27 17:47:02.158118 systemd-timesyncd[1450]: Network configuration changed, trying to establish connection. May 27 17:47:02.158515 google_oslogin_nss_cache[1528]: oslogin_cache_refresh[1528]: Failure getting users, quitting May 27 17:47:02.161415 oslogin_cache_refresh[1528]: Failure getting users, quitting May 27 17:47:02.161627 google_oslogin_nss_cache[1528]: oslogin_cache_refresh[1528]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. May 27 17:47:02.161627 google_oslogin_nss_cache[1528]: oslogin_cache_refresh[1528]: Refreshing group entry cache May 27 17:47:02.161434 oslogin_cache_refresh[1528]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. May 27 17:47:02.161464 oslogin_cache_refresh[1528]: Refreshing group entry cache May 27 17:47:02.162018 systemd[1]: Starting systemd-logind.service - User Login Management... May 27 17:47:02.162548 google_oslogin_nss_cache[1528]: oslogin_cache_refresh[1528]: Failure getting groups, quitting May 27 17:47:02.162548 google_oslogin_nss_cache[1528]: oslogin_cache_refresh[1528]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. May 27 17:47:02.162499 oslogin_cache_refresh[1528]: Failure getting groups, quitting May 27 17:47:02.162505 oslogin_cache_refresh[1528]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. May 27 17:47:02.165504 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 27 17:47:02.167608 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 27 17:47:02.169379 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 27 17:47:02.172600 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 27 17:47:02.178671 systemd[1]: Starting update-engine.service - Update Engine... May 27 17:47:02.184900 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 27 17:47:02.199748 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 27 17:47:02.201827 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 27 17:47:02.202029 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 27 17:47:02.202234 systemd[1]: google-oslogin-cache.service: Deactivated successfully. May 27 17:47:02.202357 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. May 27 17:47:02.203166 systemd[1]: motdgen.service: Deactivated successfully. May 27 17:47:02.203298 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 27 17:47:02.205492 extend-filesystems[1527]: Found loop4 May 27 17:47:02.206504 extend-filesystems[1527]: Found loop5 May 27 17:47:02.206504 extend-filesystems[1527]: Found loop6 May 27 17:47:02.206504 extend-filesystems[1527]: Found loop7 May 27 17:47:02.206504 extend-filesystems[1527]: Found sda May 27 17:47:02.206504 extend-filesystems[1527]: Found sda1 May 27 17:47:02.206504 extend-filesystems[1527]: Found sda2 May 27 17:47:02.206504 extend-filesystems[1527]: Found sda3 May 27 17:47:02.206504 extend-filesystems[1527]: Found usr May 27 17:47:02.206504 extend-filesystems[1527]: Found sda4 May 27 17:47:02.206504 extend-filesystems[1527]: Found sda6 May 27 17:47:02.206504 extend-filesystems[1527]: Found sda7 May 27 17:47:02.206504 extend-filesystems[1527]: Found sda9 May 27 17:47:02.206504 extend-filesystems[1527]: Checking size of /dev/sda9 May 27 17:47:02.225303 coreos-metadata[1521]: May 27 17:47:02.205 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 May 27 17:47:02.225303 coreos-metadata[1521]: May 27 17:47:02.212 INFO Fetch successful May 27 17:47:02.225303 coreos-metadata[1521]: May 27 17:47:02.214 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 May 27 17:47:02.225303 coreos-metadata[1521]: May 27 17:47:02.214 INFO Fetch successful May 27 17:47:02.207453 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 27 17:47:02.225545 jq[1539]: true May 27 17:47:02.210837 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 27 17:47:02.234547 update_engine[1538]: I20250527 17:47:02.233806 1538 main.cc:92] Flatcar Update Engine starting May 27 17:47:02.247822 jq[1555]: true May 27 17:47:02.254731 extend-filesystems[1527]: Resized partition /dev/sda9 May 27 17:47:02.269043 extend-filesystems[1575]: resize2fs 1.47.2 (1-Jan-2025) May 27 17:47:02.273580 tar[1550]: linux-amd64/LICENSE May 27 17:47:02.273580 tar[1550]: linux-amd64/helm May 27 17:47:02.273968 (ntainerd)[1565]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 27 17:47:02.281813 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks May 27 17:47:02.288146 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 27 17:47:02.300105 dbus-daemon[1522]: [system] SELinux support is enabled May 27 17:47:02.300751 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 27 17:47:02.303069 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt May 27 17:47:02.303242 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD May 27 17:47:02.304965 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 27 17:47:02.304992 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 27 17:47:02.305627 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 27 17:47:02.305642 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 27 17:47:02.324336 systemd[1]: Started update-engine.service - Update Engine. May 27 17:47:02.326844 update_engine[1538]: I20250527 17:47:02.324908 1538 update_check_scheduler.cc:74] Next update check in 6m51s May 27 17:47:02.333333 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 27 17:47:02.340309 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. May 27 17:47:02.346716 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 27 17:47:02.362035 systemd-logind[1534]: New seat seat0. May 27 17:47:02.363666 systemd[1]: Started systemd-logind.service - User Login Management. May 27 17:47:02.369837 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. May 27 17:47:02.370878 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 27 17:47:02.399883 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 27 17:47:02.401558 bash[1602]: Updated "/home/core/.ssh/authorized_keys" May 27 17:47:02.402067 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 27 17:47:02.407745 systemd[1]: Starting sshkeys.service... May 27 17:47:02.429700 kernel: EXT4-fs (sda9): resized filesystem to 9393147 May 27 17:47:02.439128 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. May 27 17:47:02.441590 extend-filesystems[1575]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required May 27 17:47:02.441590 extend-filesystems[1575]: old_desc_blocks = 1, new_desc_blocks = 5 May 27 17:47:02.441590 extend-filesystems[1575]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. May 27 17:47:02.456430 extend-filesystems[1527]: Resized filesystem in /dev/sda9 May 27 17:47:02.456430 extend-filesystems[1527]: Found sr0 May 27 17:47:02.446345 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... May 27 17:47:02.448954 systemd[1]: extend-filesystems.service: Deactivated successfully. May 27 17:47:02.449094 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 27 17:47:02.497887 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 May 27 17:47:02.513426 coreos-metadata[1619]: May 27 17:47:02.513 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 May 27 17:47:02.513843 coreos-metadata[1619]: May 27 17:47:02.513 INFO Fetch successful May 27 17:47:02.520370 unknown[1619]: wrote ssh authorized keys file for user: core May 27 17:47:02.534099 kernel: EDAC MC: Ver: 3.0.0 May 27 17:47:02.548548 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console May 27 17:47:02.567503 update-ssh-keys[1625]: Updated "/home/core/.ssh/authorized_keys" May 27 17:47:02.568008 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). May 27 17:47:02.603696 containerd[1565]: time="2025-05-27T17:47:02Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 27 17:47:02.604032 containerd[1565]: time="2025-05-27T17:47:02.603992122Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 May 27 17:47:02.627833 containerd[1565]: time="2025-05-27T17:47:02.627801407Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="8.958µs" May 27 17:47:02.627833 containerd[1565]: time="2025-05-27T17:47:02.627829710Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 27 17:47:02.627888 containerd[1565]: time="2025-05-27T17:47:02.627846682Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 27 17:47:02.627972 containerd[1565]: time="2025-05-27T17:47:02.627950666Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 27 17:47:02.627997 containerd[1565]: time="2025-05-27T17:47:02.627972387Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 27 17:47:02.627997 containerd[1565]: time="2025-05-27T17:47:02.627991343Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 27 17:47:02.628060 containerd[1565]: time="2025-05-27T17:47:02.628039724Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 27 17:47:02.628060 containerd[1565]: time="2025-05-27T17:47:02.628056806Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 27 17:47:02.628236 containerd[1565]: time="2025-05-27T17:47:02.628212998Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 27 17:47:02.628236 containerd[1565]: time="2025-05-27T17:47:02.628233287Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 27 17:47:02.628269 containerd[1565]: time="2025-05-27T17:47:02.628243115Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 27 17:47:02.628269 containerd[1565]: time="2025-05-27T17:47:02.628250289Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 27 17:47:02.628324 containerd[1565]: time="2025-05-27T17:47:02.628304931Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 27 17:47:02.628476 containerd[1565]: time="2025-05-27T17:47:02.628455012Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 27 17:47:02.628496 containerd[1565]: time="2025-05-27T17:47:02.628485580Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 27 17:47:02.628512 containerd[1565]: time="2025-05-27T17:47:02.628499807Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 27 17:47:02.629669 containerd[1565]: time="2025-05-27T17:47:02.629637711Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 27 17:47:02.630782 containerd[1565]: time="2025-05-27T17:47:02.630741913Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 27 17:47:02.630844 containerd[1565]: time="2025-05-27T17:47:02.630819909Z" level=info msg="metadata content store policy set" policy=shared May 27 17:47:02.634369 containerd[1565]: time="2025-05-27T17:47:02.634344802Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 27 17:47:02.634410 containerd[1565]: time="2025-05-27T17:47:02.634391539Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 27 17:47:02.634428 containerd[1565]: time="2025-05-27T17:47:02.634412038Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 27 17:47:02.634428 containerd[1565]: time="2025-05-27T17:47:02.634423238Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 27 17:47:02.634477 containerd[1565]: time="2025-05-27T17:47:02.634433728Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 27 17:47:02.634495 containerd[1565]: time="2025-05-27T17:47:02.634477901Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 27 17:47:02.634510 containerd[1565]: time="2025-05-27T17:47:02.634498440Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 27 17:47:02.634551 containerd[1565]: time="2025-05-27T17:47:02.634511785Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 27 17:47:02.634551 containerd[1565]: time="2025-05-27T17:47:02.634539677Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 27 17:47:02.634551 containerd[1565]: time="2025-05-27T17:47:02.634549555Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 27 17:47:02.634595 containerd[1565]: time="2025-05-27T17:47:02.634557320Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 27 17:47:02.634595 containerd[1565]: time="2025-05-27T17:47:02.634571988Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 27 17:47:02.634670 containerd[1565]: time="2025-05-27T17:47:02.634643702Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 27 17:47:02.634721 containerd[1565]: time="2025-05-27T17:47:02.634679729Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 27 17:47:02.634721 containerd[1565]: time="2025-05-27T17:47:02.634695840Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 27 17:47:02.634721 containerd[1565]: time="2025-05-27T17:47:02.634704667Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 27 17:47:02.634721 containerd[1565]: time="2025-05-27T17:47:02.634713312Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 27 17:47:02.634721 containerd[1565]: time="2025-05-27T17:47:02.634721237Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 27 17:47:02.634794 containerd[1565]: time="2025-05-27T17:47:02.634730275Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 27 17:47:02.634794 containerd[1565]: time="2025-05-27T17:47:02.634738290Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 27 17:47:02.634794 containerd[1565]: time="2025-05-27T17:47:02.634746395Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 27 17:47:02.634794 containerd[1565]: time="2025-05-27T17:47:02.634754139Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 27 17:47:02.634794 containerd[1565]: time="2025-05-27T17:47:02.634762225Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 27 17:47:02.634864 containerd[1565]: time="2025-05-27T17:47:02.634806578Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 27 17:47:02.634864 containerd[1565]: time="2025-05-27T17:47:02.634817118Z" level=info msg="Start snapshots syncer" May 27 17:47:02.637544 containerd[1565]: time="2025-05-27T17:47:02.636151440Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 27 17:47:02.637544 containerd[1565]: time="2025-05-27T17:47:02.636410807Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 27 17:47:02.637651 containerd[1565]: time="2025-05-27T17:47:02.636450542Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 27 17:47:02.637749 sshd_keygen[1551]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 27 17:47:02.638307 containerd[1565]: time="2025-05-27T17:47:02.638277289Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 27 17:47:02.638553 kernel: Console: switching to colour dummy device 80x25 May 27 17:47:02.639875 containerd[1565]: time="2025-05-27T17:47:02.639845641Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 27 17:47:02.639909 containerd[1565]: time="2025-05-27T17:47:02.639875527Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 27 17:47:02.639909 containerd[1565]: time="2025-05-27T17:47:02.639886136Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 27 17:47:02.639909 containerd[1565]: time="2025-05-27T17:47:02.639896355Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 27 17:47:02.639909 containerd[1565]: time="2025-05-27T17:47:02.639905944Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 27 17:47:02.639976 containerd[1565]: time="2025-05-27T17:47:02.639915802Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 27 17:47:02.639976 containerd[1565]: time="2025-05-27T17:47:02.639925180Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 27 17:47:02.639976 containerd[1565]: time="2025-05-27T17:47:02.639942973Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 27 17:47:02.639976 containerd[1565]: time="2025-05-27T17:47:02.639951069Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 27 17:47:02.639976 containerd[1565]: time="2025-05-27T17:47:02.639959374Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 27 17:47:02.640046 containerd[1565]: time="2025-05-27T17:47:02.639995031Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 27 17:47:02.640046 containerd[1565]: time="2025-05-27T17:47:02.640007254Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 27 17:47:02.640046 containerd[1565]: time="2025-05-27T17:47:02.640013646Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 27 17:47:02.640046 containerd[1565]: time="2025-05-27T17:47:02.640020649Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 27 17:47:02.640046 containerd[1565]: time="2025-05-27T17:47:02.640026229Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 27 17:47:02.640046 containerd[1565]: time="2025-05-27T17:47:02.640032992Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 27 17:47:02.640128 containerd[1565]: time="2025-05-27T17:47:02.640078176Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 27 17:47:02.640128 containerd[1565]: time="2025-05-27T17:47:02.640095910Z" level=info msg="runtime interface created" May 27 17:47:02.640128 containerd[1565]: time="2025-05-27T17:47:02.640100408Z" level=info msg="created NRI interface" May 27 17:47:02.640128 containerd[1565]: time="2025-05-27T17:47:02.640107242Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 27 17:47:02.640128 containerd[1565]: time="2025-05-27T17:47:02.640116148Z" level=info msg="Connect containerd service" May 27 17:47:02.640196 containerd[1565]: time="2025-05-27T17:47:02.640134031Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 27 17:47:02.645313 containerd[1565]: time="2025-05-27T17:47:02.645279634Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 27 17:47:02.677551 kernel: [drm] features: -virgl +edid -resource_blob -host_visible May 27 17:47:02.677599 kernel: [drm] features: -context_init May 27 17:47:02.680071 locksmithd[1594]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 27 17:47:02.681566 kernel: [drm] number of scanouts: 1 May 27 17:47:02.685026 kernel: [drm] number of cap sets: 0 May 27 17:47:02.683092 systemd[1]: Finished sshkeys.service. May 27 17:47:02.688936 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 May 27 17:47:02.690909 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 27 17:47:02.692901 systemd[1]: Starting issuegen.service - Generate /run/issue... May 27 17:47:02.716358 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 17:47:02.725616 systemd[1]: issuegen.service: Deactivated successfully. May 27 17:47:02.726576 systemd[1]: Finished issuegen.service - Generate /run/issue. May 27 17:47:02.728002 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 27 17:47:02.753063 systemd-logind[1534]: Watching system buttons on /dev/input/event3 (Power Button) May 27 17:47:02.761485 systemd-logind[1534]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) May 27 17:47:02.780466 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 27 17:47:02.782213 systemd[1]: Started getty@tty1.service - Getty on tty1. May 27 17:47:02.784755 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. May 27 17:47:02.784900 systemd[1]: Reached target getty.target - Login Prompts. May 27 17:47:02.792870 containerd[1565]: time="2025-05-27T17:47:02.792845636Z" level=info msg="Start subscribing containerd event" May 27 17:47:02.796143 containerd[1565]: time="2025-05-27T17:47:02.796010483Z" level=info msg="Start recovering state" May 27 17:47:02.798027 containerd[1565]: time="2025-05-27T17:47:02.795986518Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 27 17:47:02.798027 containerd[1565]: time="2025-05-27T17:47:02.796593867Z" level=info msg="Start event monitor" May 27 17:47:02.798027 containerd[1565]: time="2025-05-27T17:47:02.797967073Z" level=info msg="Start cni network conf syncer for default" May 27 17:47:02.798027 containerd[1565]: time="2025-05-27T17:47:02.797975109Z" level=info msg="Start streaming server" May 27 17:47:02.798027 containerd[1565]: time="2025-05-27T17:47:02.797982592Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 27 17:47:02.798027 containerd[1565]: time="2025-05-27T17:47:02.797988694Z" level=info msg="runtime interface starting up..." May 27 17:47:02.798027 containerd[1565]: time="2025-05-27T17:47:02.797993292Z" level=info msg="starting plugins..." May 27 17:47:02.798027 containerd[1565]: time="2025-05-27T17:47:02.798006538Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 27 17:47:02.798703 containerd[1565]: time="2025-05-27T17:47:02.798629657Z" level=info msg=serving... address=/run/containerd/containerd.sock May 27 17:47:02.798909 containerd[1565]: time="2025-05-27T17:47:02.798862574Z" level=info msg="containerd successfully booted in 0.195647s" May 27 17:47:02.798990 systemd[1]: Started containerd.service - containerd container runtime. May 27 17:47:02.861487 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 27 17:47:02.862040 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 27 17:47:02.863441 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 27 17:47:02.872320 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 17:47:02.912282 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 17:47:03.063815 tar[1550]: linux-amd64/README.md May 27 17:47:03.081535 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 27 17:47:03.385731 systemd-networkd[1475]: eth1: Gained IPv6LL May 27 17:47:03.386278 systemd-timesyncd[1450]: Network configuration changed, trying to establish connection. May 27 17:47:03.388090 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 27 17:47:03.388605 systemd[1]: Reached target network-online.target - Network is Online. May 27 17:47:03.390429 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:47:03.391749 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 27 17:47:03.422878 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 27 17:47:03.961695 systemd-networkd[1475]: eth0: Gained IPv6LL May 27 17:47:03.963402 systemd-timesyncd[1450]: Network configuration changed, trying to establish connection. May 27 17:47:04.153182 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:47:04.153588 systemd[1]: Reached target multi-user.target - Multi-User System. May 27 17:47:04.154434 systemd[1]: Startup finished in 2.956s (kernel) + 7.602s (initrd) + 3.964s (userspace) = 14.522s. May 27 17:47:04.158057 (kubelet)[1703]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:47:04.663635 kubelet[1703]: E0527 17:47:04.663582 1703 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:47:04.665802 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:47:04.665912 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:47:04.666165 systemd[1]: kubelet.service: Consumed 810ms CPU time, 268.6M memory peak. May 27 17:47:14.734246 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 27 17:47:14.736602 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:47:14.834314 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:47:14.841749 (kubelet)[1722]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:47:14.891170 kubelet[1722]: E0527 17:47:14.891115 1722 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:47:14.894097 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:47:14.894285 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:47:14.894763 systemd[1]: kubelet.service: Consumed 136ms CPU time, 110.8M memory peak. May 27 17:47:24.984309 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. May 27 17:47:24.986071 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:47:25.087241 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:47:25.098741 (kubelet)[1737]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:47:25.127925 kubelet[1737]: E0527 17:47:25.127886 1737 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:47:25.130060 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:47:25.130226 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:47:25.130493 systemd[1]: kubelet.service: Consumed 106ms CPU time, 110.4M memory peak. May 27 17:47:34.960796 systemd-timesyncd[1450]: Contacted time server 129.70.132.33:123 (2.flatcar.pool.ntp.org). May 27 17:47:34.960852 systemd-timesyncd[1450]: Initial clock synchronization to Tue 2025-05-27 17:47:34.960670 UTC. May 27 17:47:34.960943 systemd-resolved[1424]: Clock change detected. Flushing caches. May 27 17:47:35.929931 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. May 27 17:47:35.931594 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:47:36.053883 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:47:36.056345 (kubelet)[1753]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:47:36.085901 kubelet[1753]: E0527 17:47:36.085851 1753 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:47:36.088042 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:47:36.088150 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:47:36.088412 systemd[1]: kubelet.service: Consumed 115ms CPU time, 110.1M memory peak. May 27 17:47:46.180577 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. May 27 17:47:46.183510 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:47:46.306861 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:47:46.309256 (kubelet)[1768]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:47:46.333501 kubelet[1768]: E0527 17:47:46.333451 1768 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:47:46.335277 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:47:46.335394 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:47:46.335686 systemd[1]: kubelet.service: Consumed 114ms CPU time, 108.2M memory peak. May 27 17:47:48.366138 update_engine[1538]: I20250527 17:47:48.366024 1538 update_attempter.cc:509] Updating boot flags... May 27 17:47:56.430665 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. May 27 17:47:56.433919 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:47:56.584734 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:47:56.587066 (kubelet)[1807]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:47:56.616815 kubelet[1807]: E0527 17:47:56.616761 1807 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:47:56.618942 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:47:56.619145 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:47:56.619558 systemd[1]: kubelet.service: Consumed 138ms CPU time, 108.6M memory peak. May 27 17:48:06.679958 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. May 27 17:48:06.681337 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:48:06.787782 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:48:06.790109 (kubelet)[1822]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:48:06.817298 kubelet[1822]: E0527 17:48:06.817208 1822 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:48:06.819934 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:48:06.820050 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:48:06.820291 systemd[1]: kubelet.service: Consumed 106ms CPU time, 108.1M memory peak. May 27 17:48:16.930014 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 7. May 27 17:48:16.931527 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:48:17.050065 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:48:17.058415 (kubelet)[1837]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:48:17.087658 kubelet[1837]: E0527 17:48:17.087603 1837 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:48:17.089652 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:48:17.089803 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:48:17.090078 systemd[1]: kubelet.service: Consumed 109ms CPU time, 110.5M memory peak. May 27 17:48:27.180429 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 8. May 27 17:48:27.182657 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:48:27.327789 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:48:27.330032 (kubelet)[1852]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:48:27.355244 kubelet[1852]: E0527 17:48:27.355201 1852 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:48:27.357036 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:48:27.357266 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:48:27.357518 systemd[1]: kubelet.service: Consumed 127ms CPU time, 109.9M memory peak. May 27 17:48:37.430564 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 9. May 27 17:48:37.433115 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:48:37.578977 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:48:37.591437 (kubelet)[1868]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:48:37.620234 kubelet[1868]: E0527 17:48:37.620159 1868 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:48:37.622310 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:48:37.622507 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:48:37.622737 systemd[1]: kubelet.service: Consumed 136ms CPU time, 108.6M memory peak. May 27 17:48:44.512549 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 27 17:48:44.514120 systemd[1]: Started sshd@0-157.180.123.17:22-139.178.89.65:40998.service - OpenSSH per-connection server daemon (139.178.89.65:40998). May 27 17:48:45.497594 sshd[1876]: Accepted publickey for core from 139.178.89.65 port 40998 ssh2: RSA SHA256:qwIifqwsO4XluOtZIsqbrQnpEfm6LyQvxlgpLQEO90E May 27 17:48:45.499390 sshd-session[1876]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:48:45.504737 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 27 17:48:45.505673 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 27 17:48:45.512447 systemd-logind[1534]: New session 1 of user core. May 27 17:48:45.522410 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 27 17:48:45.524924 systemd[1]: Starting user@500.service - User Manager for UID 500... May 27 17:48:45.536535 (systemd)[1880]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 27 17:48:45.538721 systemd-logind[1534]: New session c1 of user core. May 27 17:48:45.708836 systemd[1880]: Queued start job for default target default.target. May 27 17:48:45.715467 systemd[1880]: Created slice app.slice - User Application Slice. May 27 17:48:45.715501 systemd[1880]: Reached target paths.target - Paths. May 27 17:48:45.715548 systemd[1880]: Reached target timers.target - Timers. May 27 17:48:45.717125 systemd[1880]: Starting dbus.socket - D-Bus User Message Bus Socket... May 27 17:48:45.730533 systemd[1880]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 27 17:48:45.730720 systemd[1880]: Reached target sockets.target - Sockets. May 27 17:48:45.730794 systemd[1880]: Reached target basic.target - Basic System. May 27 17:48:45.730846 systemd[1880]: Reached target default.target - Main User Target. May 27 17:48:45.730882 systemd[1880]: Startup finished in 186ms. May 27 17:48:45.731088 systemd[1]: Started user@500.service - User Manager for UID 500. May 27 17:48:45.745437 systemd[1]: Started session-1.scope - Session 1 of User core. May 27 17:48:46.430962 systemd[1]: Started sshd@1-157.180.123.17:22-139.178.89.65:41012.service - OpenSSH per-connection server daemon (139.178.89.65:41012). May 27 17:48:47.410735 sshd[1891]: Accepted publickey for core from 139.178.89.65 port 41012 ssh2: RSA SHA256:qwIifqwsO4XluOtZIsqbrQnpEfm6LyQvxlgpLQEO90E May 27 17:48:47.412034 sshd-session[1891]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:48:47.416726 systemd-logind[1534]: New session 2 of user core. May 27 17:48:47.430451 systemd[1]: Started session-2.scope - Session 2 of User core. May 27 17:48:47.679833 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 10. May 27 17:48:47.681344 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:48:47.789744 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:48:47.792176 (kubelet)[1902]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:48:47.818414 kubelet[1902]: E0527 17:48:47.818361 1902 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:48:47.820522 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:48:47.820635 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:48:47.820866 systemd[1]: kubelet.service: Consumed 103ms CPU time, 108.3M memory peak. May 27 17:48:48.083585 sshd[1893]: Connection closed by 139.178.89.65 port 41012 May 27 17:48:48.084105 sshd-session[1891]: pam_unix(sshd:session): session closed for user core May 27 17:48:48.086906 systemd[1]: sshd@1-157.180.123.17:22-139.178.89.65:41012.service: Deactivated successfully. May 27 17:48:48.088401 systemd[1]: session-2.scope: Deactivated successfully. May 27 17:48:48.089426 systemd-logind[1534]: Session 2 logged out. Waiting for processes to exit. May 27 17:48:48.090569 systemd-logind[1534]: Removed session 2. May 27 17:48:48.255726 systemd[1]: Started sshd@2-157.180.123.17:22-139.178.89.65:41022.service - OpenSSH per-connection server daemon (139.178.89.65:41022). May 27 17:48:49.234794 sshd[1914]: Accepted publickey for core from 139.178.89.65 port 41022 ssh2: RSA SHA256:qwIifqwsO4XluOtZIsqbrQnpEfm6LyQvxlgpLQEO90E May 27 17:48:49.236048 sshd-session[1914]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:48:49.240274 systemd-logind[1534]: New session 3 of user core. May 27 17:48:49.255361 systemd[1]: Started session-3.scope - Session 3 of User core. May 27 17:48:49.905874 sshd[1916]: Connection closed by 139.178.89.65 port 41022 May 27 17:48:49.906422 sshd-session[1914]: pam_unix(sshd:session): session closed for user core May 27 17:48:49.909085 systemd[1]: sshd@2-157.180.123.17:22-139.178.89.65:41022.service: Deactivated successfully. May 27 17:48:49.911025 systemd-logind[1534]: Session 3 logged out. Waiting for processes to exit. May 27 17:48:49.911109 systemd[1]: session-3.scope: Deactivated successfully. May 27 17:48:49.912616 systemd-logind[1534]: Removed session 3. May 27 17:48:50.072085 systemd[1]: Started sshd@3-157.180.123.17:22-139.178.89.65:41028.service - OpenSSH per-connection server daemon (139.178.89.65:41028). May 27 17:48:51.060735 sshd[1922]: Accepted publickey for core from 139.178.89.65 port 41028 ssh2: RSA SHA256:qwIifqwsO4XluOtZIsqbrQnpEfm6LyQvxlgpLQEO90E May 27 17:48:51.062110 sshd-session[1922]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:48:51.066184 systemd-logind[1534]: New session 4 of user core. May 27 17:48:51.075412 systemd[1]: Started session-4.scope - Session 4 of User core. May 27 17:48:51.731093 sshd[1924]: Connection closed by 139.178.89.65 port 41028 May 27 17:48:51.731680 sshd-session[1922]: pam_unix(sshd:session): session closed for user core May 27 17:48:51.734710 systemd-logind[1534]: Session 4 logged out. Waiting for processes to exit. May 27 17:48:51.734796 systemd[1]: sshd@3-157.180.123.17:22-139.178.89.65:41028.service: Deactivated successfully. May 27 17:48:51.736039 systemd[1]: session-4.scope: Deactivated successfully. May 27 17:48:51.737582 systemd-logind[1534]: Removed session 4. May 27 17:48:51.897096 systemd[1]: Started sshd@4-157.180.123.17:22-139.178.89.65:41040.service - OpenSSH per-connection server daemon (139.178.89.65:41040). May 27 17:48:52.871029 sshd[1930]: Accepted publickey for core from 139.178.89.65 port 41040 ssh2: RSA SHA256:qwIifqwsO4XluOtZIsqbrQnpEfm6LyQvxlgpLQEO90E May 27 17:48:52.872508 sshd-session[1930]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:48:52.878102 systemd-logind[1534]: New session 5 of user core. May 27 17:48:52.883423 systemd[1]: Started session-5.scope - Session 5 of User core. May 27 17:48:53.391794 sudo[1933]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 27 17:48:53.392026 sudo[1933]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 17:48:53.406521 sudo[1933]: pam_unix(sudo:session): session closed for user root May 27 17:48:53.563396 sshd[1932]: Connection closed by 139.178.89.65 port 41040 May 27 17:48:53.564149 sshd-session[1930]: pam_unix(sshd:session): session closed for user core May 27 17:48:53.567834 systemd-logind[1534]: Session 5 logged out. Waiting for processes to exit. May 27 17:48:53.567956 systemd[1]: sshd@4-157.180.123.17:22-139.178.89.65:41040.service: Deactivated successfully. May 27 17:48:53.569199 systemd[1]: session-5.scope: Deactivated successfully. May 27 17:48:53.570685 systemd-logind[1534]: Removed session 5. May 27 17:48:53.731469 systemd[1]: Started sshd@5-157.180.123.17:22-139.178.89.65:40796.service - OpenSSH per-connection server daemon (139.178.89.65:40796). May 27 17:48:54.707636 sshd[1939]: Accepted publickey for core from 139.178.89.65 port 40796 ssh2: RSA SHA256:qwIifqwsO4XluOtZIsqbrQnpEfm6LyQvxlgpLQEO90E May 27 17:48:54.708781 sshd-session[1939]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:48:54.713343 systemd-logind[1534]: New session 6 of user core. May 27 17:48:54.718374 systemd[1]: Started session-6.scope - Session 6 of User core. May 27 17:48:55.223981 sudo[1943]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 27 17:48:55.224367 sudo[1943]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 17:48:55.228136 sudo[1943]: pam_unix(sudo:session): session closed for user root May 27 17:48:55.232387 sudo[1942]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 27 17:48:55.232606 sudo[1942]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 17:48:55.240752 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 27 17:48:55.270926 augenrules[1965]: No rules May 27 17:48:55.271799 systemd[1]: audit-rules.service: Deactivated successfully. May 27 17:48:55.271984 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 27 17:48:55.273000 sudo[1942]: pam_unix(sudo:session): session closed for user root May 27 17:48:55.430365 sshd[1941]: Connection closed by 139.178.89.65 port 40796 May 27 17:48:55.430831 sshd-session[1939]: pam_unix(sshd:session): session closed for user core May 27 17:48:55.433375 systemd[1]: sshd@5-157.180.123.17:22-139.178.89.65:40796.service: Deactivated successfully. May 27 17:48:55.435121 systemd-logind[1534]: Session 6 logged out. Waiting for processes to exit. May 27 17:48:55.435213 systemd[1]: session-6.scope: Deactivated successfully. May 27 17:48:55.436619 systemd-logind[1534]: Removed session 6. May 27 17:48:55.601105 systemd[1]: Started sshd@6-157.180.123.17:22-139.178.89.65:40812.service - OpenSSH per-connection server daemon (139.178.89.65:40812). May 27 17:48:56.585410 sshd[1974]: Accepted publickey for core from 139.178.89.65 port 40812 ssh2: RSA SHA256:qwIifqwsO4XluOtZIsqbrQnpEfm6LyQvxlgpLQEO90E May 27 17:48:56.586573 sshd-session[1974]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:48:56.590839 systemd-logind[1534]: New session 7 of user core. May 27 17:48:56.603407 systemd[1]: Started session-7.scope - Session 7 of User core. May 27 17:48:57.106117 sudo[1977]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 27 17:48:57.106579 sudo[1977]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 17:48:57.440541 systemd[1]: Starting docker.service - Docker Application Container Engine... May 27 17:48:57.451486 (dockerd)[1995]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 27 17:48:57.607496 dockerd[1995]: time="2025-05-27T17:48:57.607432874Z" level=info msg="Starting up" May 27 17:48:57.608518 dockerd[1995]: time="2025-05-27T17:48:57.608489668Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 27 17:48:57.661238 dockerd[1995]: time="2025-05-27T17:48:57.661189410Z" level=info msg="Loading containers: start." May 27 17:48:57.670247 kernel: Initializing XFRM netlink socket May 27 17:48:57.837116 systemd-networkd[1475]: docker0: Link UP May 27 17:48:57.841486 dockerd[1995]: time="2025-05-27T17:48:57.841453595Z" level=info msg="Loading containers: done." May 27 17:48:57.851442 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck800640183-merged.mount: Deactivated successfully. May 27 17:48:57.852655 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 11. May 27 17:48:57.853239 dockerd[1995]: time="2025-05-27T17:48:57.853094229Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 27 17:48:57.853239 dockerd[1995]: time="2025-05-27T17:48:57.853147810Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 May 27 17:48:57.853339 dockerd[1995]: time="2025-05-27T17:48:57.853325574Z" level=info msg="Initializing buildkit" May 27 17:48:57.854022 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:48:57.871648 dockerd[1995]: time="2025-05-27T17:48:57.871579569Z" level=info msg="Completed buildkit initialization" May 27 17:48:57.878688 dockerd[1995]: time="2025-05-27T17:48:57.878662562Z" level=info msg="Daemon has completed initialization" May 27 17:48:57.878828 dockerd[1995]: time="2025-05-27T17:48:57.878767438Z" level=info msg="API listen on /run/docker.sock" May 27 17:48:57.880189 systemd[1]: Started docker.service - Docker Application Container Engine. May 27 17:48:57.951510 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:48:57.958395 (kubelet)[2202]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:48:57.983794 kubelet[2202]: E0527 17:48:57.983701 2202 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:48:57.985588 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:48:57.985703 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:48:57.986103 systemd[1]: kubelet.service: Consumed 94ms CPU time, 110.2M memory peak. May 27 17:48:58.624969 containerd[1565]: time="2025-05-27T17:48:58.624893237Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.1\"" May 27 17:48:59.213461 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1887363527.mount: Deactivated successfully. May 27 17:49:00.085610 containerd[1565]: time="2025-05-27T17:49:00.085558037Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:00.086438 containerd[1565]: time="2025-05-27T17:49:00.086290703Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.1: active requests=0, bytes read=30075497" May 27 17:49:00.087119 containerd[1565]: time="2025-05-27T17:49:00.087092829Z" level=info msg="ImageCreate event name:\"sha256:c6ab243b29f82a6ce269a5342bfd9ea3d0d4ef0f2bb7e98c6ac0bde1aeafab66\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:00.089019 containerd[1565]: time="2025-05-27T17:49:00.088992517Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:d8ae2fb01c39aa1c7add84f3d54425cf081c24c11e3946830292a8cfa4293548\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:00.089741 containerd[1565]: time="2025-05-27T17:49:00.089711125Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.1\" with image id \"sha256:c6ab243b29f82a6ce269a5342bfd9ea3d0d4ef0f2bb7e98c6ac0bde1aeafab66\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.1\", repo digest \"registry.k8s.io/kube-apiserver@sha256:d8ae2fb01c39aa1c7add84f3d54425cf081c24c11e3946830292a8cfa4293548\", size \"30072203\" in 1.464748018s" May 27 17:49:00.089818 containerd[1565]: time="2025-05-27T17:49:00.089804101Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.1\" returns image reference \"sha256:c6ab243b29f82a6ce269a5342bfd9ea3d0d4ef0f2bb7e98c6ac0bde1aeafab66\"" May 27 17:49:00.090430 containerd[1565]: time="2025-05-27T17:49:00.090406332Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.1\"" May 27 17:49:01.131528 containerd[1565]: time="2025-05-27T17:49:01.131481330Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:01.132484 containerd[1565]: time="2025-05-27T17:49:01.132292223Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.1: active requests=0, bytes read=26011412" May 27 17:49:01.133191 containerd[1565]: time="2025-05-27T17:49:01.133169681Z" level=info msg="ImageCreate event name:\"sha256:ef43894fa110c389f7286f4d5a3ea176072c95280efeca60d6a79617cdbbf3e4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:01.135137 containerd[1565]: time="2025-05-27T17:49:01.135106477Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:7c9bea694e3a3c01ed6a5ee02d55a6124cc08e0b2eec6caa33f2c396b8cbc3f8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:01.135883 containerd[1565]: time="2025-05-27T17:49:01.135858881Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.1\" with image id \"sha256:ef43894fa110c389f7286f4d5a3ea176072c95280efeca60d6a79617cdbbf3e4\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.1\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:7c9bea694e3a3c01ed6a5ee02d55a6124cc08e0b2eec6caa33f2c396b8cbc3f8\", size \"27638910\" in 1.045427582s" May 27 17:49:01.135925 containerd[1565]: time="2025-05-27T17:49:01.135886382Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.1\" returns image reference \"sha256:ef43894fa110c389f7286f4d5a3ea176072c95280efeca60d6a79617cdbbf3e4\"" May 27 17:49:01.136569 containerd[1565]: time="2025-05-27T17:49:01.136544688Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.1\"" May 27 17:49:02.220587 containerd[1565]: time="2025-05-27T17:49:02.220534415Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:02.221552 containerd[1565]: time="2025-05-27T17:49:02.221368231Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.1: active requests=0, bytes read=20148982" May 27 17:49:02.222291 containerd[1565]: time="2025-05-27T17:49:02.222269543Z" level=info msg="ImageCreate event name:\"sha256:398c985c0d950becc8dcdab5877a8a517ffeafca0792b3fe5f1acff218aeac49\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:02.224370 containerd[1565]: time="2025-05-27T17:49:02.224350490Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:395b7de7cdbdcc3c3a3db270844a3f71d757e2447a1e4db76b4cce46fba7fd55\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:02.224961 containerd[1565]: time="2025-05-27T17:49:02.224941770Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.1\" with image id \"sha256:398c985c0d950becc8dcdab5877a8a517ffeafca0792b3fe5f1acff218aeac49\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.1\", repo digest \"registry.k8s.io/kube-scheduler@sha256:395b7de7cdbdcc3c3a3db270844a3f71d757e2447a1e4db76b4cce46fba7fd55\", size \"21776498\" in 1.088315981s" May 27 17:49:02.225034 containerd[1565]: time="2025-05-27T17:49:02.225016822Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.1\" returns image reference \"sha256:398c985c0d950becc8dcdab5877a8a517ffeafca0792b3fe5f1acff218aeac49\"" May 27 17:49:02.225595 containerd[1565]: time="2025-05-27T17:49:02.225463480Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.1\"" May 27 17:49:03.117984 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1928478323.mount: Deactivated successfully. May 27 17:49:03.380302 containerd[1565]: time="2025-05-27T17:49:03.380184837Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:03.381070 containerd[1565]: time="2025-05-27T17:49:03.381048097Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.1: active requests=0, bytes read=31889103" May 27 17:49:03.382025 containerd[1565]: time="2025-05-27T17:49:03.381976892Z" level=info msg="ImageCreate event name:\"sha256:b79c189b052cdbe0e837d0caa6faf1d9fd696d8664fcc462f67d9ea51f26fef2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:03.383406 containerd[1565]: time="2025-05-27T17:49:03.383382861Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:7ddf379897139ae8ade8b33cb9373b70c632a4d5491da6e234f5d830e0a50807\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:03.383826 containerd[1565]: time="2025-05-27T17:49:03.383802079Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.1\" with image id \"sha256:b79c189b052cdbe0e837d0caa6faf1d9fd696d8664fcc462f67d9ea51f26fef2\", repo tag \"registry.k8s.io/kube-proxy:v1.33.1\", repo digest \"registry.k8s.io/kube-proxy@sha256:7ddf379897139ae8ade8b33cb9373b70c632a4d5491da6e234f5d830e0a50807\", size \"31888094\" in 1.15831814s" May 27 17:49:03.383895 containerd[1565]: time="2025-05-27T17:49:03.383882510Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.1\" returns image reference \"sha256:b79c189b052cdbe0e837d0caa6faf1d9fd696d8664fcc462f67d9ea51f26fef2\"" May 27 17:49:03.384276 containerd[1565]: time="2025-05-27T17:49:03.384259136Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" May 27 17:49:03.877128 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount73295677.mount: Deactivated successfully. May 27 17:49:04.573574 containerd[1565]: time="2025-05-27T17:49:04.573522689Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:04.574547 containerd[1565]: time="2025-05-27T17:49:04.574522547Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=20942332" May 27 17:49:04.575513 containerd[1565]: time="2025-05-27T17:49:04.575479282Z" level=info msg="ImageCreate event name:\"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:04.577714 containerd[1565]: time="2025-05-27T17:49:04.577680885Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:04.578377 containerd[1565]: time="2025-05-27T17:49:04.578357626Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"20939036\" in 1.194077872s" May 27 17:49:04.578502 containerd[1565]: time="2025-05-27T17:49:04.578431675Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\"" May 27 17:49:04.578860 containerd[1565]: time="2025-05-27T17:49:04.578825925Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" May 27 17:49:05.023105 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount964739197.mount: Deactivated successfully. May 27 17:49:05.028303 containerd[1565]: time="2025-05-27T17:49:05.028213735Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 17:49:05.029144 containerd[1565]: time="2025-05-27T17:49:05.029105490Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321160" May 27 17:49:05.029946 containerd[1565]: time="2025-05-27T17:49:05.029885554Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 17:49:05.032116 containerd[1565]: time="2025-05-27T17:49:05.032064945Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 17:49:05.033024 containerd[1565]: time="2025-05-27T17:49:05.032872762Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 453.890344ms" May 27 17:49:05.033024 containerd[1565]: time="2025-05-27T17:49:05.032908889Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" May 27 17:49:05.033978 containerd[1565]: time="2025-05-27T17:49:05.033910069Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" May 27 17:49:06.655891 containerd[1565]: time="2025-05-27T17:49:06.655818316Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:06.656874 containerd[1565]: time="2025-05-27T17:49:06.656841786Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=58142773" May 27 17:49:06.657801 containerd[1565]: time="2025-05-27T17:49:06.657765030Z" level=info msg="ImageCreate event name:\"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:06.659698 containerd[1565]: time="2025-05-27T17:49:06.659665107Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:06.660739 containerd[1565]: time="2025-05-27T17:49:06.660562962Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"58938593\" in 1.626622296s" May 27 17:49:06.660739 containerd[1565]: time="2025-05-27T17:49:06.660611844Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\"" May 27 17:49:08.179847 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 12. May 27 17:49:08.183341 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:49:08.285324 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:49:08.286447 (kubelet)[2380]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 17:49:08.317729 kubelet[2380]: E0527 17:49:08.317680 2380 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 17:49:08.319543 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 17:49:08.319660 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 17:49:08.319887 systemd[1]: kubelet.service: Consumed 98ms CPU time, 107.9M memory peak. May 27 17:49:08.803489 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:49:08.803621 systemd[1]: kubelet.service: Consumed 98ms CPU time, 107.9M memory peak. May 27 17:49:08.805272 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:49:08.830022 systemd[1]: Reload requested from client PID 2394 ('systemctl') (unit session-7.scope)... May 27 17:49:08.830148 systemd[1]: Reloading... May 27 17:49:08.917648 zram_generator::config[2441]: No configuration found. May 27 17:49:08.983992 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 17:49:09.081436 systemd[1]: Reloading finished in 250 ms. May 27 17:49:09.131425 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:49:09.136694 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:49:09.137668 systemd[1]: kubelet.service: Deactivated successfully. May 27 17:49:09.138019 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:49:09.138147 systemd[1]: kubelet.service: Consumed 74ms CPU time, 98.3M memory peak. May 27 17:49:09.139851 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:49:09.236368 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:49:09.244590 (kubelet)[2494]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 27 17:49:09.288777 kubelet[2494]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 17:49:09.288777 kubelet[2494]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 27 17:49:09.288777 kubelet[2494]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 17:49:09.289283 kubelet[2494]: I0527 17:49:09.288857 2494 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 27 17:49:09.880271 kubelet[2494]: I0527 17:49:09.879646 2494 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" May 27 17:49:09.880271 kubelet[2494]: I0527 17:49:09.879687 2494 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 27 17:49:09.880271 kubelet[2494]: I0527 17:49:09.880097 2494 server.go:956] "Client rotation is on, will bootstrap in background" May 27 17:49:09.917617 kubelet[2494]: I0527 17:49:09.917562 2494 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 27 17:49:09.919080 kubelet[2494]: E0527 17:49:09.919021 2494 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://157.180.123.17:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 157.180.123.17:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" May 27 17:49:09.943278 kubelet[2494]: I0527 17:49:09.943228 2494 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 27 17:49:09.948992 kubelet[2494]: I0527 17:49:09.948924 2494 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 27 17:49:09.955208 kubelet[2494]: I0527 17:49:09.955154 2494 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 27 17:49:09.957924 kubelet[2494]: I0527 17:49:09.955183 2494 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4344-0-0-a-c8f0a3e630","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 27 17:49:09.957924 kubelet[2494]: I0527 17:49:09.957901 2494 topology_manager.go:138] "Creating topology manager with none policy" May 27 17:49:09.957924 kubelet[2494]: I0527 17:49:09.957926 2494 container_manager_linux.go:303] "Creating device plugin manager" May 27 17:49:09.958179 kubelet[2494]: I0527 17:49:09.958020 2494 state_mem.go:36] "Initialized new in-memory state store" May 27 17:49:09.961084 kubelet[2494]: I0527 17:49:09.960780 2494 kubelet.go:480] "Attempting to sync node with API server" May 27 17:49:09.961084 kubelet[2494]: I0527 17:49:09.960845 2494 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" May 27 17:49:09.961084 kubelet[2494]: I0527 17:49:09.960882 2494 kubelet.go:386] "Adding apiserver pod source" May 27 17:49:09.961084 kubelet[2494]: I0527 17:49:09.960897 2494 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 27 17:49:09.972970 kubelet[2494]: E0527 17:49:09.972877 2494 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://157.180.123.17:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4344-0-0-a-c8f0a3e630&limit=500&resourceVersion=0\": dial tcp 157.180.123.17:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" May 27 17:49:09.977032 kubelet[2494]: I0527 17:49:09.976578 2494 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 27 17:49:09.977032 kubelet[2494]: I0527 17:49:09.976968 2494 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" May 27 17:49:09.977690 kubelet[2494]: W0527 17:49:09.977629 2494 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 27 17:49:09.981278 kubelet[2494]: E0527 17:49:09.981190 2494 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://157.180.123.17:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 157.180.123.17:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" May 27 17:49:09.984263 kubelet[2494]: I0527 17:49:09.982811 2494 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 27 17:49:09.984263 kubelet[2494]: I0527 17:49:09.982872 2494 server.go:1289] "Started kubelet" May 27 17:49:09.994492 kubelet[2494]: I0527 17:49:09.994448 2494 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 27 17:49:09.996278 kubelet[2494]: E0527 17:49:09.991303 2494 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://157.180.123.17:6443/api/v1/namespaces/default/events\": dial tcp 157.180.123.17:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4344-0-0-a-c8f0a3e630.18437391b45562d1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4344-0-0-a-c8f0a3e630,UID:ci-4344-0-0-a-c8f0a3e630,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4344-0-0-a-c8f0a3e630,},FirstTimestamp:2025-05-27 17:49:09.982831313 +0000 UTC m=+0.734753237,LastTimestamp:2025-05-27 17:49:09.982831313 +0000 UTC m=+0.734753237,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4344-0-0-a-c8f0a3e630,}" May 27 17:49:10.000652 kubelet[2494]: I0527 17:49:10.000608 2494 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 May 27 17:49:10.002199 kubelet[2494]: I0527 17:49:10.002173 2494 server.go:317] "Adding debug handlers to kubelet server" May 27 17:49:10.004942 kubelet[2494]: I0527 17:49:10.004905 2494 volume_manager.go:297] "Starting Kubelet Volume Manager" May 27 17:49:10.005258 kubelet[2494]: E0527 17:49:10.005103 2494 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4344-0-0-a-c8f0a3e630\" not found" May 27 17:49:10.007093 kubelet[2494]: I0527 17:49:10.007061 2494 desired_state_of_world_populator.go:150] "Desired state populator starts to run" May 27 17:49:10.007164 kubelet[2494]: I0527 17:49:10.007127 2494 reconciler.go:26] "Reconciler: start to sync state" May 27 17:49:10.008036 kubelet[2494]: E0527 17:49:10.007986 2494 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://157.180.123.17:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 157.180.123.17:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" May 27 17:49:10.008122 kubelet[2494]: E0527 17:49:10.008062 2494 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://157.180.123.17:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4344-0-0-a-c8f0a3e630?timeout=10s\": dial tcp 157.180.123.17:6443: connect: connection refused" interval="200ms" May 27 17:49:10.009015 kubelet[2494]: I0527 17:49:10.008909 2494 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" May 27 17:49:10.010791 kubelet[2494]: I0527 17:49:10.010153 2494 factory.go:223] Registration of the systemd container factory successfully May 27 17:49:10.010791 kubelet[2494]: I0527 17:49:10.010328 2494 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 27 17:49:10.011011 kubelet[2494]: I0527 17:49:10.010974 2494 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 27 17:49:10.011434 kubelet[2494]: I0527 17:49:10.011422 2494 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 27 17:49:10.012858 kubelet[2494]: I0527 17:49:10.012846 2494 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 27 17:49:10.014331 kubelet[2494]: E0527 17:49:10.014301 2494 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 27 17:49:10.015455 kubelet[2494]: I0527 17:49:10.015121 2494 factory.go:223] Registration of the containerd container factory successfully May 27 17:49:10.029040 kubelet[2494]: I0527 17:49:10.028987 2494 cpu_manager.go:221] "Starting CPU manager" policy="none" May 27 17:49:10.029040 kubelet[2494]: I0527 17:49:10.029032 2494 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" May 27 17:49:10.029040 kubelet[2494]: I0527 17:49:10.029044 2494 state_mem.go:36] "Initialized new in-memory state store" May 27 17:49:10.030603 kubelet[2494]: I0527 17:49:10.030577 2494 policy_none.go:49] "None policy: Start" May 27 17:49:10.030603 kubelet[2494]: I0527 17:49:10.030599 2494 memory_manager.go:186] "Starting memorymanager" policy="None" May 27 17:49:10.031324 kubelet[2494]: I0527 17:49:10.030715 2494 state_mem.go:35] "Initializing new in-memory state store" May 27 17:49:10.033309 kubelet[2494]: I0527 17:49:10.032994 2494 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" May 27 17:49:10.033309 kubelet[2494]: I0527 17:49:10.033149 2494 status_manager.go:230] "Starting to sync pod status with apiserver" May 27 17:49:10.033309 kubelet[2494]: I0527 17:49:10.033170 2494 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 27 17:49:10.033309 kubelet[2494]: I0527 17:49:10.033175 2494 kubelet.go:2436] "Starting kubelet main sync loop" May 27 17:49:10.033309 kubelet[2494]: E0527 17:49:10.033201 2494 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 27 17:49:10.034926 kubelet[2494]: E0527 17:49:10.034899 2494 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://157.180.123.17:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 157.180.123.17:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" May 27 17:49:10.038376 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 27 17:49:10.057547 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 27 17:49:10.060540 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 27 17:49:10.070240 kubelet[2494]: E0527 17:49:10.069836 2494 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" May 27 17:49:10.070398 kubelet[2494]: I0527 17:49:10.070375 2494 eviction_manager.go:189] "Eviction manager: starting control loop" May 27 17:49:10.070447 kubelet[2494]: I0527 17:49:10.070390 2494 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 27 17:49:10.071698 kubelet[2494]: I0527 17:49:10.071624 2494 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 27 17:49:10.073249 kubelet[2494]: E0527 17:49:10.073196 2494 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" May 27 17:49:10.073249 kubelet[2494]: E0527 17:49:10.073244 2494 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4344-0-0-a-c8f0a3e630\" not found" May 27 17:49:10.147860 systemd[1]: Created slice kubepods-burstable-pod98f81ad9995f88103edca6af43f4760f.slice - libcontainer container kubepods-burstable-pod98f81ad9995f88103edca6af43f4760f.slice. May 27 17:49:10.152203 kubelet[2494]: W0527 17:49:10.152098 2494 helpers.go:245] readString: Failed to read "/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98f81ad9995f88103edca6af43f4760f.slice/cpuset.cpus.effective": read /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98f81ad9995f88103edca6af43f4760f.slice/cpuset.cpus.effective: no such device May 27 17:49:10.156959 kubelet[2494]: E0527 17:49:10.156618 2494 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344-0-0-a-c8f0a3e630\" not found" node="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:10.162118 systemd[1]: Created slice kubepods-burstable-pod4d24da20a65a9f7a3c68a6bf6c0c6572.slice - libcontainer container kubepods-burstable-pod4d24da20a65a9f7a3c68a6bf6c0c6572.slice. May 27 17:49:10.168149 systemd[1]: Created slice kubepods-burstable-pod44b3843146aa4432a4aeed1d6cbc2f4e.slice - libcontainer container kubepods-burstable-pod44b3843146aa4432a4aeed1d6cbc2f4e.slice. May 27 17:49:10.170704 kubelet[2494]: E0527 17:49:10.170449 2494 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344-0-0-a-c8f0a3e630\" not found" node="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:10.172521 kubelet[2494]: E0527 17:49:10.172450 2494 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344-0-0-a-c8f0a3e630\" not found" node="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:10.176186 kubelet[2494]: I0527 17:49:10.176149 2494 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:10.177042 kubelet[2494]: E0527 17:49:10.176994 2494 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://157.180.123.17:6443/api/v1/nodes\": dial tcp 157.180.123.17:6443: connect: connection refused" node="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:10.211189 kubelet[2494]: E0527 17:49:10.211127 2494 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://157.180.123.17:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4344-0-0-a-c8f0a3e630?timeout=10s\": dial tcp 157.180.123.17:6443: connect: connection refused" interval="400ms" May 27 17:49:10.308146 kubelet[2494]: I0527 17:49:10.308045 2494 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/4d24da20a65a9f7a3c68a6bf6c0c6572-flexvolume-dir\") pod \"kube-controller-manager-ci-4344-0-0-a-c8f0a3e630\" (UID: \"4d24da20a65a9f7a3c68a6bf6c0c6572\") " pod="kube-system/kube-controller-manager-ci-4344-0-0-a-c8f0a3e630" May 27 17:49:10.308146 kubelet[2494]: I0527 17:49:10.308112 2494 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4d24da20a65a9f7a3c68a6bf6c0c6572-kubeconfig\") pod \"kube-controller-manager-ci-4344-0-0-a-c8f0a3e630\" (UID: \"4d24da20a65a9f7a3c68a6bf6c0c6572\") " pod="kube-system/kube-controller-manager-ci-4344-0-0-a-c8f0a3e630" May 27 17:49:10.308146 kubelet[2494]: I0527 17:49:10.308147 2494 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/44b3843146aa4432a4aeed1d6cbc2f4e-kubeconfig\") pod \"kube-scheduler-ci-4344-0-0-a-c8f0a3e630\" (UID: \"44b3843146aa4432a4aeed1d6cbc2f4e\") " pod="kube-system/kube-scheduler-ci-4344-0-0-a-c8f0a3e630" May 27 17:49:10.308887 kubelet[2494]: I0527 17:49:10.308173 2494 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/98f81ad9995f88103edca6af43f4760f-k8s-certs\") pod \"kube-apiserver-ci-4344-0-0-a-c8f0a3e630\" (UID: \"98f81ad9995f88103edca6af43f4760f\") " pod="kube-system/kube-apiserver-ci-4344-0-0-a-c8f0a3e630" May 27 17:49:10.308887 kubelet[2494]: I0527 17:49:10.308202 2494 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/98f81ad9995f88103edca6af43f4760f-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4344-0-0-a-c8f0a3e630\" (UID: \"98f81ad9995f88103edca6af43f4760f\") " pod="kube-system/kube-apiserver-ci-4344-0-0-a-c8f0a3e630" May 27 17:49:10.308887 kubelet[2494]: I0527 17:49:10.308254 2494 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4d24da20a65a9f7a3c68a6bf6c0c6572-ca-certs\") pod \"kube-controller-manager-ci-4344-0-0-a-c8f0a3e630\" (UID: \"4d24da20a65a9f7a3c68a6bf6c0c6572\") " pod="kube-system/kube-controller-manager-ci-4344-0-0-a-c8f0a3e630" May 27 17:49:10.309315 kubelet[2494]: I0527 17:49:10.309080 2494 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4d24da20a65a9f7a3c68a6bf6c0c6572-k8s-certs\") pod \"kube-controller-manager-ci-4344-0-0-a-c8f0a3e630\" (UID: \"4d24da20a65a9f7a3c68a6bf6c0c6572\") " pod="kube-system/kube-controller-manager-ci-4344-0-0-a-c8f0a3e630" May 27 17:49:10.309315 kubelet[2494]: I0527 17:49:10.309174 2494 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4d24da20a65a9f7a3c68a6bf6c0c6572-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4344-0-0-a-c8f0a3e630\" (UID: \"4d24da20a65a9f7a3c68a6bf6c0c6572\") " pod="kube-system/kube-controller-manager-ci-4344-0-0-a-c8f0a3e630" May 27 17:49:10.309315 kubelet[2494]: I0527 17:49:10.309210 2494 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/98f81ad9995f88103edca6af43f4760f-ca-certs\") pod \"kube-apiserver-ci-4344-0-0-a-c8f0a3e630\" (UID: \"98f81ad9995f88103edca6af43f4760f\") " pod="kube-system/kube-apiserver-ci-4344-0-0-a-c8f0a3e630" May 27 17:49:10.380413 kubelet[2494]: I0527 17:49:10.380348 2494 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:10.380831 kubelet[2494]: E0527 17:49:10.380759 2494 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://157.180.123.17:6443/api/v1/nodes\": dial tcp 157.180.123.17:6443: connect: connection refused" node="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:10.460138 containerd[1565]: time="2025-05-27T17:49:10.460062324Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4344-0-0-a-c8f0a3e630,Uid:98f81ad9995f88103edca6af43f4760f,Namespace:kube-system,Attempt:0,}" May 27 17:49:10.476889 containerd[1565]: time="2025-05-27T17:49:10.476843932Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4344-0-0-a-c8f0a3e630,Uid:4d24da20a65a9f7a3c68a6bf6c0c6572,Namespace:kube-system,Attempt:0,}" May 27 17:49:10.477168 containerd[1565]: time="2025-05-27T17:49:10.476879739Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4344-0-0-a-c8f0a3e630,Uid:44b3843146aa4432a4aeed1d6cbc2f4e,Namespace:kube-system,Attempt:0,}" May 27 17:49:10.577410 containerd[1565]: time="2025-05-27T17:49:10.577347298Z" level=info msg="connecting to shim 6d5f736f852b8b27a94de15abaa8040cd49cba370c27e7218f1b84e905099448" address="unix:///run/containerd/s/8a9d493ba965878bf575af53024a3b7843e278241b23f5eabb09fd994fb3566d" namespace=k8s.io protocol=ttrpc version=3 May 27 17:49:10.579574 containerd[1565]: time="2025-05-27T17:49:10.579531638Z" level=info msg="connecting to shim 701000cf7d3f781c59dbf1e31f9fd84606a0c14c8b65b2dfe796507a746d5895" address="unix:///run/containerd/s/d50e86bf2cb21e0261ee68ed0cdef62428dc01dc1d7d1085c2c221529644fdcb" namespace=k8s.io protocol=ttrpc version=3 May 27 17:49:10.587800 containerd[1565]: time="2025-05-27T17:49:10.587425675Z" level=info msg="connecting to shim 9943345f5151c7643cdfff39141b42be5cd764524c9c46a8e518dd350f28fd0a" address="unix:///run/containerd/s/f5f512b4d59e02c570f1767c952747f19830947c3fa4d7dcdcd6ed449bb88c4a" namespace=k8s.io protocol=ttrpc version=3 May 27 17:49:10.612176 kubelet[2494]: E0527 17:49:10.612149 2494 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://157.180.123.17:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4344-0-0-a-c8f0a3e630?timeout=10s\": dial tcp 157.180.123.17:6443: connect: connection refused" interval="800ms" May 27 17:49:10.649423 systemd[1]: Started cri-containerd-6d5f736f852b8b27a94de15abaa8040cd49cba370c27e7218f1b84e905099448.scope - libcontainer container 6d5f736f852b8b27a94de15abaa8040cd49cba370c27e7218f1b84e905099448. May 27 17:49:10.651017 systemd[1]: Started cri-containerd-701000cf7d3f781c59dbf1e31f9fd84606a0c14c8b65b2dfe796507a746d5895.scope - libcontainer container 701000cf7d3f781c59dbf1e31f9fd84606a0c14c8b65b2dfe796507a746d5895. May 27 17:49:10.652450 systemd[1]: Started cri-containerd-9943345f5151c7643cdfff39141b42be5cd764524c9c46a8e518dd350f28fd0a.scope - libcontainer container 9943345f5151c7643cdfff39141b42be5cd764524c9c46a8e518dd350f28fd0a. May 27 17:49:10.706089 containerd[1565]: time="2025-05-27T17:49:10.706011269Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4344-0-0-a-c8f0a3e630,Uid:44b3843146aa4432a4aeed1d6cbc2f4e,Namespace:kube-system,Attempt:0,} returns sandbox id \"9943345f5151c7643cdfff39141b42be5cd764524c9c46a8e518dd350f28fd0a\"" May 27 17:49:10.706918 containerd[1565]: time="2025-05-27T17:49:10.706849593Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4344-0-0-a-c8f0a3e630,Uid:4d24da20a65a9f7a3c68a6bf6c0c6572,Namespace:kube-system,Attempt:0,} returns sandbox id \"701000cf7d3f781c59dbf1e31f9fd84606a0c14c8b65b2dfe796507a746d5895\"" May 27 17:49:10.711271 containerd[1565]: time="2025-05-27T17:49:10.711188355Z" level=info msg="CreateContainer within sandbox \"9943345f5151c7643cdfff39141b42be5cd764524c9c46a8e518dd350f28fd0a\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 27 17:49:10.714133 containerd[1565]: time="2025-05-27T17:49:10.714110280Z" level=info msg="CreateContainer within sandbox \"701000cf7d3f781c59dbf1e31f9fd84606a0c14c8b65b2dfe796507a746d5895\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 27 17:49:10.721184 containerd[1565]: time="2025-05-27T17:49:10.720802080Z" level=info msg="Container 302b85b05373335fa61629b02ec4219e120ffef55b045c988ce6ebe347f4e0f6: CDI devices from CRI Config.CDIDevices: []" May 27 17:49:10.721184 containerd[1565]: time="2025-05-27T17:49:10.721054824Z" level=info msg="Container 21867166cd49e0b545cdfa7a8580ea0b46d5f02f94139d815d1336cbac9fa9ef: CDI devices from CRI Config.CDIDevices: []" May 27 17:49:10.730191 containerd[1565]: time="2025-05-27T17:49:10.730159112Z" level=info msg="CreateContainer within sandbox \"9943345f5151c7643cdfff39141b42be5cd764524c9c46a8e518dd350f28fd0a\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"302b85b05373335fa61629b02ec4219e120ffef55b045c988ce6ebe347f4e0f6\"" May 27 17:49:10.733301 containerd[1565]: time="2025-05-27T17:49:10.733274950Z" level=info msg="CreateContainer within sandbox \"701000cf7d3f781c59dbf1e31f9fd84606a0c14c8b65b2dfe796507a746d5895\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"21867166cd49e0b545cdfa7a8580ea0b46d5f02f94139d815d1336cbac9fa9ef\"" May 27 17:49:10.734246 containerd[1565]: time="2025-05-27T17:49:10.733588540Z" level=info msg="StartContainer for \"21867166cd49e0b545cdfa7a8580ea0b46d5f02f94139d815d1336cbac9fa9ef\"" May 27 17:49:10.734246 containerd[1565]: time="2025-05-27T17:49:10.733647049Z" level=info msg="StartContainer for \"302b85b05373335fa61629b02ec4219e120ffef55b045c988ce6ebe347f4e0f6\"" May 27 17:49:10.734896 containerd[1565]: time="2025-05-27T17:49:10.734337083Z" level=info msg="connecting to shim 21867166cd49e0b545cdfa7a8580ea0b46d5f02f94139d815d1336cbac9fa9ef" address="unix:///run/containerd/s/d50e86bf2cb21e0261ee68ed0cdef62428dc01dc1d7d1085c2c221529644fdcb" protocol=ttrpc version=3 May 27 17:49:10.735205 containerd[1565]: time="2025-05-27T17:49:10.735188853Z" level=info msg="connecting to shim 302b85b05373335fa61629b02ec4219e120ffef55b045c988ce6ebe347f4e0f6" address="unix:///run/containerd/s/f5f512b4d59e02c570f1767c952747f19830947c3fa4d7dcdcd6ed449bb88c4a" protocol=ttrpc version=3 May 27 17:49:10.736721 containerd[1565]: time="2025-05-27T17:49:10.736704798Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4344-0-0-a-c8f0a3e630,Uid:98f81ad9995f88103edca6af43f4760f,Namespace:kube-system,Attempt:0,} returns sandbox id \"6d5f736f852b8b27a94de15abaa8040cd49cba370c27e7218f1b84e905099448\"" May 27 17:49:10.741591 containerd[1565]: time="2025-05-27T17:49:10.741575340Z" level=info msg="CreateContainer within sandbox \"6d5f736f852b8b27a94de15abaa8040cd49cba370c27e7218f1b84e905099448\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 27 17:49:10.749381 systemd[1]: Started cri-containerd-302b85b05373335fa61629b02ec4219e120ffef55b045c988ce6ebe347f4e0f6.scope - libcontainer container 302b85b05373335fa61629b02ec4219e120ffef55b045c988ce6ebe347f4e0f6. May 27 17:49:10.752239 containerd[1565]: time="2025-05-27T17:49:10.751984637Z" level=info msg="Container 7c3877e48b3b0c07ab61d1983fe9a6683d468c0ac14df2779cfa3870aee3fe88: CDI devices from CRI Config.CDIDevices: []" May 27 17:49:10.752846 systemd[1]: Started cri-containerd-21867166cd49e0b545cdfa7a8580ea0b46d5f02f94139d815d1336cbac9fa9ef.scope - libcontainer container 21867166cd49e0b545cdfa7a8580ea0b46d5f02f94139d815d1336cbac9fa9ef. May 27 17:49:10.767288 containerd[1565]: time="2025-05-27T17:49:10.766922995Z" level=info msg="CreateContainer within sandbox \"6d5f736f852b8b27a94de15abaa8040cd49cba370c27e7218f1b84e905099448\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"7c3877e48b3b0c07ab61d1983fe9a6683d468c0ac14df2779cfa3870aee3fe88\"" May 27 17:49:10.770704 containerd[1565]: time="2025-05-27T17:49:10.770671491Z" level=info msg="StartContainer for \"7c3877e48b3b0c07ab61d1983fe9a6683d468c0ac14df2779cfa3870aee3fe88\"" May 27 17:49:10.774931 containerd[1565]: time="2025-05-27T17:49:10.774905107Z" level=info msg="connecting to shim 7c3877e48b3b0c07ab61d1983fe9a6683d468c0ac14df2779cfa3870aee3fe88" address="unix:///run/containerd/s/8a9d493ba965878bf575af53024a3b7843e278241b23f5eabb09fd994fb3566d" protocol=ttrpc version=3 May 27 17:49:10.785233 kubelet[2494]: I0527 17:49:10.783810 2494 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:10.785684 kubelet[2494]: E0527 17:49:10.785661 2494 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://157.180.123.17:6443/api/v1/nodes\": dial tcp 157.180.123.17:6443: connect: connection refused" node="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:10.794449 systemd[1]: Started cri-containerd-7c3877e48b3b0c07ab61d1983fe9a6683d468c0ac14df2779cfa3870aee3fe88.scope - libcontainer container 7c3877e48b3b0c07ab61d1983fe9a6683d468c0ac14df2779cfa3870aee3fe88. May 27 17:49:10.811185 containerd[1565]: time="2025-05-27T17:49:10.810927027Z" level=info msg="StartContainer for \"21867166cd49e0b545cdfa7a8580ea0b46d5f02f94139d815d1336cbac9fa9ef\" returns successfully" May 27 17:49:10.850341 containerd[1565]: time="2025-05-27T17:49:10.850214707Z" level=info msg="StartContainer for \"302b85b05373335fa61629b02ec4219e120ffef55b045c988ce6ebe347f4e0f6\" returns successfully" May 27 17:49:10.867638 containerd[1565]: time="2025-05-27T17:49:10.867565552Z" level=info msg="StartContainer for \"7c3877e48b3b0c07ab61d1983fe9a6683d468c0ac14df2779cfa3870aee3fe88\" returns successfully" May 27 17:49:10.874533 kubelet[2494]: E0527 17:49:10.874505 2494 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://157.180.123.17:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 157.180.123.17:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" May 27 17:49:11.043276 kubelet[2494]: E0527 17:49:11.043017 2494 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344-0-0-a-c8f0a3e630\" not found" node="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:11.046181 kubelet[2494]: E0527 17:49:11.045638 2494 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344-0-0-a-c8f0a3e630\" not found" node="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:11.047895 kubelet[2494]: E0527 17:49:11.047881 2494 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344-0-0-a-c8f0a3e630\" not found" node="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:11.589166 kubelet[2494]: I0527 17:49:11.588534 2494 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:12.051246 kubelet[2494]: E0527 17:49:12.051108 2494 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344-0-0-a-c8f0a3e630\" not found" node="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:12.051493 kubelet[2494]: E0527 17:49:12.051481 2494 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344-0-0-a-c8f0a3e630\" not found" node="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:12.725959 kubelet[2494]: E0527 17:49:12.725915 2494 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4344-0-0-a-c8f0a3e630\" not found" node="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:12.810078 kubelet[2494]: I0527 17:49:12.810035 2494 kubelet_node_status.go:78] "Successfully registered node" node="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:12.839951 kubelet[2494]: E0527 17:49:12.839857 2494 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{ci-4344-0-0-a-c8f0a3e630.18437391b45562d1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4344-0-0-a-c8f0a3e630,UID:ci-4344-0-0-a-c8f0a3e630,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4344-0-0-a-c8f0a3e630,},FirstTimestamp:2025-05-27 17:49:09.982831313 +0000 UTC m=+0.734753237,LastTimestamp:2025-05-27 17:49:09.982831313 +0000 UTC m=+0.734753237,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4344-0-0-a-c8f0a3e630,}" May 27 17:49:12.909550 kubelet[2494]: I0527 17:49:12.909508 2494 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4344-0-0-a-c8f0a3e630" May 27 17:49:12.916086 kubelet[2494]: E0527 17:49:12.916054 2494 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4344-0-0-a-c8f0a3e630\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4344-0-0-a-c8f0a3e630" May 27 17:49:12.916086 kubelet[2494]: I0527 17:49:12.916079 2494 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4344-0-0-a-c8f0a3e630" May 27 17:49:12.918100 kubelet[2494]: E0527 17:49:12.918067 2494 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4344-0-0-a-c8f0a3e630\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4344-0-0-a-c8f0a3e630" May 27 17:49:12.918100 kubelet[2494]: I0527 17:49:12.918085 2494 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4344-0-0-a-c8f0a3e630" May 27 17:49:12.919390 kubelet[2494]: E0527 17:49:12.919369 2494 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4344-0-0-a-c8f0a3e630\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4344-0-0-a-c8f0a3e630" May 27 17:49:12.978336 kubelet[2494]: I0527 17:49:12.978057 2494 apiserver.go:52] "Watching apiserver" May 27 17:49:13.007904 kubelet[2494]: I0527 17:49:13.007881 2494 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" May 27 17:49:14.704416 systemd[1]: Reload requested from client PID 2770 ('systemctl') (unit session-7.scope)... May 27 17:49:14.704434 systemd[1]: Reloading... May 27 17:49:14.774275 zram_generator::config[2814]: No configuration found. May 27 17:49:14.851829 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 17:49:14.955196 systemd[1]: Reloading finished in 250 ms. May 27 17:49:14.980408 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:49:15.005169 systemd[1]: kubelet.service: Deactivated successfully. May 27 17:49:15.005384 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:49:15.005425 systemd[1]: kubelet.service: Consumed 1.055s CPU time, 127.5M memory peak. May 27 17:49:15.007341 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 17:49:15.136529 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 17:49:15.142559 (kubelet)[2865]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 27 17:49:15.181501 kubelet[2865]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 17:49:15.181501 kubelet[2865]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 27 17:49:15.181501 kubelet[2865]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 17:49:15.182234 kubelet[2865]: I0527 17:49:15.181828 2865 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 27 17:49:15.189474 kubelet[2865]: I0527 17:49:15.189454 2865 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" May 27 17:49:15.189556 kubelet[2865]: I0527 17:49:15.189549 2865 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 27 17:49:15.189759 kubelet[2865]: I0527 17:49:15.189748 2865 server.go:956] "Client rotation is on, will bootstrap in background" May 27 17:49:15.190636 kubelet[2865]: I0527 17:49:15.190625 2865 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" May 27 17:49:15.192725 kubelet[2865]: I0527 17:49:15.192400 2865 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 27 17:49:15.196194 kubelet[2865]: I0527 17:49:15.196184 2865 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 27 17:49:15.200813 kubelet[2865]: I0527 17:49:15.200792 2865 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 27 17:49:15.201016 kubelet[2865]: I0527 17:49:15.200989 2865 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 27 17:49:15.201285 kubelet[2865]: I0527 17:49:15.201075 2865 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4344-0-0-a-c8f0a3e630","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 27 17:49:15.201396 kubelet[2865]: I0527 17:49:15.201387 2865 topology_manager.go:138] "Creating topology manager with none policy" May 27 17:49:15.201448 kubelet[2865]: I0527 17:49:15.201441 2865 container_manager_linux.go:303] "Creating device plugin manager" May 27 17:49:15.202506 kubelet[2865]: I0527 17:49:15.202488 2865 state_mem.go:36] "Initialized new in-memory state store" May 27 17:49:15.202763 kubelet[2865]: I0527 17:49:15.202748 2865 kubelet.go:480] "Attempting to sync node with API server" May 27 17:49:15.202820 kubelet[2865]: I0527 17:49:15.202772 2865 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" May 27 17:49:15.202820 kubelet[2865]: I0527 17:49:15.202799 2865 kubelet.go:386] "Adding apiserver pod source" May 27 17:49:15.202820 kubelet[2865]: I0527 17:49:15.202814 2865 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 27 17:49:15.206963 kubelet[2865]: I0527 17:49:15.206513 2865 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 27 17:49:15.209626 kubelet[2865]: I0527 17:49:15.209293 2865 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" May 27 17:49:15.214411 kubelet[2865]: I0527 17:49:15.214373 2865 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 27 17:49:15.214411 kubelet[2865]: I0527 17:49:15.214407 2865 server.go:1289] "Started kubelet" May 27 17:49:15.214580 kubelet[2865]: I0527 17:49:15.214535 2865 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 May 27 17:49:15.214633 kubelet[2865]: I0527 17:49:15.214600 2865 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 27 17:49:15.214834 kubelet[2865]: I0527 17:49:15.214818 2865 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 27 17:49:15.217243 kubelet[2865]: I0527 17:49:15.216557 2865 server.go:317] "Adding debug handlers to kubelet server" May 27 17:49:15.218968 kubelet[2865]: I0527 17:49:15.218958 2865 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 27 17:49:15.220616 kubelet[2865]: I0527 17:49:15.220556 2865 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 27 17:49:15.222494 kubelet[2865]: I0527 17:49:15.222476 2865 volume_manager.go:297] "Starting Kubelet Volume Manager" May 27 17:49:15.222568 kubelet[2865]: I0527 17:49:15.222553 2865 desired_state_of_world_populator.go:150] "Desired state populator starts to run" May 27 17:49:15.222632 kubelet[2865]: I0527 17:49:15.222619 2865 reconciler.go:26] "Reconciler: start to sync state" May 27 17:49:15.223979 kubelet[2865]: E0527 17:49:15.223955 2865 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 27 17:49:15.225023 kubelet[2865]: I0527 17:49:15.225002 2865 factory.go:223] Registration of the systemd container factory successfully May 27 17:49:15.225116 kubelet[2865]: I0527 17:49:15.225094 2865 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 27 17:49:15.226289 kubelet[2865]: I0527 17:49:15.226270 2865 factory.go:223] Registration of the containerd container factory successfully May 27 17:49:15.229933 kubelet[2865]: I0527 17:49:15.229911 2865 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" May 27 17:49:15.231014 kubelet[2865]: I0527 17:49:15.231002 2865 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" May 27 17:49:15.231073 kubelet[2865]: I0527 17:49:15.231067 2865 status_manager.go:230] "Starting to sync pod status with apiserver" May 27 17:49:15.231125 kubelet[2865]: I0527 17:49:15.231119 2865 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 27 17:49:15.231169 kubelet[2865]: I0527 17:49:15.231163 2865 kubelet.go:2436] "Starting kubelet main sync loop" May 27 17:49:15.231325 kubelet[2865]: E0527 17:49:15.231310 2865 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 27 17:49:15.272340 kubelet[2865]: I0527 17:49:15.272311 2865 cpu_manager.go:221] "Starting CPU manager" policy="none" May 27 17:49:15.272340 kubelet[2865]: I0527 17:49:15.272327 2865 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" May 27 17:49:15.272340 kubelet[2865]: I0527 17:49:15.272341 2865 state_mem.go:36] "Initialized new in-memory state store" May 27 17:49:15.272495 kubelet[2865]: I0527 17:49:15.272430 2865 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 27 17:49:15.272495 kubelet[2865]: I0527 17:49:15.272440 2865 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 27 17:49:15.272495 kubelet[2865]: I0527 17:49:15.272454 2865 policy_none.go:49] "None policy: Start" May 27 17:49:15.272495 kubelet[2865]: I0527 17:49:15.272462 2865 memory_manager.go:186] "Starting memorymanager" policy="None" May 27 17:49:15.272495 kubelet[2865]: I0527 17:49:15.272469 2865 state_mem.go:35] "Initializing new in-memory state store" May 27 17:49:15.272578 kubelet[2865]: I0527 17:49:15.272532 2865 state_mem.go:75] "Updated machine memory state" May 27 17:49:15.277320 kubelet[2865]: E0527 17:49:15.276480 2865 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" May 27 17:49:15.277320 kubelet[2865]: I0527 17:49:15.276637 2865 eviction_manager.go:189] "Eviction manager: starting control loop" May 27 17:49:15.277320 kubelet[2865]: I0527 17:49:15.276647 2865 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 27 17:49:15.277320 kubelet[2865]: I0527 17:49:15.277141 2865 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 27 17:49:15.278791 kubelet[2865]: E0527 17:49:15.278601 2865 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" May 27 17:49:15.332878 kubelet[2865]: I0527 17:49:15.332823 2865 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4344-0-0-a-c8f0a3e630" May 27 17:49:15.333163 kubelet[2865]: I0527 17:49:15.332826 2865 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4344-0-0-a-c8f0a3e630" May 27 17:49:15.333213 kubelet[2865]: I0527 17:49:15.332972 2865 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4344-0-0-a-c8f0a3e630" May 27 17:49:15.385520 kubelet[2865]: I0527 17:49:15.385492 2865 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:15.395288 kubelet[2865]: I0527 17:49:15.395257 2865 kubelet_node_status.go:124] "Node was previously registered" node="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:15.395514 kubelet[2865]: I0527 17:49:15.395491 2865 kubelet_node_status.go:78] "Successfully registered node" node="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:15.425396 kubelet[2865]: I0527 17:49:15.425353 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4d24da20a65a9f7a3c68a6bf6c0c6572-ca-certs\") pod \"kube-controller-manager-ci-4344-0-0-a-c8f0a3e630\" (UID: \"4d24da20a65a9f7a3c68a6bf6c0c6572\") " pod="kube-system/kube-controller-manager-ci-4344-0-0-a-c8f0a3e630" May 27 17:49:15.526468 kubelet[2865]: I0527 17:49:15.526176 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4d24da20a65a9f7a3c68a6bf6c0c6572-kubeconfig\") pod \"kube-controller-manager-ci-4344-0-0-a-c8f0a3e630\" (UID: \"4d24da20a65a9f7a3c68a6bf6c0c6572\") " pod="kube-system/kube-controller-manager-ci-4344-0-0-a-c8f0a3e630" May 27 17:49:15.526468 kubelet[2865]: I0527 17:49:15.526209 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4d24da20a65a9f7a3c68a6bf6c0c6572-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4344-0-0-a-c8f0a3e630\" (UID: \"4d24da20a65a9f7a3c68a6bf6c0c6572\") " pod="kube-system/kube-controller-manager-ci-4344-0-0-a-c8f0a3e630" May 27 17:49:15.526468 kubelet[2865]: I0527 17:49:15.526289 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/98f81ad9995f88103edca6af43f4760f-ca-certs\") pod \"kube-apiserver-ci-4344-0-0-a-c8f0a3e630\" (UID: \"98f81ad9995f88103edca6af43f4760f\") " pod="kube-system/kube-apiserver-ci-4344-0-0-a-c8f0a3e630" May 27 17:49:15.526468 kubelet[2865]: I0527 17:49:15.526321 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/98f81ad9995f88103edca6af43f4760f-k8s-certs\") pod \"kube-apiserver-ci-4344-0-0-a-c8f0a3e630\" (UID: \"98f81ad9995f88103edca6af43f4760f\") " pod="kube-system/kube-apiserver-ci-4344-0-0-a-c8f0a3e630" May 27 17:49:15.526468 kubelet[2865]: I0527 17:49:15.526343 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/98f81ad9995f88103edca6af43f4760f-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4344-0-0-a-c8f0a3e630\" (UID: \"98f81ad9995f88103edca6af43f4760f\") " pod="kube-system/kube-apiserver-ci-4344-0-0-a-c8f0a3e630" May 27 17:49:15.526671 kubelet[2865]: I0527 17:49:15.526358 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/4d24da20a65a9f7a3c68a6bf6c0c6572-flexvolume-dir\") pod \"kube-controller-manager-ci-4344-0-0-a-c8f0a3e630\" (UID: \"4d24da20a65a9f7a3c68a6bf6c0c6572\") " pod="kube-system/kube-controller-manager-ci-4344-0-0-a-c8f0a3e630" May 27 17:49:15.526671 kubelet[2865]: I0527 17:49:15.526372 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4d24da20a65a9f7a3c68a6bf6c0c6572-k8s-certs\") pod \"kube-controller-manager-ci-4344-0-0-a-c8f0a3e630\" (UID: \"4d24da20a65a9f7a3c68a6bf6c0c6572\") " pod="kube-system/kube-controller-manager-ci-4344-0-0-a-c8f0a3e630" May 27 17:49:15.526671 kubelet[2865]: I0527 17:49:15.526386 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/44b3843146aa4432a4aeed1d6cbc2f4e-kubeconfig\") pod \"kube-scheduler-ci-4344-0-0-a-c8f0a3e630\" (UID: \"44b3843146aa4432a4aeed1d6cbc2f4e\") " pod="kube-system/kube-scheduler-ci-4344-0-0-a-c8f0a3e630" May 27 17:49:16.206501 kubelet[2865]: I0527 17:49:16.206461 2865 apiserver.go:52] "Watching apiserver" May 27 17:49:16.223556 kubelet[2865]: I0527 17:49:16.223496 2865 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" May 27 17:49:16.262018 kubelet[2865]: I0527 17:49:16.260057 2865 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4344-0-0-a-c8f0a3e630" May 27 17:49:16.266194 kubelet[2865]: E0527 17:49:16.266168 2865 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4344-0-0-a-c8f0a3e630\" already exists" pod="kube-system/kube-scheduler-ci-4344-0-0-a-c8f0a3e630" May 27 17:49:16.290846 kubelet[2865]: I0527 17:49:16.290797 2865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4344-0-0-a-c8f0a3e630" podStartSLOduration=1.2907824030000001 podStartE2EDuration="1.290782403s" podCreationTimestamp="2025-05-27 17:49:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 17:49:16.283690429 +0000 UTC m=+1.135569078" watchObservedRunningTime="2025-05-27 17:49:16.290782403 +0000 UTC m=+1.142661053" May 27 17:49:16.297915 kubelet[2865]: I0527 17:49:16.297839 2865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4344-0-0-a-c8f0a3e630" podStartSLOduration=1.2978212 podStartE2EDuration="1.2978212s" podCreationTimestamp="2025-05-27 17:49:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 17:49:16.291239557 +0000 UTC m=+1.143118217" watchObservedRunningTime="2025-05-27 17:49:16.2978212 +0000 UTC m=+1.149699851" May 27 17:49:16.298073 kubelet[2865]: I0527 17:49:16.297945 2865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4344-0-0-a-c8f0a3e630" podStartSLOduration=1.29794012 podStartE2EDuration="1.29794012s" podCreationTimestamp="2025-05-27 17:49:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 17:49:16.297622574 +0000 UTC m=+1.149501224" watchObservedRunningTime="2025-05-27 17:49:16.29794012 +0000 UTC m=+1.149818809" May 27 17:49:21.154359 systemd[1]: Created slice kubepods-besteffort-poddedf6950_80e8_4884_b657_78bf3cac35e3.slice - libcontainer container kubepods-besteffort-poddedf6950_80e8_4884_b657_78bf3cac35e3.slice. May 27 17:49:21.162013 kubelet[2865]: I0527 17:49:21.161986 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/dedf6950-80e8-4884-b657-78bf3cac35e3-kube-proxy\") pod \"kube-proxy-8mb8p\" (UID: \"dedf6950-80e8-4884-b657-78bf3cac35e3\") " pod="kube-system/kube-proxy-8mb8p" May 27 17:49:21.162760 kubelet[2865]: I0527 17:49:21.162018 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dedf6950-80e8-4884-b657-78bf3cac35e3-lib-modules\") pod \"kube-proxy-8mb8p\" (UID: \"dedf6950-80e8-4884-b657-78bf3cac35e3\") " pod="kube-system/kube-proxy-8mb8p" May 27 17:49:21.162760 kubelet[2865]: I0527 17:49:21.162033 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/dedf6950-80e8-4884-b657-78bf3cac35e3-xtables-lock\") pod \"kube-proxy-8mb8p\" (UID: \"dedf6950-80e8-4884-b657-78bf3cac35e3\") " pod="kube-system/kube-proxy-8mb8p" May 27 17:49:21.162760 kubelet[2865]: I0527 17:49:21.162045 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwb7f\" (UniqueName: \"kubernetes.io/projected/dedf6950-80e8-4884-b657-78bf3cac35e3-kube-api-access-xwb7f\") pod \"kube-proxy-8mb8p\" (UID: \"dedf6950-80e8-4884-b657-78bf3cac35e3\") " pod="kube-system/kube-proxy-8mb8p" May 27 17:49:21.173670 kubelet[2865]: I0527 17:49:21.173649 2865 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 27 17:49:21.174062 containerd[1565]: time="2025-05-27T17:49:21.173968224Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 27 17:49:21.174938 kubelet[2865]: I0527 17:49:21.174275 2865 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 27 17:49:21.271896 kubelet[2865]: E0527 17:49:21.271862 2865 projected.go:289] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found May 27 17:49:21.271896 kubelet[2865]: E0527 17:49:21.271887 2865 projected.go:194] Error preparing data for projected volume kube-api-access-xwb7f for pod kube-system/kube-proxy-8mb8p: configmap "kube-root-ca.crt" not found May 27 17:49:21.272138 kubelet[2865]: E0527 17:49:21.271955 2865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dedf6950-80e8-4884-b657-78bf3cac35e3-kube-api-access-xwb7f podName:dedf6950-80e8-4884-b657-78bf3cac35e3 nodeName:}" failed. No retries permitted until 2025-05-27 17:49:21.771930108 +0000 UTC m=+6.623808768 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xwb7f" (UniqueName: "kubernetes.io/projected/dedf6950-80e8-4884-b657-78bf3cac35e3-kube-api-access-xwb7f") pod "kube-proxy-8mb8p" (UID: "dedf6950-80e8-4884-b657-78bf3cac35e3") : configmap "kube-root-ca.crt" not found May 27 17:49:21.867983 kubelet[2865]: E0527 17:49:21.867935 2865 projected.go:289] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found May 27 17:49:21.867983 kubelet[2865]: E0527 17:49:21.867979 2865 projected.go:194] Error preparing data for projected volume kube-api-access-xwb7f for pod kube-system/kube-proxy-8mb8p: configmap "kube-root-ca.crt" not found May 27 17:49:21.868164 kubelet[2865]: E0527 17:49:21.868047 2865 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dedf6950-80e8-4884-b657-78bf3cac35e3-kube-api-access-xwb7f podName:dedf6950-80e8-4884-b657-78bf3cac35e3 nodeName:}" failed. No retries permitted until 2025-05-27 17:49:22.868030943 +0000 UTC m=+7.719909603 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-xwb7f" (UniqueName: "kubernetes.io/projected/dedf6950-80e8-4884-b657-78bf3cac35e3-kube-api-access-xwb7f") pod "kube-proxy-8mb8p" (UID: "dedf6950-80e8-4884-b657-78bf3cac35e3") : configmap "kube-root-ca.crt" not found May 27 17:49:22.345804 systemd[1]: Created slice kubepods-besteffort-pod505c9ee1_1282_4b74_a7e8_80de64140eab.slice - libcontainer container kubepods-besteffort-pod505c9ee1_1282_4b74_a7e8_80de64140eab.slice. May 27 17:49:22.371031 kubelet[2865]: I0527 17:49:22.370936 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtzn2\" (UniqueName: \"kubernetes.io/projected/505c9ee1-1282-4b74-a7e8-80de64140eab-kube-api-access-wtzn2\") pod \"tigera-operator-844669ff44-m9wdq\" (UID: \"505c9ee1-1282-4b74-a7e8-80de64140eab\") " pod="tigera-operator/tigera-operator-844669ff44-m9wdq" May 27 17:49:22.371649 kubelet[2865]: I0527 17:49:22.371284 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/505c9ee1-1282-4b74-a7e8-80de64140eab-var-lib-calico\") pod \"tigera-operator-844669ff44-m9wdq\" (UID: \"505c9ee1-1282-4b74-a7e8-80de64140eab\") " pod="tigera-operator/tigera-operator-844669ff44-m9wdq" May 27 17:49:22.649902 containerd[1565]: time="2025-05-27T17:49:22.649798841Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-844669ff44-m9wdq,Uid:505c9ee1-1282-4b74-a7e8-80de64140eab,Namespace:tigera-operator,Attempt:0,}" May 27 17:49:22.670905 containerd[1565]: time="2025-05-27T17:49:22.670749494Z" level=info msg="connecting to shim cbd271c3934a4c61217db7ad844a25babc9fae6884f129b6a65854f8a4fb699c" address="unix:///run/containerd/s/5b6428d68c592eb8502f9c01ff20cdde4f0e4dddd3534bf0bb1d4edd87de63da" namespace=k8s.io protocol=ttrpc version=3 May 27 17:49:22.696420 systemd[1]: Started cri-containerd-cbd271c3934a4c61217db7ad844a25babc9fae6884f129b6a65854f8a4fb699c.scope - libcontainer container cbd271c3934a4c61217db7ad844a25babc9fae6884f129b6a65854f8a4fb699c. May 27 17:49:22.745447 containerd[1565]: time="2025-05-27T17:49:22.745411640Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-844669ff44-m9wdq,Uid:505c9ee1-1282-4b74-a7e8-80de64140eab,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"cbd271c3934a4c61217db7ad844a25babc9fae6884f129b6a65854f8a4fb699c\"" May 27 17:49:22.747801 containerd[1565]: time="2025-05-27T17:49:22.747722663Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.0\"" May 27 17:49:22.963961 containerd[1565]: time="2025-05-27T17:49:22.963877918Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-8mb8p,Uid:dedf6950-80e8-4884-b657-78bf3cac35e3,Namespace:kube-system,Attempt:0,}" May 27 17:49:22.986184 containerd[1565]: time="2025-05-27T17:49:22.986094272Z" level=info msg="connecting to shim 78a6b25eb41607f8e844ae700c85070ec34f6a4fa028c7f0b203206b9475fbca" address="unix:///run/containerd/s/c402a6fd2108766a01bd67578f3fc52335b76095cebc99fe80abdc246db2b07c" namespace=k8s.io protocol=ttrpc version=3 May 27 17:49:23.016397 systemd[1]: Started cri-containerd-78a6b25eb41607f8e844ae700c85070ec34f6a4fa028c7f0b203206b9475fbca.scope - libcontainer container 78a6b25eb41607f8e844ae700c85070ec34f6a4fa028c7f0b203206b9475fbca. May 27 17:49:23.047094 containerd[1565]: time="2025-05-27T17:49:23.047043704Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-8mb8p,Uid:dedf6950-80e8-4884-b657-78bf3cac35e3,Namespace:kube-system,Attempt:0,} returns sandbox id \"78a6b25eb41607f8e844ae700c85070ec34f6a4fa028c7f0b203206b9475fbca\"" May 27 17:49:23.052377 containerd[1565]: time="2025-05-27T17:49:23.052342038Z" level=info msg="CreateContainer within sandbox \"78a6b25eb41607f8e844ae700c85070ec34f6a4fa028c7f0b203206b9475fbca\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 27 17:49:23.060611 containerd[1565]: time="2025-05-27T17:49:23.060581255Z" level=info msg="Container 415326fef1c9c6314109f06dc2f684ca7881cb9b9387576ce495a92c1225d984: CDI devices from CRI Config.CDIDevices: []" May 27 17:49:23.072355 containerd[1565]: time="2025-05-27T17:49:23.072316260Z" level=info msg="CreateContainer within sandbox \"78a6b25eb41607f8e844ae700c85070ec34f6a4fa028c7f0b203206b9475fbca\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"415326fef1c9c6314109f06dc2f684ca7881cb9b9387576ce495a92c1225d984\"" May 27 17:49:23.073003 containerd[1565]: time="2025-05-27T17:49:23.072961695Z" level=info msg="StartContainer for \"415326fef1c9c6314109f06dc2f684ca7881cb9b9387576ce495a92c1225d984\"" May 27 17:49:23.074807 containerd[1565]: time="2025-05-27T17:49:23.074777334Z" level=info msg="connecting to shim 415326fef1c9c6314109f06dc2f684ca7881cb9b9387576ce495a92c1225d984" address="unix:///run/containerd/s/c402a6fd2108766a01bd67578f3fc52335b76095cebc99fe80abdc246db2b07c" protocol=ttrpc version=3 May 27 17:49:23.093357 systemd[1]: Started cri-containerd-415326fef1c9c6314109f06dc2f684ca7881cb9b9387576ce495a92c1225d984.scope - libcontainer container 415326fef1c9c6314109f06dc2f684ca7881cb9b9387576ce495a92c1225d984. May 27 17:49:23.121760 containerd[1565]: time="2025-05-27T17:49:23.121717824Z" level=info msg="StartContainer for \"415326fef1c9c6314109f06dc2f684ca7881cb9b9387576ce495a92c1225d984\" returns successfully" May 27 17:49:23.293752 kubelet[2865]: I0527 17:49:23.293421 2865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-8mb8p" podStartSLOduration=2.2933960349999998 podStartE2EDuration="2.293396035s" podCreationTimestamp="2025-05-27 17:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 17:49:23.292994552 +0000 UTC m=+8.144873222" watchObservedRunningTime="2025-05-27 17:49:23.293396035 +0000 UTC m=+8.145274715" May 27 17:49:24.799411 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1813029478.mount: Deactivated successfully. May 27 17:49:25.393205 containerd[1565]: time="2025-05-27T17:49:25.393154640Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:25.394207 containerd[1565]: time="2025-05-27T17:49:25.394016605Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.0: active requests=0, bytes read=25055451" May 27 17:49:25.394921 containerd[1565]: time="2025-05-27T17:49:25.394898048Z" level=info msg="ImageCreate event name:\"sha256:5e43c1322619406528ff596056dfeb70cb8d20c5c00439feb752a7725302e033\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:25.396602 containerd[1565]: time="2025-05-27T17:49:25.396565476Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:e0a34b265aebce1a2db906d8dad99190706e8bf3910cae626b9c2eb6bbb21775\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:25.397101 containerd[1565]: time="2025-05-27T17:49:25.397080850Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.0\" with image id \"sha256:5e43c1322619406528ff596056dfeb70cb8d20c5c00439feb752a7725302e033\", repo tag \"quay.io/tigera/operator:v1.38.0\", repo digest \"quay.io/tigera/operator@sha256:e0a34b265aebce1a2db906d8dad99190706e8bf3910cae626b9c2eb6bbb21775\", size \"25051446\" in 2.649327599s" May 27 17:49:25.397236 containerd[1565]: time="2025-05-27T17:49:25.397159876Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.0\" returns image reference \"sha256:5e43c1322619406528ff596056dfeb70cb8d20c5c00439feb752a7725302e033\"" May 27 17:49:25.401390 containerd[1565]: time="2025-05-27T17:49:25.401362770Z" level=info msg="CreateContainer within sandbox \"cbd271c3934a4c61217db7ad844a25babc9fae6884f129b6a65854f8a4fb699c\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 27 17:49:25.407085 containerd[1565]: time="2025-05-27T17:49:25.406487610Z" level=info msg="Container e54cf70bd99c3778a7c09887ff2f1fa1e4c9b452ff837e3e06e4a872d42c175d: CDI devices from CRI Config.CDIDevices: []" May 27 17:49:25.416350 containerd[1565]: time="2025-05-27T17:49:25.416312663Z" level=info msg="CreateContainer within sandbox \"cbd271c3934a4c61217db7ad844a25babc9fae6884f129b6a65854f8a4fb699c\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"e54cf70bd99c3778a7c09887ff2f1fa1e4c9b452ff837e3e06e4a872d42c175d\"" May 27 17:49:25.417054 containerd[1565]: time="2025-05-27T17:49:25.417034961Z" level=info msg="StartContainer for \"e54cf70bd99c3778a7c09887ff2f1fa1e4c9b452ff837e3e06e4a872d42c175d\"" May 27 17:49:25.418531 containerd[1565]: time="2025-05-27T17:49:25.417872671Z" level=info msg="connecting to shim e54cf70bd99c3778a7c09887ff2f1fa1e4c9b452ff837e3e06e4a872d42c175d" address="unix:///run/containerd/s/5b6428d68c592eb8502f9c01ff20cdde4f0e4dddd3534bf0bb1d4edd87de63da" protocol=ttrpc version=3 May 27 17:49:25.439331 systemd[1]: Started cri-containerd-e54cf70bd99c3778a7c09887ff2f1fa1e4c9b452ff837e3e06e4a872d42c175d.scope - libcontainer container e54cf70bd99c3778a7c09887ff2f1fa1e4c9b452ff837e3e06e4a872d42c175d. May 27 17:49:25.463878 containerd[1565]: time="2025-05-27T17:49:25.463790952Z" level=info msg="StartContainer for \"e54cf70bd99c3778a7c09887ff2f1fa1e4c9b452ff837e3e06e4a872d42c175d\" returns successfully" May 27 17:49:26.300785 kubelet[2865]: I0527 17:49:26.300733 2865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-844669ff44-m9wdq" podStartSLOduration=1.649874558 podStartE2EDuration="4.300718543s" podCreationTimestamp="2025-05-27 17:49:22 +0000 UTC" firstStartedPulling="2025-05-27 17:49:22.746907987 +0000 UTC m=+7.598786638" lastFinishedPulling="2025-05-27 17:49:25.397751973 +0000 UTC m=+10.249630623" observedRunningTime="2025-05-27 17:49:26.300049585 +0000 UTC m=+11.151928245" watchObservedRunningTime="2025-05-27 17:49:26.300718543 +0000 UTC m=+11.152597224" May 27 17:49:31.248593 sudo[1977]: pam_unix(sudo:session): session closed for user root May 27 17:49:31.407309 sshd[1976]: Connection closed by 139.178.89.65 port 40812 May 27 17:49:31.409082 sshd-session[1974]: pam_unix(sshd:session): session closed for user core May 27 17:49:31.411586 systemd[1]: sshd@6-157.180.123.17:22-139.178.89.65:40812.service: Deactivated successfully. May 27 17:49:31.414642 systemd[1]: session-7.scope: Deactivated successfully. May 27 17:49:31.414834 systemd[1]: session-7.scope: Consumed 3.740s CPU time, 162.8M memory peak. May 27 17:49:31.419334 systemd-logind[1534]: Session 7 logged out. Waiting for processes to exit. May 27 17:49:31.420356 systemd-logind[1534]: Removed session 7. May 27 17:49:34.002504 systemd[1]: Created slice kubepods-besteffort-pod5d36deb3_015d_49e8_8f48_10eab20e5be4.slice - libcontainer container kubepods-besteffort-pod5d36deb3_015d_49e8_8f48_10eab20e5be4.slice. May 27 17:49:34.049529 kubelet[2865]: I0527 17:49:34.049491 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5d36deb3-015d-49e8-8f48-10eab20e5be4-tigera-ca-bundle\") pod \"calico-typha-655b47479b-tsjv7\" (UID: \"5d36deb3-015d-49e8-8f48-10eab20e5be4\") " pod="calico-system/calico-typha-655b47479b-tsjv7" May 27 17:49:34.049529 kubelet[2865]: I0527 17:49:34.049529 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/5d36deb3-015d-49e8-8f48-10eab20e5be4-typha-certs\") pod \"calico-typha-655b47479b-tsjv7\" (UID: \"5d36deb3-015d-49e8-8f48-10eab20e5be4\") " pod="calico-system/calico-typha-655b47479b-tsjv7" May 27 17:49:34.049883 kubelet[2865]: I0527 17:49:34.049556 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k4rc\" (UniqueName: \"kubernetes.io/projected/5d36deb3-015d-49e8-8f48-10eab20e5be4-kube-api-access-9k4rc\") pod \"calico-typha-655b47479b-tsjv7\" (UID: \"5d36deb3-015d-49e8-8f48-10eab20e5be4\") " pod="calico-system/calico-typha-655b47479b-tsjv7" May 27 17:49:34.306117 containerd[1565]: time="2025-05-27T17:49:34.306025427Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-655b47479b-tsjv7,Uid:5d36deb3-015d-49e8-8f48-10eab20e5be4,Namespace:calico-system,Attempt:0,}" May 27 17:49:34.321004 containerd[1565]: time="2025-05-27T17:49:34.320954868Z" level=info msg="connecting to shim 79a9a74fe021f9513f2a50bc7653984b86f49d7fab05deb68c5cc847c62a98d8" address="unix:///run/containerd/s/9d8b0bd7012e6a2ff6f38f14166929a653c2867b9485fb3e5d8e1df0b467bfc8" namespace=k8s.io protocol=ttrpc version=3 May 27 17:49:34.339431 systemd[1]: Started cri-containerd-79a9a74fe021f9513f2a50bc7653984b86f49d7fab05deb68c5cc847c62a98d8.scope - libcontainer container 79a9a74fe021f9513f2a50bc7653984b86f49d7fab05deb68c5cc847c62a98d8. May 27 17:49:34.377362 containerd[1565]: time="2025-05-27T17:49:34.377324927Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-655b47479b-tsjv7,Uid:5d36deb3-015d-49e8-8f48-10eab20e5be4,Namespace:calico-system,Attempt:0,} returns sandbox id \"79a9a74fe021f9513f2a50bc7653984b86f49d7fab05deb68c5cc847c62a98d8\"" May 27 17:49:34.378585 containerd[1565]: time="2025-05-27T17:49:34.378563586Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.0\"" May 27 17:49:34.430552 systemd[1]: Created slice kubepods-besteffort-podf324b5b7_da06_4933_b02b_55d111252b1c.slice - libcontainer container kubepods-besteffort-podf324b5b7_da06_4933_b02b_55d111252b1c.slice. May 27 17:49:34.452126 kubelet[2865]: I0527 17:49:34.452094 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f324b5b7-da06-4933-b02b-55d111252b1c-lib-modules\") pod \"calico-node-qb8hf\" (UID: \"f324b5b7-da06-4933-b02b-55d111252b1c\") " pod="calico-system/calico-node-qb8hf" May 27 17:49:34.452126 kubelet[2865]: I0527 17:49:34.452129 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/f324b5b7-da06-4933-b02b-55d111252b1c-var-lib-calico\") pod \"calico-node-qb8hf\" (UID: \"f324b5b7-da06-4933-b02b-55d111252b1c\") " pod="calico-system/calico-node-qb8hf" May 27 17:49:34.453316 kubelet[2865]: I0527 17:49:34.452144 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/f324b5b7-da06-4933-b02b-55d111252b1c-node-certs\") pod \"calico-node-qb8hf\" (UID: \"f324b5b7-da06-4933-b02b-55d111252b1c\") " pod="calico-system/calico-node-qb8hf" May 27 17:49:34.453316 kubelet[2865]: I0527 17:49:34.452155 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/f324b5b7-da06-4933-b02b-55d111252b1c-xtables-lock\") pod \"calico-node-qb8hf\" (UID: \"f324b5b7-da06-4933-b02b-55d111252b1c\") " pod="calico-system/calico-node-qb8hf" May 27 17:49:34.453316 kubelet[2865]: I0527 17:49:34.452168 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/f324b5b7-da06-4933-b02b-55d111252b1c-cni-net-dir\") pod \"calico-node-qb8hf\" (UID: \"f324b5b7-da06-4933-b02b-55d111252b1c\") " pod="calico-system/calico-node-qb8hf" May 27 17:49:34.453316 kubelet[2865]: I0527 17:49:34.452182 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/f324b5b7-da06-4933-b02b-55d111252b1c-flexvol-driver-host\") pod \"calico-node-qb8hf\" (UID: \"f324b5b7-da06-4933-b02b-55d111252b1c\") " pod="calico-system/calico-node-qb8hf" May 27 17:49:34.453316 kubelet[2865]: I0527 17:49:34.452197 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/f324b5b7-da06-4933-b02b-55d111252b1c-var-run-calico\") pod \"calico-node-qb8hf\" (UID: \"f324b5b7-da06-4933-b02b-55d111252b1c\") " pod="calico-system/calico-node-qb8hf" May 27 17:49:34.453410 kubelet[2865]: I0527 17:49:34.452211 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/f324b5b7-da06-4933-b02b-55d111252b1c-cni-log-dir\") pod \"calico-node-qb8hf\" (UID: \"f324b5b7-da06-4933-b02b-55d111252b1c\") " pod="calico-system/calico-node-qb8hf" May 27 17:49:34.453410 kubelet[2865]: I0527 17:49:34.452297 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/f324b5b7-da06-4933-b02b-55d111252b1c-cni-bin-dir\") pod \"calico-node-qb8hf\" (UID: \"f324b5b7-da06-4933-b02b-55d111252b1c\") " pod="calico-system/calico-node-qb8hf" May 27 17:49:34.453410 kubelet[2865]: I0527 17:49:34.452313 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/f324b5b7-da06-4933-b02b-55d111252b1c-policysync\") pod \"calico-node-qb8hf\" (UID: \"f324b5b7-da06-4933-b02b-55d111252b1c\") " pod="calico-system/calico-node-qb8hf" May 27 17:49:34.453410 kubelet[2865]: I0527 17:49:34.452327 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28lv8\" (UniqueName: \"kubernetes.io/projected/f324b5b7-da06-4933-b02b-55d111252b1c-kube-api-access-28lv8\") pod \"calico-node-qb8hf\" (UID: \"f324b5b7-da06-4933-b02b-55d111252b1c\") " pod="calico-system/calico-node-qb8hf" May 27 17:49:34.453410 kubelet[2865]: I0527 17:49:34.452342 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f324b5b7-da06-4933-b02b-55d111252b1c-tigera-ca-bundle\") pod \"calico-node-qb8hf\" (UID: \"f324b5b7-da06-4933-b02b-55d111252b1c\") " pod="calico-system/calico-node-qb8hf" May 27 17:49:34.561543 kubelet[2865]: E0527 17:49:34.558612 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.561543 kubelet[2865]: W0527 17:49:34.558637 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.562071 kubelet[2865]: E0527 17:49:34.562038 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.568877 kubelet[2865]: E0527 17:49:34.568777 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.568877 kubelet[2865]: W0527 17:49:34.568813 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.568877 kubelet[2865]: E0527 17:49:34.568833 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.676743 kubelet[2865]: E0527 17:49:34.676474 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5zw22" podUID="6cefb86f-fbd3-4c4c-8534-44d8ba742df1" May 27 17:49:34.735030 kubelet[2865]: E0527 17:49:34.734642 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.735030 kubelet[2865]: W0527 17:49:34.735019 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.735030 kubelet[2865]: E0527 17:49:34.735035 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.735737 kubelet[2865]: E0527 17:49:34.735252 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.735737 kubelet[2865]: W0527 17:49:34.735260 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.735737 kubelet[2865]: E0527 17:49:34.735268 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.735737 kubelet[2865]: E0527 17:49:34.735419 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.735737 kubelet[2865]: W0527 17:49:34.735426 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.735737 kubelet[2865]: E0527 17:49:34.735433 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.736111 containerd[1565]: time="2025-05-27T17:49:34.735615419Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-qb8hf,Uid:f324b5b7-da06-4933-b02b-55d111252b1c,Namespace:calico-system,Attempt:0,}" May 27 17:49:34.736487 kubelet[2865]: E0527 17:49:34.736043 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.736487 kubelet[2865]: W0527 17:49:34.736061 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.736487 kubelet[2865]: E0527 17:49:34.736069 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.736487 kubelet[2865]: E0527 17:49:34.736195 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.736487 kubelet[2865]: W0527 17:49:34.736201 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.736487 kubelet[2865]: E0527 17:49:34.736239 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.736487 kubelet[2865]: E0527 17:49:34.736433 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.736487 kubelet[2865]: W0527 17:49:34.736440 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.736487 kubelet[2865]: E0527 17:49:34.736447 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.736941 kubelet[2865]: E0527 17:49:34.736690 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.736941 kubelet[2865]: W0527 17:49:34.736698 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.736941 kubelet[2865]: E0527 17:49:34.736705 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.736941 kubelet[2865]: E0527 17:49:34.736889 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.736941 kubelet[2865]: W0527 17:49:34.736896 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.736941 kubelet[2865]: E0527 17:49:34.736904 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.737344 kubelet[2865]: E0527 17:49:34.737301 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.737344 kubelet[2865]: W0527 17:49:34.737313 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.737344 kubelet[2865]: E0527 17:49:34.737320 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.737974 kubelet[2865]: E0527 17:49:34.737952 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.737974 kubelet[2865]: W0527 17:49:34.737966 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.737974 kubelet[2865]: E0527 17:49:34.737974 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.738614 kubelet[2865]: E0527 17:49:34.738597 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.738614 kubelet[2865]: W0527 17:49:34.738611 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.738709 kubelet[2865]: E0527 17:49:34.738620 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.739435 kubelet[2865]: E0527 17:49:34.739409 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.739435 kubelet[2865]: W0527 17:49:34.739421 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.739435 kubelet[2865]: E0527 17:49:34.739429 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.740182 kubelet[2865]: E0527 17:49:34.740147 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.740182 kubelet[2865]: W0527 17:49:34.740161 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.740182 kubelet[2865]: E0527 17:49:34.740168 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.741063 kubelet[2865]: E0527 17:49:34.741049 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.741105 kubelet[2865]: W0527 17:49:34.741079 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.741105 kubelet[2865]: E0527 17:49:34.741088 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.741339 kubelet[2865]: E0527 17:49:34.741311 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.741617 kubelet[2865]: W0527 17:49:34.741322 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.741669 kubelet[2865]: E0527 17:49:34.741630 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.741806 kubelet[2865]: E0527 17:49:34.741746 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.741806 kubelet[2865]: W0527 17:49:34.741771 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.741806 kubelet[2865]: E0527 17:49:34.741779 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.741918 kubelet[2865]: E0527 17:49:34.741901 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.741942 kubelet[2865]: W0527 17:49:34.741924 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.741942 kubelet[2865]: E0527 17:49:34.741933 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.742050 kubelet[2865]: E0527 17:49:34.742029 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.742050 kubelet[2865]: W0527 17:49:34.742044 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.742050 kubelet[2865]: E0527 17:49:34.742050 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.742335 kubelet[2865]: E0527 17:49:34.742190 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.742335 kubelet[2865]: W0527 17:49:34.742198 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.742335 kubelet[2865]: E0527 17:49:34.742204 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.742589 kubelet[2865]: E0527 17:49:34.742576 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.742589 kubelet[2865]: W0527 17:49:34.742590 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.742695 kubelet[2865]: E0527 17:49:34.742598 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.755245 kubelet[2865]: E0527 17:49:34.755161 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.755245 kubelet[2865]: W0527 17:49:34.755178 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.755245 kubelet[2865]: E0527 17:49:34.755191 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.755526 kubelet[2865]: I0527 17:49:34.755214 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6cefb86f-fbd3-4c4c-8534-44d8ba742df1-registration-dir\") pod \"csi-node-driver-5zw22\" (UID: \"6cefb86f-fbd3-4c4c-8534-44d8ba742df1\") " pod="calico-system/csi-node-driver-5zw22" May 27 17:49:34.756196 kubelet[2865]: E0527 17:49:34.756162 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.756196 kubelet[2865]: W0527 17:49:34.756177 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.756196 kubelet[2865]: E0527 17:49:34.756186 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.756352 kubelet[2865]: I0527 17:49:34.756325 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/6cefb86f-fbd3-4c4c-8534-44d8ba742df1-varrun\") pod \"csi-node-driver-5zw22\" (UID: \"6cefb86f-fbd3-4c4c-8534-44d8ba742df1\") " pod="calico-system/csi-node-driver-5zw22" May 27 17:49:34.757251 kubelet[2865]: E0527 17:49:34.757014 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.757251 kubelet[2865]: W0527 17:49:34.757031 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.757251 kubelet[2865]: E0527 17:49:34.757045 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.757675 kubelet[2865]: E0527 17:49:34.757638 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.757675 kubelet[2865]: W0527 17:49:34.757652 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.757820 kubelet[2865]: E0527 17:49:34.757797 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.758594 kubelet[2865]: E0527 17:49:34.758572 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.758594 kubelet[2865]: W0527 17:49:34.758585 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.758594 kubelet[2865]: E0527 17:49:34.758595 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.759283 kubelet[2865]: I0527 17:49:34.759256 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6cefb86f-fbd3-4c4c-8534-44d8ba742df1-socket-dir\") pod \"csi-node-driver-5zw22\" (UID: \"6cefb86f-fbd3-4c4c-8534-44d8ba742df1\") " pod="calico-system/csi-node-driver-5zw22" May 27 17:49:34.759532 kubelet[2865]: E0527 17:49:34.759513 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.759532 kubelet[2865]: W0527 17:49:34.759528 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.759580 kubelet[2865]: E0527 17:49:34.759537 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.760368 kubelet[2865]: E0527 17:49:34.760352 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.760368 kubelet[2865]: W0527 17:49:34.760366 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.760426 kubelet[2865]: E0527 17:49:34.760374 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.761152 kubelet[2865]: E0527 17:49:34.761129 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.761152 kubelet[2865]: W0527 17:49:34.761146 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.761213 kubelet[2865]: E0527 17:49:34.761158 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.761762 kubelet[2865]: I0527 17:49:34.761725 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trfp7\" (UniqueName: \"kubernetes.io/projected/6cefb86f-fbd3-4c4c-8534-44d8ba742df1-kube-api-access-trfp7\") pod \"csi-node-driver-5zw22\" (UID: \"6cefb86f-fbd3-4c4c-8534-44d8ba742df1\") " pod="calico-system/csi-node-driver-5zw22" May 27 17:49:34.763041 kubelet[2865]: E0527 17:49:34.763018 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.763097 kubelet[2865]: W0527 17:49:34.763063 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.763097 kubelet[2865]: E0527 17:49:34.763074 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.763234 kubelet[2865]: E0527 17:49:34.763212 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.763273 kubelet[2865]: W0527 17:49:34.763241 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.763273 kubelet[2865]: E0527 17:49:34.763248 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.763635 kubelet[2865]: E0527 17:49:34.763617 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.763635 kubelet[2865]: W0527 17:49:34.763633 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.763703 kubelet[2865]: E0527 17:49:34.763646 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.763963 kubelet[2865]: I0527 17:49:34.763945 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6cefb86f-fbd3-4c4c-8534-44d8ba742df1-kubelet-dir\") pod \"csi-node-driver-5zw22\" (UID: \"6cefb86f-fbd3-4c4c-8534-44d8ba742df1\") " pod="calico-system/csi-node-driver-5zw22" May 27 17:49:34.764557 kubelet[2865]: E0527 17:49:34.764540 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.764557 kubelet[2865]: W0527 17:49:34.764554 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.764722 kubelet[2865]: E0527 17:49:34.764563 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.765317 kubelet[2865]: E0527 17:49:34.765281 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.765317 kubelet[2865]: W0527 17:49:34.765294 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.765317 kubelet[2865]: E0527 17:49:34.765303 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.766472 kubelet[2865]: E0527 17:49:34.766348 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.766472 kubelet[2865]: W0527 17:49:34.766360 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.766472 kubelet[2865]: E0527 17:49:34.766374 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.766773 kubelet[2865]: E0527 17:49:34.766758 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.768757 kubelet[2865]: W0527 17:49:34.768634 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.768757 kubelet[2865]: E0527 17:49:34.768650 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.769491 containerd[1565]: time="2025-05-27T17:49:34.769370215Z" level=info msg="connecting to shim 74d2d6f5705a1cbf153c09249e5f4d2df5359f4dbf6f7e21893020fb31e2c3a3" address="unix:///run/containerd/s/45e12a65ccb6700c521ca9ba5f05b34ba305e3a09fc648a07aa60fc2ca3276b0" namespace=k8s.io protocol=ttrpc version=3 May 27 17:49:34.794452 systemd[1]: Started cri-containerd-74d2d6f5705a1cbf153c09249e5f4d2df5359f4dbf6f7e21893020fb31e2c3a3.scope - libcontainer container 74d2d6f5705a1cbf153c09249e5f4d2df5359f4dbf6f7e21893020fb31e2c3a3. May 27 17:49:34.865520 kubelet[2865]: E0527 17:49:34.865429 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.865520 kubelet[2865]: W0527 17:49:34.865458 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.865520 kubelet[2865]: E0527 17:49:34.865480 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.866504 containerd[1565]: time="2025-05-27T17:49:34.866439776Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-qb8hf,Uid:f324b5b7-da06-4933-b02b-55d111252b1c,Namespace:calico-system,Attempt:0,} returns sandbox id \"74d2d6f5705a1cbf153c09249e5f4d2df5359f4dbf6f7e21893020fb31e2c3a3\"" May 27 17:49:34.868061 kubelet[2865]: E0527 17:49:34.868042 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.868061 kubelet[2865]: W0527 17:49:34.868057 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.868274 kubelet[2865]: E0527 17:49:34.868066 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.868403 kubelet[2865]: E0527 17:49:34.868389 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.868403 kubelet[2865]: W0527 17:49:34.868396 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.868455 kubelet[2865]: E0527 17:49:34.868404 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.869120 kubelet[2865]: E0527 17:49:34.869101 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.869120 kubelet[2865]: W0527 17:49:34.869112 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.869120 kubelet[2865]: E0527 17:49:34.869120 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.870842 kubelet[2865]: E0527 17:49:34.870824 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.870842 kubelet[2865]: W0527 17:49:34.870838 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.870909 kubelet[2865]: E0527 17:49:34.870846 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.871422 kubelet[2865]: E0527 17:49:34.871404 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.871422 kubelet[2865]: W0527 17:49:34.871418 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.871476 kubelet[2865]: E0527 17:49:34.871426 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.871555 kubelet[2865]: E0527 17:49:34.871540 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.871555 kubelet[2865]: W0527 17:49:34.871551 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.871632 kubelet[2865]: E0527 17:49:34.871558 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.871706 kubelet[2865]: E0527 17:49:34.871691 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.871706 kubelet[2865]: W0527 17:49:34.871703 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.871749 kubelet[2865]: E0527 17:49:34.871710 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.872409 kubelet[2865]: E0527 17:49:34.872392 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.872409 kubelet[2865]: W0527 17:49:34.872405 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.872461 kubelet[2865]: E0527 17:49:34.872412 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.873323 kubelet[2865]: E0527 17:49:34.873298 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.873323 kubelet[2865]: W0527 17:49:34.873320 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.873372 kubelet[2865]: E0527 17:49:34.873332 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.873479 kubelet[2865]: E0527 17:49:34.873464 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.873479 kubelet[2865]: W0527 17:49:34.873476 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.873521 kubelet[2865]: E0527 17:49:34.873483 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.873628 kubelet[2865]: E0527 17:49:34.873611 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.873628 kubelet[2865]: W0527 17:49:34.873623 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.873683 kubelet[2865]: E0527 17:49:34.873634 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.873764 kubelet[2865]: E0527 17:49:34.873747 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.873764 kubelet[2865]: W0527 17:49:34.873759 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.873818 kubelet[2865]: E0527 17:49:34.873766 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.873896 kubelet[2865]: E0527 17:49:34.873871 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.873896 kubelet[2865]: W0527 17:49:34.873884 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.873896 kubelet[2865]: E0527 17:49:34.873890 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.875281 kubelet[2865]: E0527 17:49:34.875261 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.875281 kubelet[2865]: W0527 17:49:34.875275 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.875281 kubelet[2865]: E0527 17:49:34.875282 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.875603 kubelet[2865]: E0527 17:49:34.875408 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.875603 kubelet[2865]: W0527 17:49:34.875415 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.875603 kubelet[2865]: E0527 17:49:34.875421 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.875603 kubelet[2865]: E0527 17:49:34.875555 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.875603 kubelet[2865]: W0527 17:49:34.875574 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.875603 kubelet[2865]: E0527 17:49:34.875581 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.876044 kubelet[2865]: E0527 17:49:34.875674 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.876044 kubelet[2865]: W0527 17:49:34.875680 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.876044 kubelet[2865]: E0527 17:49:34.875685 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.876044 kubelet[2865]: E0527 17:49:34.875871 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.876044 kubelet[2865]: W0527 17:49:34.875881 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.876044 kubelet[2865]: E0527 17:49:34.875893 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.876289 kubelet[2865]: E0527 17:49:34.876264 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.876289 kubelet[2865]: W0527 17:49:34.876278 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.876289 kubelet[2865]: E0527 17:49:34.876285 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.876414 kubelet[2865]: E0527 17:49:34.876386 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.876414 kubelet[2865]: W0527 17:49:34.876392 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.876414 kubelet[2865]: E0527 17:49:34.876400 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.876508 kubelet[2865]: E0527 17:49:34.876483 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.876508 kubelet[2865]: W0527 17:49:34.876497 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.876508 kubelet[2865]: E0527 17:49:34.876503 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.877165 kubelet[2865]: E0527 17:49:34.877144 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.877165 kubelet[2865]: W0527 17:49:34.877160 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.877165 kubelet[2865]: E0527 17:49:34.877167 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.877830 kubelet[2865]: E0527 17:49:34.877813 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.877830 kubelet[2865]: W0527 17:49:34.877826 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.877882 kubelet[2865]: E0527 17:49:34.877834 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.878120 kubelet[2865]: E0527 17:49:34.878101 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.878120 kubelet[2865]: W0527 17:49:34.878115 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.878172 kubelet[2865]: E0527 17:49:34.878122 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:34.886381 kubelet[2865]: E0527 17:49:34.886360 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:34.886381 kubelet[2865]: W0527 17:49:34.886377 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:34.886459 kubelet[2865]: E0527 17:49:34.886388 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:36.073950 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2307338512.mount: Deactivated successfully. May 27 17:49:36.232600 kubelet[2865]: E0527 17:49:36.232532 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5zw22" podUID="6cefb86f-fbd3-4c4c-8534-44d8ba742df1" May 27 17:49:37.045245 containerd[1565]: time="2025-05-27T17:49:37.044562870Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:37.046090 containerd[1565]: time="2025-05-27T17:49:37.046063068Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.0: active requests=0, bytes read=35158669" May 27 17:49:37.046872 containerd[1565]: time="2025-05-27T17:49:37.046844840Z" level=info msg="ImageCreate event name:\"sha256:71be0570e8645ac646675719e0da6ac33a05810991b31aecc303e7add70933be\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:37.049416 containerd[1565]: time="2025-05-27T17:49:37.049392234Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d282f6c773c4631b9dc8379eb093c54ca34c7728d55d6509cb45da5e1f5baf8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:37.050661 containerd[1565]: time="2025-05-27T17:49:37.050612942Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.0\" with image id \"sha256:71be0570e8645ac646675719e0da6ac33a05810991b31aecc303e7add70933be\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d282f6c773c4631b9dc8379eb093c54ca34c7728d55d6509cb45da5e1f5baf8f\", size \"35158523\" in 2.672021964s" May 27 17:49:37.050707 containerd[1565]: time="2025-05-27T17:49:37.050662794Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.0\" returns image reference \"sha256:71be0570e8645ac646675719e0da6ac33a05810991b31aecc303e7add70933be\"" May 27 17:49:37.052898 containerd[1565]: time="2025-05-27T17:49:37.052738853Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\"" May 27 17:49:37.067027 containerd[1565]: time="2025-05-27T17:49:37.066999352Z" level=info msg="CreateContainer within sandbox \"79a9a74fe021f9513f2a50bc7653984b86f49d7fab05deb68c5cc847c62a98d8\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 27 17:49:37.097565 containerd[1565]: time="2025-05-27T17:49:37.096557436Z" level=info msg="Container 36431077d2afa3d933b514aacaf20b74817865ba00ee94d0cf6a48dcb25d5a78: CDI devices from CRI Config.CDIDevices: []" May 27 17:49:37.098241 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1516550010.mount: Deactivated successfully. May 27 17:49:37.103268 containerd[1565]: time="2025-05-27T17:49:37.103240365Z" level=info msg="CreateContainer within sandbox \"79a9a74fe021f9513f2a50bc7653984b86f49d7fab05deb68c5cc847c62a98d8\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"36431077d2afa3d933b514aacaf20b74817865ba00ee94d0cf6a48dcb25d5a78\"" May 27 17:49:37.103759 containerd[1565]: time="2025-05-27T17:49:37.103659584Z" level=info msg="StartContainer for \"36431077d2afa3d933b514aacaf20b74817865ba00ee94d0cf6a48dcb25d5a78\"" May 27 17:49:37.104526 containerd[1565]: time="2025-05-27T17:49:37.104502340Z" level=info msg="connecting to shim 36431077d2afa3d933b514aacaf20b74817865ba00ee94d0cf6a48dcb25d5a78" address="unix:///run/containerd/s/9d8b0bd7012e6a2ff6f38f14166929a653c2867b9485fb3e5d8e1df0b467bfc8" protocol=ttrpc version=3 May 27 17:49:37.131336 systemd[1]: Started cri-containerd-36431077d2afa3d933b514aacaf20b74817865ba00ee94d0cf6a48dcb25d5a78.scope - libcontainer container 36431077d2afa3d933b514aacaf20b74817865ba00ee94d0cf6a48dcb25d5a78. May 27 17:49:37.185328 containerd[1565]: time="2025-05-27T17:49:37.185132758Z" level=info msg="StartContainer for \"36431077d2afa3d933b514aacaf20b74817865ba00ee94d0cf6a48dcb25d5a78\" returns successfully" May 27 17:49:37.353991 kubelet[2865]: I0527 17:49:37.353860 2865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-655b47479b-tsjv7" podStartSLOduration=1.679527442 podStartE2EDuration="4.353800839s" podCreationTimestamp="2025-05-27 17:49:33 +0000 UTC" firstStartedPulling="2025-05-27 17:49:34.378153205 +0000 UTC m=+19.230031855" lastFinishedPulling="2025-05-27 17:49:37.052426602 +0000 UTC m=+21.904305252" observedRunningTime="2025-05-27 17:49:37.353087172 +0000 UTC m=+22.204965823" watchObservedRunningTime="2025-05-27 17:49:37.353800839 +0000 UTC m=+22.205679489" May 27 17:49:37.361857 kubelet[2865]: E0527 17:49:37.361834 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:37.361857 kubelet[2865]: W0527 17:49:37.361851 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:37.362138 kubelet[2865]: E0527 17:49:37.362090 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:37.362440 kubelet[2865]: E0527 17:49:37.362425 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:37.362440 kubelet[2865]: W0527 17:49:37.362437 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:37.362505 kubelet[2865]: E0527 17:49:37.362446 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:37.362698 kubelet[2865]: E0527 17:49:37.362682 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:37.362698 kubelet[2865]: W0527 17:49:37.362694 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:37.362776 kubelet[2865]: E0527 17:49:37.362702 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:37.362975 kubelet[2865]: E0527 17:49:37.362942 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:37.362975 kubelet[2865]: W0527 17:49:37.362954 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:37.362975 kubelet[2865]: E0527 17:49:37.362962 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:37.363138 kubelet[2865]: E0527 17:49:37.363113 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:37.363138 kubelet[2865]: W0527 17:49:37.363124 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:37.363138 kubelet[2865]: E0527 17:49:37.363132 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:37.363290 kubelet[2865]: E0527 17:49:37.363276 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:37.363290 kubelet[2865]: W0527 17:49:37.363284 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:37.363345 kubelet[2865]: E0527 17:49:37.363291 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:37.363451 kubelet[2865]: E0527 17:49:37.363433 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:37.363451 kubelet[2865]: W0527 17:49:37.363441 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:37.363451 kubelet[2865]: E0527 17:49:37.363448 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:37.363599 kubelet[2865]: E0527 17:49:37.363584 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:37.363599 kubelet[2865]: W0527 17:49:37.363596 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:37.363599 kubelet[2865]: E0527 17:49:37.363603 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:37.363812 kubelet[2865]: E0527 17:49:37.363764 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:37.363812 kubelet[2865]: W0527 17:49:37.363773 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:37.363812 kubelet[2865]: E0527 17:49:37.363780 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:37.363974 kubelet[2865]: E0527 17:49:37.363901 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:37.363974 kubelet[2865]: W0527 17:49:37.363910 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:37.363974 kubelet[2865]: E0527 17:49:37.363938 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:37.364142 kubelet[2865]: E0527 17:49:37.364118 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:37.364142 kubelet[2865]: W0527 17:49:37.364130 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:37.364142 kubelet[2865]: E0527 17:49:37.364137 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:37.364332 kubelet[2865]: E0527 17:49:37.364315 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:37.364332 kubelet[2865]: W0527 17:49:37.364326 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:37.364332 kubelet[2865]: E0527 17:49:37.364333 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:37.364487 kubelet[2865]: E0527 17:49:37.364468 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:37.364487 kubelet[2865]: W0527 17:49:37.364478 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:37.364487 kubelet[2865]: E0527 17:49:37.364485 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:37.364634 kubelet[2865]: E0527 17:49:37.364601 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:37.364634 kubelet[2865]: W0527 17:49:37.364607 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:37.364634 kubelet[2865]: E0527 17:49:37.364613 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:37.364798 kubelet[2865]: E0527 17:49:37.364744 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:37.364798 kubelet[2865]: W0527 17:49:37.364755 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:37.364798 kubelet[2865]: E0527 17:49:37.364765 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:37.390172 kubelet[2865]: E0527 17:49:37.390142 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:37.390172 kubelet[2865]: W0527 17:49:37.390164 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:37.390272 kubelet[2865]: E0527 17:49:37.390176 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:37.391093 kubelet[2865]: E0527 17:49:37.391056 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:37.391093 kubelet[2865]: W0527 17:49:37.391066 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:37.391093 kubelet[2865]: E0527 17:49:37.391075 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:37.391387 kubelet[2865]: E0527 17:49:37.391360 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:37.391387 kubelet[2865]: W0527 17:49:37.391369 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:37.391387 kubelet[2865]: E0527 17:49:37.391377 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:37.391724 kubelet[2865]: E0527 17:49:37.391706 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:37.391724 kubelet[2865]: W0527 17:49:37.391718 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:37.391798 kubelet[2865]: E0527 17:49:37.391726 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:37.391972 kubelet[2865]: E0527 17:49:37.391918 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:37.391972 kubelet[2865]: W0527 17:49:37.391965 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:37.392032 kubelet[2865]: E0527 17:49:37.391982 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:37.392295 kubelet[2865]: E0527 17:49:37.392279 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:37.392295 kubelet[2865]: W0527 17:49:37.392290 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:37.392360 kubelet[2865]: E0527 17:49:37.392299 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:37.392882 kubelet[2865]: E0527 17:49:37.392861 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:37.393204 kubelet[2865]: W0527 17:49:37.392872 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:37.393204 kubelet[2865]: E0527 17:49:37.393183 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:37.393528 kubelet[2865]: E0527 17:49:37.393513 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:37.393528 kubelet[2865]: W0527 17:49:37.393525 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:37.393579 kubelet[2865]: E0527 17:49:37.393534 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:37.394554 kubelet[2865]: E0527 17:49:37.394536 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:37.394554 kubelet[2865]: W0527 17:49:37.394549 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:37.394623 kubelet[2865]: E0527 17:49:37.394557 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:37.394894 kubelet[2865]: E0527 17:49:37.394877 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:37.394894 kubelet[2865]: W0527 17:49:37.394889 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:37.395297 kubelet[2865]: E0527 17:49:37.394898 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:37.395297 kubelet[2865]: E0527 17:49:37.395002 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:37.395297 kubelet[2865]: W0527 17:49:37.395008 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:37.395297 kubelet[2865]: E0527 17:49:37.395016 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:37.395490 kubelet[2865]: E0527 17:49:37.395474 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:37.395490 kubelet[2865]: W0527 17:49:37.395487 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:37.395490 kubelet[2865]: E0527 17:49:37.395496 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:37.396579 kubelet[2865]: E0527 17:49:37.396512 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:37.396579 kubelet[2865]: W0527 17:49:37.396525 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:37.396579 kubelet[2865]: E0527 17:49:37.396534 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:37.397024 kubelet[2865]: E0527 17:49:37.397011 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:37.397061 kubelet[2865]: W0527 17:49:37.397023 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:37.397061 kubelet[2865]: E0527 17:49:37.397044 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:37.397312 kubelet[2865]: E0527 17:49:37.397302 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:37.397312 kubelet[2865]: W0527 17:49:37.397312 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:37.397432 kubelet[2865]: E0527 17:49:37.397320 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:37.397648 kubelet[2865]: E0527 17:49:37.397621 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:37.397648 kubelet[2865]: W0527 17:49:37.397634 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:37.397648 kubelet[2865]: E0527 17:49:37.397641 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:37.397904 kubelet[2865]: E0527 17:49:37.397885 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:37.397904 kubelet[2865]: W0527 17:49:37.397898 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:37.397980 kubelet[2865]: E0527 17:49:37.397906 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:37.399343 kubelet[2865]: E0527 17:49:37.399289 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:37.399343 kubelet[2865]: W0527 17:49:37.399300 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:37.399343 kubelet[2865]: E0527 17:49:37.399308 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:38.232135 kubelet[2865]: E0527 17:49:38.232093 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5zw22" podUID="6cefb86f-fbd3-4c4c-8534-44d8ba742df1" May 27 17:49:38.344427 kubelet[2865]: I0527 17:49:38.344393 2865 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:49:38.369413 kubelet[2865]: E0527 17:49:38.369376 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:38.369413 kubelet[2865]: W0527 17:49:38.369399 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:38.369413 kubelet[2865]: E0527 17:49:38.369418 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:38.369777 kubelet[2865]: E0527 17:49:38.369541 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:38.369777 kubelet[2865]: W0527 17:49:38.369548 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:38.369777 kubelet[2865]: E0527 17:49:38.369557 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:38.369777 kubelet[2865]: E0527 17:49:38.369666 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:38.369777 kubelet[2865]: W0527 17:49:38.369673 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:38.369777 kubelet[2865]: E0527 17:49:38.369680 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:38.369905 kubelet[2865]: E0527 17:49:38.369801 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:38.369905 kubelet[2865]: W0527 17:49:38.369808 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:38.369905 kubelet[2865]: E0527 17:49:38.369815 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:38.369966 kubelet[2865]: E0527 17:49:38.369917 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:38.369966 kubelet[2865]: W0527 17:49:38.369924 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:38.369966 kubelet[2865]: E0527 17:49:38.369931 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:38.370035 kubelet[2865]: E0527 17:49:38.370021 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:38.370035 kubelet[2865]: W0527 17:49:38.370033 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:38.370107 kubelet[2865]: E0527 17:49:38.370040 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:38.370172 kubelet[2865]: E0527 17:49:38.370147 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:38.370172 kubelet[2865]: W0527 17:49:38.370163 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:38.370172 kubelet[2865]: E0527 17:49:38.370170 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:38.370322 kubelet[2865]: E0527 17:49:38.370312 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:38.370322 kubelet[2865]: W0527 17:49:38.370321 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:38.370397 kubelet[2865]: E0527 17:49:38.370329 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:38.370542 kubelet[2865]: E0527 17:49:38.370519 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:38.370542 kubelet[2865]: W0527 17:49:38.370534 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:38.370622 kubelet[2865]: E0527 17:49:38.370545 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:38.370695 kubelet[2865]: E0527 17:49:38.370681 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:38.370695 kubelet[2865]: W0527 17:49:38.370692 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:38.370768 kubelet[2865]: E0527 17:49:38.370700 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:38.370844 kubelet[2865]: E0527 17:49:38.370822 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:38.370844 kubelet[2865]: W0527 17:49:38.370840 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:38.370897 kubelet[2865]: E0527 17:49:38.370848 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:38.370994 kubelet[2865]: E0527 17:49:38.370974 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:38.370994 kubelet[2865]: W0527 17:49:38.370987 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:38.371068 kubelet[2865]: E0527 17:49:38.370997 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:38.371144 kubelet[2865]: E0527 17:49:38.371126 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:38.371144 kubelet[2865]: W0527 17:49:38.371138 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:38.371200 kubelet[2865]: E0527 17:49:38.371145 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:38.371292 kubelet[2865]: E0527 17:49:38.371273 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:38.371292 kubelet[2865]: W0527 17:49:38.371286 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:38.371357 kubelet[2865]: E0527 17:49:38.371293 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:38.371452 kubelet[2865]: E0527 17:49:38.371433 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:38.371452 kubelet[2865]: W0527 17:49:38.371446 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:38.371518 kubelet[2865]: E0527 17:49:38.371456 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:38.401905 kubelet[2865]: E0527 17:49:38.401872 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:38.401905 kubelet[2865]: W0527 17:49:38.401891 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:38.401905 kubelet[2865]: E0527 17:49:38.401907 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:38.402106 kubelet[2865]: E0527 17:49:38.402078 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:38.402141 kubelet[2865]: W0527 17:49:38.402092 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:38.402141 kubelet[2865]: E0527 17:49:38.402129 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:38.402362 kubelet[2865]: E0527 17:49:38.402342 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:38.402362 kubelet[2865]: W0527 17:49:38.402354 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:38.402362 kubelet[2865]: E0527 17:49:38.402363 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:38.402593 kubelet[2865]: E0527 17:49:38.402575 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:38.402593 kubelet[2865]: W0527 17:49:38.402588 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:38.402672 kubelet[2865]: E0527 17:49:38.402596 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:38.402802 kubelet[2865]: E0527 17:49:38.402783 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:38.402802 kubelet[2865]: W0527 17:49:38.402796 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:38.402802 kubelet[2865]: E0527 17:49:38.402804 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:38.402938 kubelet[2865]: E0527 17:49:38.402918 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:38.402938 kubelet[2865]: W0527 17:49:38.402930 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:38.403019 kubelet[2865]: E0527 17:49:38.402938 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:38.403176 kubelet[2865]: E0527 17:49:38.403152 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:38.403176 kubelet[2865]: W0527 17:49:38.403167 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:38.403296 kubelet[2865]: E0527 17:49:38.403186 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:38.403735 kubelet[2865]: E0527 17:49:38.403677 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:38.403735 kubelet[2865]: W0527 17:49:38.403690 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:38.403735 kubelet[2865]: E0527 17:49:38.403698 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:38.403856 kubelet[2865]: E0527 17:49:38.403837 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:38.403856 kubelet[2865]: W0527 17:49:38.403846 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:38.403856 kubelet[2865]: E0527 17:49:38.403854 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:38.404050 kubelet[2865]: E0527 17:49:38.403975 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:38.404050 kubelet[2865]: W0527 17:49:38.403981 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:38.404050 kubelet[2865]: E0527 17:49:38.403989 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:38.404172 kubelet[2865]: E0527 17:49:38.404111 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:38.404172 kubelet[2865]: W0527 17:49:38.404118 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:38.404172 kubelet[2865]: E0527 17:49:38.404125 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:38.404399 kubelet[2865]: E0527 17:49:38.404266 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:38.404399 kubelet[2865]: W0527 17:49:38.404273 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:38.404399 kubelet[2865]: E0527 17:49:38.404281 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:38.404399 kubelet[2865]: E0527 17:49:38.404396 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:38.404520 kubelet[2865]: W0527 17:49:38.404402 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:38.404520 kubelet[2865]: E0527 17:49:38.404411 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:38.404588 kubelet[2865]: E0527 17:49:38.404564 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:38.404588 kubelet[2865]: W0527 17:49:38.404571 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:38.404588 kubelet[2865]: E0527 17:49:38.404579 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:38.404884 kubelet[2865]: E0527 17:49:38.404859 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:38.404884 kubelet[2865]: W0527 17:49:38.404875 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:38.404884 kubelet[2865]: E0527 17:49:38.404886 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:38.405056 kubelet[2865]: E0527 17:49:38.405039 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:38.405056 kubelet[2865]: W0527 17:49:38.405050 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:38.405112 kubelet[2865]: E0527 17:49:38.405059 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:38.405281 kubelet[2865]: E0527 17:49:38.405268 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:38.405281 kubelet[2865]: W0527 17:49:38.405280 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:38.405353 kubelet[2865]: E0527 17:49:38.405288 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:38.405572 kubelet[2865]: E0527 17:49:38.405551 2865 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 17:49:38.405572 kubelet[2865]: W0527 17:49:38.405565 2865 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 17:49:38.405572 kubelet[2865]: E0527 17:49:38.405574 2865 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 17:49:38.919926 containerd[1565]: time="2025-05-27T17:49:38.919536790Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:38.920922 containerd[1565]: time="2025-05-27T17:49:38.920868956Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0: active requests=0, bytes read=4441619" May 27 17:49:38.921176 containerd[1565]: time="2025-05-27T17:49:38.921153685Z" level=info msg="ImageCreate event name:\"sha256:c53606cea03e59dcbfa981dc43a55dff05952895f72576b8389fa00be09ab676\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:38.922996 containerd[1565]: time="2025-05-27T17:49:38.922948171Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:ce76dd87f11d3fd0054c35ad2e0e9f833748d007f77a9bfe859d0ddcb66fcb2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:38.923381 containerd[1565]: time="2025-05-27T17:49:38.923283063Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" with image id \"sha256:c53606cea03e59dcbfa981dc43a55dff05952895f72576b8389fa00be09ab676\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:ce76dd87f11d3fd0054c35ad2e0e9f833748d007f77a9bfe859d0ddcb66fcb2c\", size \"5934282\" in 1.870511579s" May 27 17:49:38.923381 containerd[1565]: time="2025-05-27T17:49:38.923309523Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" returns image reference \"sha256:c53606cea03e59dcbfa981dc43a55dff05952895f72576b8389fa00be09ab676\"" May 27 17:49:38.927163 containerd[1565]: time="2025-05-27T17:49:38.927135714Z" level=info msg="CreateContainer within sandbox \"74d2d6f5705a1cbf153c09249e5f4d2df5359f4dbf6f7e21893020fb31e2c3a3\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 27 17:49:38.934923 containerd[1565]: time="2025-05-27T17:49:38.932322375Z" level=info msg="Container cc483a90b9f1c799db8a63aad897ce4f2bd83c5840abe8ce9fc5fde0f3ee8cc7: CDI devices from CRI Config.CDIDevices: []" May 27 17:49:38.953773 containerd[1565]: time="2025-05-27T17:49:38.953739401Z" level=info msg="CreateContainer within sandbox \"74d2d6f5705a1cbf153c09249e5f4d2df5359f4dbf6f7e21893020fb31e2c3a3\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"cc483a90b9f1c799db8a63aad897ce4f2bd83c5840abe8ce9fc5fde0f3ee8cc7\"" May 27 17:49:38.954255 containerd[1565]: time="2025-05-27T17:49:38.954132220Z" level=info msg="StartContainer for \"cc483a90b9f1c799db8a63aad897ce4f2bd83c5840abe8ce9fc5fde0f3ee8cc7\"" May 27 17:49:38.955856 containerd[1565]: time="2025-05-27T17:49:38.955391813Z" level=info msg="connecting to shim cc483a90b9f1c799db8a63aad897ce4f2bd83c5840abe8ce9fc5fde0f3ee8cc7" address="unix:///run/containerd/s/45e12a65ccb6700c521ca9ba5f05b34ba305e3a09fc648a07aa60fc2ca3276b0" protocol=ttrpc version=3 May 27 17:49:38.975357 systemd[1]: Started cri-containerd-cc483a90b9f1c799db8a63aad897ce4f2bd83c5840abe8ce9fc5fde0f3ee8cc7.scope - libcontainer container cc483a90b9f1c799db8a63aad897ce4f2bd83c5840abe8ce9fc5fde0f3ee8cc7. May 27 17:49:39.004963 containerd[1565]: time="2025-05-27T17:49:39.004925159Z" level=info msg="StartContainer for \"cc483a90b9f1c799db8a63aad897ce4f2bd83c5840abe8ce9fc5fde0f3ee8cc7\" returns successfully" May 27 17:49:39.013782 systemd[1]: cri-containerd-cc483a90b9f1c799db8a63aad897ce4f2bd83c5840abe8ce9fc5fde0f3ee8cc7.scope: Deactivated successfully. May 27 17:49:39.031066 containerd[1565]: time="2025-05-27T17:49:39.031000053Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cc483a90b9f1c799db8a63aad897ce4f2bd83c5840abe8ce9fc5fde0f3ee8cc7\" id:\"cc483a90b9f1c799db8a63aad897ce4f2bd83c5840abe8ce9fc5fde0f3ee8cc7\" pid:3575 exited_at:{seconds:1748368179 nanos:16748599}" May 27 17:49:39.045431 containerd[1565]: time="2025-05-27T17:49:39.045393321Z" level=info msg="received exit event container_id:\"cc483a90b9f1c799db8a63aad897ce4f2bd83c5840abe8ce9fc5fde0f3ee8cc7\" id:\"cc483a90b9f1c799db8a63aad897ce4f2bd83c5840abe8ce9fc5fde0f3ee8cc7\" pid:3575 exited_at:{seconds:1748368179 nanos:16748599}" May 27 17:49:39.065856 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-cc483a90b9f1c799db8a63aad897ce4f2bd83c5840abe8ce9fc5fde0f3ee8cc7-rootfs.mount: Deactivated successfully. May 27 17:49:39.349341 containerd[1565]: time="2025-05-27T17:49:39.349191822Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.0\"" May 27 17:49:40.231946 kubelet[2865]: E0527 17:49:40.231911 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5zw22" podUID="6cefb86f-fbd3-4c4c-8534-44d8ba742df1" May 27 17:49:42.231732 kubelet[2865]: E0527 17:49:42.231638 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-5zw22" podUID="6cefb86f-fbd3-4c4c-8534-44d8ba742df1" May 27 17:49:43.438468 containerd[1565]: time="2025-05-27T17:49:43.438420752Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:43.443797 containerd[1565]: time="2025-05-27T17:49:43.439384796Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.0: active requests=0, bytes read=70300568" May 27 17:49:43.443797 containerd[1565]: time="2025-05-27T17:49:43.440610567Z" level=info msg="ImageCreate event name:\"sha256:15f996c472622f23047ea38b2d72940e8c34d0996b8a2e12a1f255c1d7083185\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:43.443797 containerd[1565]: time="2025-05-27T17:49:43.443610598Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.0\" with image id \"sha256:15f996c472622f23047ea38b2d72940e8c34d0996b8a2e12a1f255c1d7083185\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:3dd06656abdc03fbd51782d5f6fe4d70e6825a1c0c5bce2a165bbd2ff9e0f7df\", size \"71793271\" in 4.094347965s" May 27 17:49:43.443797 containerd[1565]: time="2025-05-27T17:49:43.443631968Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.0\" returns image reference \"sha256:15f996c472622f23047ea38b2d72940e8c34d0996b8a2e12a1f255c1d7083185\"" May 27 17:49:43.445129 containerd[1565]: time="2025-05-27T17:49:43.445105109Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:3dd06656abdc03fbd51782d5f6fe4d70e6825a1c0c5bce2a165bbd2ff9e0f7df\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:43.447906 containerd[1565]: time="2025-05-27T17:49:43.447880884Z" level=info msg="CreateContainer within sandbox \"74d2d6f5705a1cbf153c09249e5f4d2df5359f4dbf6f7e21893020fb31e2c3a3\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 27 17:49:43.457958 containerd[1565]: time="2025-05-27T17:49:43.457612220Z" level=info msg="Container 3012ee1e8d29b4ebc4ecd2ab003d555d8d8e4e4fd9cdfd74b8dd5202b6c8e241: CDI devices from CRI Config.CDIDevices: []" May 27 17:49:43.468761 containerd[1565]: time="2025-05-27T17:49:43.468734675Z" level=info msg="CreateContainer within sandbox \"74d2d6f5705a1cbf153c09249e5f4d2df5359f4dbf6f7e21893020fb31e2c3a3\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"3012ee1e8d29b4ebc4ecd2ab003d555d8d8e4e4fd9cdfd74b8dd5202b6c8e241\"" May 27 17:49:43.469292 containerd[1565]: time="2025-05-27T17:49:43.469271884Z" level=info msg="StartContainer for \"3012ee1e8d29b4ebc4ecd2ab003d555d8d8e4e4fd9cdfd74b8dd5202b6c8e241\"" May 27 17:49:43.470568 containerd[1565]: time="2025-05-27T17:49:43.470542188Z" level=info msg="connecting to shim 3012ee1e8d29b4ebc4ecd2ab003d555d8d8e4e4fd9cdfd74b8dd5202b6c8e241" address="unix:///run/containerd/s/45e12a65ccb6700c521ca9ba5f05b34ba305e3a09fc648a07aa60fc2ca3276b0" protocol=ttrpc version=3 May 27 17:49:43.489334 systemd[1]: Started cri-containerd-3012ee1e8d29b4ebc4ecd2ab003d555d8d8e4e4fd9cdfd74b8dd5202b6c8e241.scope - libcontainer container 3012ee1e8d29b4ebc4ecd2ab003d555d8d8e4e4fd9cdfd74b8dd5202b6c8e241. May 27 17:49:43.519094 containerd[1565]: time="2025-05-27T17:49:43.519062525Z" level=info msg="StartContainer for \"3012ee1e8d29b4ebc4ecd2ab003d555d8d8e4e4fd9cdfd74b8dd5202b6c8e241\" returns successfully" May 27 17:49:43.848947 systemd[1]: cri-containerd-3012ee1e8d29b4ebc4ecd2ab003d555d8d8e4e4fd9cdfd74b8dd5202b6c8e241.scope: Deactivated successfully. May 27 17:49:43.849146 systemd[1]: cri-containerd-3012ee1e8d29b4ebc4ecd2ab003d555d8d8e4e4fd9cdfd74b8dd5202b6c8e241.scope: Consumed 303ms CPU time, 166.6M memory peak, 11.6M read from disk, 170.9M written to disk. May 27 17:49:43.851362 containerd[1565]: time="2025-05-27T17:49:43.851287569Z" level=info msg="received exit event container_id:\"3012ee1e8d29b4ebc4ecd2ab003d555d8d8e4e4fd9cdfd74b8dd5202b6c8e241\" id:\"3012ee1e8d29b4ebc4ecd2ab003d555d8d8e4e4fd9cdfd74b8dd5202b6c8e241\" pid:3630 exited_at:{seconds:1748368183 nanos:850745490}" May 27 17:49:43.854060 containerd[1565]: time="2025-05-27T17:49:43.854035782Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3012ee1e8d29b4ebc4ecd2ab003d555d8d8e4e4fd9cdfd74b8dd5202b6c8e241\" id:\"3012ee1e8d29b4ebc4ecd2ab003d555d8d8e4e4fd9cdfd74b8dd5202b6c8e241\" pid:3630 exited_at:{seconds:1748368183 nanos:850745490}" May 27 17:49:43.880315 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3012ee1e8d29b4ebc4ecd2ab003d555d8d8e4e4fd9cdfd74b8dd5202b6c8e241-rootfs.mount: Deactivated successfully. May 27 17:49:43.916737 kubelet[2865]: I0527 17:49:43.916631 2865 kubelet_node_status.go:501] "Fast updating node status as it just became ready" May 27 17:49:43.971161 systemd[1]: Created slice kubepods-burstable-podb12086d3_36c9_4de5_998b_0494bb3bd5ec.slice - libcontainer container kubepods-burstable-podb12086d3_36c9_4de5_998b_0494bb3bd5ec.slice. May 27 17:49:43.977320 systemd[1]: Created slice kubepods-besteffort-pod27ab50e6_a59e_432e_b7e7_3f9bf2c59af6.slice - libcontainer container kubepods-besteffort-pod27ab50e6_a59e_432e_b7e7_3f9bf2c59af6.slice. May 27 17:49:43.988464 systemd[1]: Created slice kubepods-burstable-pod1513fc78_c3ee_47f4_9d52_8cedb71687f3.slice - libcontainer container kubepods-burstable-pod1513fc78_c3ee_47f4_9d52_8cedb71687f3.slice. May 27 17:49:43.997361 systemd[1]: Created slice kubepods-besteffort-podfad65c3f_de87_4e29_9edf_763061c577ea.slice - libcontainer container kubepods-besteffort-podfad65c3f_de87_4e29_9edf_763061c577ea.slice. May 27 17:49:44.004371 systemd[1]: Created slice kubepods-besteffort-pod901c1e3b_28f7_45f0_9531_bd718bf1ff98.slice - libcontainer container kubepods-besteffort-pod901c1e3b_28f7_45f0_9531_bd718bf1ff98.slice. May 27 17:49:44.011492 systemd[1]: Created slice kubepods-besteffort-pod8b985871_4559_42e8_9cf3_7b26e6ec2b9f.slice - libcontainer container kubepods-besteffort-pod8b985871_4559_42e8_9cf3_7b26e6ec2b9f.slice. May 27 17:49:44.017534 systemd[1]: Created slice kubepods-besteffort-podbb9f0f7c_2db0_4832_a710_1a0c49d448f1.slice - libcontainer container kubepods-besteffort-podbb9f0f7c_2db0_4832_a710_1a0c49d448f1.slice. May 27 17:49:44.039430 kubelet[2865]: I0527 17:49:44.039374 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/901c1e3b-28f7-45f0-9531-bd718bf1ff98-whisker-ca-bundle\") pod \"whisker-6bd7fbb46-ck2w7\" (UID: \"901c1e3b-28f7-45f0-9531-bd718bf1ff98\") " pod="calico-system/whisker-6bd7fbb46-ck2w7" May 27 17:49:44.039430 kubelet[2865]: I0527 17:49:44.039410 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz8mz\" (UniqueName: \"kubernetes.io/projected/1513fc78-c3ee-47f4-9d52-8cedb71687f3-kube-api-access-hz8mz\") pod \"coredns-674b8bbfcf-6fkcq\" (UID: \"1513fc78-c3ee-47f4-9d52-8cedb71687f3\") " pod="kube-system/coredns-674b8bbfcf-6fkcq" May 27 17:49:44.039430 kubelet[2865]: I0527 17:49:44.039424 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/fad65c3f-de87-4e29-9edf-763061c577ea-calico-apiserver-certs\") pod \"calico-apiserver-9f4d94d96-jn9qm\" (UID: \"fad65c3f-de87-4e29-9edf-763061c577ea\") " pod="calico-apiserver/calico-apiserver-9f4d94d96-jn9qm" May 27 17:49:44.039430 kubelet[2865]: I0527 17:49:44.039437 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td6ct\" (UniqueName: \"kubernetes.io/projected/fad65c3f-de87-4e29-9edf-763061c577ea-kube-api-access-td6ct\") pod \"calico-apiserver-9f4d94d96-jn9qm\" (UID: \"fad65c3f-de87-4e29-9edf-763061c577ea\") " pod="calico-apiserver/calico-apiserver-9f4d94d96-jn9qm" May 27 17:49:44.040647 kubelet[2865]: I0527 17:49:44.039455 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtzsz\" (UniqueName: \"kubernetes.io/projected/901c1e3b-28f7-45f0-9531-bd718bf1ff98-kube-api-access-rtzsz\") pod \"whisker-6bd7fbb46-ck2w7\" (UID: \"901c1e3b-28f7-45f0-9531-bd718bf1ff98\") " pod="calico-system/whisker-6bd7fbb46-ck2w7" May 27 17:49:44.040871 kubelet[2865]: I0527 17:49:44.040674 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b985871-4559-42e8-9cf3-7b26e6ec2b9f-goldmane-ca-bundle\") pod \"goldmane-78d55f7ddc-fq45k\" (UID: \"8b985871-4559-42e8-9cf3-7b26e6ec2b9f\") " pod="calico-system/goldmane-78d55f7ddc-fq45k" May 27 17:49:44.040871 kubelet[2865]: I0527 17:49:44.040732 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b12086d3-36c9-4de5-998b-0494bb3bd5ec-config-volume\") pod \"coredns-674b8bbfcf-qsxzp\" (UID: \"b12086d3-36c9-4de5-998b-0494bb3bd5ec\") " pod="kube-system/coredns-674b8bbfcf-qsxzp" May 27 17:49:44.040871 kubelet[2865]: I0527 17:49:44.040753 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27ab50e6-a59e-432e-b7e7-3f9bf2c59af6-tigera-ca-bundle\") pod \"calico-kube-controllers-cb6f4656f-6r5lk\" (UID: \"27ab50e6-a59e-432e-b7e7-3f9bf2c59af6\") " pod="calico-system/calico-kube-controllers-cb6f4656f-6r5lk" May 27 17:49:44.040871 kubelet[2865]: I0527 17:49:44.040856 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmb5r\" (UniqueName: \"kubernetes.io/projected/27ab50e6-a59e-432e-b7e7-3f9bf2c59af6-kube-api-access-jmb5r\") pod \"calico-kube-controllers-cb6f4656f-6r5lk\" (UID: \"27ab50e6-a59e-432e-b7e7-3f9bf2c59af6\") " pod="calico-system/calico-kube-controllers-cb6f4656f-6r5lk" May 27 17:49:44.040988 kubelet[2865]: I0527 17:49:44.040874 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1513fc78-c3ee-47f4-9d52-8cedb71687f3-config-volume\") pod \"coredns-674b8bbfcf-6fkcq\" (UID: \"1513fc78-c3ee-47f4-9d52-8cedb71687f3\") " pod="kube-system/coredns-674b8bbfcf-6fkcq" May 27 17:49:44.041009 kubelet[2865]: I0527 17:49:44.040990 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b985871-4559-42e8-9cf3-7b26e6ec2b9f-config\") pod \"goldmane-78d55f7ddc-fq45k\" (UID: \"8b985871-4559-42e8-9cf3-7b26e6ec2b9f\") " pod="calico-system/goldmane-78d55f7ddc-fq45k" May 27 17:49:44.041144 kubelet[2865]: I0527 17:49:44.041024 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx2l8\" (UniqueName: \"kubernetes.io/projected/bb9f0f7c-2db0-4832-a710-1a0c49d448f1-kube-api-access-rx2l8\") pod \"calico-apiserver-9f4d94d96-v64xk\" (UID: \"bb9f0f7c-2db0-4832-a710-1a0c49d448f1\") " pod="calico-apiserver/calico-apiserver-9f4d94d96-v64xk" May 27 17:49:44.041170 kubelet[2865]: I0527 17:49:44.041156 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpm55\" (UniqueName: \"kubernetes.io/projected/b12086d3-36c9-4de5-998b-0494bb3bd5ec-kube-api-access-jpm55\") pod \"coredns-674b8bbfcf-qsxzp\" (UID: \"b12086d3-36c9-4de5-998b-0494bb3bd5ec\") " pod="kube-system/coredns-674b8bbfcf-qsxzp" May 27 17:49:44.041833 kubelet[2865]: I0527 17:49:44.041176 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/8b985871-4559-42e8-9cf3-7b26e6ec2b9f-goldmane-key-pair\") pod \"goldmane-78d55f7ddc-fq45k\" (UID: \"8b985871-4559-42e8-9cf3-7b26e6ec2b9f\") " pod="calico-system/goldmane-78d55f7ddc-fq45k" May 27 17:49:44.042647 kubelet[2865]: I0527 17:49:44.042602 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/bb9f0f7c-2db0-4832-a710-1a0c49d448f1-calico-apiserver-certs\") pod \"calico-apiserver-9f4d94d96-v64xk\" (UID: \"bb9f0f7c-2db0-4832-a710-1a0c49d448f1\") " pod="calico-apiserver/calico-apiserver-9f4d94d96-v64xk" May 27 17:49:44.042718 kubelet[2865]: I0527 17:49:44.042678 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/901c1e3b-28f7-45f0-9531-bd718bf1ff98-whisker-backend-key-pair\") pod \"whisker-6bd7fbb46-ck2w7\" (UID: \"901c1e3b-28f7-45f0-9531-bd718bf1ff98\") " pod="calico-system/whisker-6bd7fbb46-ck2w7" May 27 17:49:44.042789 kubelet[2865]: I0527 17:49:44.042744 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs4qx\" (UniqueName: \"kubernetes.io/projected/8b985871-4559-42e8-9cf3-7b26e6ec2b9f-kube-api-access-xs4qx\") pod \"goldmane-78d55f7ddc-fq45k\" (UID: \"8b985871-4559-42e8-9cf3-7b26e6ec2b9f\") " pod="calico-system/goldmane-78d55f7ddc-fq45k" May 27 17:49:44.236848 systemd[1]: Created slice kubepods-besteffort-pod6cefb86f_fbd3_4c4c_8534_44d8ba742df1.slice - libcontainer container kubepods-besteffort-pod6cefb86f_fbd3_4c4c_8534_44d8ba742df1.slice. May 27 17:49:44.245648 containerd[1565]: time="2025-05-27T17:49:44.245600300Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5zw22,Uid:6cefb86f-fbd3-4c4c-8534-44d8ba742df1,Namespace:calico-system,Attempt:0,}" May 27 17:49:44.286750 containerd[1565]: time="2025-05-27T17:49:44.286707259Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-cb6f4656f-6r5lk,Uid:27ab50e6-a59e-432e-b7e7-3f9bf2c59af6,Namespace:calico-system,Attempt:0,}" May 27 17:49:44.287020 containerd[1565]: time="2025-05-27T17:49:44.287001838Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-qsxzp,Uid:b12086d3-36c9-4de5-998b-0494bb3bd5ec,Namespace:kube-system,Attempt:0,}" May 27 17:49:44.295141 containerd[1565]: time="2025-05-27T17:49:44.295037801Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-6fkcq,Uid:1513fc78-c3ee-47f4-9d52-8cedb71687f3,Namespace:kube-system,Attempt:0,}" May 27 17:49:44.301238 containerd[1565]: time="2025-05-27T17:49:44.301120632Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9f4d94d96-jn9qm,Uid:fad65c3f-de87-4e29-9edf-763061c577ea,Namespace:calico-apiserver,Attempt:0,}" May 27 17:49:44.307762 containerd[1565]: time="2025-05-27T17:49:44.307720825Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6bd7fbb46-ck2w7,Uid:901c1e3b-28f7-45f0-9531-bd718bf1ff98,Namespace:calico-system,Attempt:0,}" May 27 17:49:44.315757 containerd[1565]: time="2025-05-27T17:49:44.315700605Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-fq45k,Uid:8b985871-4559-42e8-9cf3-7b26e6ec2b9f,Namespace:calico-system,Attempt:0,}" May 27 17:49:44.324664 containerd[1565]: time="2025-05-27T17:49:44.324583035Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9f4d94d96-v64xk,Uid:bb9f0f7c-2db0-4832-a710-1a0c49d448f1,Namespace:calico-apiserver,Attempt:0,}" May 27 17:49:44.371692 containerd[1565]: time="2025-05-27T17:49:44.371655144Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.0\"" May 27 17:49:44.487803 containerd[1565]: time="2025-05-27T17:49:44.487031367Z" level=error msg="Failed to destroy network for sandbox \"ae7d23e452d35d8311a491c73fc79b5e74d5f4573da3d1bac65196e2a14d5e06\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:49:44.489271 containerd[1565]: time="2025-05-27T17:49:44.488982899Z" level=error msg="Failed to destroy network for sandbox \"ebecf9f9ab944f56f85a876fee4f25348002f0fd2c6453a6e0b34eb5fa5435f1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:49:44.490231 systemd[1]: run-netns-cni\x2d2952d362\x2d3e7d\x2dad23\x2d0cc6\x2dd8855af188aa.mount: Deactivated successfully. May 27 17:49:44.493665 systemd[1]: run-netns-cni\x2dc1dd00ca\x2de92f\x2d3830\x2df801\x2d16f8dffc9ccd.mount: Deactivated successfully. May 27 17:49:44.498003 containerd[1565]: time="2025-05-27T17:49:44.497860440Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5zw22,Uid:6cefb86f-fbd3-4c4c-8534-44d8ba742df1,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ae7d23e452d35d8311a491c73fc79b5e74d5f4573da3d1bac65196e2a14d5e06\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:49:44.498474 kubelet[2865]: E0527 17:49:44.498432 2865 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ae7d23e452d35d8311a491c73fc79b5e74d5f4573da3d1bac65196e2a14d5e06\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:49:44.498591 kubelet[2865]: E0527 17:49:44.498572 2865 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ae7d23e452d35d8311a491c73fc79b5e74d5f4573da3d1bac65196e2a14d5e06\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-5zw22" May 27 17:49:44.498815 kubelet[2865]: E0527 17:49:44.498663 2865 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ae7d23e452d35d8311a491c73fc79b5e74d5f4573da3d1bac65196e2a14d5e06\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-5zw22" May 27 17:49:44.498815 kubelet[2865]: E0527 17:49:44.498782 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-5zw22_calico-system(6cefb86f-fbd3-4c4c-8534-44d8ba742df1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-5zw22_calico-system(6cefb86f-fbd3-4c4c-8534-44d8ba742df1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ae7d23e452d35d8311a491c73fc79b5e74d5f4573da3d1bac65196e2a14d5e06\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-5zw22" podUID="6cefb86f-fbd3-4c4c-8534-44d8ba742df1" May 27 17:49:44.500503 containerd[1565]: time="2025-05-27T17:49:44.500465017Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-qsxzp,Uid:b12086d3-36c9-4de5-998b-0494bb3bd5ec,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ebecf9f9ab944f56f85a876fee4f25348002f0fd2c6453a6e0b34eb5fa5435f1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:49:44.500791 kubelet[2865]: E0527 17:49:44.500686 2865 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ebecf9f9ab944f56f85a876fee4f25348002f0fd2c6453a6e0b34eb5fa5435f1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:49:44.500791 kubelet[2865]: E0527 17:49:44.500712 2865 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ebecf9f9ab944f56f85a876fee4f25348002f0fd2c6453a6e0b34eb5fa5435f1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-qsxzp" May 27 17:49:44.500791 kubelet[2865]: E0527 17:49:44.500727 2865 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ebecf9f9ab944f56f85a876fee4f25348002f0fd2c6453a6e0b34eb5fa5435f1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-qsxzp" May 27 17:49:44.500878 kubelet[2865]: E0527 17:49:44.500753 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-qsxzp_kube-system(b12086d3-36c9-4de5-998b-0494bb3bd5ec)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-qsxzp_kube-system(b12086d3-36c9-4de5-998b-0494bb3bd5ec)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ebecf9f9ab944f56f85a876fee4f25348002f0fd2c6453a6e0b34eb5fa5435f1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-qsxzp" podUID="b12086d3-36c9-4de5-998b-0494bb3bd5ec" May 27 17:49:44.503916 containerd[1565]: time="2025-05-27T17:49:44.503592968Z" level=error msg="Failed to destroy network for sandbox \"6dd29227c99714c1e638ae94b876bd3b57ccfbc96ed27859cbeb7dd14b1706b0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:49:44.505834 systemd[1]: run-netns-cni\x2dd8ba4e2b\x2dafa5\x2dd229\x2db5e4\x2de8f825436197.mount: Deactivated successfully. May 27 17:49:44.507899 containerd[1565]: time="2025-05-27T17:49:44.507409931Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-6fkcq,Uid:1513fc78-c3ee-47f4-9d52-8cedb71687f3,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6dd29227c99714c1e638ae94b876bd3b57ccfbc96ed27859cbeb7dd14b1706b0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:49:44.507973 kubelet[2865]: E0527 17:49:44.507601 2865 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6dd29227c99714c1e638ae94b876bd3b57ccfbc96ed27859cbeb7dd14b1706b0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:49:44.507973 kubelet[2865]: E0527 17:49:44.507643 2865 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6dd29227c99714c1e638ae94b876bd3b57ccfbc96ed27859cbeb7dd14b1706b0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-6fkcq" May 27 17:49:44.507973 kubelet[2865]: E0527 17:49:44.507658 2865 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6dd29227c99714c1e638ae94b876bd3b57ccfbc96ed27859cbeb7dd14b1706b0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-6fkcq" May 27 17:49:44.508037 kubelet[2865]: E0527 17:49:44.507704 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-6fkcq_kube-system(1513fc78-c3ee-47f4-9d52-8cedb71687f3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-6fkcq_kube-system(1513fc78-c3ee-47f4-9d52-8cedb71687f3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6dd29227c99714c1e638ae94b876bd3b57ccfbc96ed27859cbeb7dd14b1706b0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-6fkcq" podUID="1513fc78-c3ee-47f4-9d52-8cedb71687f3" May 27 17:49:44.524070 containerd[1565]: time="2025-05-27T17:49:44.523957545Z" level=error msg="Failed to destroy network for sandbox \"9b43ea5cec71cd5e32b0fd6a7ada46eb4c8c255013ef0f101a6e98e3b8c2e6ac\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:49:44.525648 systemd[1]: run-netns-cni\x2dfd3f21c9\x2db274\x2dfa67\x2d2a8f\x2db0216a8b4ff4.mount: Deactivated successfully. May 27 17:49:44.535018 containerd[1565]: time="2025-05-27T17:49:44.534971552Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9f4d94d96-v64xk,Uid:bb9f0f7c-2db0-4832-a710-1a0c49d448f1,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b43ea5cec71cd5e32b0fd6a7ada46eb4c8c255013ef0f101a6e98e3b8c2e6ac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:49:44.535362 kubelet[2865]: E0527 17:49:44.535174 2865 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b43ea5cec71cd5e32b0fd6a7ada46eb4c8c255013ef0f101a6e98e3b8c2e6ac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:49:44.535362 kubelet[2865]: E0527 17:49:44.535237 2865 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b43ea5cec71cd5e32b0fd6a7ada46eb4c8c255013ef0f101a6e98e3b8c2e6ac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-9f4d94d96-v64xk" May 27 17:49:44.535362 kubelet[2865]: E0527 17:49:44.535256 2865 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b43ea5cec71cd5e32b0fd6a7ada46eb4c8c255013ef0f101a6e98e3b8c2e6ac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-9f4d94d96-v64xk" May 27 17:49:44.536686 kubelet[2865]: E0527 17:49:44.535871 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-9f4d94d96-v64xk_calico-apiserver(bb9f0f7c-2db0-4832-a710-1a0c49d448f1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-9f4d94d96-v64xk_calico-apiserver(bb9f0f7c-2db0-4832-a710-1a0c49d448f1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9b43ea5cec71cd5e32b0fd6a7ada46eb4c8c255013ef0f101a6e98e3b8c2e6ac\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-9f4d94d96-v64xk" podUID="bb9f0f7c-2db0-4832-a710-1a0c49d448f1" May 27 17:49:44.538872 containerd[1565]: time="2025-05-27T17:49:44.538841093Z" level=error msg="Failed to destroy network for sandbox \"097d032f6f0cf8b8d2438cb8f315a420063f7e80826c2f9d4c694747601183bd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:49:44.540083 containerd[1565]: time="2025-05-27T17:49:44.540055914Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9f4d94d96-jn9qm,Uid:fad65c3f-de87-4e29-9edf-763061c577ea,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"097d032f6f0cf8b8d2438cb8f315a420063f7e80826c2f9d4c694747601183bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:49:44.540341 kubelet[2865]: E0527 17:49:44.540200 2865 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"097d032f6f0cf8b8d2438cb8f315a420063f7e80826c2f9d4c694747601183bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:49:44.540341 kubelet[2865]: E0527 17:49:44.540248 2865 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"097d032f6f0cf8b8d2438cb8f315a420063f7e80826c2f9d4c694747601183bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-9f4d94d96-jn9qm" May 27 17:49:44.540341 kubelet[2865]: E0527 17:49:44.540263 2865 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"097d032f6f0cf8b8d2438cb8f315a420063f7e80826c2f9d4c694747601183bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-9f4d94d96-jn9qm" May 27 17:49:44.540473 kubelet[2865]: E0527 17:49:44.540303 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-9f4d94d96-jn9qm_calico-apiserver(fad65c3f-de87-4e29-9edf-763061c577ea)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-9f4d94d96-jn9qm_calico-apiserver(fad65c3f-de87-4e29-9edf-763061c577ea)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"097d032f6f0cf8b8d2438cb8f315a420063f7e80826c2f9d4c694747601183bd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-9f4d94d96-jn9qm" podUID="fad65c3f-de87-4e29-9edf-763061c577ea" May 27 17:49:44.541716 containerd[1565]: time="2025-05-27T17:49:44.541198050Z" level=error msg="Failed to destroy network for sandbox \"140363783045de48d1c9404b48971adaa6778c4eb6c0d34765b582623dee5876\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:49:44.542572 containerd[1565]: time="2025-05-27T17:49:44.542540719Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-fq45k,Uid:8b985871-4559-42e8-9cf3-7b26e6ec2b9f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"140363783045de48d1c9404b48971adaa6778c4eb6c0d34765b582623dee5876\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:49:44.542867 kubelet[2865]: E0527 17:49:44.542735 2865 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"140363783045de48d1c9404b48971adaa6778c4eb6c0d34765b582623dee5876\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:49:44.542867 kubelet[2865]: E0527 17:49:44.542763 2865 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"140363783045de48d1c9404b48971adaa6778c4eb6c0d34765b582623dee5876\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-78d55f7ddc-fq45k" May 27 17:49:44.542867 kubelet[2865]: E0527 17:49:44.542777 2865 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"140363783045de48d1c9404b48971adaa6778c4eb6c0d34765b582623dee5876\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-78d55f7ddc-fq45k" May 27 17:49:44.542972 kubelet[2865]: E0527 17:49:44.542811 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-78d55f7ddc-fq45k_calico-system(8b985871-4559-42e8-9cf3-7b26e6ec2b9f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-78d55f7ddc-fq45k_calico-system(8b985871-4559-42e8-9cf3-7b26e6ec2b9f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"140363783045de48d1c9404b48971adaa6778c4eb6c0d34765b582623dee5876\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-78d55f7ddc-fq45k" podUID="8b985871-4559-42e8-9cf3-7b26e6ec2b9f" May 27 17:49:44.544085 containerd[1565]: time="2025-05-27T17:49:44.543324006Z" level=error msg="Failed to destroy network for sandbox \"d52577df27b4c0592dc262f686a1876663b31b8484316518d84e0ef68a9d5ce9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:49:44.544606 containerd[1565]: time="2025-05-27T17:49:44.544582568Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6bd7fbb46-ck2w7,Uid:901c1e3b-28f7-45f0-9531-bd718bf1ff98,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d52577df27b4c0592dc262f686a1876663b31b8484316518d84e0ef68a9d5ce9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:49:44.545045 kubelet[2865]: E0527 17:49:44.545003 2865 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d52577df27b4c0592dc262f686a1876663b31b8484316518d84e0ef68a9d5ce9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:49:44.545045 kubelet[2865]: E0527 17:49:44.545035 2865 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d52577df27b4c0592dc262f686a1876663b31b8484316518d84e0ef68a9d5ce9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6bd7fbb46-ck2w7" May 27 17:49:44.545103 kubelet[2865]: E0527 17:49:44.545047 2865 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d52577df27b4c0592dc262f686a1876663b31b8484316518d84e0ef68a9d5ce9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6bd7fbb46-ck2w7" May 27 17:49:44.545212 kubelet[2865]: E0527 17:49:44.545188 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-6bd7fbb46-ck2w7_calico-system(901c1e3b-28f7-45f0-9531-bd718bf1ff98)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-6bd7fbb46-ck2w7_calico-system(901c1e3b-28f7-45f0-9531-bd718bf1ff98)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d52577df27b4c0592dc262f686a1876663b31b8484316518d84e0ef68a9d5ce9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-6bd7fbb46-ck2w7" podUID="901c1e3b-28f7-45f0-9531-bd718bf1ff98" May 27 17:49:44.545382 containerd[1565]: time="2025-05-27T17:49:44.545365114Z" level=error msg="Failed to destroy network for sandbox \"a9f2f7c54f8b78acfdaa6b3d13af779670fd197f53ccf78da5d567e3a0753933\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:49:44.546552 containerd[1565]: time="2025-05-27T17:49:44.546530683Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-cb6f4656f-6r5lk,Uid:27ab50e6-a59e-432e-b7e7-3f9bf2c59af6,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9f2f7c54f8b78acfdaa6b3d13af779670fd197f53ccf78da5d567e3a0753933\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:49:44.546842 kubelet[2865]: E0527 17:49:44.546826 2865 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9f2f7c54f8b78acfdaa6b3d13af779670fd197f53ccf78da5d567e3a0753933\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 17:49:44.546924 kubelet[2865]: E0527 17:49:44.546910 2865 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9f2f7c54f8b78acfdaa6b3d13af779670fd197f53ccf78da5d567e3a0753933\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-cb6f4656f-6r5lk" May 27 17:49:44.547013 kubelet[2865]: E0527 17:49:44.546991 2865 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a9f2f7c54f8b78acfdaa6b3d13af779670fd197f53ccf78da5d567e3a0753933\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-cb6f4656f-6r5lk" May 27 17:49:44.547057 kubelet[2865]: E0527 17:49:44.547031 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-cb6f4656f-6r5lk_calico-system(27ab50e6-a59e-432e-b7e7-3f9bf2c59af6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-cb6f4656f-6r5lk_calico-system(27ab50e6-a59e-432e-b7e7-3f9bf2c59af6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a9f2f7c54f8b78acfdaa6b3d13af779670fd197f53ccf78da5d567e3a0753933\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-cb6f4656f-6r5lk" podUID="27ab50e6-a59e-432e-b7e7-3f9bf2c59af6" May 27 17:49:45.456982 systemd[1]: run-netns-cni\x2d7fbdb52d\x2dc840\x2d3ec2\x2d6174\x2dcf52db599f58.mount: Deactivated successfully. May 27 17:49:45.457058 systemd[1]: run-netns-cni\x2dc1d009ad\x2d11c7\x2dd724\x2d4183\x2d9a1b416e0eac.mount: Deactivated successfully. May 27 17:49:45.457102 systemd[1]: run-netns-cni\x2da44328af\x2d2dbc\x2d4039\x2d3c56\x2d0790aaff5af3.mount: Deactivated successfully. May 27 17:49:45.457140 systemd[1]: run-netns-cni\x2d7c3d314c\x2db4cd\x2d0b39\x2d2a5f\x2d584134b1fa00.mount: Deactivated successfully. May 27 17:49:51.440848 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1340342861.mount: Deactivated successfully. May 27 17:49:51.554131 containerd[1565]: time="2025-05-27T17:49:51.551447244Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.0: active requests=0, bytes read=156396372" May 27 17:49:51.631728 containerd[1565]: time="2025-05-27T17:49:51.631650127Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:51.666860 containerd[1565]: time="2025-05-27T17:49:51.666806860Z" level=info msg="ImageCreate event name:\"sha256:d12dae9bc0999225efe30fd5618bcf2195709d54ed2840234f5006aab5f7d721\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:51.668474 containerd[1565]: time="2025-05-27T17:49:51.668421349Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:7cb61ea47ca0a8e6d0526a42da4f1e399b37ccd13339d0776d272465cb7ee012\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:51.668871 containerd[1565]: time="2025-05-27T17:49:51.668724744Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.0\" with image id \"sha256:d12dae9bc0999225efe30fd5618bcf2195709d54ed2840234f5006aab5f7d721\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/node@sha256:7cb61ea47ca0a8e6d0526a42da4f1e399b37ccd13339d0776d272465cb7ee012\", size \"156396234\" in 7.296635942s" May 27 17:49:51.668871 containerd[1565]: time="2025-05-27T17:49:51.668753387Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.0\" returns image reference \"sha256:d12dae9bc0999225efe30fd5618bcf2195709d54ed2840234f5006aab5f7d721\"" May 27 17:49:51.719284 containerd[1565]: time="2025-05-27T17:49:51.719180553Z" level=info msg="CreateContainer within sandbox \"74d2d6f5705a1cbf153c09249e5f4d2df5359f4dbf6f7e21893020fb31e2c3a3\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 27 17:49:51.772612 containerd[1565]: time="2025-05-27T17:49:51.772566541Z" level=info msg="Container 70ddb3e5906d2ff3f84bb9249ff4cce437881aa2d1921a2cbd1edd83f0dcb9b0: CDI devices from CRI Config.CDIDevices: []" May 27 17:49:51.776393 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1242342146.mount: Deactivated successfully. May 27 17:49:51.788725 containerd[1565]: time="2025-05-27T17:49:51.788641010Z" level=info msg="CreateContainer within sandbox \"74d2d6f5705a1cbf153c09249e5f4d2df5359f4dbf6f7e21893020fb31e2c3a3\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"70ddb3e5906d2ff3f84bb9249ff4cce437881aa2d1921a2cbd1edd83f0dcb9b0\"" May 27 17:49:51.789300 containerd[1565]: time="2025-05-27T17:49:51.789206544Z" level=info msg="StartContainer for \"70ddb3e5906d2ff3f84bb9249ff4cce437881aa2d1921a2cbd1edd83f0dcb9b0\"" May 27 17:49:51.801061 containerd[1565]: time="2025-05-27T17:49:51.800998926Z" level=info msg="connecting to shim 70ddb3e5906d2ff3f84bb9249ff4cce437881aa2d1921a2cbd1edd83f0dcb9b0" address="unix:///run/containerd/s/45e12a65ccb6700c521ca9ba5f05b34ba305e3a09fc648a07aa60fc2ca3276b0" protocol=ttrpc version=3 May 27 17:49:51.900384 systemd[1]: Started cri-containerd-70ddb3e5906d2ff3f84bb9249ff4cce437881aa2d1921a2cbd1edd83f0dcb9b0.scope - libcontainer container 70ddb3e5906d2ff3f84bb9249ff4cce437881aa2d1921a2cbd1edd83f0dcb9b0. May 27 17:49:51.941346 containerd[1565]: time="2025-05-27T17:49:51.941318617Z" level=info msg="StartContainer for \"70ddb3e5906d2ff3f84bb9249ff4cce437881aa2d1921a2cbd1edd83f0dcb9b0\" returns successfully" May 27 17:49:52.029071 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 27 17:49:52.029906 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 27 17:49:52.301904 kubelet[2865]: I0527 17:49:52.301858 2865 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtzsz\" (UniqueName: \"kubernetes.io/projected/901c1e3b-28f7-45f0-9531-bd718bf1ff98-kube-api-access-rtzsz\") pod \"901c1e3b-28f7-45f0-9531-bd718bf1ff98\" (UID: \"901c1e3b-28f7-45f0-9531-bd718bf1ff98\") " May 27 17:49:52.301904 kubelet[2865]: I0527 17:49:52.301898 2865 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/901c1e3b-28f7-45f0-9531-bd718bf1ff98-whisker-backend-key-pair\") pod \"901c1e3b-28f7-45f0-9531-bd718bf1ff98\" (UID: \"901c1e3b-28f7-45f0-9531-bd718bf1ff98\") " May 27 17:49:52.303891 kubelet[2865]: I0527 17:49:52.301933 2865 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/901c1e3b-28f7-45f0-9531-bd718bf1ff98-whisker-ca-bundle\") pod \"901c1e3b-28f7-45f0-9531-bd718bf1ff98\" (UID: \"901c1e3b-28f7-45f0-9531-bd718bf1ff98\") " May 27 17:49:52.303891 kubelet[2865]: I0527 17:49:52.303595 2865 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/901c1e3b-28f7-45f0-9531-bd718bf1ff98-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "901c1e3b-28f7-45f0-9531-bd718bf1ff98" (UID: "901c1e3b-28f7-45f0-9531-bd718bf1ff98"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 27 17:49:52.309782 kubelet[2865]: I0527 17:49:52.309764 2865 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/901c1e3b-28f7-45f0-9531-bd718bf1ff98-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "901c1e3b-28f7-45f0-9531-bd718bf1ff98" (UID: "901c1e3b-28f7-45f0-9531-bd718bf1ff98"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 27 17:49:52.310618 kubelet[2865]: I0527 17:49:52.310576 2865 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/901c1e3b-28f7-45f0-9531-bd718bf1ff98-kube-api-access-rtzsz" (OuterVolumeSpecName: "kube-api-access-rtzsz") pod "901c1e3b-28f7-45f0-9531-bd718bf1ff98" (UID: "901c1e3b-28f7-45f0-9531-bd718bf1ff98"). InnerVolumeSpecName "kube-api-access-rtzsz". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 27 17:49:52.405975 kubelet[2865]: I0527 17:49:52.405747 2865 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rtzsz\" (UniqueName: \"kubernetes.io/projected/901c1e3b-28f7-45f0-9531-bd718bf1ff98-kube-api-access-rtzsz\") on node \"ci-4344-0-0-a-c8f0a3e630\" DevicePath \"\"" May 27 17:49:52.405975 kubelet[2865]: I0527 17:49:52.405770 2865 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/901c1e3b-28f7-45f0-9531-bd718bf1ff98-whisker-backend-key-pair\") on node \"ci-4344-0-0-a-c8f0a3e630\" DevicePath \"\"" May 27 17:49:52.405975 kubelet[2865]: I0527 17:49:52.405780 2865 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/901c1e3b-28f7-45f0-9531-bd718bf1ff98-whisker-ca-bundle\") on node \"ci-4344-0-0-a-c8f0a3e630\" DevicePath \"\"" May 27 17:49:52.408451 systemd[1]: Removed slice kubepods-besteffort-pod901c1e3b_28f7_45f0_9531_bd718bf1ff98.slice - libcontainer container kubepods-besteffort-pod901c1e3b_28f7_45f0_9531_bd718bf1ff98.slice. May 27 17:49:52.422849 kubelet[2865]: I0527 17:49:52.421018 2865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-qb8hf" podStartSLOduration=1.614147386 podStartE2EDuration="18.421005039s" podCreationTimestamp="2025-05-27 17:49:34 +0000 UTC" firstStartedPulling="2025-05-27 17:49:34.87044912 +0000 UTC m=+19.722327770" lastFinishedPulling="2025-05-27 17:49:51.677306774 +0000 UTC m=+36.529185423" observedRunningTime="2025-05-27 17:49:52.41964795 +0000 UTC m=+37.271526620" watchObservedRunningTime="2025-05-27 17:49:52.421005039 +0000 UTC m=+37.272883689" May 27 17:49:52.442420 systemd[1]: var-lib-kubelet-pods-901c1e3b\x2d28f7\x2d45f0\x2d9531\x2dbd718bf1ff98-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. May 27 17:49:52.442937 systemd[1]: var-lib-kubelet-pods-901c1e3b\x2d28f7\x2d45f0\x2d9531\x2dbd718bf1ff98-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2drtzsz.mount: Deactivated successfully. May 27 17:49:52.500676 systemd[1]: Created slice kubepods-besteffort-podba03e825_a044_4684_ba50_40a1a4351879.slice - libcontainer container kubepods-besteffort-podba03e825_a044_4684_ba50_40a1a4351879.slice. May 27 17:49:52.607249 kubelet[2865]: I0527 17:49:52.607106 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba03e825-a044-4684-ba50-40a1a4351879-whisker-ca-bundle\") pod \"whisker-7c68764d8d-q7mcw\" (UID: \"ba03e825-a044-4684-ba50-40a1a4351879\") " pod="calico-system/whisker-7c68764d8d-q7mcw" May 27 17:49:52.607249 kubelet[2865]: I0527 17:49:52.607149 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ba03e825-a044-4684-ba50-40a1a4351879-whisker-backend-key-pair\") pod \"whisker-7c68764d8d-q7mcw\" (UID: \"ba03e825-a044-4684-ba50-40a1a4351879\") " pod="calico-system/whisker-7c68764d8d-q7mcw" May 27 17:49:52.607249 kubelet[2865]: I0527 17:49:52.607168 2865 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdf7c\" (UniqueName: \"kubernetes.io/projected/ba03e825-a044-4684-ba50-40a1a4351879-kube-api-access-xdf7c\") pod \"whisker-7c68764d8d-q7mcw\" (UID: \"ba03e825-a044-4684-ba50-40a1a4351879\") " pod="calico-system/whisker-7c68764d8d-q7mcw" May 27 17:49:52.806602 containerd[1565]: time="2025-05-27T17:49:52.806551632Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7c68764d8d-q7mcw,Uid:ba03e825-a044-4684-ba50-40a1a4351879,Namespace:calico-system,Attempt:0,}" May 27 17:49:53.075491 systemd-networkd[1475]: cali602b0aed4d8: Link UP May 27 17:49:53.075653 systemd-networkd[1475]: cali602b0aed4d8: Gained carrier May 27 17:49:53.089207 containerd[1565]: 2025-05-27 17:49:52.837 [INFO][3964] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 27 17:49:53.089207 containerd[1565]: 2025-05-27 17:49:52.865 [INFO][3964] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--0--0--a--c8f0a3e630-k8s-whisker--7c68764d8d--q7mcw-eth0 whisker-7c68764d8d- calico-system ba03e825-a044-4684-ba50-40a1a4351879 918 0 2025-05-27 17:49:52 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:7c68764d8d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4344-0-0-a-c8f0a3e630 whisker-7c68764d8d-q7mcw eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali602b0aed4d8 [] [] }} ContainerID="d4f0583df76aebe81b6b68cd8b3d216c78763a313d21eac9272dff74ae095c34" Namespace="calico-system" Pod="whisker-7c68764d8d-q7mcw" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-whisker--7c68764d8d--q7mcw-" May 27 17:49:53.089207 containerd[1565]: 2025-05-27 17:49:52.865 [INFO][3964] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d4f0583df76aebe81b6b68cd8b3d216c78763a313d21eac9272dff74ae095c34" Namespace="calico-system" Pod="whisker-7c68764d8d-q7mcw" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-whisker--7c68764d8d--q7mcw-eth0" May 27 17:49:53.089207 containerd[1565]: 2025-05-27 17:49:53.017 [INFO][3973] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d4f0583df76aebe81b6b68cd8b3d216c78763a313d21eac9272dff74ae095c34" HandleID="k8s-pod-network.d4f0583df76aebe81b6b68cd8b3d216c78763a313d21eac9272dff74ae095c34" Workload="ci--4344--0--0--a--c8f0a3e630-k8s-whisker--7c68764d8d--q7mcw-eth0" May 27 17:49:53.089411 containerd[1565]: 2025-05-27 17:49:53.020 [INFO][3973] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d4f0583df76aebe81b6b68cd8b3d216c78763a313d21eac9272dff74ae095c34" HandleID="k8s-pod-network.d4f0583df76aebe81b6b68cd8b3d216c78763a313d21eac9272dff74ae095c34" Workload="ci--4344--0--0--a--c8f0a3e630-k8s-whisker--7c68764d8d--q7mcw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003359b0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344-0-0-a-c8f0a3e630", "pod":"whisker-7c68764d8d-q7mcw", "timestamp":"2025-05-27 17:49:53.01785319 +0000 UTC"}, Hostname:"ci-4344-0-0-a-c8f0a3e630", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:49:53.089411 containerd[1565]: 2025-05-27 17:49:53.020 [INFO][3973] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:49:53.089411 containerd[1565]: 2025-05-27 17:49:53.020 [INFO][3973] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:49:53.089411 containerd[1565]: 2025-05-27 17:49:53.021 [INFO][3973] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-0-0-a-c8f0a3e630' May 27 17:49:53.089411 containerd[1565]: 2025-05-27 17:49:53.036 [INFO][3973] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d4f0583df76aebe81b6b68cd8b3d216c78763a313d21eac9272dff74ae095c34" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:53.089411 containerd[1565]: 2025-05-27 17:49:53.045 [INFO][3973] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:53.089411 containerd[1565]: 2025-05-27 17:49:53.050 [INFO][3973] ipam/ipam.go 511: Trying affinity for 192.168.8.128/26 host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:53.089411 containerd[1565]: 2025-05-27 17:49:53.052 [INFO][3973] ipam/ipam.go 158: Attempting to load block cidr=192.168.8.128/26 host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:53.089411 containerd[1565]: 2025-05-27 17:49:53.054 [INFO][3973] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.8.128/26 host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:53.089574 containerd[1565]: 2025-05-27 17:49:53.054 [INFO][3973] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.8.128/26 handle="k8s-pod-network.d4f0583df76aebe81b6b68cd8b3d216c78763a313d21eac9272dff74ae095c34" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:53.089574 containerd[1565]: 2025-05-27 17:49:53.056 [INFO][3973] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d4f0583df76aebe81b6b68cd8b3d216c78763a313d21eac9272dff74ae095c34 May 27 17:49:53.089574 containerd[1565]: 2025-05-27 17:49:53.060 [INFO][3973] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.8.128/26 handle="k8s-pod-network.d4f0583df76aebe81b6b68cd8b3d216c78763a313d21eac9272dff74ae095c34" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:53.089574 containerd[1565]: 2025-05-27 17:49:53.064 [INFO][3973] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.8.129/26] block=192.168.8.128/26 handle="k8s-pod-network.d4f0583df76aebe81b6b68cd8b3d216c78763a313d21eac9272dff74ae095c34" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:53.089574 containerd[1565]: 2025-05-27 17:49:53.064 [INFO][3973] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.8.129/26] handle="k8s-pod-network.d4f0583df76aebe81b6b68cd8b3d216c78763a313d21eac9272dff74ae095c34" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:53.089574 containerd[1565]: 2025-05-27 17:49:53.064 [INFO][3973] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:49:53.089574 containerd[1565]: 2025-05-27 17:49:53.064 [INFO][3973] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.8.129/26] IPv6=[] ContainerID="d4f0583df76aebe81b6b68cd8b3d216c78763a313d21eac9272dff74ae095c34" HandleID="k8s-pod-network.d4f0583df76aebe81b6b68cd8b3d216c78763a313d21eac9272dff74ae095c34" Workload="ci--4344--0--0--a--c8f0a3e630-k8s-whisker--7c68764d8d--q7mcw-eth0" May 27 17:49:53.090423 containerd[1565]: 2025-05-27 17:49:53.067 [INFO][3964] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d4f0583df76aebe81b6b68cd8b3d216c78763a313d21eac9272dff74ae095c34" Namespace="calico-system" Pod="whisker-7c68764d8d-q7mcw" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-whisker--7c68764d8d--q7mcw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--a--c8f0a3e630-k8s-whisker--7c68764d8d--q7mcw-eth0", GenerateName:"whisker-7c68764d8d-", Namespace:"calico-system", SelfLink:"", UID:"ba03e825-a044-4684-ba50-40a1a4351879", ResourceVersion:"918", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 49, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7c68764d8d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-a-c8f0a3e630", ContainerID:"", Pod:"whisker-7c68764d8d-q7mcw", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.8.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali602b0aed4d8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:49:53.090423 containerd[1565]: 2025-05-27 17:49:53.067 [INFO][3964] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.8.129/32] ContainerID="d4f0583df76aebe81b6b68cd8b3d216c78763a313d21eac9272dff74ae095c34" Namespace="calico-system" Pod="whisker-7c68764d8d-q7mcw" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-whisker--7c68764d8d--q7mcw-eth0" May 27 17:49:53.090512 containerd[1565]: 2025-05-27 17:49:53.067 [INFO][3964] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali602b0aed4d8 ContainerID="d4f0583df76aebe81b6b68cd8b3d216c78763a313d21eac9272dff74ae095c34" Namespace="calico-system" Pod="whisker-7c68764d8d-q7mcw" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-whisker--7c68764d8d--q7mcw-eth0" May 27 17:49:53.090512 containerd[1565]: 2025-05-27 17:49:53.076 [INFO][3964] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d4f0583df76aebe81b6b68cd8b3d216c78763a313d21eac9272dff74ae095c34" Namespace="calico-system" Pod="whisker-7c68764d8d-q7mcw" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-whisker--7c68764d8d--q7mcw-eth0" May 27 17:49:53.090560 containerd[1565]: 2025-05-27 17:49:53.076 [INFO][3964] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d4f0583df76aebe81b6b68cd8b3d216c78763a313d21eac9272dff74ae095c34" Namespace="calico-system" Pod="whisker-7c68764d8d-q7mcw" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-whisker--7c68764d8d--q7mcw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--a--c8f0a3e630-k8s-whisker--7c68764d8d--q7mcw-eth0", GenerateName:"whisker-7c68764d8d-", Namespace:"calico-system", SelfLink:"", UID:"ba03e825-a044-4684-ba50-40a1a4351879", ResourceVersion:"918", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 49, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7c68764d8d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-a-c8f0a3e630", ContainerID:"d4f0583df76aebe81b6b68cd8b3d216c78763a313d21eac9272dff74ae095c34", Pod:"whisker-7c68764d8d-q7mcw", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.8.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali602b0aed4d8", MAC:"ca:ce:6c:c8:e5:2a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:49:53.090606 containerd[1565]: 2025-05-27 17:49:53.085 [INFO][3964] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d4f0583df76aebe81b6b68cd8b3d216c78763a313d21eac9272dff74ae095c34" Namespace="calico-system" Pod="whisker-7c68764d8d-q7mcw" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-whisker--7c68764d8d--q7mcw-eth0" May 27 17:49:53.234998 kubelet[2865]: I0527 17:49:53.234817 2865 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="901c1e3b-28f7-45f0-9531-bd718bf1ff98" path="/var/lib/kubelet/pods/901c1e3b-28f7-45f0-9531-bd718bf1ff98/volumes" May 27 17:49:53.261989 containerd[1565]: time="2025-05-27T17:49:53.261939651Z" level=info msg="connecting to shim d4f0583df76aebe81b6b68cd8b3d216c78763a313d21eac9272dff74ae095c34" address="unix:///run/containerd/s/29dbb227989cc56360bc4bdf7fa76c32b75208f7adf45bfa7dbfa0716f07e7f1" namespace=k8s.io protocol=ttrpc version=3 May 27 17:49:53.286356 systemd[1]: Started cri-containerd-d4f0583df76aebe81b6b68cd8b3d216c78763a313d21eac9272dff74ae095c34.scope - libcontainer container d4f0583df76aebe81b6b68cd8b3d216c78763a313d21eac9272dff74ae095c34. May 27 17:49:53.376623 containerd[1565]: time="2025-05-27T17:49:53.376077284Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7c68764d8d-q7mcw,Uid:ba03e825-a044-4684-ba50-40a1a4351879,Namespace:calico-system,Attempt:0,} returns sandbox id \"d4f0583df76aebe81b6b68cd8b3d216c78763a313d21eac9272dff74ae095c34\"" May 27 17:49:53.389250 containerd[1565]: time="2025-05-27T17:49:53.389068446Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 17:49:53.412310 kubelet[2865]: I0527 17:49:53.410440 2865 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:49:53.773802 containerd[1565]: time="2025-05-27T17:49:53.773691230Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:49:53.777086 containerd[1565]: time="2025-05-27T17:49:53.775204230Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:49:53.777208 containerd[1565]: time="2025-05-27T17:49:53.776106092Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 17:49:53.778548 kubelet[2865]: E0527 17:49:53.778443 2865 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:49:53.778700 kubelet[2865]: E0527 17:49:53.778533 2865 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:49:53.779917 kubelet[2865]: E0527 17:49:53.779845 2865 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:679f3a53bfbe4d5cacd8f8aefe926c7f,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xdf7c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7c68764d8d-q7mcw_calico-system(ba03e825-a044-4684-ba50-40a1a4351879): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:49:53.782350 containerd[1565]: time="2025-05-27T17:49:53.782322116Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 17:49:54.109394 containerd[1565]: time="2025-05-27T17:49:54.109265189Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:49:54.110419 containerd[1565]: time="2025-05-27T17:49:54.110369327Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:49:54.110492 containerd[1565]: time="2025-05-27T17:49:54.110450829Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 17:49:54.110664 kubelet[2865]: E0527 17:49:54.110616 2865 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:49:54.110783 kubelet[2865]: E0527 17:49:54.110674 2865 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:49:54.111090 kubelet[2865]: E0527 17:49:54.111022 2865 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xdf7c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7c68764d8d-q7mcw_calico-system(ba03e825-a044-4684-ba50-40a1a4351879): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:49:54.112978 kubelet[2865]: E0527 17:49:54.112920 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7c68764d8d-q7mcw" podUID="ba03e825-a044-4684-ba50-40a1a4351879" May 27 17:49:54.389189 systemd-networkd[1475]: cali602b0aed4d8: Gained IPv6LL May 27 17:49:54.414406 kubelet[2865]: E0527 17:49:54.414357 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7c68764d8d-q7mcw" podUID="ba03e825-a044-4684-ba50-40a1a4351879" May 27 17:49:55.235027 containerd[1565]: time="2025-05-27T17:49:55.234946572Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9f4d94d96-jn9qm,Uid:fad65c3f-de87-4e29-9edf-763061c577ea,Namespace:calico-apiserver,Attempt:0,}" May 27 17:49:55.367278 systemd-networkd[1475]: calif9074794094: Link UP May 27 17:49:55.368716 systemd-networkd[1475]: calif9074794094: Gained carrier May 27 17:49:55.383452 containerd[1565]: 2025-05-27 17:49:55.280 [INFO][4147] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 27 17:49:55.383452 containerd[1565]: 2025-05-27 17:49:55.301 [INFO][4147] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--0--0--a--c8f0a3e630-k8s-calico--apiserver--9f4d94d96--jn9qm-eth0 calico-apiserver-9f4d94d96- calico-apiserver fad65c3f-de87-4e29-9edf-763061c577ea 839 0 2025-05-27 17:49:31 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:9f4d94d96 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4344-0-0-a-c8f0a3e630 calico-apiserver-9f4d94d96-jn9qm eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calif9074794094 [] [] }} ContainerID="90414ce2b1120c9ec2383c53e2e3a6d28c1fa76098fd12c6891c3fd58617cb7f" Namespace="calico-apiserver" Pod="calico-apiserver-9f4d94d96-jn9qm" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-calico--apiserver--9f4d94d96--jn9qm-" May 27 17:49:55.383452 containerd[1565]: 2025-05-27 17:49:55.301 [INFO][4147] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="90414ce2b1120c9ec2383c53e2e3a6d28c1fa76098fd12c6891c3fd58617cb7f" Namespace="calico-apiserver" Pod="calico-apiserver-9f4d94d96-jn9qm" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-calico--apiserver--9f4d94d96--jn9qm-eth0" May 27 17:49:55.383452 containerd[1565]: 2025-05-27 17:49:55.327 [INFO][4159] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="90414ce2b1120c9ec2383c53e2e3a6d28c1fa76098fd12c6891c3fd58617cb7f" HandleID="k8s-pod-network.90414ce2b1120c9ec2383c53e2e3a6d28c1fa76098fd12c6891c3fd58617cb7f" Workload="ci--4344--0--0--a--c8f0a3e630-k8s-calico--apiserver--9f4d94d96--jn9qm-eth0" May 27 17:49:55.384183 containerd[1565]: 2025-05-27 17:49:55.327 [INFO][4159] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="90414ce2b1120c9ec2383c53e2e3a6d28c1fa76098fd12c6891c3fd58617cb7f" HandleID="k8s-pod-network.90414ce2b1120c9ec2383c53e2e3a6d28c1fa76098fd12c6891c3fd58617cb7f" Workload="ci--4344--0--0--a--c8f0a3e630-k8s-calico--apiserver--9f4d94d96--jn9qm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000233050), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4344-0-0-a-c8f0a3e630", "pod":"calico-apiserver-9f4d94d96-jn9qm", "timestamp":"2025-05-27 17:49:55.327741846 +0000 UTC"}, Hostname:"ci-4344-0-0-a-c8f0a3e630", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:49:55.384183 containerd[1565]: 2025-05-27 17:49:55.327 [INFO][4159] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:49:55.384183 containerd[1565]: 2025-05-27 17:49:55.327 [INFO][4159] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:49:55.384183 containerd[1565]: 2025-05-27 17:49:55.328 [INFO][4159] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-0-0-a-c8f0a3e630' May 27 17:49:55.384183 containerd[1565]: 2025-05-27 17:49:55.335 [INFO][4159] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.90414ce2b1120c9ec2383c53e2e3a6d28c1fa76098fd12c6891c3fd58617cb7f" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:55.384183 containerd[1565]: 2025-05-27 17:49:55.340 [INFO][4159] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:55.384183 containerd[1565]: 2025-05-27 17:49:55.345 [INFO][4159] ipam/ipam.go 511: Trying affinity for 192.168.8.128/26 host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:55.384183 containerd[1565]: 2025-05-27 17:49:55.348 [INFO][4159] ipam/ipam.go 158: Attempting to load block cidr=192.168.8.128/26 host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:55.384183 containerd[1565]: 2025-05-27 17:49:55.350 [INFO][4159] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.8.128/26 host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:55.384512 containerd[1565]: 2025-05-27 17:49:55.350 [INFO][4159] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.8.128/26 handle="k8s-pod-network.90414ce2b1120c9ec2383c53e2e3a6d28c1fa76098fd12c6891c3fd58617cb7f" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:55.384512 containerd[1565]: 2025-05-27 17:49:55.351 [INFO][4159] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.90414ce2b1120c9ec2383c53e2e3a6d28c1fa76098fd12c6891c3fd58617cb7f May 27 17:49:55.384512 containerd[1565]: 2025-05-27 17:49:55.356 [INFO][4159] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.8.128/26 handle="k8s-pod-network.90414ce2b1120c9ec2383c53e2e3a6d28c1fa76098fd12c6891c3fd58617cb7f" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:55.384512 containerd[1565]: 2025-05-27 17:49:55.361 [INFO][4159] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.8.130/26] block=192.168.8.128/26 handle="k8s-pod-network.90414ce2b1120c9ec2383c53e2e3a6d28c1fa76098fd12c6891c3fd58617cb7f" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:55.384512 containerd[1565]: 2025-05-27 17:49:55.361 [INFO][4159] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.8.130/26] handle="k8s-pod-network.90414ce2b1120c9ec2383c53e2e3a6d28c1fa76098fd12c6891c3fd58617cb7f" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:55.384512 containerd[1565]: 2025-05-27 17:49:55.361 [INFO][4159] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:49:55.384512 containerd[1565]: 2025-05-27 17:49:55.362 [INFO][4159] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.8.130/26] IPv6=[] ContainerID="90414ce2b1120c9ec2383c53e2e3a6d28c1fa76098fd12c6891c3fd58617cb7f" HandleID="k8s-pod-network.90414ce2b1120c9ec2383c53e2e3a6d28c1fa76098fd12c6891c3fd58617cb7f" Workload="ci--4344--0--0--a--c8f0a3e630-k8s-calico--apiserver--9f4d94d96--jn9qm-eth0" May 27 17:49:55.384626 containerd[1565]: 2025-05-27 17:49:55.365 [INFO][4147] cni-plugin/k8s.go 418: Populated endpoint ContainerID="90414ce2b1120c9ec2383c53e2e3a6d28c1fa76098fd12c6891c3fd58617cb7f" Namespace="calico-apiserver" Pod="calico-apiserver-9f4d94d96-jn9qm" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-calico--apiserver--9f4d94d96--jn9qm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--a--c8f0a3e630-k8s-calico--apiserver--9f4d94d96--jn9qm-eth0", GenerateName:"calico-apiserver-9f4d94d96-", Namespace:"calico-apiserver", SelfLink:"", UID:"fad65c3f-de87-4e29-9edf-763061c577ea", ResourceVersion:"839", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 49, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"9f4d94d96", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-a-c8f0a3e630", ContainerID:"", Pod:"calico-apiserver-9f4d94d96-jn9qm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.8.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif9074794094", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:49:55.384676 containerd[1565]: 2025-05-27 17:49:55.365 [INFO][4147] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.8.130/32] ContainerID="90414ce2b1120c9ec2383c53e2e3a6d28c1fa76098fd12c6891c3fd58617cb7f" Namespace="calico-apiserver" Pod="calico-apiserver-9f4d94d96-jn9qm" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-calico--apiserver--9f4d94d96--jn9qm-eth0" May 27 17:49:55.384676 containerd[1565]: 2025-05-27 17:49:55.365 [INFO][4147] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif9074794094 ContainerID="90414ce2b1120c9ec2383c53e2e3a6d28c1fa76098fd12c6891c3fd58617cb7f" Namespace="calico-apiserver" Pod="calico-apiserver-9f4d94d96-jn9qm" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-calico--apiserver--9f4d94d96--jn9qm-eth0" May 27 17:49:55.384676 containerd[1565]: 2025-05-27 17:49:55.366 [INFO][4147] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="90414ce2b1120c9ec2383c53e2e3a6d28c1fa76098fd12c6891c3fd58617cb7f" Namespace="calico-apiserver" Pod="calico-apiserver-9f4d94d96-jn9qm" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-calico--apiserver--9f4d94d96--jn9qm-eth0" May 27 17:49:55.384727 containerd[1565]: 2025-05-27 17:49:55.368 [INFO][4147] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="90414ce2b1120c9ec2383c53e2e3a6d28c1fa76098fd12c6891c3fd58617cb7f" Namespace="calico-apiserver" Pod="calico-apiserver-9f4d94d96-jn9qm" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-calico--apiserver--9f4d94d96--jn9qm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--a--c8f0a3e630-k8s-calico--apiserver--9f4d94d96--jn9qm-eth0", GenerateName:"calico-apiserver-9f4d94d96-", Namespace:"calico-apiserver", SelfLink:"", UID:"fad65c3f-de87-4e29-9edf-763061c577ea", ResourceVersion:"839", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 49, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"9f4d94d96", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-a-c8f0a3e630", ContainerID:"90414ce2b1120c9ec2383c53e2e3a6d28c1fa76098fd12c6891c3fd58617cb7f", Pod:"calico-apiserver-9f4d94d96-jn9qm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.8.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif9074794094", MAC:"4a:be:63:df:a1:42", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:49:55.384799 containerd[1565]: 2025-05-27 17:49:55.378 [INFO][4147] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="90414ce2b1120c9ec2383c53e2e3a6d28c1fa76098fd12c6891c3fd58617cb7f" Namespace="calico-apiserver" Pod="calico-apiserver-9f4d94d96-jn9qm" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-calico--apiserver--9f4d94d96--jn9qm-eth0" May 27 17:49:55.413407 containerd[1565]: time="2025-05-27T17:49:55.413296792Z" level=info msg="connecting to shim 90414ce2b1120c9ec2383c53e2e3a6d28c1fa76098fd12c6891c3fd58617cb7f" address="unix:///run/containerd/s/8e46eda52a721690ce87a7629dcc5c428cd775b29ac4062ee9267de528a5443d" namespace=k8s.io protocol=ttrpc version=3 May 27 17:49:55.433357 systemd[1]: Started cri-containerd-90414ce2b1120c9ec2383c53e2e3a6d28c1fa76098fd12c6891c3fd58617cb7f.scope - libcontainer container 90414ce2b1120c9ec2383c53e2e3a6d28c1fa76098fd12c6891c3fd58617cb7f. May 27 17:49:55.480096 containerd[1565]: time="2025-05-27T17:49:55.479904234Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9f4d94d96-jn9qm,Uid:fad65c3f-de87-4e29-9edf-763061c577ea,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"90414ce2b1120c9ec2383c53e2e3a6d28c1fa76098fd12c6891c3fd58617cb7f\"" May 27 17:49:55.481847 containerd[1565]: time="2025-05-27T17:49:55.481723626Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\"" May 27 17:49:56.232888 containerd[1565]: time="2025-05-27T17:49:56.232816987Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-fq45k,Uid:8b985871-4559-42e8-9cf3-7b26e6ec2b9f,Namespace:calico-system,Attempt:0,}" May 27 17:49:56.319519 systemd-networkd[1475]: cali7a160990215: Link UP May 27 17:49:56.319660 systemd-networkd[1475]: cali7a160990215: Gained carrier May 27 17:49:56.332265 containerd[1565]: 2025-05-27 17:49:56.256 [INFO][4237] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 27 17:49:56.332265 containerd[1565]: 2025-05-27 17:49:56.265 [INFO][4237] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--0--0--a--c8f0a3e630-k8s-goldmane--78d55f7ddc--fq45k-eth0 goldmane-78d55f7ddc- calico-system 8b985871-4559-42e8-9cf3-7b26e6ec2b9f 841 0 2025-05-27 17:49:33 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:78d55f7ddc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4344-0-0-a-c8f0a3e630 goldmane-78d55f7ddc-fq45k eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali7a160990215 [] [] }} ContainerID="1ddf36c2976f3684bd155197bd8667c21478b2792b862b16ebb62dbd527da4ac" Namespace="calico-system" Pod="goldmane-78d55f7ddc-fq45k" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-goldmane--78d55f7ddc--fq45k-" May 27 17:49:56.332265 containerd[1565]: 2025-05-27 17:49:56.265 [INFO][4237] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1ddf36c2976f3684bd155197bd8667c21478b2792b862b16ebb62dbd527da4ac" Namespace="calico-system" Pod="goldmane-78d55f7ddc-fq45k" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-goldmane--78d55f7ddc--fq45k-eth0" May 27 17:49:56.332265 containerd[1565]: 2025-05-27 17:49:56.286 [INFO][4248] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1ddf36c2976f3684bd155197bd8667c21478b2792b862b16ebb62dbd527da4ac" HandleID="k8s-pod-network.1ddf36c2976f3684bd155197bd8667c21478b2792b862b16ebb62dbd527da4ac" Workload="ci--4344--0--0--a--c8f0a3e630-k8s-goldmane--78d55f7ddc--fq45k-eth0" May 27 17:49:56.332633 containerd[1565]: 2025-05-27 17:49:56.287 [INFO][4248] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1ddf36c2976f3684bd155197bd8667c21478b2792b862b16ebb62dbd527da4ac" HandleID="k8s-pod-network.1ddf36c2976f3684bd155197bd8667c21478b2792b862b16ebb62dbd527da4ac" Workload="ci--4344--0--0--a--c8f0a3e630-k8s-goldmane--78d55f7ddc--fq45k-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002332c0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344-0-0-a-c8f0a3e630", "pod":"goldmane-78d55f7ddc-fq45k", "timestamp":"2025-05-27 17:49:56.286783878 +0000 UTC"}, Hostname:"ci-4344-0-0-a-c8f0a3e630", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:49:56.332633 containerd[1565]: 2025-05-27 17:49:56.289 [INFO][4248] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:49:56.332633 containerd[1565]: 2025-05-27 17:49:56.289 [INFO][4248] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:49:56.332633 containerd[1565]: 2025-05-27 17:49:56.289 [INFO][4248] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-0-0-a-c8f0a3e630' May 27 17:49:56.332633 containerd[1565]: 2025-05-27 17:49:56.295 [INFO][4248] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1ddf36c2976f3684bd155197bd8667c21478b2792b862b16ebb62dbd527da4ac" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:56.332633 containerd[1565]: 2025-05-27 17:49:56.300 [INFO][4248] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:56.332633 containerd[1565]: 2025-05-27 17:49:56.304 [INFO][4248] ipam/ipam.go 511: Trying affinity for 192.168.8.128/26 host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:56.332633 containerd[1565]: 2025-05-27 17:49:56.306 [INFO][4248] ipam/ipam.go 158: Attempting to load block cidr=192.168.8.128/26 host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:56.332633 containerd[1565]: 2025-05-27 17:49:56.307 [INFO][4248] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.8.128/26 host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:56.332806 containerd[1565]: 2025-05-27 17:49:56.308 [INFO][4248] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.8.128/26 handle="k8s-pod-network.1ddf36c2976f3684bd155197bd8667c21478b2792b862b16ebb62dbd527da4ac" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:56.332806 containerd[1565]: 2025-05-27 17:49:56.309 [INFO][4248] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1ddf36c2976f3684bd155197bd8667c21478b2792b862b16ebb62dbd527da4ac May 27 17:49:56.332806 containerd[1565]: 2025-05-27 17:49:56.312 [INFO][4248] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.8.128/26 handle="k8s-pod-network.1ddf36c2976f3684bd155197bd8667c21478b2792b862b16ebb62dbd527da4ac" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:56.332806 containerd[1565]: 2025-05-27 17:49:56.315 [INFO][4248] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.8.131/26] block=192.168.8.128/26 handle="k8s-pod-network.1ddf36c2976f3684bd155197bd8667c21478b2792b862b16ebb62dbd527da4ac" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:56.332806 containerd[1565]: 2025-05-27 17:49:56.315 [INFO][4248] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.8.131/26] handle="k8s-pod-network.1ddf36c2976f3684bd155197bd8667c21478b2792b862b16ebb62dbd527da4ac" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:56.332806 containerd[1565]: 2025-05-27 17:49:56.316 [INFO][4248] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:49:56.332806 containerd[1565]: 2025-05-27 17:49:56.316 [INFO][4248] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.8.131/26] IPv6=[] ContainerID="1ddf36c2976f3684bd155197bd8667c21478b2792b862b16ebb62dbd527da4ac" HandleID="k8s-pod-network.1ddf36c2976f3684bd155197bd8667c21478b2792b862b16ebb62dbd527da4ac" Workload="ci--4344--0--0--a--c8f0a3e630-k8s-goldmane--78d55f7ddc--fq45k-eth0" May 27 17:49:56.332964 containerd[1565]: 2025-05-27 17:49:56.318 [INFO][4237] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1ddf36c2976f3684bd155197bd8667c21478b2792b862b16ebb62dbd527da4ac" Namespace="calico-system" Pod="goldmane-78d55f7ddc-fq45k" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-goldmane--78d55f7ddc--fq45k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--a--c8f0a3e630-k8s-goldmane--78d55f7ddc--fq45k-eth0", GenerateName:"goldmane-78d55f7ddc-", Namespace:"calico-system", SelfLink:"", UID:"8b985871-4559-42e8-9cf3-7b26e6ec2b9f", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 49, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"78d55f7ddc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-a-c8f0a3e630", ContainerID:"", Pod:"goldmane-78d55f7ddc-fq45k", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.8.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali7a160990215", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:49:56.333037 containerd[1565]: 2025-05-27 17:49:56.318 [INFO][4237] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.8.131/32] ContainerID="1ddf36c2976f3684bd155197bd8667c21478b2792b862b16ebb62dbd527da4ac" Namespace="calico-system" Pod="goldmane-78d55f7ddc-fq45k" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-goldmane--78d55f7ddc--fq45k-eth0" May 27 17:49:56.333037 containerd[1565]: 2025-05-27 17:49:56.318 [INFO][4237] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7a160990215 ContainerID="1ddf36c2976f3684bd155197bd8667c21478b2792b862b16ebb62dbd527da4ac" Namespace="calico-system" Pod="goldmane-78d55f7ddc-fq45k" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-goldmane--78d55f7ddc--fq45k-eth0" May 27 17:49:56.333037 containerd[1565]: 2025-05-27 17:49:56.319 [INFO][4237] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1ddf36c2976f3684bd155197bd8667c21478b2792b862b16ebb62dbd527da4ac" Namespace="calico-system" Pod="goldmane-78d55f7ddc-fq45k" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-goldmane--78d55f7ddc--fq45k-eth0" May 27 17:49:56.333263 containerd[1565]: 2025-05-27 17:49:56.319 [INFO][4237] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1ddf36c2976f3684bd155197bd8667c21478b2792b862b16ebb62dbd527da4ac" Namespace="calico-system" Pod="goldmane-78d55f7ddc-fq45k" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-goldmane--78d55f7ddc--fq45k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--a--c8f0a3e630-k8s-goldmane--78d55f7ddc--fq45k-eth0", GenerateName:"goldmane-78d55f7ddc-", Namespace:"calico-system", SelfLink:"", UID:"8b985871-4559-42e8-9cf3-7b26e6ec2b9f", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 49, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"78d55f7ddc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-a-c8f0a3e630", ContainerID:"1ddf36c2976f3684bd155197bd8667c21478b2792b862b16ebb62dbd527da4ac", Pod:"goldmane-78d55f7ddc-fq45k", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.8.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali7a160990215", MAC:"f6:e8:40:c0:0d:22", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:49:56.333357 containerd[1565]: 2025-05-27 17:49:56.330 [INFO][4237] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1ddf36c2976f3684bd155197bd8667c21478b2792b862b16ebb62dbd527da4ac" Namespace="calico-system" Pod="goldmane-78d55f7ddc-fq45k" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-goldmane--78d55f7ddc--fq45k-eth0" May 27 17:49:56.349959 containerd[1565]: time="2025-05-27T17:49:56.349483811Z" level=info msg="connecting to shim 1ddf36c2976f3684bd155197bd8667c21478b2792b862b16ebb62dbd527da4ac" address="unix:///run/containerd/s/c69a5dcf1333f614f9493d7cdb064bf4d89249e558d56761f89a3ac1dabda1a5" namespace=k8s.io protocol=ttrpc version=3 May 27 17:49:56.378335 systemd[1]: Started cri-containerd-1ddf36c2976f3684bd155197bd8667c21478b2792b862b16ebb62dbd527da4ac.scope - libcontainer container 1ddf36c2976f3684bd155197bd8667c21478b2792b862b16ebb62dbd527da4ac. May 27 17:49:56.414682 containerd[1565]: time="2025-05-27T17:49:56.414609919Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-fq45k,Uid:8b985871-4559-42e8-9cf3-7b26e6ec2b9f,Namespace:calico-system,Attempt:0,} returns sandbox id \"1ddf36c2976f3684bd155197bd8667c21478b2792b862b16ebb62dbd527da4ac\"" May 27 17:49:57.010445 systemd-networkd[1475]: calif9074794094: Gained IPv6LL May 27 17:49:57.195079 kubelet[2865]: I0527 17:49:57.195038 2865 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:49:57.234373 containerd[1565]: time="2025-05-27T17:49:57.234030337Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-cb6f4656f-6r5lk,Uid:27ab50e6-a59e-432e-b7e7-3f9bf2c59af6,Namespace:calico-system,Attempt:0,}" May 27 17:49:57.248734 containerd[1565]: time="2025-05-27T17:49:57.248696204Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-qsxzp,Uid:b12086d3-36c9-4de5-998b-0494bb3bd5ec,Namespace:kube-system,Attempt:0,}" May 27 17:49:57.249467 containerd[1565]: time="2025-05-27T17:49:57.249441625Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5zw22,Uid:6cefb86f-fbd3-4c4c-8534-44d8ba742df1,Namespace:calico-system,Attempt:0,}" May 27 17:49:57.400402 systemd-networkd[1475]: califd9e1c01f10: Link UP May 27 17:49:57.402109 systemd-networkd[1475]: califd9e1c01f10: Gained carrier May 27 17:49:57.405214 containerd[1565]: time="2025-05-27T17:49:57.405165719Z" level=info msg="TaskExit event in podsandbox handler container_id:\"70ddb3e5906d2ff3f84bb9249ff4cce437881aa2d1921a2cbd1edd83f0dcb9b0\" id:\"d755fd2c8cb9b38d039cea1c8e1a77629afd34287da5a4f2fe6e0c5fb4aaa22a\" pid:4356 exit_status:1 exited_at:{seconds:1748368197 nanos:396249382}" May 27 17:49:57.418380 containerd[1565]: 2025-05-27 17:49:57.267 [INFO][4326] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 27 17:49:57.418380 containerd[1565]: 2025-05-27 17:49:57.284 [INFO][4326] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--0--0--a--c8f0a3e630-k8s-calico--kube--controllers--cb6f4656f--6r5lk-eth0 calico-kube-controllers-cb6f4656f- calico-system 27ab50e6-a59e-432e-b7e7-3f9bf2c59af6 844 0 2025-05-27 17:49:34 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:cb6f4656f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4344-0-0-a-c8f0a3e630 calico-kube-controllers-cb6f4656f-6r5lk eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] califd9e1c01f10 [] [] }} ContainerID="f0a259e941c5ce73a80470c831a01e67308ce3ae8025de5b2586d1e933506f38" Namespace="calico-system" Pod="calico-kube-controllers-cb6f4656f-6r5lk" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-calico--kube--controllers--cb6f4656f--6r5lk-" May 27 17:49:57.418380 containerd[1565]: 2025-05-27 17:49:57.284 [INFO][4326] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f0a259e941c5ce73a80470c831a01e67308ce3ae8025de5b2586d1e933506f38" Namespace="calico-system" Pod="calico-kube-controllers-cb6f4656f-6r5lk" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-calico--kube--controllers--cb6f4656f--6r5lk-eth0" May 27 17:49:57.418380 containerd[1565]: 2025-05-27 17:49:57.322 [INFO][4380] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f0a259e941c5ce73a80470c831a01e67308ce3ae8025de5b2586d1e933506f38" HandleID="k8s-pod-network.f0a259e941c5ce73a80470c831a01e67308ce3ae8025de5b2586d1e933506f38" Workload="ci--4344--0--0--a--c8f0a3e630-k8s-calico--kube--controllers--cb6f4656f--6r5lk-eth0" May 27 17:49:57.418576 containerd[1565]: 2025-05-27 17:49:57.324 [INFO][4380] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f0a259e941c5ce73a80470c831a01e67308ce3ae8025de5b2586d1e933506f38" HandleID="k8s-pod-network.f0a259e941c5ce73a80470c831a01e67308ce3ae8025de5b2586d1e933506f38" Workload="ci--4344--0--0--a--c8f0a3e630-k8s-calico--kube--controllers--cb6f4656f--6r5lk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d9020), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344-0-0-a-c8f0a3e630", "pod":"calico-kube-controllers-cb6f4656f-6r5lk", "timestamp":"2025-05-27 17:49:57.322484109 +0000 UTC"}, Hostname:"ci-4344-0-0-a-c8f0a3e630", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:49:57.418576 containerd[1565]: 2025-05-27 17:49:57.325 [INFO][4380] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:49:57.418576 containerd[1565]: 2025-05-27 17:49:57.325 [INFO][4380] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:49:57.418576 containerd[1565]: 2025-05-27 17:49:57.326 [INFO][4380] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-0-0-a-c8f0a3e630' May 27 17:49:57.418576 containerd[1565]: 2025-05-27 17:49:57.344 [INFO][4380] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f0a259e941c5ce73a80470c831a01e67308ce3ae8025de5b2586d1e933506f38" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:57.418576 containerd[1565]: 2025-05-27 17:49:57.351 [INFO][4380] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:57.418576 containerd[1565]: 2025-05-27 17:49:57.363 [INFO][4380] ipam/ipam.go 511: Trying affinity for 192.168.8.128/26 host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:57.418576 containerd[1565]: 2025-05-27 17:49:57.367 [INFO][4380] ipam/ipam.go 158: Attempting to load block cidr=192.168.8.128/26 host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:57.418576 containerd[1565]: 2025-05-27 17:49:57.370 [INFO][4380] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.8.128/26 host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:57.418736 containerd[1565]: 2025-05-27 17:49:57.371 [INFO][4380] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.8.128/26 handle="k8s-pod-network.f0a259e941c5ce73a80470c831a01e67308ce3ae8025de5b2586d1e933506f38" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:57.418736 containerd[1565]: 2025-05-27 17:49:57.373 [INFO][4380] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f0a259e941c5ce73a80470c831a01e67308ce3ae8025de5b2586d1e933506f38 May 27 17:49:57.418736 containerd[1565]: 2025-05-27 17:49:57.379 [INFO][4380] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.8.128/26 handle="k8s-pod-network.f0a259e941c5ce73a80470c831a01e67308ce3ae8025de5b2586d1e933506f38" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:57.418736 containerd[1565]: 2025-05-27 17:49:57.387 [INFO][4380] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.8.132/26] block=192.168.8.128/26 handle="k8s-pod-network.f0a259e941c5ce73a80470c831a01e67308ce3ae8025de5b2586d1e933506f38" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:57.418736 containerd[1565]: 2025-05-27 17:49:57.388 [INFO][4380] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.8.132/26] handle="k8s-pod-network.f0a259e941c5ce73a80470c831a01e67308ce3ae8025de5b2586d1e933506f38" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:57.418736 containerd[1565]: 2025-05-27 17:49:57.388 [INFO][4380] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:49:57.418736 containerd[1565]: 2025-05-27 17:49:57.388 [INFO][4380] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.8.132/26] IPv6=[] ContainerID="f0a259e941c5ce73a80470c831a01e67308ce3ae8025de5b2586d1e933506f38" HandleID="k8s-pod-network.f0a259e941c5ce73a80470c831a01e67308ce3ae8025de5b2586d1e933506f38" Workload="ci--4344--0--0--a--c8f0a3e630-k8s-calico--kube--controllers--cb6f4656f--6r5lk-eth0" May 27 17:49:57.419306 containerd[1565]: 2025-05-27 17:49:57.392 [INFO][4326] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f0a259e941c5ce73a80470c831a01e67308ce3ae8025de5b2586d1e933506f38" Namespace="calico-system" Pod="calico-kube-controllers-cb6f4656f-6r5lk" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-calico--kube--controllers--cb6f4656f--6r5lk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--a--c8f0a3e630-k8s-calico--kube--controllers--cb6f4656f--6r5lk-eth0", GenerateName:"calico-kube-controllers-cb6f4656f-", Namespace:"calico-system", SelfLink:"", UID:"27ab50e6-a59e-432e-b7e7-3f9bf2c59af6", ResourceVersion:"844", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 49, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"cb6f4656f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-a-c8f0a3e630", ContainerID:"", Pod:"calico-kube-controllers-cb6f4656f-6r5lk", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.8.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"califd9e1c01f10", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:49:57.419359 containerd[1565]: 2025-05-27 17:49:57.392 [INFO][4326] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.8.132/32] ContainerID="f0a259e941c5ce73a80470c831a01e67308ce3ae8025de5b2586d1e933506f38" Namespace="calico-system" Pod="calico-kube-controllers-cb6f4656f-6r5lk" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-calico--kube--controllers--cb6f4656f--6r5lk-eth0" May 27 17:49:57.419359 containerd[1565]: 2025-05-27 17:49:57.393 [INFO][4326] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califd9e1c01f10 ContainerID="f0a259e941c5ce73a80470c831a01e67308ce3ae8025de5b2586d1e933506f38" Namespace="calico-system" Pod="calico-kube-controllers-cb6f4656f-6r5lk" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-calico--kube--controllers--cb6f4656f--6r5lk-eth0" May 27 17:49:57.419359 containerd[1565]: 2025-05-27 17:49:57.401 [INFO][4326] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f0a259e941c5ce73a80470c831a01e67308ce3ae8025de5b2586d1e933506f38" Namespace="calico-system" Pod="calico-kube-controllers-cb6f4656f-6r5lk" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-calico--kube--controllers--cb6f4656f--6r5lk-eth0" May 27 17:49:57.419410 containerd[1565]: 2025-05-27 17:49:57.401 [INFO][4326] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f0a259e941c5ce73a80470c831a01e67308ce3ae8025de5b2586d1e933506f38" Namespace="calico-system" Pod="calico-kube-controllers-cb6f4656f-6r5lk" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-calico--kube--controllers--cb6f4656f--6r5lk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--a--c8f0a3e630-k8s-calico--kube--controllers--cb6f4656f--6r5lk-eth0", GenerateName:"calico-kube-controllers-cb6f4656f-", Namespace:"calico-system", SelfLink:"", UID:"27ab50e6-a59e-432e-b7e7-3f9bf2c59af6", ResourceVersion:"844", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 49, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"cb6f4656f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-a-c8f0a3e630", ContainerID:"f0a259e941c5ce73a80470c831a01e67308ce3ae8025de5b2586d1e933506f38", Pod:"calico-kube-controllers-cb6f4656f-6r5lk", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.8.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"califd9e1c01f10", MAC:"82:0a:81:ca:ae:1a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:49:57.419453 containerd[1565]: 2025-05-27 17:49:57.416 [INFO][4326] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f0a259e941c5ce73a80470c831a01e67308ce3ae8025de5b2586d1e933506f38" Namespace="calico-system" Pod="calico-kube-controllers-cb6f4656f-6r5lk" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-calico--kube--controllers--cb6f4656f--6r5lk-eth0" May 27 17:49:57.447602 containerd[1565]: time="2025-05-27T17:49:57.447285800Z" level=info msg="connecting to shim f0a259e941c5ce73a80470c831a01e67308ce3ae8025de5b2586d1e933506f38" address="unix:///run/containerd/s/03a36c6892702378f76d4e6c9339c2df5ebf5af778a45cee9e012041a4d36021" namespace=k8s.io protocol=ttrpc version=3 May 27 17:49:57.488460 systemd[1]: Started cri-containerd-f0a259e941c5ce73a80470c831a01e67308ce3ae8025de5b2586d1e933506f38.scope - libcontainer container f0a259e941c5ce73a80470c831a01e67308ce3ae8025de5b2586d1e933506f38. May 27 17:49:57.509016 systemd-networkd[1475]: caliabcb73d4400: Link UP May 27 17:49:57.513973 systemd-networkd[1475]: caliabcb73d4400: Gained carrier May 27 17:49:57.530098 containerd[1565]: 2025-05-27 17:49:57.310 [INFO][4351] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 27 17:49:57.530098 containerd[1565]: 2025-05-27 17:49:57.323 [INFO][4351] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--0--0--a--c8f0a3e630-k8s-csi--node--driver--5zw22-eth0 csi-node-driver- calico-system 6cefb86f-fbd3-4c4c-8534-44d8ba742df1 740 0 2025-05-27 17:49:34 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:78f6f74485 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4344-0-0-a-c8f0a3e630 csi-node-driver-5zw22 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] caliabcb73d4400 [] [] }} ContainerID="0d4c46dd32b324d996dd4dec47a460426bb9c6aeaae517f341399ef5e81adb28" Namespace="calico-system" Pod="csi-node-driver-5zw22" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-csi--node--driver--5zw22-" May 27 17:49:57.530098 containerd[1565]: 2025-05-27 17:49:57.323 [INFO][4351] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0d4c46dd32b324d996dd4dec47a460426bb9c6aeaae517f341399ef5e81adb28" Namespace="calico-system" Pod="csi-node-driver-5zw22" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-csi--node--driver--5zw22-eth0" May 27 17:49:57.530098 containerd[1565]: 2025-05-27 17:49:57.375 [INFO][4395] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0d4c46dd32b324d996dd4dec47a460426bb9c6aeaae517f341399ef5e81adb28" HandleID="k8s-pod-network.0d4c46dd32b324d996dd4dec47a460426bb9c6aeaae517f341399ef5e81adb28" Workload="ci--4344--0--0--a--c8f0a3e630-k8s-csi--node--driver--5zw22-eth0" May 27 17:49:57.530430 containerd[1565]: 2025-05-27 17:49:57.376 [INFO][4395] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0d4c46dd32b324d996dd4dec47a460426bb9c6aeaae517f341399ef5e81adb28" HandleID="k8s-pod-network.0d4c46dd32b324d996dd4dec47a460426bb9c6aeaae517f341399ef5e81adb28" Workload="ci--4344--0--0--a--c8f0a3e630-k8s-csi--node--driver--5zw22-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d9260), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344-0-0-a-c8f0a3e630", "pod":"csi-node-driver-5zw22", "timestamp":"2025-05-27 17:49:57.375804398 +0000 UTC"}, Hostname:"ci-4344-0-0-a-c8f0a3e630", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:49:57.530430 containerd[1565]: 2025-05-27 17:49:57.376 [INFO][4395] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:49:57.530430 containerd[1565]: 2025-05-27 17:49:57.388 [INFO][4395] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:49:57.530430 containerd[1565]: 2025-05-27 17:49:57.388 [INFO][4395] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-0-0-a-c8f0a3e630' May 27 17:49:57.530430 containerd[1565]: 2025-05-27 17:49:57.447 [INFO][4395] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0d4c46dd32b324d996dd4dec47a460426bb9c6aeaae517f341399ef5e81adb28" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:57.530430 containerd[1565]: 2025-05-27 17:49:57.456 [INFO][4395] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:57.530430 containerd[1565]: 2025-05-27 17:49:57.468 [INFO][4395] ipam/ipam.go 511: Trying affinity for 192.168.8.128/26 host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:57.530430 containerd[1565]: 2025-05-27 17:49:57.470 [INFO][4395] ipam/ipam.go 158: Attempting to load block cidr=192.168.8.128/26 host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:57.530430 containerd[1565]: 2025-05-27 17:49:57.474 [INFO][4395] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.8.128/26 host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:57.530604 containerd[1565]: 2025-05-27 17:49:57.474 [INFO][4395] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.8.128/26 handle="k8s-pod-network.0d4c46dd32b324d996dd4dec47a460426bb9c6aeaae517f341399ef5e81adb28" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:57.530604 containerd[1565]: 2025-05-27 17:49:57.477 [INFO][4395] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.0d4c46dd32b324d996dd4dec47a460426bb9c6aeaae517f341399ef5e81adb28 May 27 17:49:57.530604 containerd[1565]: 2025-05-27 17:49:57.483 [INFO][4395] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.8.128/26 handle="k8s-pod-network.0d4c46dd32b324d996dd4dec47a460426bb9c6aeaae517f341399ef5e81adb28" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:57.530604 containerd[1565]: 2025-05-27 17:49:57.494 [INFO][4395] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.8.133/26] block=192.168.8.128/26 handle="k8s-pod-network.0d4c46dd32b324d996dd4dec47a460426bb9c6aeaae517f341399ef5e81adb28" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:57.530604 containerd[1565]: 2025-05-27 17:49:57.494 [INFO][4395] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.8.133/26] handle="k8s-pod-network.0d4c46dd32b324d996dd4dec47a460426bb9c6aeaae517f341399ef5e81adb28" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:57.530604 containerd[1565]: 2025-05-27 17:49:57.494 [INFO][4395] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:49:57.530604 containerd[1565]: 2025-05-27 17:49:57.494 [INFO][4395] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.8.133/26] IPv6=[] ContainerID="0d4c46dd32b324d996dd4dec47a460426bb9c6aeaae517f341399ef5e81adb28" HandleID="k8s-pod-network.0d4c46dd32b324d996dd4dec47a460426bb9c6aeaae517f341399ef5e81adb28" Workload="ci--4344--0--0--a--c8f0a3e630-k8s-csi--node--driver--5zw22-eth0" May 27 17:49:57.530713 containerd[1565]: 2025-05-27 17:49:57.497 [INFO][4351] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0d4c46dd32b324d996dd4dec47a460426bb9c6aeaae517f341399ef5e81adb28" Namespace="calico-system" Pod="csi-node-driver-5zw22" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-csi--node--driver--5zw22-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--a--c8f0a3e630-k8s-csi--node--driver--5zw22-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"6cefb86f-fbd3-4c4c-8534-44d8ba742df1", ResourceVersion:"740", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 49, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"78f6f74485", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-a-c8f0a3e630", ContainerID:"", Pod:"csi-node-driver-5zw22", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.8.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"caliabcb73d4400", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:49:57.530782 containerd[1565]: 2025-05-27 17:49:57.497 [INFO][4351] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.8.133/32] ContainerID="0d4c46dd32b324d996dd4dec47a460426bb9c6aeaae517f341399ef5e81adb28" Namespace="calico-system" Pod="csi-node-driver-5zw22" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-csi--node--driver--5zw22-eth0" May 27 17:49:57.530782 containerd[1565]: 2025-05-27 17:49:57.498 [INFO][4351] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliabcb73d4400 ContainerID="0d4c46dd32b324d996dd4dec47a460426bb9c6aeaae517f341399ef5e81adb28" Namespace="calico-system" Pod="csi-node-driver-5zw22" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-csi--node--driver--5zw22-eth0" May 27 17:49:57.530782 containerd[1565]: 2025-05-27 17:49:57.514 [INFO][4351] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0d4c46dd32b324d996dd4dec47a460426bb9c6aeaae517f341399ef5e81adb28" Namespace="calico-system" Pod="csi-node-driver-5zw22" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-csi--node--driver--5zw22-eth0" May 27 17:49:57.530935 containerd[1565]: 2025-05-27 17:49:57.515 [INFO][4351] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0d4c46dd32b324d996dd4dec47a460426bb9c6aeaae517f341399ef5e81adb28" Namespace="calico-system" Pod="csi-node-driver-5zw22" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-csi--node--driver--5zw22-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--a--c8f0a3e630-k8s-csi--node--driver--5zw22-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"6cefb86f-fbd3-4c4c-8534-44d8ba742df1", ResourceVersion:"740", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 49, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"78f6f74485", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-a-c8f0a3e630", ContainerID:"0d4c46dd32b324d996dd4dec47a460426bb9c6aeaae517f341399ef5e81adb28", Pod:"csi-node-driver-5zw22", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.8.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"caliabcb73d4400", MAC:"e6:b2:b8:cb:28:cb", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:49:57.530979 containerd[1565]: 2025-05-27 17:49:57.526 [INFO][4351] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0d4c46dd32b324d996dd4dec47a460426bb9c6aeaae517f341399ef5e81adb28" Namespace="calico-system" Pod="csi-node-driver-5zw22" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-csi--node--driver--5zw22-eth0" May 27 17:49:57.539886 containerd[1565]: time="2025-05-27T17:49:57.539719604Z" level=info msg="TaskExit event in podsandbox handler container_id:\"70ddb3e5906d2ff3f84bb9249ff4cce437881aa2d1921a2cbd1edd83f0dcb9b0\" id:\"51cf776257c199b57a331f6c0bb1e1a2e880cd4a4621f33fb03ef22f2d493567\" pid:4429 exit_status:1 exited_at:{seconds:1748368197 nanos:539522859}" May 27 17:49:57.560895 containerd[1565]: time="2025-05-27T17:49:57.560743108Z" level=info msg="connecting to shim 0d4c46dd32b324d996dd4dec47a460426bb9c6aeaae517f341399ef5e81adb28" address="unix:///run/containerd/s/da5f70ebad77abf285ef40b81453bb5a29a4090c0d74da4b3a9687671089e4e1" namespace=k8s.io protocol=ttrpc version=3 May 27 17:49:57.595550 systemd[1]: Started cri-containerd-0d4c46dd32b324d996dd4dec47a460426bb9c6aeaae517f341399ef5e81adb28.scope - libcontainer container 0d4c46dd32b324d996dd4dec47a460426bb9c6aeaae517f341399ef5e81adb28. May 27 17:49:57.612048 systemd-networkd[1475]: cali3711e5cd13e: Link UP May 27 17:49:57.613026 systemd-networkd[1475]: cali3711e5cd13e: Gained carrier May 27 17:49:57.618669 containerd[1565]: time="2025-05-27T17:49:57.618628609Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-cb6f4656f-6r5lk,Uid:27ab50e6-a59e-432e-b7e7-3f9bf2c59af6,Namespace:calico-system,Attempt:0,} returns sandbox id \"f0a259e941c5ce73a80470c831a01e67308ce3ae8025de5b2586d1e933506f38\"" May 27 17:49:57.635084 containerd[1565]: 2025-05-27 17:49:57.305 [INFO][4358] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 27 17:49:57.635084 containerd[1565]: 2025-05-27 17:49:57.321 [INFO][4358] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--0--0--a--c8f0a3e630-k8s-coredns--674b8bbfcf--qsxzp-eth0 coredns-674b8bbfcf- kube-system b12086d3-36c9-4de5-998b-0494bb3bd5ec 836 0 2025-05-27 17:49:22 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4344-0-0-a-c8f0a3e630 coredns-674b8bbfcf-qsxzp eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali3711e5cd13e [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="656ed912db74aff5447cd72acc340017d0b00e9a703d24af0c936d92bd774f34" Namespace="kube-system" Pod="coredns-674b8bbfcf-qsxzp" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-coredns--674b8bbfcf--qsxzp-" May 27 17:49:57.635084 containerd[1565]: 2025-05-27 17:49:57.321 [INFO][4358] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="656ed912db74aff5447cd72acc340017d0b00e9a703d24af0c936d92bd774f34" Namespace="kube-system" Pod="coredns-674b8bbfcf-qsxzp" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-coredns--674b8bbfcf--qsxzp-eth0" May 27 17:49:57.635084 containerd[1565]: 2025-05-27 17:49:57.383 [INFO][4394] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="656ed912db74aff5447cd72acc340017d0b00e9a703d24af0c936d92bd774f34" HandleID="k8s-pod-network.656ed912db74aff5447cd72acc340017d0b00e9a703d24af0c936d92bd774f34" Workload="ci--4344--0--0--a--c8f0a3e630-k8s-coredns--674b8bbfcf--qsxzp-eth0" May 27 17:49:57.635301 containerd[1565]: 2025-05-27 17:49:57.384 [INFO][4394] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="656ed912db74aff5447cd72acc340017d0b00e9a703d24af0c936d92bd774f34" HandleID="k8s-pod-network.656ed912db74aff5447cd72acc340017d0b00e9a703d24af0c936d92bd774f34" Workload="ci--4344--0--0--a--c8f0a3e630-k8s-coredns--674b8bbfcf--qsxzp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f960), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4344-0-0-a-c8f0a3e630", "pod":"coredns-674b8bbfcf-qsxzp", "timestamp":"2025-05-27 17:49:57.383943203 +0000 UTC"}, Hostname:"ci-4344-0-0-a-c8f0a3e630", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:49:57.635301 containerd[1565]: 2025-05-27 17:49:57.384 [INFO][4394] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:49:57.635301 containerd[1565]: 2025-05-27 17:49:57.494 [INFO][4394] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:49:57.635301 containerd[1565]: 2025-05-27 17:49:57.494 [INFO][4394] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-0-0-a-c8f0a3e630' May 27 17:49:57.635301 containerd[1565]: 2025-05-27 17:49:57.545 [INFO][4394] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.656ed912db74aff5447cd72acc340017d0b00e9a703d24af0c936d92bd774f34" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:57.635301 containerd[1565]: 2025-05-27 17:49:57.555 [INFO][4394] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:57.635301 containerd[1565]: 2025-05-27 17:49:57.569 [INFO][4394] ipam/ipam.go 511: Trying affinity for 192.168.8.128/26 host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:57.635301 containerd[1565]: 2025-05-27 17:49:57.575 [INFO][4394] ipam/ipam.go 158: Attempting to load block cidr=192.168.8.128/26 host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:57.635301 containerd[1565]: 2025-05-27 17:49:57.577 [INFO][4394] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.8.128/26 host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:57.635488 containerd[1565]: 2025-05-27 17:49:57.577 [INFO][4394] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.8.128/26 handle="k8s-pod-network.656ed912db74aff5447cd72acc340017d0b00e9a703d24af0c936d92bd774f34" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:57.635488 containerd[1565]: 2025-05-27 17:49:57.581 [INFO][4394] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.656ed912db74aff5447cd72acc340017d0b00e9a703d24af0c936d92bd774f34 May 27 17:49:57.635488 containerd[1565]: 2025-05-27 17:49:57.594 [INFO][4394] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.8.128/26 handle="k8s-pod-network.656ed912db74aff5447cd72acc340017d0b00e9a703d24af0c936d92bd774f34" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:57.635488 containerd[1565]: 2025-05-27 17:49:57.602 [INFO][4394] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.8.134/26] block=192.168.8.128/26 handle="k8s-pod-network.656ed912db74aff5447cd72acc340017d0b00e9a703d24af0c936d92bd774f34" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:57.635488 containerd[1565]: 2025-05-27 17:49:57.602 [INFO][4394] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.8.134/26] handle="k8s-pod-network.656ed912db74aff5447cd72acc340017d0b00e9a703d24af0c936d92bd774f34" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:57.635488 containerd[1565]: 2025-05-27 17:49:57.602 [INFO][4394] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:49:57.635488 containerd[1565]: 2025-05-27 17:49:57.602 [INFO][4394] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.8.134/26] IPv6=[] ContainerID="656ed912db74aff5447cd72acc340017d0b00e9a703d24af0c936d92bd774f34" HandleID="k8s-pod-network.656ed912db74aff5447cd72acc340017d0b00e9a703d24af0c936d92bd774f34" Workload="ci--4344--0--0--a--c8f0a3e630-k8s-coredns--674b8bbfcf--qsxzp-eth0" May 27 17:49:57.635598 containerd[1565]: 2025-05-27 17:49:57.604 [INFO][4358] cni-plugin/k8s.go 418: Populated endpoint ContainerID="656ed912db74aff5447cd72acc340017d0b00e9a703d24af0c936d92bd774f34" Namespace="kube-system" Pod="coredns-674b8bbfcf-qsxzp" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-coredns--674b8bbfcf--qsxzp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--a--c8f0a3e630-k8s-coredns--674b8bbfcf--qsxzp-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"b12086d3-36c9-4de5-998b-0494bb3bd5ec", ResourceVersion:"836", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 49, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-a-c8f0a3e630", ContainerID:"", Pod:"coredns-674b8bbfcf-qsxzp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.8.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3711e5cd13e", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:49:57.635598 containerd[1565]: 2025-05-27 17:49:57.605 [INFO][4358] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.8.134/32] ContainerID="656ed912db74aff5447cd72acc340017d0b00e9a703d24af0c936d92bd774f34" Namespace="kube-system" Pod="coredns-674b8bbfcf-qsxzp" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-coredns--674b8bbfcf--qsxzp-eth0" May 27 17:49:57.635598 containerd[1565]: 2025-05-27 17:49:57.605 [INFO][4358] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3711e5cd13e ContainerID="656ed912db74aff5447cd72acc340017d0b00e9a703d24af0c936d92bd774f34" Namespace="kube-system" Pod="coredns-674b8bbfcf-qsxzp" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-coredns--674b8bbfcf--qsxzp-eth0" May 27 17:49:57.635598 containerd[1565]: 2025-05-27 17:49:57.615 [INFO][4358] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="656ed912db74aff5447cd72acc340017d0b00e9a703d24af0c936d92bd774f34" Namespace="kube-system" Pod="coredns-674b8bbfcf-qsxzp" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-coredns--674b8bbfcf--qsxzp-eth0" May 27 17:49:57.635598 containerd[1565]: 2025-05-27 17:49:57.616 [INFO][4358] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="656ed912db74aff5447cd72acc340017d0b00e9a703d24af0c936d92bd774f34" Namespace="kube-system" Pod="coredns-674b8bbfcf-qsxzp" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-coredns--674b8bbfcf--qsxzp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--a--c8f0a3e630-k8s-coredns--674b8bbfcf--qsxzp-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"b12086d3-36c9-4de5-998b-0494bb3bd5ec", ResourceVersion:"836", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 49, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-a-c8f0a3e630", ContainerID:"656ed912db74aff5447cd72acc340017d0b00e9a703d24af0c936d92bd774f34", Pod:"coredns-674b8bbfcf-qsxzp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.8.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3711e5cd13e", MAC:"7e:d1:89:40:47:7a", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:49:57.635598 containerd[1565]: 2025-05-27 17:49:57.630 [INFO][4358] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="656ed912db74aff5447cd72acc340017d0b00e9a703d24af0c936d92bd774f34" Namespace="kube-system" Pod="coredns-674b8bbfcf-qsxzp" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-coredns--674b8bbfcf--qsxzp-eth0" May 27 17:49:57.644609 containerd[1565]: time="2025-05-27T17:49:57.644519369Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-5zw22,Uid:6cefb86f-fbd3-4c4c-8534-44d8ba742df1,Namespace:calico-system,Attempt:0,} returns sandbox id \"0d4c46dd32b324d996dd4dec47a460426bb9c6aeaae517f341399ef5e81adb28\"" May 27 17:49:57.683763 containerd[1565]: time="2025-05-27T17:49:57.683727962Z" level=info msg="connecting to shim 656ed912db74aff5447cd72acc340017d0b00e9a703d24af0c936d92bd774f34" address="unix:///run/containerd/s/6529242d3e536ed157e9c7f2e864158893bf427fca6fefba1b643b8ee0abf498" namespace=k8s.io protocol=ttrpc version=3 May 27 17:49:57.703351 systemd[1]: Started cri-containerd-656ed912db74aff5447cd72acc340017d0b00e9a703d24af0c936d92bd774f34.scope - libcontainer container 656ed912db74aff5447cd72acc340017d0b00e9a703d24af0c936d92bd774f34. May 27 17:49:57.740671 containerd[1565]: time="2025-05-27T17:49:57.740627423Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-qsxzp,Uid:b12086d3-36c9-4de5-998b-0494bb3bd5ec,Namespace:kube-system,Attempt:0,} returns sandbox id \"656ed912db74aff5447cd72acc340017d0b00e9a703d24af0c936d92bd774f34\"" May 27 17:49:57.745273 containerd[1565]: time="2025-05-27T17:49:57.745242921Z" level=info msg="CreateContainer within sandbox \"656ed912db74aff5447cd72acc340017d0b00e9a703d24af0c936d92bd774f34\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 27 17:49:57.766110 containerd[1565]: time="2025-05-27T17:49:57.766085476Z" level=info msg="Container 5629b5164be4cef55f62a9402b8f98d6606d66fe6976406e46d4ef8b04a98919: CDI devices from CRI Config.CDIDevices: []" May 27 17:49:57.770684 containerd[1565]: time="2025-05-27T17:49:57.770658133Z" level=info msg="CreateContainer within sandbox \"656ed912db74aff5447cd72acc340017d0b00e9a703d24af0c936d92bd774f34\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"5629b5164be4cef55f62a9402b8f98d6606d66fe6976406e46d4ef8b04a98919\"" May 27 17:49:57.771602 containerd[1565]: time="2025-05-27T17:49:57.771584249Z" level=info msg="StartContainer for \"5629b5164be4cef55f62a9402b8f98d6606d66fe6976406e46d4ef8b04a98919\"" May 27 17:49:57.773075 containerd[1565]: time="2025-05-27T17:49:57.772959805Z" level=info msg="connecting to shim 5629b5164be4cef55f62a9402b8f98d6606d66fe6976406e46d4ef8b04a98919" address="unix:///run/containerd/s/6529242d3e536ed157e9c7f2e864158893bf427fca6fefba1b643b8ee0abf498" protocol=ttrpc version=3 May 27 17:49:57.792366 systemd[1]: Started cri-containerd-5629b5164be4cef55f62a9402b8f98d6606d66fe6976406e46d4ef8b04a98919.scope - libcontainer container 5629b5164be4cef55f62a9402b8f98d6606d66fe6976406e46d4ef8b04a98919. May 27 17:49:57.825321 containerd[1565]: time="2025-05-27T17:49:57.825275751Z" level=info msg="StartContainer for \"5629b5164be4cef55f62a9402b8f98d6606d66fe6976406e46d4ef8b04a98919\" returns successfully" May 27 17:49:58.289456 systemd-networkd[1475]: cali7a160990215: Gained IPv6LL May 27 17:49:58.448368 kubelet[2865]: I0527 17:49:58.448140 2865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-qsxzp" podStartSLOduration=36.448126732 podStartE2EDuration="36.448126732s" podCreationTimestamp="2025-05-27 17:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 17:49:58.446993819 +0000 UTC m=+43.298872468" watchObservedRunningTime="2025-05-27 17:49:58.448126732 +0000 UTC m=+43.300005382" May 27 17:49:58.827699 containerd[1565]: time="2025-05-27T17:49:58.827655267Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:58.828573 containerd[1565]: time="2025-05-27T17:49:58.828496887Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.0: active requests=0, bytes read=47252431" May 27 17:49:58.829279 containerd[1565]: time="2025-05-27T17:49:58.829250061Z" level=info msg="ImageCreate event name:\"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:58.831389 containerd[1565]: time="2025-05-27T17:49:58.830770617Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:49:58.831389 containerd[1565]: time="2025-05-27T17:49:58.831131290Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" with image id \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\", size \"48745150\" in 3.349385343s" May 27 17:49:58.831389 containerd[1565]: time="2025-05-27T17:49:58.831165192Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" returns image reference \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\"" May 27 17:49:58.832605 containerd[1565]: time="2025-05-27T17:49:58.832586544Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 17:49:58.834550 containerd[1565]: time="2025-05-27T17:49:58.834524849Z" level=info msg="CreateContainer within sandbox \"90414ce2b1120c9ec2383c53e2e3a6d28c1fa76098fd12c6891c3fd58617cb7f\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 27 17:49:58.841806 containerd[1565]: time="2025-05-27T17:49:58.841317967Z" level=info msg="Container a195da45e312328d3aa4c0df7a7fdda094405bc53d16f05a41533c33157fb139: CDI devices from CRI Config.CDIDevices: []" May 27 17:49:58.854048 containerd[1565]: time="2025-05-27T17:49:58.854023434Z" level=info msg="CreateContainer within sandbox \"90414ce2b1120c9ec2383c53e2e3a6d28c1fa76098fd12c6891c3fd58617cb7f\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"a195da45e312328d3aa4c0df7a7fdda094405bc53d16f05a41533c33157fb139\"" May 27 17:49:58.855194 containerd[1565]: time="2025-05-27T17:49:58.854414742Z" level=info msg="StartContainer for \"a195da45e312328d3aa4c0df7a7fdda094405bc53d16f05a41533c33157fb139\"" May 27 17:49:58.855194 containerd[1565]: time="2025-05-27T17:49:58.855128233Z" level=info msg="connecting to shim a195da45e312328d3aa4c0df7a7fdda094405bc53d16f05a41533c33157fb139" address="unix:///run/containerd/s/8e46eda52a721690ce87a7629dcc5c428cd775b29ac4062ee9267de528a5443d" protocol=ttrpc version=3 May 27 17:49:58.898430 systemd[1]: Started cri-containerd-a195da45e312328d3aa4c0df7a7fdda094405bc53d16f05a41533c33157fb139.scope - libcontainer container a195da45e312328d3aa4c0df7a7fdda094405bc53d16f05a41533c33157fb139. May 27 17:49:58.929406 systemd-networkd[1475]: cali3711e5cd13e: Gained IPv6LL May 27 17:49:58.946574 containerd[1565]: time="2025-05-27T17:49:58.946532909Z" level=info msg="StartContainer for \"a195da45e312328d3aa4c0df7a7fdda094405bc53d16f05a41533c33157fb139\" returns successfully" May 27 17:49:59.129718 containerd[1565]: time="2025-05-27T17:49:59.129631198Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:49:59.131382 containerd[1565]: time="2025-05-27T17:49:59.131359312Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:49:59.131891 kubelet[2865]: E0527 17:49:59.131683 2865 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:49:59.131891 kubelet[2865]: E0527 17:49:59.131733 2865 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:49:59.132017 containerd[1565]: time="2025-05-27T17:49:59.131488552Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 17:49:59.132726 containerd[1565]: time="2025-05-27T17:49:59.132192406Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\"" May 27 17:49:59.149486 kubelet[2865]: E0527 17:49:59.149433 2865 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xs4qx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-fq45k_calico-system(8b985871-4559-42e8-9cf3-7b26e6ec2b9f): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:49:59.154069 kubelet[2865]: E0527 17:49:59.154033 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-fq45k" podUID="8b985871-4559-42e8-9cf3-7b26e6ec2b9f" May 27 17:49:59.241764 containerd[1565]: time="2025-05-27T17:49:59.241736877Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-6fkcq,Uid:1513fc78-c3ee-47f4-9d52-8cedb71687f3,Namespace:kube-system,Attempt:0,}" May 27 17:49:59.242435 containerd[1565]: time="2025-05-27T17:49:59.242397137Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9f4d94d96-v64xk,Uid:bb9f0f7c-2db0-4832-a710-1a0c49d448f1,Namespace:calico-apiserver,Attempt:0,}" May 27 17:49:59.249339 systemd-networkd[1475]: califd9e1c01f10: Gained IPv6LL May 27 17:49:59.369584 systemd-networkd[1475]: caliae64cf244bc: Link UP May 27 17:49:59.370759 systemd-networkd[1475]: caliae64cf244bc: Gained carrier May 27 17:49:59.377280 systemd-networkd[1475]: caliabcb73d4400: Gained IPv6LL May 27 17:49:59.390265 containerd[1565]: 2025-05-27 17:49:59.283 [INFO][4717] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 27 17:49:59.390265 containerd[1565]: 2025-05-27 17:49:59.303 [INFO][4717] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--0--0--a--c8f0a3e630-k8s-calico--apiserver--9f4d94d96--v64xk-eth0 calico-apiserver-9f4d94d96- calico-apiserver bb9f0f7c-2db0-4832-a710-1a0c49d448f1 843 0 2025-05-27 17:49:31 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:9f4d94d96 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4344-0-0-a-c8f0a3e630 calico-apiserver-9f4d94d96-v64xk eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] caliae64cf244bc [] [] }} ContainerID="d35d4c60e0023eadcf8b1ee04531b572804ca51c597f2ebb934ce305ab300ddd" Namespace="calico-apiserver" Pod="calico-apiserver-9f4d94d96-v64xk" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-calico--apiserver--9f4d94d96--v64xk-" May 27 17:49:59.390265 containerd[1565]: 2025-05-27 17:49:59.303 [INFO][4717] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d35d4c60e0023eadcf8b1ee04531b572804ca51c597f2ebb934ce305ab300ddd" Namespace="calico-apiserver" Pod="calico-apiserver-9f4d94d96-v64xk" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-calico--apiserver--9f4d94d96--v64xk-eth0" May 27 17:49:59.390265 containerd[1565]: 2025-05-27 17:49:59.329 [INFO][4733] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d35d4c60e0023eadcf8b1ee04531b572804ca51c597f2ebb934ce305ab300ddd" HandleID="k8s-pod-network.d35d4c60e0023eadcf8b1ee04531b572804ca51c597f2ebb934ce305ab300ddd" Workload="ci--4344--0--0--a--c8f0a3e630-k8s-calico--apiserver--9f4d94d96--v64xk-eth0" May 27 17:49:59.390265 containerd[1565]: 2025-05-27 17:49:59.329 [INFO][4733] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d35d4c60e0023eadcf8b1ee04531b572804ca51c597f2ebb934ce305ab300ddd" HandleID="k8s-pod-network.d35d4c60e0023eadcf8b1ee04531b572804ca51c597f2ebb934ce305ab300ddd" Workload="ci--4344--0--0--a--c8f0a3e630-k8s-calico--apiserver--9f4d94d96--v64xk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c9020), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4344-0-0-a-c8f0a3e630", "pod":"calico-apiserver-9f4d94d96-v64xk", "timestamp":"2025-05-27 17:49:59.329046063 +0000 UTC"}, Hostname:"ci-4344-0-0-a-c8f0a3e630", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:49:59.390265 containerd[1565]: 2025-05-27 17:49:59.329 [INFO][4733] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:49:59.390265 containerd[1565]: 2025-05-27 17:49:59.329 [INFO][4733] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:49:59.390265 containerd[1565]: 2025-05-27 17:49:59.329 [INFO][4733] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-0-0-a-c8f0a3e630' May 27 17:49:59.390265 containerd[1565]: 2025-05-27 17:49:59.337 [INFO][4733] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d35d4c60e0023eadcf8b1ee04531b572804ca51c597f2ebb934ce305ab300ddd" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:59.390265 containerd[1565]: 2025-05-27 17:49:59.341 [INFO][4733] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:59.390265 containerd[1565]: 2025-05-27 17:49:59.346 [INFO][4733] ipam/ipam.go 511: Trying affinity for 192.168.8.128/26 host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:59.390265 containerd[1565]: 2025-05-27 17:49:59.347 [INFO][4733] ipam/ipam.go 158: Attempting to load block cidr=192.168.8.128/26 host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:59.390265 containerd[1565]: 2025-05-27 17:49:59.350 [INFO][4733] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.8.128/26 host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:59.390265 containerd[1565]: 2025-05-27 17:49:59.350 [INFO][4733] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.8.128/26 handle="k8s-pod-network.d35d4c60e0023eadcf8b1ee04531b572804ca51c597f2ebb934ce305ab300ddd" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:59.390265 containerd[1565]: 2025-05-27 17:49:59.351 [INFO][4733] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d35d4c60e0023eadcf8b1ee04531b572804ca51c597f2ebb934ce305ab300ddd May 27 17:49:59.390265 containerd[1565]: 2025-05-27 17:49:59.357 [INFO][4733] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.8.128/26 handle="k8s-pod-network.d35d4c60e0023eadcf8b1ee04531b572804ca51c597f2ebb934ce305ab300ddd" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:59.390265 containerd[1565]: 2025-05-27 17:49:59.362 [INFO][4733] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.8.135/26] block=192.168.8.128/26 handle="k8s-pod-network.d35d4c60e0023eadcf8b1ee04531b572804ca51c597f2ebb934ce305ab300ddd" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:59.390265 containerd[1565]: 2025-05-27 17:49:59.362 [INFO][4733] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.8.135/26] handle="k8s-pod-network.d35d4c60e0023eadcf8b1ee04531b572804ca51c597f2ebb934ce305ab300ddd" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:59.390265 containerd[1565]: 2025-05-27 17:49:59.363 [INFO][4733] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:49:59.390265 containerd[1565]: 2025-05-27 17:49:59.363 [INFO][4733] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.8.135/26] IPv6=[] ContainerID="d35d4c60e0023eadcf8b1ee04531b572804ca51c597f2ebb934ce305ab300ddd" HandleID="k8s-pod-network.d35d4c60e0023eadcf8b1ee04531b572804ca51c597f2ebb934ce305ab300ddd" Workload="ci--4344--0--0--a--c8f0a3e630-k8s-calico--apiserver--9f4d94d96--v64xk-eth0" May 27 17:49:59.391595 containerd[1565]: 2025-05-27 17:49:59.365 [INFO][4717] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d35d4c60e0023eadcf8b1ee04531b572804ca51c597f2ebb934ce305ab300ddd" Namespace="calico-apiserver" Pod="calico-apiserver-9f4d94d96-v64xk" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-calico--apiserver--9f4d94d96--v64xk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--a--c8f0a3e630-k8s-calico--apiserver--9f4d94d96--v64xk-eth0", GenerateName:"calico-apiserver-9f4d94d96-", Namespace:"calico-apiserver", SelfLink:"", UID:"bb9f0f7c-2db0-4832-a710-1a0c49d448f1", ResourceVersion:"843", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 49, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"9f4d94d96", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-a-c8f0a3e630", ContainerID:"", Pod:"calico-apiserver-9f4d94d96-v64xk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.8.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliae64cf244bc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:49:59.391595 containerd[1565]: 2025-05-27 17:49:59.365 [INFO][4717] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.8.135/32] ContainerID="d35d4c60e0023eadcf8b1ee04531b572804ca51c597f2ebb934ce305ab300ddd" Namespace="calico-apiserver" Pod="calico-apiserver-9f4d94d96-v64xk" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-calico--apiserver--9f4d94d96--v64xk-eth0" May 27 17:49:59.391595 containerd[1565]: 2025-05-27 17:49:59.365 [INFO][4717] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliae64cf244bc ContainerID="d35d4c60e0023eadcf8b1ee04531b572804ca51c597f2ebb934ce305ab300ddd" Namespace="calico-apiserver" Pod="calico-apiserver-9f4d94d96-v64xk" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-calico--apiserver--9f4d94d96--v64xk-eth0" May 27 17:49:59.391595 containerd[1565]: 2025-05-27 17:49:59.371 [INFO][4717] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d35d4c60e0023eadcf8b1ee04531b572804ca51c597f2ebb934ce305ab300ddd" Namespace="calico-apiserver" Pod="calico-apiserver-9f4d94d96-v64xk" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-calico--apiserver--9f4d94d96--v64xk-eth0" May 27 17:49:59.391595 containerd[1565]: 2025-05-27 17:49:59.372 [INFO][4717] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d35d4c60e0023eadcf8b1ee04531b572804ca51c597f2ebb934ce305ab300ddd" Namespace="calico-apiserver" Pod="calico-apiserver-9f4d94d96-v64xk" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-calico--apiserver--9f4d94d96--v64xk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--a--c8f0a3e630-k8s-calico--apiserver--9f4d94d96--v64xk-eth0", GenerateName:"calico-apiserver-9f4d94d96-", Namespace:"calico-apiserver", SelfLink:"", UID:"bb9f0f7c-2db0-4832-a710-1a0c49d448f1", ResourceVersion:"843", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 49, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"9f4d94d96", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-a-c8f0a3e630", ContainerID:"d35d4c60e0023eadcf8b1ee04531b572804ca51c597f2ebb934ce305ab300ddd", Pod:"calico-apiserver-9f4d94d96-v64xk", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.8.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliae64cf244bc", MAC:"d6:21:9f:62:b0:84", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:49:59.391595 containerd[1565]: 2025-05-27 17:49:59.385 [INFO][4717] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d35d4c60e0023eadcf8b1ee04531b572804ca51c597f2ebb934ce305ab300ddd" Namespace="calico-apiserver" Pod="calico-apiserver-9f4d94d96-v64xk" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-calico--apiserver--9f4d94d96--v64xk-eth0" May 27 17:49:59.414101 containerd[1565]: time="2025-05-27T17:49:59.413521586Z" level=info msg="connecting to shim d35d4c60e0023eadcf8b1ee04531b572804ca51c597f2ebb934ce305ab300ddd" address="unix:///run/containerd/s/89d62ef9aaead31d16cab7b3edd306148fb65f6831c6e222a58f9ca26b908df4" namespace=k8s.io protocol=ttrpc version=3 May 27 17:49:59.433513 systemd[1]: Started cri-containerd-d35d4c60e0023eadcf8b1ee04531b572804ca51c597f2ebb934ce305ab300ddd.scope - libcontainer container d35d4c60e0023eadcf8b1ee04531b572804ca51c597f2ebb934ce305ab300ddd. May 27 17:49:59.447827 kubelet[2865]: E0527 17:49:59.447782 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-fq45k" podUID="8b985871-4559-42e8-9cf3-7b26e6ec2b9f" May 27 17:49:59.460928 kubelet[2865]: I0527 17:49:59.460795 2865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-9f4d94d96-jn9qm" podStartSLOduration=25.110073866 podStartE2EDuration="28.460782006s" podCreationTimestamp="2025-05-27 17:49:31 +0000 UTC" firstStartedPulling="2025-05-27 17:49:55.481316898 +0000 UTC m=+40.333195547" lastFinishedPulling="2025-05-27 17:49:58.832025036 +0000 UTC m=+43.683903687" observedRunningTime="2025-05-27 17:49:59.460275711 +0000 UTC m=+44.312154361" watchObservedRunningTime="2025-05-27 17:49:59.460782006 +0000 UTC m=+44.312660656" May 27 17:49:59.499167 systemd-networkd[1475]: cali055d8f4640b: Link UP May 27 17:49:59.500278 systemd-networkd[1475]: cali055d8f4640b: Gained carrier May 27 17:49:59.517395 containerd[1565]: 2025-05-27 17:49:59.278 [INFO][4708] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 27 17:49:59.517395 containerd[1565]: 2025-05-27 17:49:59.302 [INFO][4708] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--0--0--a--c8f0a3e630-k8s-coredns--674b8bbfcf--6fkcq-eth0 coredns-674b8bbfcf- kube-system 1513fc78-c3ee-47f4-9d52-8cedb71687f3 842 0 2025-05-27 17:49:22 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4344-0-0-a-c8f0a3e630 coredns-674b8bbfcf-6fkcq eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali055d8f4640b [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="dbdb3e9fda2d561c19162076347b65b88f3a17357ca15950123485b4d9d9f21d" Namespace="kube-system" Pod="coredns-674b8bbfcf-6fkcq" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-coredns--674b8bbfcf--6fkcq-" May 27 17:49:59.517395 containerd[1565]: 2025-05-27 17:49:59.302 [INFO][4708] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="dbdb3e9fda2d561c19162076347b65b88f3a17357ca15950123485b4d9d9f21d" Namespace="kube-system" Pod="coredns-674b8bbfcf-6fkcq" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-coredns--674b8bbfcf--6fkcq-eth0" May 27 17:49:59.517395 containerd[1565]: 2025-05-27 17:49:59.336 [INFO][4731] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="dbdb3e9fda2d561c19162076347b65b88f3a17357ca15950123485b4d9d9f21d" HandleID="k8s-pod-network.dbdb3e9fda2d561c19162076347b65b88f3a17357ca15950123485b4d9d9f21d" Workload="ci--4344--0--0--a--c8f0a3e630-k8s-coredns--674b8bbfcf--6fkcq-eth0" May 27 17:49:59.517395 containerd[1565]: 2025-05-27 17:49:59.336 [INFO][4731] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="dbdb3e9fda2d561c19162076347b65b88f3a17357ca15950123485b4d9d9f21d" HandleID="k8s-pod-network.dbdb3e9fda2d561c19162076347b65b88f3a17357ca15950123485b4d9d9f21d" Workload="ci--4344--0--0--a--c8f0a3e630-k8s-coredns--674b8bbfcf--6fkcq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d9040), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4344-0-0-a-c8f0a3e630", "pod":"coredns-674b8bbfcf-6fkcq", "timestamp":"2025-05-27 17:49:59.336851622 +0000 UTC"}, Hostname:"ci-4344-0-0-a-c8f0a3e630", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 17:49:59.517395 containerd[1565]: 2025-05-27 17:49:59.336 [INFO][4731] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 17:49:59.517395 containerd[1565]: 2025-05-27 17:49:59.363 [INFO][4731] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 17:49:59.517395 containerd[1565]: 2025-05-27 17:49:59.363 [INFO][4731] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-0-0-a-c8f0a3e630' May 27 17:49:59.517395 containerd[1565]: 2025-05-27 17:49:59.443 [INFO][4731] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.dbdb3e9fda2d561c19162076347b65b88f3a17357ca15950123485b4d9d9f21d" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:59.517395 containerd[1565]: 2025-05-27 17:49:59.458 [INFO][4731] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:59.517395 containerd[1565]: 2025-05-27 17:49:59.469 [INFO][4731] ipam/ipam.go 511: Trying affinity for 192.168.8.128/26 host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:59.517395 containerd[1565]: 2025-05-27 17:49:59.472 [INFO][4731] ipam/ipam.go 158: Attempting to load block cidr=192.168.8.128/26 host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:59.517395 containerd[1565]: 2025-05-27 17:49:59.477 [INFO][4731] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.8.128/26 host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:59.517395 containerd[1565]: 2025-05-27 17:49:59.477 [INFO][4731] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.8.128/26 handle="k8s-pod-network.dbdb3e9fda2d561c19162076347b65b88f3a17357ca15950123485b4d9d9f21d" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:59.517395 containerd[1565]: 2025-05-27 17:49:59.479 [INFO][4731] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.dbdb3e9fda2d561c19162076347b65b88f3a17357ca15950123485b4d9d9f21d May 27 17:49:59.517395 containerd[1565]: 2025-05-27 17:49:59.483 [INFO][4731] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.8.128/26 handle="k8s-pod-network.dbdb3e9fda2d561c19162076347b65b88f3a17357ca15950123485b4d9d9f21d" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:59.517395 containerd[1565]: 2025-05-27 17:49:59.489 [INFO][4731] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.8.136/26] block=192.168.8.128/26 handle="k8s-pod-network.dbdb3e9fda2d561c19162076347b65b88f3a17357ca15950123485b4d9d9f21d" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:59.517395 containerd[1565]: 2025-05-27 17:49:59.489 [INFO][4731] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.8.136/26] handle="k8s-pod-network.dbdb3e9fda2d561c19162076347b65b88f3a17357ca15950123485b4d9d9f21d" host="ci-4344-0-0-a-c8f0a3e630" May 27 17:49:59.517395 containerd[1565]: 2025-05-27 17:49:59.489 [INFO][4731] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 17:49:59.517395 containerd[1565]: 2025-05-27 17:49:59.489 [INFO][4731] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.8.136/26] IPv6=[] ContainerID="dbdb3e9fda2d561c19162076347b65b88f3a17357ca15950123485b4d9d9f21d" HandleID="k8s-pod-network.dbdb3e9fda2d561c19162076347b65b88f3a17357ca15950123485b4d9d9f21d" Workload="ci--4344--0--0--a--c8f0a3e630-k8s-coredns--674b8bbfcf--6fkcq-eth0" May 27 17:49:59.518672 containerd[1565]: 2025-05-27 17:49:59.491 [INFO][4708] cni-plugin/k8s.go 418: Populated endpoint ContainerID="dbdb3e9fda2d561c19162076347b65b88f3a17357ca15950123485b4d9d9f21d" Namespace="kube-system" Pod="coredns-674b8bbfcf-6fkcq" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-coredns--674b8bbfcf--6fkcq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--a--c8f0a3e630-k8s-coredns--674b8bbfcf--6fkcq-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"1513fc78-c3ee-47f4-9d52-8cedb71687f3", ResourceVersion:"842", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 49, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-a-c8f0a3e630", ContainerID:"", Pod:"coredns-674b8bbfcf-6fkcq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.8.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali055d8f4640b", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:49:59.518672 containerd[1565]: 2025-05-27 17:49:59.492 [INFO][4708] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.8.136/32] ContainerID="dbdb3e9fda2d561c19162076347b65b88f3a17357ca15950123485b4d9d9f21d" Namespace="kube-system" Pod="coredns-674b8bbfcf-6fkcq" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-coredns--674b8bbfcf--6fkcq-eth0" May 27 17:49:59.518672 containerd[1565]: 2025-05-27 17:49:59.492 [INFO][4708] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali055d8f4640b ContainerID="dbdb3e9fda2d561c19162076347b65b88f3a17357ca15950123485b4d9d9f21d" Namespace="kube-system" Pod="coredns-674b8bbfcf-6fkcq" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-coredns--674b8bbfcf--6fkcq-eth0" May 27 17:49:59.518672 containerd[1565]: 2025-05-27 17:49:59.498 [INFO][4708] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="dbdb3e9fda2d561c19162076347b65b88f3a17357ca15950123485b4d9d9f21d" Namespace="kube-system" Pod="coredns-674b8bbfcf-6fkcq" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-coredns--674b8bbfcf--6fkcq-eth0" May 27 17:49:59.518672 containerd[1565]: 2025-05-27 17:49:59.500 [INFO][4708] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="dbdb3e9fda2d561c19162076347b65b88f3a17357ca15950123485b4d9d9f21d" Namespace="kube-system" Pod="coredns-674b8bbfcf-6fkcq" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-coredns--674b8bbfcf--6fkcq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--0--0--a--c8f0a3e630-k8s-coredns--674b8bbfcf--6fkcq-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"1513fc78-c3ee-47f4-9d52-8cedb71687f3", ResourceVersion:"842", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 17, 49, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-0-0-a-c8f0a3e630", ContainerID:"dbdb3e9fda2d561c19162076347b65b88f3a17357ca15950123485b4d9d9f21d", Pod:"coredns-674b8bbfcf-6fkcq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.8.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali055d8f4640b", MAC:"f6:0a:56:8c:eb:0d", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 17:49:59.518672 containerd[1565]: 2025-05-27 17:49:59.512 [INFO][4708] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="dbdb3e9fda2d561c19162076347b65b88f3a17357ca15950123485b4d9d9f21d" Namespace="kube-system" Pod="coredns-674b8bbfcf-6fkcq" WorkloadEndpoint="ci--4344--0--0--a--c8f0a3e630-k8s-coredns--674b8bbfcf--6fkcq-eth0" May 27 17:49:59.535964 containerd[1565]: time="2025-05-27T17:49:59.535923247Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9f4d94d96-v64xk,Uid:bb9f0f7c-2db0-4832-a710-1a0c49d448f1,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"d35d4c60e0023eadcf8b1ee04531b572804ca51c597f2ebb934ce305ab300ddd\"" May 27 17:49:59.546944 containerd[1565]: time="2025-05-27T17:49:59.546919868Z" level=info msg="CreateContainer within sandbox \"d35d4c60e0023eadcf8b1ee04531b572804ca51c597f2ebb934ce305ab300ddd\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 27 17:49:59.553787 containerd[1565]: time="2025-05-27T17:49:59.553763633Z" level=info msg="connecting to shim dbdb3e9fda2d561c19162076347b65b88f3a17357ca15950123485b4d9d9f21d" address="unix:///run/containerd/s/9274756ec7a029b1587b7283dbfe7f4c5c7d473759902de4d580f2059f8c30bf" namespace=k8s.io protocol=ttrpc version=3 May 27 17:49:59.557360 containerd[1565]: time="2025-05-27T17:49:59.557327992Z" level=info msg="Container a65bdb2ed18bfc56dd7244b51fb78312a5b3f8c6cc54205817732ae66fb86cc7: CDI devices from CRI Config.CDIDevices: []" May 27 17:49:59.561928 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1620777923.mount: Deactivated successfully. May 27 17:49:59.579244 containerd[1565]: time="2025-05-27T17:49:59.578750448Z" level=info msg="CreateContainer within sandbox \"d35d4c60e0023eadcf8b1ee04531b572804ca51c597f2ebb934ce305ab300ddd\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"a65bdb2ed18bfc56dd7244b51fb78312a5b3f8c6cc54205817732ae66fb86cc7\"" May 27 17:49:59.581013 containerd[1565]: time="2025-05-27T17:49:59.580973625Z" level=info msg="StartContainer for \"a65bdb2ed18bfc56dd7244b51fb78312a5b3f8c6cc54205817732ae66fb86cc7\"" May 27 17:49:59.582550 containerd[1565]: time="2025-05-27T17:49:59.582293647Z" level=info msg="connecting to shim a65bdb2ed18bfc56dd7244b51fb78312a5b3f8c6cc54205817732ae66fb86cc7" address="unix:///run/containerd/s/89d62ef9aaead31d16cab7b3edd306148fb65f6831c6e222a58f9ca26b908df4" protocol=ttrpc version=3 May 27 17:49:59.597442 systemd[1]: Started cri-containerd-dbdb3e9fda2d561c19162076347b65b88f3a17357ca15950123485b4d9d9f21d.scope - libcontainer container dbdb3e9fda2d561c19162076347b65b88f3a17357ca15950123485b4d9d9f21d. May 27 17:49:59.610319 systemd[1]: Started cri-containerd-a65bdb2ed18bfc56dd7244b51fb78312a5b3f8c6cc54205817732ae66fb86cc7.scope - libcontainer container a65bdb2ed18bfc56dd7244b51fb78312a5b3f8c6cc54205817732ae66fb86cc7. May 27 17:49:59.660300 containerd[1565]: time="2025-05-27T17:49:59.659565693Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-6fkcq,Uid:1513fc78-c3ee-47f4-9d52-8cedb71687f3,Namespace:kube-system,Attempt:0,} returns sandbox id \"dbdb3e9fda2d561c19162076347b65b88f3a17357ca15950123485b4d9d9f21d\"" May 27 17:49:59.665577 containerd[1565]: time="2025-05-27T17:49:59.665555135Z" level=info msg="CreateContainer within sandbox \"dbdb3e9fda2d561c19162076347b65b88f3a17357ca15950123485b4d9d9f21d\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 27 17:49:59.673247 containerd[1565]: time="2025-05-27T17:49:59.672833180Z" level=info msg="Container 32826426ccfb961f246e0c007ccda95c8593d5343dcea181acb97a3718e44aba: CDI devices from CRI Config.CDIDevices: []" May 27 17:49:59.679669 containerd[1565]: time="2025-05-27T17:49:59.679602966Z" level=info msg="CreateContainer within sandbox \"dbdb3e9fda2d561c19162076347b65b88f3a17357ca15950123485b4d9d9f21d\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"32826426ccfb961f246e0c007ccda95c8593d5343dcea181acb97a3718e44aba\"" May 27 17:49:59.680872 containerd[1565]: time="2025-05-27T17:49:59.680195202Z" level=info msg="StartContainer for \"32826426ccfb961f246e0c007ccda95c8593d5343dcea181acb97a3718e44aba\"" May 27 17:49:59.681645 containerd[1565]: time="2025-05-27T17:49:59.681616091Z" level=info msg="connecting to shim 32826426ccfb961f246e0c007ccda95c8593d5343dcea181acb97a3718e44aba" address="unix:///run/containerd/s/9274756ec7a029b1587b7283dbfe7f4c5c7d473759902de4d580f2059f8c30bf" protocol=ttrpc version=3 May 27 17:49:59.702573 systemd[1]: Started cri-containerd-32826426ccfb961f246e0c007ccda95c8593d5343dcea181acb97a3718e44aba.scope - libcontainer container 32826426ccfb961f246e0c007ccda95c8593d5343dcea181acb97a3718e44aba. May 27 17:49:59.716628 containerd[1565]: time="2025-05-27T17:49:59.716603579Z" level=info msg="StartContainer for \"a65bdb2ed18bfc56dd7244b51fb78312a5b3f8c6cc54205817732ae66fb86cc7\" returns successfully" May 27 17:49:59.746160 containerd[1565]: time="2025-05-27T17:49:59.746111107Z" level=info msg="StartContainer for \"32826426ccfb961f246e0c007ccda95c8593d5343dcea181acb97a3718e44aba\" returns successfully" May 27 17:50:00.451292 kubelet[2865]: I0527 17:50:00.451245 2865 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:50:00.459914 kubelet[2865]: I0527 17:50:00.459875 2865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-6fkcq" podStartSLOduration=38.459863204 podStartE2EDuration="38.459863204s" podCreationTimestamp="2025-05-27 17:49:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 17:50:00.459466424 +0000 UTC m=+45.311345073" watchObservedRunningTime="2025-05-27 17:50:00.459863204 +0000 UTC m=+45.311741853" May 27 17:50:00.488250 kubelet[2865]: I0527 17:50:00.487692 2865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-9f4d94d96-v64xk" podStartSLOduration=29.487676417 podStartE2EDuration="29.487676417s" podCreationTimestamp="2025-05-27 17:49:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 17:50:00.486552659 +0000 UTC m=+45.338431309" watchObservedRunningTime="2025-05-27 17:50:00.487676417 +0000 UTC m=+45.339555067" May 27 17:50:01.169542 systemd-networkd[1475]: caliae64cf244bc: Gained IPv6LL May 27 17:50:01.361333 systemd-networkd[1475]: cali055d8f4640b: Gained IPv6LL May 27 17:50:01.453015 kubelet[2865]: I0527 17:50:01.452919 2865 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:50:01.623650 kubelet[2865]: I0527 17:50:01.623592 2865 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:50:02.228161 systemd-networkd[1475]: vxlan.calico: Link UP May 27 17:50:02.228171 systemd-networkd[1475]: vxlan.calico: Gained carrier May 27 17:50:03.170410 containerd[1565]: time="2025-05-27T17:50:03.170345545Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:50:03.177938 containerd[1565]: time="2025-05-27T17:50:03.177835070Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.0: active requests=0, bytes read=51178512" May 27 17:50:03.265158 containerd[1565]: time="2025-05-27T17:50:03.265127428Z" level=info msg="ImageCreate event name:\"sha256:094053209304a3d20e6561c18d37ac2dc4c7fbb68c1579d9864c303edebffa50\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:50:03.269729 containerd[1565]: time="2025-05-27T17:50:03.268205823Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:eb5bc5c9e7a71f1d8ea69bbcc8e54b84fb7ec1e32d919c8b148f80b770f20182\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:50:03.271064 containerd[1565]: time="2025-05-27T17:50:03.271046473Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" with image id \"sha256:094053209304a3d20e6561c18d37ac2dc4c7fbb68c1579d9864c303edebffa50\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:eb5bc5c9e7a71f1d8ea69bbcc8e54b84fb7ec1e32d919c8b148f80b770f20182\", size \"52671183\" in 4.138781943s" May 27 17:50:03.271142 containerd[1565]: time="2025-05-27T17:50:03.271128406Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" returns image reference \"sha256:094053209304a3d20e6561c18d37ac2dc4c7fbb68c1579d9864c303edebffa50\"" May 27 17:50:03.346205 containerd[1565]: time="2025-05-27T17:50:03.345978768Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.0\"" May 27 17:50:03.433976 containerd[1565]: time="2025-05-27T17:50:03.433948199Z" level=info msg="CreateContainer within sandbox \"f0a259e941c5ce73a80470c831a01e67308ce3ae8025de5b2586d1e933506f38\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 27 17:50:03.461878 containerd[1565]: time="2025-05-27T17:50:03.460958761Z" level=info msg="Container 72c2e0ae0af1e51d52a1d2b4ee60cf6b8986602fcf3e79ed24901f24d1cc6948: CDI devices from CRI Config.CDIDevices: []" May 27 17:50:03.466872 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3024148189.mount: Deactivated successfully. May 27 17:50:03.479698 containerd[1565]: time="2025-05-27T17:50:03.479675126Z" level=info msg="CreateContainer within sandbox \"f0a259e941c5ce73a80470c831a01e67308ce3ae8025de5b2586d1e933506f38\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"72c2e0ae0af1e51d52a1d2b4ee60cf6b8986602fcf3e79ed24901f24d1cc6948\"" May 27 17:50:03.480238 containerd[1565]: time="2025-05-27T17:50:03.480183184Z" level=info msg="StartContainer for \"72c2e0ae0af1e51d52a1d2b4ee60cf6b8986602fcf3e79ed24901f24d1cc6948\"" May 27 17:50:03.483935 containerd[1565]: time="2025-05-27T17:50:03.483910198Z" level=info msg="connecting to shim 72c2e0ae0af1e51d52a1d2b4ee60cf6b8986602fcf3e79ed24901f24d1cc6948" address="unix:///run/containerd/s/03a36c6892702378f76d4e6c9339c2df5ebf5af778a45cee9e012041a4d36021" protocol=ttrpc version=3 May 27 17:50:03.571441 systemd[1]: Started cri-containerd-72c2e0ae0af1e51d52a1d2b4ee60cf6b8986602fcf3e79ed24901f24d1cc6948.scope - libcontainer container 72c2e0ae0af1e51d52a1d2b4ee60cf6b8986602fcf3e79ed24901f24d1cc6948. May 27 17:50:03.622531 containerd[1565]: time="2025-05-27T17:50:03.621592564Z" level=info msg="StartContainer for \"72c2e0ae0af1e51d52a1d2b4ee60cf6b8986602fcf3e79ed24901f24d1cc6948\" returns successfully" May 27 17:50:04.178598 systemd-networkd[1475]: vxlan.calico: Gained IPv6LL May 27 17:50:04.542499 kubelet[2865]: I0527 17:50:04.542439 2865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-cb6f4656f-6r5lk" podStartSLOduration=24.836967113 podStartE2EDuration="30.534972947s" podCreationTimestamp="2025-05-27 17:49:34 +0000 UTC" firstStartedPulling="2025-05-27 17:49:57.621204843 +0000 UTC m=+42.473083493" lastFinishedPulling="2025-05-27 17:50:03.319210678 +0000 UTC m=+48.171089327" observedRunningTime="2025-05-27 17:50:04.533740598 +0000 UTC m=+49.385619267" watchObservedRunningTime="2025-05-27 17:50:04.534972947 +0000 UTC m=+49.386851607" May 27 17:50:04.606673 containerd[1565]: time="2025-05-27T17:50:04.606632428Z" level=info msg="TaskExit event in podsandbox handler container_id:\"72c2e0ae0af1e51d52a1d2b4ee60cf6b8986602fcf3e79ed24901f24d1cc6948\" id:\"d58c30d4a48ab1a9d012d923a4b23f734e685b3d9582940afbcc6d615b29025a\" pid:5165 exited_at:{seconds:1748368204 nanos:606010798}" May 27 17:50:05.708105 containerd[1565]: time="2025-05-27T17:50:05.708042355Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:50:05.709052 containerd[1565]: time="2025-05-27T17:50:05.709001975Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.0: active requests=0, bytes read=8758390" May 27 17:50:05.710104 containerd[1565]: time="2025-05-27T17:50:05.710038079Z" level=info msg="ImageCreate event name:\"sha256:d5b08093b7928c0ac1122e59edf69b2e58c6d10ecc8b9e5cffeb809a956dc48e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:50:05.711443 containerd[1565]: time="2025-05-27T17:50:05.711405952Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:27883a4104876fe239311dd93ce6efd0c4a87de7163d57a4c8d96bd65a287ffd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:50:05.711812 containerd[1565]: time="2025-05-27T17:50:05.711783477Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.0\" with image id \"sha256:d5b08093b7928c0ac1122e59edf69b2e58c6d10ecc8b9e5cffeb809a956dc48e\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:27883a4104876fe239311dd93ce6efd0c4a87de7163d57a4c8d96bd65a287ffd\", size \"10251093\" in 2.365773632s" May 27 17:50:05.711853 containerd[1565]: time="2025-05-27T17:50:05.711812101Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.0\" returns image reference \"sha256:d5b08093b7928c0ac1122e59edf69b2e58c6d10ecc8b9e5cffeb809a956dc48e\"" May 27 17:50:05.715752 containerd[1565]: time="2025-05-27T17:50:05.715719893Z" level=info msg="CreateContainer within sandbox \"0d4c46dd32b324d996dd4dec47a460426bb9c6aeaae517f341399ef5e81adb28\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 27 17:50:05.734238 containerd[1565]: time="2025-05-27T17:50:05.734188894Z" level=info msg="Container f9c0731559011273999402a8e81e4a64a7380d9c202c90d5df2b6bae8c4f4040: CDI devices from CRI Config.CDIDevices: []" May 27 17:50:05.737638 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount600699377.mount: Deactivated successfully. May 27 17:50:05.744436 containerd[1565]: time="2025-05-27T17:50:05.744392397Z" level=info msg="CreateContainer within sandbox \"0d4c46dd32b324d996dd4dec47a460426bb9c6aeaae517f341399ef5e81adb28\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"f9c0731559011273999402a8e81e4a64a7380d9c202c90d5df2b6bae8c4f4040\"" May 27 17:50:05.744894 containerd[1565]: time="2025-05-27T17:50:05.744876341Z" level=info msg="StartContainer for \"f9c0731559011273999402a8e81e4a64a7380d9c202c90d5df2b6bae8c4f4040\"" May 27 17:50:05.746146 containerd[1565]: time="2025-05-27T17:50:05.746067123Z" level=info msg="connecting to shim f9c0731559011273999402a8e81e4a64a7380d9c202c90d5df2b6bae8c4f4040" address="unix:///run/containerd/s/da5f70ebad77abf285ef40b81453bb5a29a4090c0d74da4b3a9687671089e4e1" protocol=ttrpc version=3 May 27 17:50:05.765342 systemd[1]: Started cri-containerd-f9c0731559011273999402a8e81e4a64a7380d9c202c90d5df2b6bae8c4f4040.scope - libcontainer container f9c0731559011273999402a8e81e4a64a7380d9c202c90d5df2b6bae8c4f4040. May 27 17:50:05.796856 containerd[1565]: time="2025-05-27T17:50:05.796797435Z" level=info msg="StartContainer for \"f9c0731559011273999402a8e81e4a64a7380d9c202c90d5df2b6bae8c4f4040\" returns successfully" May 27 17:50:05.798900 containerd[1565]: time="2025-05-27T17:50:05.798854374Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\"" May 27 17:50:07.956266 containerd[1565]: time="2025-05-27T17:50:07.956207237Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:50:07.957190 containerd[1565]: time="2025-05-27T17:50:07.957044670Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0: active requests=0, bytes read=14705639" May 27 17:50:07.957885 containerd[1565]: time="2025-05-27T17:50:07.957862706Z" level=info msg="ImageCreate event name:\"sha256:45c8692ffc029387ee93ba83da8ad26da9749cf2ba6ed03981f8f9933ed5a5b0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:50:07.959613 containerd[1565]: time="2025-05-27T17:50:07.959560346Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:dca5c16181edde2e860463615523ce457cd9dcfca85b7cfdcd6f3ea7de6f2ac8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 17:50:07.960209 containerd[1565]: time="2025-05-27T17:50:07.959947458Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" with image id \"sha256:45c8692ffc029387ee93ba83da8ad26da9749cf2ba6ed03981f8f9933ed5a5b0\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:dca5c16181edde2e860463615523ce457cd9dcfca85b7cfdcd6f3ea7de6f2ac8\", size \"16198294\" in 2.160442889s" May 27 17:50:07.960209 containerd[1565]: time="2025-05-27T17:50:07.959973687Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" returns image reference \"sha256:45c8692ffc029387ee93ba83da8ad26da9749cf2ba6ed03981f8f9933ed5a5b0\"" May 27 17:50:07.961333 containerd[1565]: time="2025-05-27T17:50:07.961313629Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 17:50:07.964013 containerd[1565]: time="2025-05-27T17:50:07.963728856Z" level=info msg="CreateContainer within sandbox \"0d4c46dd32b324d996dd4dec47a460426bb9c6aeaae517f341399ef5e81adb28\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 27 17:50:07.972039 containerd[1565]: time="2025-05-27T17:50:07.972008814Z" level=info msg="Container 46644a54b91cfdfb9268ef5c08209861710b261eba243646050a84c525024f3a: CDI devices from CRI Config.CDIDevices: []" May 27 17:50:07.977778 containerd[1565]: time="2025-05-27T17:50:07.977753848Z" level=info msg="CreateContainer within sandbox \"0d4c46dd32b324d996dd4dec47a460426bb9c6aeaae517f341399ef5e81adb28\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"46644a54b91cfdfb9268ef5c08209861710b261eba243646050a84c525024f3a\"" May 27 17:50:07.978417 containerd[1565]: time="2025-05-27T17:50:07.978402348Z" level=info msg="StartContainer for \"46644a54b91cfdfb9268ef5c08209861710b261eba243646050a84c525024f3a\"" May 27 17:50:07.979794 containerd[1565]: time="2025-05-27T17:50:07.979746167Z" level=info msg="connecting to shim 46644a54b91cfdfb9268ef5c08209861710b261eba243646050a84c525024f3a" address="unix:///run/containerd/s/da5f70ebad77abf285ef40b81453bb5a29a4090c0d74da4b3a9687671089e4e1" protocol=ttrpc version=3 May 27 17:50:08.005190 systemd[1]: Started cri-containerd-46644a54b91cfdfb9268ef5c08209861710b261eba243646050a84c525024f3a.scope - libcontainer container 46644a54b91cfdfb9268ef5c08209861710b261eba243646050a84c525024f3a. May 27 17:50:08.067737 containerd[1565]: time="2025-05-27T17:50:08.067656022Z" level=info msg="StartContainer for \"46644a54b91cfdfb9268ef5c08209861710b261eba243646050a84c525024f3a\" returns successfully" May 27 17:50:08.277688 containerd[1565]: time="2025-05-27T17:50:08.277542258Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:50:08.281015 containerd[1565]: time="2025-05-27T17:50:08.278631652Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:50:08.281108 containerd[1565]: time="2025-05-27T17:50:08.278675835Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 17:50:08.282498 kubelet[2865]: E0527 17:50:08.282432 2865 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:50:08.283616 kubelet[2865]: E0527 17:50:08.283579 2865 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:50:08.291522 kubelet[2865]: E0527 17:50:08.290546 2865 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:679f3a53bfbe4d5cacd8f8aefe926c7f,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xdf7c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7c68764d8d-q7mcw_calico-system(ba03e825-a044-4684-ba50-40a1a4351879): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:50:08.304405 containerd[1565]: time="2025-05-27T17:50:08.304354945Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 17:50:08.518733 kubelet[2865]: I0527 17:50:08.518622 2865 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 27 17:50:08.520807 kubelet[2865]: I0527 17:50:08.520720 2865 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 27 17:50:08.526645 kubelet[2865]: I0527 17:50:08.526237 2865 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-5zw22" podStartSLOduration=24.211568352 podStartE2EDuration="34.526208674s" podCreationTimestamp="2025-05-27 17:49:34 +0000 UTC" firstStartedPulling="2025-05-27 17:49:57.645967398 +0000 UTC m=+42.497846048" lastFinishedPulling="2025-05-27 17:50:07.960607721 +0000 UTC m=+52.812486370" observedRunningTime="2025-05-27 17:50:08.525295449 +0000 UTC m=+53.377174109" watchObservedRunningTime="2025-05-27 17:50:08.526208674 +0000 UTC m=+53.378087323" May 27 17:50:08.556457 kubelet[2865]: I0527 17:50:08.556289 2865 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:50:08.603190 containerd[1565]: time="2025-05-27T17:50:08.602864972Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:50:08.604425 containerd[1565]: time="2025-05-27T17:50:08.604364732Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:50:08.604580 containerd[1565]: time="2025-05-27T17:50:08.604553484Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 17:50:08.605151 kubelet[2865]: E0527 17:50:08.605128 2865 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:50:08.605271 kubelet[2865]: E0527 17:50:08.605254 2865 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:50:08.605495 kubelet[2865]: E0527 17:50:08.605460 2865 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xdf7c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7c68764d8d-q7mcw_calico-system(ba03e825-a044-4684-ba50-40a1a4351879): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:50:08.607254 kubelet[2865]: E0527 17:50:08.607214 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7c68764d8d-q7mcw" podUID="ba03e825-a044-4684-ba50-40a1a4351879" May 27 17:50:10.235983 containerd[1565]: time="2025-05-27T17:50:10.235844694Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 17:50:10.551706 containerd[1565]: time="2025-05-27T17:50:10.551453575Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:50:10.552530 containerd[1565]: time="2025-05-27T17:50:10.552486964Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:50:10.552696 containerd[1565]: time="2025-05-27T17:50:10.552563347Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 17:50:10.552762 kubelet[2865]: E0527 17:50:10.552722 2865 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:50:10.553087 kubelet[2865]: E0527 17:50:10.552770 2865 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:50:10.553087 kubelet[2865]: E0527 17:50:10.552934 2865 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xs4qx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-fq45k_calico-system(8b985871-4559-42e8-9cf3-7b26e6ec2b9f): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:50:10.554417 kubelet[2865]: E0527 17:50:10.554383 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-fq45k" podUID="8b985871-4559-42e8-9cf3-7b26e6ec2b9f" May 27 17:50:21.237563 kubelet[2865]: E0527 17:50:21.237348 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7c68764d8d-q7mcw" podUID="ba03e825-a044-4684-ba50-40a1a4351879" May 27 17:50:21.398490 kubelet[2865]: I0527 17:50:21.398333 2865 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 17:50:24.233192 kubelet[2865]: E0527 17:50:24.233107 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-fq45k" podUID="8b985871-4559-42e8-9cf3-7b26e6ec2b9f" May 27 17:50:27.526356 containerd[1565]: time="2025-05-27T17:50:27.526283619Z" level=info msg="TaskExit event in podsandbox handler container_id:\"70ddb3e5906d2ff3f84bb9249ff4cce437881aa2d1921a2cbd1edd83f0dcb9b0\" id:\"ae254e48ff0762686261ebab2e11574026290bad05935512b403a9ac4b12c484\" pid:5297 exited_at:{seconds:1748368227 nanos:525855158}" May 27 17:50:33.243518 containerd[1565]: time="2025-05-27T17:50:33.243486413Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 17:50:33.549396 containerd[1565]: time="2025-05-27T17:50:33.549099470Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:50:33.550728 containerd[1565]: time="2025-05-27T17:50:33.550625754Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:50:33.550862 containerd[1565]: time="2025-05-27T17:50:33.550831107Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 17:50:33.551185 kubelet[2865]: E0527 17:50:33.551143 2865 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:50:33.551576 kubelet[2865]: E0527 17:50:33.551280 2865 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:50:33.552045 kubelet[2865]: E0527 17:50:33.552015 2865 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:679f3a53bfbe4d5cacd8f8aefe926c7f,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xdf7c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7c68764d8d-q7mcw_calico-system(ba03e825-a044-4684-ba50-40a1a4351879): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:50:33.554493 containerd[1565]: time="2025-05-27T17:50:33.554297219Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 17:50:33.868399 containerd[1565]: time="2025-05-27T17:50:33.867504585Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:50:33.869129 containerd[1565]: time="2025-05-27T17:50:33.868750594Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:50:33.869129 containerd[1565]: time="2025-05-27T17:50:33.868958884Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 17:50:33.869369 kubelet[2865]: E0527 17:50:33.869132 2865 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:50:33.869369 kubelet[2865]: E0527 17:50:33.869194 2865 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:50:33.869369 kubelet[2865]: E0527 17:50:33.869339 2865 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xdf7c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7c68764d8d-q7mcw_calico-system(ba03e825-a044-4684-ba50-40a1a4351879): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:50:33.870775 kubelet[2865]: E0527 17:50:33.870673 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7c68764d8d-q7mcw" podUID="ba03e825-a044-4684-ba50-40a1a4351879" May 27 17:50:34.530116 containerd[1565]: time="2025-05-27T17:50:34.530074055Z" level=info msg="TaskExit event in podsandbox handler container_id:\"72c2e0ae0af1e51d52a1d2b4ee60cf6b8986602fcf3e79ed24901f24d1cc6948\" id:\"2e9b9b06ccf55c22383ca172819fb9f7dd9682f468f112ccae126299ab6a6c17\" pid:5321 exited_at:{seconds:1748368234 nanos:529624294}" May 27 17:50:37.245384 containerd[1565]: time="2025-05-27T17:50:37.245231898Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 17:50:37.558321 containerd[1565]: time="2025-05-27T17:50:37.558145062Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:50:37.559496 containerd[1565]: time="2025-05-27T17:50:37.559435355Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:50:37.559652 containerd[1565]: time="2025-05-27T17:50:37.559564306Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 17:50:37.560062 kubelet[2865]: E0527 17:50:37.559883 2865 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:50:37.560062 kubelet[2865]: E0527 17:50:37.560026 2865 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:50:37.560438 kubelet[2865]: E0527 17:50:37.560267 2865 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xs4qx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-fq45k_calico-system(8b985871-4559-42e8-9cf3-7b26e6ec2b9f): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:50:37.562376 kubelet[2865]: E0527 17:50:37.562311 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-fq45k" podUID="8b985871-4559-42e8-9cf3-7b26e6ec2b9f" May 27 17:50:46.234937 kubelet[2865]: E0527 17:50:46.234885 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7c68764d8d-q7mcw" podUID="ba03e825-a044-4684-ba50-40a1a4351879" May 27 17:50:53.233164 kubelet[2865]: E0527 17:50:53.232856 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-fq45k" podUID="8b985871-4559-42e8-9cf3-7b26e6ec2b9f" May 27 17:50:57.235896 kubelet[2865]: E0527 17:50:57.235711 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7c68764d8d-q7mcw" podUID="ba03e825-a044-4684-ba50-40a1a4351879" May 27 17:50:57.507875 containerd[1565]: time="2025-05-27T17:50:57.507697293Z" level=info msg="TaskExit event in podsandbox handler container_id:\"70ddb3e5906d2ff3f84bb9249ff4cce437881aa2d1921a2cbd1edd83f0dcb9b0\" id:\"025f752a6c15636ea1f7f2766ccf546083d50faf2f633a8ab339f5bce46e8a33\" pid:5352 exited_at:{seconds:1748368257 nanos:507421466}" May 27 17:51:00.407384 containerd[1565]: time="2025-05-27T17:51:00.407316489Z" level=info msg="TaskExit event in podsandbox handler container_id:\"72c2e0ae0af1e51d52a1d2b4ee60cf6b8986602fcf3e79ed24901f24d1cc6948\" id:\"f1897d0acba90fc054218507e2195d6d4529e0fd0a8b1336d760a23e7a785378\" pid:5378 exited_at:{seconds:1748368260 nanos:406369106}" May 27 17:51:04.537379 containerd[1565]: time="2025-05-27T17:51:04.537345090Z" level=info msg="TaskExit event in podsandbox handler container_id:\"72c2e0ae0af1e51d52a1d2b4ee60cf6b8986602fcf3e79ed24901f24d1cc6948\" id:\"7f0acb8e3d0d91b2a459f29683c22baa3eeefddfaf08284541c31ce4d3ce04e1\" pid:5400 exited_at:{seconds:1748368264 nanos:537083440}" May 27 17:51:05.263531 kubelet[2865]: E0527 17:51:05.263462 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-fq45k" podUID="8b985871-4559-42e8-9cf3-7b26e6ec2b9f" May 27 17:51:08.233208 kubelet[2865]: E0527 17:51:08.233165 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7c68764d8d-q7mcw" podUID="ba03e825-a044-4684-ba50-40a1a4351879" May 27 17:51:16.232572 kubelet[2865]: E0527 17:51:16.232506 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-fq45k" podUID="8b985871-4559-42e8-9cf3-7b26e6ec2b9f" May 27 17:51:20.233710 containerd[1565]: time="2025-05-27T17:51:20.233592765Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 17:51:20.555684 containerd[1565]: time="2025-05-27T17:51:20.555542233Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:51:20.556859 containerd[1565]: time="2025-05-27T17:51:20.556822136Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:51:20.556945 containerd[1565]: time="2025-05-27T17:51:20.556931261Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 17:51:20.557143 kubelet[2865]: E0527 17:51:20.557095 2865 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:51:20.557143 kubelet[2865]: E0527 17:51:20.557150 2865 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:51:20.557892 kubelet[2865]: E0527 17:51:20.557285 2865 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:679f3a53bfbe4d5cacd8f8aefe926c7f,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xdf7c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7c68764d8d-q7mcw_calico-system(ba03e825-a044-4684-ba50-40a1a4351879): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:51:20.560270 containerd[1565]: time="2025-05-27T17:51:20.560197025Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 17:51:20.855672 containerd[1565]: time="2025-05-27T17:51:20.855516771Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:51:20.856852 containerd[1565]: time="2025-05-27T17:51:20.856764875Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:51:20.857050 containerd[1565]: time="2025-05-27T17:51:20.856874251Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 17:51:20.857134 kubelet[2865]: E0527 17:51:20.857087 2865 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:51:20.857277 kubelet[2865]: E0527 17:51:20.857152 2865 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:51:20.857422 kubelet[2865]: E0527 17:51:20.857327 2865 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xdf7c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7c68764d8d-q7mcw_calico-system(ba03e825-a044-4684-ba50-40a1a4351879): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:51:20.858820 kubelet[2865]: E0527 17:51:20.858747 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7c68764d8d-q7mcw" podUID="ba03e825-a044-4684-ba50-40a1a4351879" May 27 17:51:27.489583 containerd[1565]: time="2025-05-27T17:51:27.489524144Z" level=info msg="TaskExit event in podsandbox handler container_id:\"70ddb3e5906d2ff3f84bb9249ff4cce437881aa2d1921a2cbd1edd83f0dcb9b0\" id:\"cc4ced71ec61a423956b236ca2f49e95ecf0825867981053904bee474c0a9052\" pid:5433 exited_at:{seconds:1748368287 nanos:489107701}" May 27 17:51:29.233611 containerd[1565]: time="2025-05-27T17:51:29.233368227Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 17:51:29.541005 containerd[1565]: time="2025-05-27T17:51:29.540856371Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:51:29.542381 containerd[1565]: time="2025-05-27T17:51:29.542295682Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:51:29.542570 containerd[1565]: time="2025-05-27T17:51:29.542409366Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 17:51:29.542629 kubelet[2865]: E0527 17:51:29.542555 2865 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:51:29.542629 kubelet[2865]: E0527 17:51:29.542614 2865 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:51:29.543077 kubelet[2865]: E0527 17:51:29.542783 2865 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xs4qx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-fq45k_calico-system(8b985871-4559-42e8-9cf3-7b26e6ec2b9f): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:51:29.544349 kubelet[2865]: E0527 17:51:29.544303 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-fq45k" podUID="8b985871-4559-42e8-9cf3-7b26e6ec2b9f" May 27 17:51:34.544278 containerd[1565]: time="2025-05-27T17:51:34.544071073Z" level=info msg="TaskExit event in podsandbox handler container_id:\"72c2e0ae0af1e51d52a1d2b4ee60cf6b8986602fcf3e79ed24901f24d1cc6948\" id:\"afc684013d3f11eec105393dedde6fbd24ba318e69abab496a55723f174a2898\" pid:5470 exited_at:{seconds:1748368294 nanos:543929086}" May 27 17:51:35.235262 kubelet[2865]: E0527 17:51:35.235175 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7c68764d8d-q7mcw" podUID="ba03e825-a044-4684-ba50-40a1a4351879" May 27 17:51:43.235256 kubelet[2865]: E0527 17:51:43.235017 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-fq45k" podUID="8b985871-4559-42e8-9cf3-7b26e6ec2b9f" May 27 17:51:49.238686 kubelet[2865]: E0527 17:51:49.238198 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7c68764d8d-q7mcw" podUID="ba03e825-a044-4684-ba50-40a1a4351879" May 27 17:51:57.233095 kubelet[2865]: E0527 17:51:57.233040 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-fq45k" podUID="8b985871-4559-42e8-9cf3-7b26e6ec2b9f" May 27 17:51:57.476882 containerd[1565]: time="2025-05-27T17:51:57.476829039Z" level=info msg="TaskExit event in podsandbox handler container_id:\"70ddb3e5906d2ff3f84bb9249ff4cce437881aa2d1921a2cbd1edd83f0dcb9b0\" id:\"e6e1a6ef685f36dff986ce69a077bd07fc8d6b5094b92564cb57d34c10fff03c\" pid:5502 exited_at:{seconds:1748368317 nanos:476304135}" May 27 17:52:00.233426 kubelet[2865]: E0527 17:52:00.233295 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7c68764d8d-q7mcw" podUID="ba03e825-a044-4684-ba50-40a1a4351879" May 27 17:52:00.401403 containerd[1565]: time="2025-05-27T17:52:00.401361207Z" level=info msg="TaskExit event in podsandbox handler container_id:\"72c2e0ae0af1e51d52a1d2b4ee60cf6b8986602fcf3e79ed24901f24d1cc6948\" id:\"2c0836e978d9ac3c4ae16edf1a34daf5b8f0d18de75939c10702e9abab62fed9\" pid:5526 exited_at:{seconds:1748368320 nanos:401159719}" May 27 17:52:04.525673 containerd[1565]: time="2025-05-27T17:52:04.525611739Z" level=info msg="TaskExit event in podsandbox handler container_id:\"72c2e0ae0af1e51d52a1d2b4ee60cf6b8986602fcf3e79ed24901f24d1cc6948\" id:\"c333fb63e079dda93c02f31ebf971b65a2df323e8f60787f517254bd52c92d9a\" pid:5550 exited_at:{seconds:1748368324 nanos:525310704}" May 27 17:52:08.234515 kubelet[2865]: E0527 17:52:08.233920 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-fq45k" podUID="8b985871-4559-42e8-9cf3-7b26e6ec2b9f" May 27 17:52:12.233687 kubelet[2865]: E0527 17:52:12.233557 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7c68764d8d-q7mcw" podUID="ba03e825-a044-4684-ba50-40a1a4351879" May 27 17:52:22.232729 kubelet[2865]: E0527 17:52:22.232655 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-fq45k" podUID="8b985871-4559-42e8-9cf3-7b26e6ec2b9f" May 27 17:52:25.233396 kubelet[2865]: E0527 17:52:25.233341 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7c68764d8d-q7mcw" podUID="ba03e825-a044-4684-ba50-40a1a4351879" May 27 17:52:27.532207 containerd[1565]: time="2025-05-27T17:52:27.532151616Z" level=info msg="TaskExit event in podsandbox handler container_id:\"70ddb3e5906d2ff3f84bb9249ff4cce437881aa2d1921a2cbd1edd83f0dcb9b0\" id:\"cc0e9a5d7f287b1df9fef3c30186e8fa2550a0d41d7ea0eb35d7ff0d0ee2c1d8\" pid:5579 exited_at:{seconds:1748368347 nanos:521179687}" May 27 17:52:34.541017 containerd[1565]: time="2025-05-27T17:52:34.540961086Z" level=info msg="TaskExit event in podsandbox handler container_id:\"72c2e0ae0af1e51d52a1d2b4ee60cf6b8986602fcf3e79ed24901f24d1cc6948\" id:\"56e432dd0c3e9e18072c60a0e8d19e96c1f0e1ec59ebb67af88cff3d44a5b1dc\" pid:5602 exited_at:{seconds:1748368354 nanos:540676212}" May 27 17:52:36.249001 kubelet[2865]: E0527 17:52:36.248912 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-fq45k" podUID="8b985871-4559-42e8-9cf3-7b26e6ec2b9f" May 27 17:52:38.234544 kubelet[2865]: E0527 17:52:38.234423 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7c68764d8d-q7mcw" podUID="ba03e825-a044-4684-ba50-40a1a4351879" May 27 17:52:50.234110 containerd[1565]: time="2025-05-27T17:52:50.234000712Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 17:52:50.533522 containerd[1565]: time="2025-05-27T17:52:50.533287655Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:52:50.535344 containerd[1565]: time="2025-05-27T17:52:50.535263557Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:52:50.535656 containerd[1565]: time="2025-05-27T17:52:50.535393071Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 17:52:50.535768 kubelet[2865]: E0527 17:52:50.535561 2865 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:52:50.535768 kubelet[2865]: E0527 17:52:50.535619 2865 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 17:52:50.536951 kubelet[2865]: E0527 17:52:50.535890 2865 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:679f3a53bfbe4d5cacd8f8aefe926c7f,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xdf7c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7c68764d8d-q7mcw_calico-system(ba03e825-a044-4684-ba50-40a1a4351879): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:52:50.537110 containerd[1565]: time="2025-05-27T17:52:50.536479256Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 17:52:50.839924 containerd[1565]: time="2025-05-27T17:52:50.839750595Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:52:50.841504 containerd[1565]: time="2025-05-27T17:52:50.841391731Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:52:50.841814 containerd[1565]: time="2025-05-27T17:52:50.841511766Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 17:52:50.842018 kubelet[2865]: E0527 17:52:50.841694 2865 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:52:50.842018 kubelet[2865]: E0527 17:52:50.841758 2865 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 17:52:50.842418 kubelet[2865]: E0527 17:52:50.842043 2865 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xs4qx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-fq45k_calico-system(8b985871-4559-42e8-9cf3-7b26e6ec2b9f): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:52:50.844065 kubelet[2865]: E0527 17:52:50.843805 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-fq45k" podUID="8b985871-4559-42e8-9cf3-7b26e6ec2b9f" May 27 17:52:50.844369 containerd[1565]: time="2025-05-27T17:52:50.844306524Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 17:52:51.187372 containerd[1565]: time="2025-05-27T17:52:51.187273506Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 17:52:51.188864 containerd[1565]: time="2025-05-27T17:52:51.188768649Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 17:52:51.189564 containerd[1565]: time="2025-05-27T17:52:51.188896098Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 17:52:51.189643 kubelet[2865]: E0527 17:52:51.189147 2865 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:52:51.189643 kubelet[2865]: E0527 17:52:51.189256 2865 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 17:52:51.189643 kubelet[2865]: E0527 17:52:51.189427 2865 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xdf7c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-7c68764d8d-q7mcw_calico-system(ba03e825-a044-4684-ba50-40a1a4351879): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 17:52:51.190993 kubelet[2865]: E0527 17:52:51.190897 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7c68764d8d-q7mcw" podUID="ba03e825-a044-4684-ba50-40a1a4351879" May 27 17:52:57.479025 containerd[1565]: time="2025-05-27T17:52:57.478807080Z" level=info msg="TaskExit event in podsandbox handler container_id:\"70ddb3e5906d2ff3f84bb9249ff4cce437881aa2d1921a2cbd1edd83f0dcb9b0\" id:\"b86467a90906505e38d23c8d8b219128e98beff4ff9d89c2b2f8817380fa0bff\" pid:5633 exited_at:{seconds:1748368377 nanos:478375151}" May 27 17:53:00.401274 containerd[1565]: time="2025-05-27T17:53:00.401142504Z" level=info msg="TaskExit event in podsandbox handler container_id:\"72c2e0ae0af1e51d52a1d2b4ee60cf6b8986602fcf3e79ed24901f24d1cc6948\" id:\"2d0efe66c6679e275872eba3b51776311b5ade28efeed6109245c67d0630cdc9\" pid:5658 exited_at:{seconds:1748368380 nanos:400838956}" May 27 17:53:02.233847 kubelet[2865]: E0527 17:53:02.233628 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-fq45k" podUID="8b985871-4559-42e8-9cf3-7b26e6ec2b9f" May 27 17:53:03.233816 kubelet[2865]: E0527 17:53:03.233556 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7c68764d8d-q7mcw" podUID="ba03e825-a044-4684-ba50-40a1a4351879" May 27 17:53:04.520880 containerd[1565]: time="2025-05-27T17:53:04.520837536Z" level=info msg="TaskExit event in podsandbox handler container_id:\"72c2e0ae0af1e51d52a1d2b4ee60cf6b8986602fcf3e79ed24901f24d1cc6948\" id:\"127064fba17fbf717e688679161af1123c0b6c6257365553553bb35c40bf7a59\" pid:5694 exited_at:{seconds:1748368384 nanos:520547922}" May 27 17:53:14.234464 kubelet[2865]: E0527 17:53:14.234370 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7c68764d8d-q7mcw" podUID="ba03e825-a044-4684-ba50-40a1a4351879" May 27 17:53:15.235966 kubelet[2865]: E0527 17:53:15.235785 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-fq45k" podUID="8b985871-4559-42e8-9cf3-7b26e6ec2b9f" May 27 17:53:27.489003 containerd[1565]: time="2025-05-27T17:53:27.488948822Z" level=info msg="TaskExit event in podsandbox handler container_id:\"70ddb3e5906d2ff3f84bb9249ff4cce437881aa2d1921a2cbd1edd83f0dcb9b0\" id:\"e38c2526fa3068cee4158296b7876684070bf65657ecde3f368044949e1ed63a\" pid:5727 exited_at:{seconds:1748368407 nanos:488471198}" May 27 17:53:28.233639 kubelet[2865]: E0527 17:53:28.233577 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-fq45k" podUID="8b985871-4559-42e8-9cf3-7b26e6ec2b9f" May 27 17:53:29.239377 kubelet[2865]: E0527 17:53:29.239289 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7c68764d8d-q7mcw" podUID="ba03e825-a044-4684-ba50-40a1a4351879" May 27 17:53:34.546729 containerd[1565]: time="2025-05-27T17:53:34.546636562Z" level=info msg="TaskExit event in podsandbox handler container_id:\"72c2e0ae0af1e51d52a1d2b4ee60cf6b8986602fcf3e79ed24901f24d1cc6948\" id:\"66dca28ea866305cc640ed38c23a056876e074ca7ffc512c60fc7b94fe198089\" pid:5752 exited_at:{seconds:1748368414 nanos:546310101}" May 27 17:53:42.234556 kubelet[2865]: E0527 17:53:42.234377 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7c68764d8d-q7mcw" podUID="ba03e825-a044-4684-ba50-40a1a4351879" May 27 17:53:43.234763 kubelet[2865]: E0527 17:53:43.233396 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-fq45k" podUID="8b985871-4559-42e8-9cf3-7b26e6ec2b9f" May 27 17:53:54.462299 update_engine[1538]: I20250527 17:53:54.462239 1538 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs May 27 17:53:54.462299 update_engine[1538]: I20250527 17:53:54.462288 1538 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs May 27 17:53:54.464101 update_engine[1538]: I20250527 17:53:54.464072 1538 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs May 27 17:53:54.465415 update_engine[1538]: I20250527 17:53:54.465361 1538 omaha_request_params.cc:62] Current group set to alpha May 27 17:53:54.465779 update_engine[1538]: I20250527 17:53:54.465683 1538 update_attempter.cc:499] Already updated boot flags. Skipping. May 27 17:53:54.465779 update_engine[1538]: I20250527 17:53:54.465703 1538 update_attempter.cc:643] Scheduling an action processor start. May 27 17:53:54.465779 update_engine[1538]: I20250527 17:53:54.465728 1538 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 27 17:53:54.465779 update_engine[1538]: I20250527 17:53:54.465764 1538 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs May 27 17:53:54.465864 update_engine[1538]: I20250527 17:53:54.465812 1538 omaha_request_action.cc:271] Posting an Omaha request to disabled May 27 17:53:54.465864 update_engine[1538]: I20250527 17:53:54.465819 1538 omaha_request_action.cc:272] Request: May 27 17:53:54.465864 update_engine[1538]: May 27 17:53:54.465864 update_engine[1538]: May 27 17:53:54.465864 update_engine[1538]: May 27 17:53:54.465864 update_engine[1538]: May 27 17:53:54.465864 update_engine[1538]: May 27 17:53:54.465864 update_engine[1538]: May 27 17:53:54.465864 update_engine[1538]: May 27 17:53:54.465864 update_engine[1538]: May 27 17:53:54.465864 update_engine[1538]: I20250527 17:53:54.465824 1538 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 27 17:53:54.480250 update_engine[1538]: I20250527 17:53:54.478858 1538 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 27 17:53:54.480555 update_engine[1538]: I20250527 17:53:54.480509 1538 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 27 17:53:54.480623 locksmithd[1594]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 May 27 17:53:54.482698 update_engine[1538]: E20250527 17:53:54.482647 1538 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 27 17:53:54.482759 update_engine[1538]: I20250527 17:53:54.482726 1538 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 May 27 17:53:55.246415 kubelet[2865]: E0527 17:53:55.246343 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7c68764d8d-q7mcw" podUID="ba03e825-a044-4684-ba50-40a1a4351879" May 27 17:53:56.233531 kubelet[2865]: E0527 17:53:56.233464 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-fq45k" podUID="8b985871-4559-42e8-9cf3-7b26e6ec2b9f" May 27 17:53:57.536097 containerd[1565]: time="2025-05-27T17:53:57.536038657Z" level=info msg="TaskExit event in podsandbox handler container_id:\"70ddb3e5906d2ff3f84bb9249ff4cce437881aa2d1921a2cbd1edd83f0dcb9b0\" id:\"2b96dac118fb02256846be34d6cdf8f7fa1e27d2f9220e6c7d98839720fe2e35\" pid:5776 exited_at:{seconds:1748368437 nanos:520877656}" May 27 17:54:00.402069 containerd[1565]: time="2025-05-27T17:54:00.402008356Z" level=info msg="TaskExit event in podsandbox handler container_id:\"72c2e0ae0af1e51d52a1d2b4ee60cf6b8986602fcf3e79ed24901f24d1cc6948\" id:\"1e7c557652eea27af0f1e08217e8babde1772035796ade396c1a28585c4e99a0\" pid:5801 exited_at:{seconds:1748368440 nanos:401740665}" May 27 17:54:04.334473 update_engine[1538]: I20250527 17:54:04.334403 1538 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 27 17:54:04.334819 update_engine[1538]: I20250527 17:54:04.334615 1538 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 27 17:54:04.334867 update_engine[1538]: I20250527 17:54:04.334849 1538 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 27 17:54:04.335284 update_engine[1538]: E20250527 17:54:04.335251 1538 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 27 17:54:04.335336 update_engine[1538]: I20250527 17:54:04.335314 1538 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 May 27 17:54:04.527253 containerd[1565]: time="2025-05-27T17:54:04.527186313Z" level=info msg="TaskExit event in podsandbox handler container_id:\"72c2e0ae0af1e51d52a1d2b4ee60cf6b8986602fcf3e79ed24901f24d1cc6948\" id:\"7522cae82c6f3bc345a303a8bb7cece69c8cab8e3f58ebd3e67a5995b2ab3c41\" pid:5822 exited_at:{seconds:1748368444 nanos:521915820}" May 27 17:54:07.233414 kubelet[2865]: E0527 17:54:07.233062 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-fq45k" podUID="8b985871-4559-42e8-9cf3-7b26e6ec2b9f" May 27 17:54:07.460665 systemd[1]: Started sshd@7-157.180.123.17:22-139.178.89.65:49360.service - OpenSSH per-connection server daemon (139.178.89.65:49360). May 27 17:54:08.233670 kubelet[2865]: E0527 17:54:08.233622 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7c68764d8d-q7mcw" podUID="ba03e825-a044-4684-ba50-40a1a4351879" May 27 17:54:08.467423 sshd[5835]: Accepted publickey for core from 139.178.89.65 port 49360 ssh2: RSA SHA256:qwIifqwsO4XluOtZIsqbrQnpEfm6LyQvxlgpLQEO90E May 27 17:54:08.470163 sshd-session[5835]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:54:08.475954 systemd-logind[1534]: New session 8 of user core. May 27 17:54:08.479372 systemd[1]: Started session-8.scope - Session 8 of User core. May 27 17:54:09.690635 sshd[5837]: Connection closed by 139.178.89.65 port 49360 May 27 17:54:09.690841 sshd-session[5835]: pam_unix(sshd:session): session closed for user core May 27 17:54:09.696198 systemd[1]: sshd@7-157.180.123.17:22-139.178.89.65:49360.service: Deactivated successfully. May 27 17:54:09.699964 systemd[1]: session-8.scope: Deactivated successfully. May 27 17:54:09.702509 systemd-logind[1534]: Session 8 logged out. Waiting for processes to exit. May 27 17:54:09.704695 systemd-logind[1534]: Removed session 8. May 27 17:54:10.730694 containerd[1565]: time="2025-05-27T17:54:10.707045194Z" level=warning msg="container event discarded" container=9943345f5151c7643cdfff39141b42be5cd764524c9c46a8e518dd350f28fd0a type=CONTAINER_CREATED_EVENT May 27 17:54:10.765767 containerd[1565]: time="2025-05-27T17:54:10.765706127Z" level=warning msg="container event discarded" container=9943345f5151c7643cdfff39141b42be5cd764524c9c46a8e518dd350f28fd0a type=CONTAINER_STARTED_EVENT May 27 17:54:10.765767 containerd[1565]: time="2025-05-27T17:54:10.765748006Z" level=warning msg="container event discarded" container=701000cf7d3f781c59dbf1e31f9fd84606a0c14c8b65b2dfe796507a746d5895 type=CONTAINER_CREATED_EVENT May 27 17:54:10.765767 containerd[1565]: time="2025-05-27T17:54:10.765756521Z" level=warning msg="container event discarded" container=701000cf7d3f781c59dbf1e31f9fd84606a0c14c8b65b2dfe796507a746d5895 type=CONTAINER_STARTED_EVENT May 27 17:54:10.765767 containerd[1565]: time="2025-05-27T17:54:10.765761621Z" level=warning msg="container event discarded" container=302b85b05373335fa61629b02ec4219e120ffef55b045c988ce6ebe347f4e0f6 type=CONTAINER_CREATED_EVENT May 27 17:54:10.765767 containerd[1565]: time="2025-05-27T17:54:10.765767162Z" level=warning msg="container event discarded" container=21867166cd49e0b545cdfa7a8580ea0b46d5f02f94139d815d1336cbac9fa9ef type=CONTAINER_CREATED_EVENT May 27 17:54:10.765767 containerd[1565]: time="2025-05-27T17:54:10.765773453Z" level=warning msg="container event discarded" container=6d5f736f852b8b27a94de15abaa8040cd49cba370c27e7218f1b84e905099448 type=CONTAINER_CREATED_EVENT May 27 17:54:10.765767 containerd[1565]: time="2025-05-27T17:54:10.765778713Z" level=warning msg="container event discarded" container=6d5f736f852b8b27a94de15abaa8040cd49cba370c27e7218f1b84e905099448 type=CONTAINER_STARTED_EVENT May 27 17:54:10.765767 containerd[1565]: time="2025-05-27T17:54:10.765785976Z" level=warning msg="container event discarded" container=7c3877e48b3b0c07ab61d1983fe9a6683d468c0ac14df2779cfa3870aee3fe88 type=CONTAINER_CREATED_EVENT May 27 17:54:10.819080 containerd[1565]: time="2025-05-27T17:54:10.818856969Z" level=warning msg="container event discarded" container=21867166cd49e0b545cdfa7a8580ea0b46d5f02f94139d815d1336cbac9fa9ef type=CONTAINER_STARTED_EVENT May 27 17:54:10.858392 containerd[1565]: time="2025-05-27T17:54:10.858339786Z" level=warning msg="container event discarded" container=302b85b05373335fa61629b02ec4219e120ffef55b045c988ce6ebe347f4e0f6 type=CONTAINER_STARTED_EVENT May 27 17:54:10.876561 containerd[1565]: time="2025-05-27T17:54:10.876507688Z" level=warning msg="container event discarded" container=7c3877e48b3b0c07ab61d1983fe9a6683d468c0ac14df2779cfa3870aee3fe88 type=CONTAINER_STARTED_EVENT May 27 17:54:14.331953 update_engine[1538]: I20250527 17:54:14.331885 1538 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 27 17:54:14.332343 update_engine[1538]: I20250527 17:54:14.332129 1538 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 27 17:54:14.332374 update_engine[1538]: I20250527 17:54:14.332347 1538 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 27 17:54:14.332850 update_engine[1538]: E20250527 17:54:14.332816 1538 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 27 17:54:14.332894 update_engine[1538]: I20250527 17:54:14.332861 1538 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 May 27 17:54:14.855002 systemd[1]: Started sshd@8-157.180.123.17:22-139.178.89.65:54616.service - OpenSSH per-connection server daemon (139.178.89.65:54616). May 27 17:54:15.870037 sshd[5850]: Accepted publickey for core from 139.178.89.65 port 54616 ssh2: RSA SHA256:qwIifqwsO4XluOtZIsqbrQnpEfm6LyQvxlgpLQEO90E May 27 17:54:15.874712 sshd-session[5850]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:54:15.883004 systemd-logind[1534]: New session 9 of user core. May 27 17:54:15.888481 systemd[1]: Started session-9.scope - Session 9 of User core. May 27 17:54:16.724494 sshd[5854]: Connection closed by 139.178.89.65 port 54616 May 27 17:54:16.731658 sshd-session[5850]: pam_unix(sshd:session): session closed for user core May 27 17:54:16.750419 systemd[1]: sshd@8-157.180.123.17:22-139.178.89.65:54616.service: Deactivated successfully. May 27 17:54:16.760296 systemd[1]: session-9.scope: Deactivated successfully. May 27 17:54:16.764284 systemd-logind[1534]: Session 9 logged out. Waiting for processes to exit. May 27 17:54:16.769259 systemd-logind[1534]: Removed session 9. May 27 17:54:16.895601 systemd[1]: Started sshd@9-157.180.123.17:22-139.178.89.65:54620.service - OpenSSH per-connection server daemon (139.178.89.65:54620). May 27 17:54:17.897120 sshd[5868]: Accepted publickey for core from 139.178.89.65 port 54620 ssh2: RSA SHA256:qwIifqwsO4XluOtZIsqbrQnpEfm6LyQvxlgpLQEO90E May 27 17:54:17.898722 sshd-session[5868]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:54:17.904780 systemd-logind[1534]: New session 10 of user core. May 27 17:54:17.909375 systemd[1]: Started session-10.scope - Session 10 of User core. May 27 17:54:18.695024 sshd[5870]: Connection closed by 139.178.89.65 port 54620 May 27 17:54:18.695811 sshd-session[5868]: pam_unix(sshd:session): session closed for user core May 27 17:54:18.701407 systemd[1]: sshd@9-157.180.123.17:22-139.178.89.65:54620.service: Deactivated successfully. May 27 17:54:18.704570 systemd[1]: session-10.scope: Deactivated successfully. May 27 17:54:18.706428 systemd-logind[1534]: Session 10 logged out. Waiting for processes to exit. May 27 17:54:18.709660 systemd-logind[1534]: Removed session 10. May 27 17:54:18.862772 systemd[1]: Started sshd@10-157.180.123.17:22-139.178.89.65:54622.service - OpenSSH per-connection server daemon (139.178.89.65:54622). May 27 17:54:19.276623 kubelet[2865]: E0527 17:54:19.276568 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-fq45k" podUID="8b985871-4559-42e8-9cf3-7b26e6ec2b9f" May 27 17:54:19.846615 sshd[5880]: Accepted publickey for core from 139.178.89.65 port 54622 ssh2: RSA SHA256:qwIifqwsO4XluOtZIsqbrQnpEfm6LyQvxlgpLQEO90E May 27 17:54:19.848186 sshd-session[5880]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:54:19.853730 systemd-logind[1534]: New session 11 of user core. May 27 17:54:19.858386 systemd[1]: Started session-11.scope - Session 11 of User core. May 27 17:54:20.609270 sshd[5882]: Connection closed by 139.178.89.65 port 54622 May 27 17:54:20.610035 sshd-session[5880]: pam_unix(sshd:session): session closed for user core May 27 17:54:20.613954 systemd[1]: sshd@10-157.180.123.17:22-139.178.89.65:54622.service: Deactivated successfully. May 27 17:54:20.616080 systemd[1]: session-11.scope: Deactivated successfully. May 27 17:54:20.618084 systemd-logind[1534]: Session 11 logged out. Waiting for processes to exit. May 27 17:54:20.619812 systemd-logind[1534]: Removed session 11. May 27 17:54:22.755599 containerd[1565]: time="2025-05-27T17:54:22.755484875Z" level=warning msg="container event discarded" container=cbd271c3934a4c61217db7ad844a25babc9fae6884f129b6a65854f8a4fb699c type=CONTAINER_CREATED_EVENT May 27 17:54:22.755599 containerd[1565]: time="2025-05-27T17:54:22.755559074Z" level=warning msg="container event discarded" container=cbd271c3934a4c61217db7ad844a25babc9fae6884f129b6a65854f8a4fb699c type=CONTAINER_STARTED_EVENT May 27 17:54:23.058246 containerd[1565]: time="2025-05-27T17:54:23.058091311Z" level=warning msg="container event discarded" container=78a6b25eb41607f8e844ae700c85070ec34f6a4fa028c7f0b203206b9475fbca type=CONTAINER_CREATED_EVENT May 27 17:54:23.058246 containerd[1565]: time="2025-05-27T17:54:23.058133842Z" level=warning msg="container event discarded" container=78a6b25eb41607f8e844ae700c85070ec34f6a4fa028c7f0b203206b9475fbca type=CONTAINER_STARTED_EVENT May 27 17:54:23.082319 containerd[1565]: time="2025-05-27T17:54:23.082282374Z" level=warning msg="container event discarded" container=415326fef1c9c6314109f06dc2f684ca7881cb9b9387576ce495a92c1225d984 type=CONTAINER_CREATED_EVENT May 27 17:54:23.131350 containerd[1565]: time="2025-05-27T17:54:23.131298907Z" level=warning msg="container event discarded" container=415326fef1c9c6314109f06dc2f684ca7881cb9b9387576ce495a92c1225d984 type=CONTAINER_STARTED_EVENT May 27 17:54:23.243483 kubelet[2865]: E0527 17:54:23.243411 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7c68764d8d-q7mcw" podUID="ba03e825-a044-4684-ba50-40a1a4351879" May 27 17:54:24.331070 update_engine[1538]: I20250527 17:54:24.330984 1538 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 27 17:54:24.331446 update_engine[1538]: I20250527 17:54:24.331243 1538 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 27 17:54:24.331524 update_engine[1538]: I20250527 17:54:24.331490 1538 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 27 17:54:24.331938 update_engine[1538]: E20250527 17:54:24.331904 1538 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 27 17:54:24.332071 update_engine[1538]: I20250527 17:54:24.331944 1538 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 27 17:54:24.332071 update_engine[1538]: I20250527 17:54:24.331952 1538 omaha_request_action.cc:617] Omaha request response: May 27 17:54:24.332071 update_engine[1538]: E20250527 17:54:24.332029 1538 omaha_request_action.cc:636] Omaha request network transfer failed. May 27 17:54:24.335087 update_engine[1538]: I20250527 17:54:24.334312 1538 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. May 27 17:54:24.335087 update_engine[1538]: I20250527 17:54:24.334328 1538 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 27 17:54:24.335087 update_engine[1538]: I20250527 17:54:24.334334 1538 update_attempter.cc:306] Processing Done. May 27 17:54:24.335087 update_engine[1538]: E20250527 17:54:24.334348 1538 update_attempter.cc:619] Update failed. May 27 17:54:24.335087 update_engine[1538]: I20250527 17:54:24.334352 1538 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse May 27 17:54:24.335087 update_engine[1538]: I20250527 17:54:24.334357 1538 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) May 27 17:54:24.335087 update_engine[1538]: I20250527 17:54:24.334360 1538 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. May 27 17:54:24.335087 update_engine[1538]: I20250527 17:54:24.334427 1538 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 27 17:54:24.335087 update_engine[1538]: I20250527 17:54:24.334446 1538 omaha_request_action.cc:271] Posting an Omaha request to disabled May 27 17:54:24.335087 update_engine[1538]: I20250527 17:54:24.334450 1538 omaha_request_action.cc:272] Request: May 27 17:54:24.335087 update_engine[1538]: May 27 17:54:24.335087 update_engine[1538]: May 27 17:54:24.335087 update_engine[1538]: May 27 17:54:24.335087 update_engine[1538]: May 27 17:54:24.335087 update_engine[1538]: May 27 17:54:24.335087 update_engine[1538]: May 27 17:54:24.335087 update_engine[1538]: I20250527 17:54:24.334455 1538 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 27 17:54:24.335087 update_engine[1538]: I20250527 17:54:24.334565 1538 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 27 17:54:24.335087 update_engine[1538]: I20250527 17:54:24.334932 1538 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 27 17:54:24.335730 update_engine[1538]: E20250527 17:54:24.335685 1538 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 27 17:54:24.335777 update_engine[1538]: I20250527 17:54:24.335736 1538 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 27 17:54:24.335777 update_engine[1538]: I20250527 17:54:24.335742 1538 omaha_request_action.cc:617] Omaha request response: May 27 17:54:24.335777 update_engine[1538]: I20250527 17:54:24.335747 1538 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 27 17:54:24.335777 update_engine[1538]: I20250527 17:54:24.335750 1538 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 27 17:54:24.335777 update_engine[1538]: I20250527 17:54:24.335754 1538 update_attempter.cc:306] Processing Done. May 27 17:54:24.335777 update_engine[1538]: I20250527 17:54:24.335758 1538 update_attempter.cc:310] Error event sent. May 27 17:54:24.338451 update_engine[1538]: I20250527 17:54:24.336935 1538 update_check_scheduler.cc:74] Next update check in 45m40s May 27 17:54:24.338744 locksmithd[1594]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 May 27 17:54:24.338744 locksmithd[1594]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 May 27 17:54:25.426485 containerd[1565]: time="2025-05-27T17:54:25.426421642Z" level=warning msg="container event discarded" container=e54cf70bd99c3778a7c09887ff2f1fa1e4c9b452ff837e3e06e4a872d42c175d type=CONTAINER_CREATED_EVENT May 27 17:54:25.473667 containerd[1565]: time="2025-05-27T17:54:25.473626271Z" level=warning msg="container event discarded" container=e54cf70bd99c3778a7c09887ff2f1fa1e4c9b452ff837e3e06e4a872d42c175d type=CONTAINER_STARTED_EVENT May 27 17:54:25.783659 systemd[1]: Started sshd@11-157.180.123.17:22-139.178.89.65:34786.service - OpenSSH per-connection server daemon (139.178.89.65:34786). May 27 17:54:26.758426 sshd[5901]: Accepted publickey for core from 139.178.89.65 port 34786 ssh2: RSA SHA256:qwIifqwsO4XluOtZIsqbrQnpEfm6LyQvxlgpLQEO90E May 27 17:54:26.759830 sshd-session[5901]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:54:26.764745 systemd-logind[1534]: New session 12 of user core. May 27 17:54:26.769344 systemd[1]: Started session-12.scope - Session 12 of User core. May 27 17:54:27.487480 containerd[1565]: time="2025-05-27T17:54:27.487434733Z" level=info msg="TaskExit event in podsandbox handler container_id:\"70ddb3e5906d2ff3f84bb9249ff4cce437881aa2d1921a2cbd1edd83f0dcb9b0\" id:\"362019ba6fb35c4b178ef88f422b62645e6eef3b53651d9c7261e1de5d55e1d3\" pid:5924 exit_status:1 exited_at:{seconds:1748368467 nanos:487087252}" May 27 17:54:27.493310 sshd[5903]: Connection closed by 139.178.89.65 port 34786 May 27 17:54:27.493388 sshd-session[5901]: pam_unix(sshd:session): session closed for user core May 27 17:54:27.498329 systemd[1]: sshd@11-157.180.123.17:22-139.178.89.65:34786.service: Deactivated successfully. May 27 17:54:27.500632 systemd[1]: session-12.scope: Deactivated successfully. May 27 17:54:27.502456 systemd-logind[1534]: Session 12 logged out. Waiting for processes to exit. May 27 17:54:27.504102 systemd-logind[1534]: Removed session 12. May 27 17:54:32.660933 systemd[1]: Started sshd@12-157.180.123.17:22-139.178.89.65:34790.service - OpenSSH per-connection server daemon (139.178.89.65:34790). May 27 17:54:33.237792 kubelet[2865]: E0527 17:54:33.237706 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-fq45k" podUID="8b985871-4559-42e8-9cf3-7b26e6ec2b9f" May 27 17:54:33.665188 sshd[5942]: Accepted publickey for core from 139.178.89.65 port 34790 ssh2: RSA SHA256:qwIifqwsO4XluOtZIsqbrQnpEfm6LyQvxlgpLQEO90E May 27 17:54:33.666514 sshd-session[5942]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:54:33.674519 systemd-logind[1534]: New session 13 of user core. May 27 17:54:33.680443 systemd[1]: Started session-13.scope - Session 13 of User core. May 27 17:54:34.388028 containerd[1565]: time="2025-05-27T17:54:34.387895049Z" level=warning msg="container event discarded" container=79a9a74fe021f9513f2a50bc7653984b86f49d7fab05deb68c5cc847c62a98d8 type=CONTAINER_CREATED_EVENT May 27 17:54:34.388028 containerd[1565]: time="2025-05-27T17:54:34.387960140Z" level=warning msg="container event discarded" container=79a9a74fe021f9513f2a50bc7653984b86f49d7fab05deb68c5cc847c62a98d8 type=CONTAINER_STARTED_EVENT May 27 17:54:34.433329 sshd[5944]: Connection closed by 139.178.89.65 port 34790 May 27 17:54:34.433962 sshd-session[5942]: pam_unix(sshd:session): session closed for user core May 27 17:54:34.438953 systemd[1]: sshd@12-157.180.123.17:22-139.178.89.65:34790.service: Deactivated successfully. May 27 17:54:34.441868 systemd[1]: session-13.scope: Deactivated successfully. May 27 17:54:34.444883 systemd-logind[1534]: Session 13 logged out. Waiting for processes to exit. May 27 17:54:34.447686 systemd-logind[1534]: Removed session 13. May 27 17:54:34.545541 containerd[1565]: time="2025-05-27T17:54:34.545503819Z" level=info msg="TaskExit event in podsandbox handler container_id:\"72c2e0ae0af1e51d52a1d2b4ee60cf6b8986602fcf3e79ed24901f24d1cc6948\" id:\"c48affc124d5bd37b5b437115097832c571bc334ba2805e1c88d5e3c713b02a8\" pid:5968 exited_at:{seconds:1748368474 nanos:545256837}" May 27 17:54:34.876771 containerd[1565]: time="2025-05-27T17:54:34.876702453Z" level=warning msg="container event discarded" container=74d2d6f5705a1cbf153c09249e5f4d2df5359f4dbf6f7e21893020fb31e2c3a3 type=CONTAINER_CREATED_EVENT May 27 17:54:34.876948 containerd[1565]: time="2025-05-27T17:54:34.876923116Z" level=warning msg="container event discarded" container=74d2d6f5705a1cbf153c09249e5f4d2df5359f4dbf6f7e21893020fb31e2c3a3 type=CONTAINER_STARTED_EVENT May 27 17:54:37.113350 containerd[1565]: time="2025-05-27T17:54:37.113242748Z" level=warning msg="container event discarded" container=36431077d2afa3d933b514aacaf20b74817865ba00ee94d0cf6a48dcb25d5a78 type=CONTAINER_CREATED_EVENT May 27 17:54:37.195005 containerd[1565]: time="2025-05-27T17:54:37.194925352Z" level=warning msg="container event discarded" container=36431077d2afa3d933b514aacaf20b74817865ba00ee94d0cf6a48dcb25d5a78 type=CONTAINER_STARTED_EVENT May 27 17:54:38.234930 kubelet[2865]: E0527 17:54:38.234841 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7c68764d8d-q7mcw" podUID="ba03e825-a044-4684-ba50-40a1a4351879" May 27 17:54:38.963723 containerd[1565]: time="2025-05-27T17:54:38.963625843Z" level=warning msg="container event discarded" container=cc483a90b9f1c799db8a63aad897ce4f2bd83c5840abe8ce9fc5fde0f3ee8cc7 type=CONTAINER_CREATED_EVENT May 27 17:54:39.014609 containerd[1565]: time="2025-05-27T17:54:39.014528546Z" level=warning msg="container event discarded" container=cc483a90b9f1c799db8a63aad897ce4f2bd83c5840abe8ce9fc5fde0f3ee8cc7 type=CONTAINER_STARTED_EVENT May 27 17:54:39.127028 containerd[1565]: time="2025-05-27T17:54:39.126916914Z" level=warning msg="container event discarded" container=cc483a90b9f1c799db8a63aad897ce4f2bd83c5840abe8ce9fc5fde0f3ee8cc7 type=CONTAINER_STOPPED_EVENT May 27 17:54:39.606171 systemd[1]: Started sshd@13-157.180.123.17:22-139.178.89.65:33816.service - OpenSSH per-connection server daemon (139.178.89.65:33816). May 27 17:54:40.601275 sshd[5978]: Accepted publickey for core from 139.178.89.65 port 33816 ssh2: RSA SHA256:qwIifqwsO4XluOtZIsqbrQnpEfm6LyQvxlgpLQEO90E May 27 17:54:40.602890 sshd-session[5978]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:54:40.607303 systemd-logind[1534]: New session 14 of user core. May 27 17:54:40.612340 systemd[1]: Started session-14.scope - Session 14 of User core. May 27 17:54:41.358753 sshd[5994]: Connection closed by 139.178.89.65 port 33816 May 27 17:54:41.359730 sshd-session[5978]: pam_unix(sshd:session): session closed for user core May 27 17:54:41.363916 systemd-logind[1534]: Session 14 logged out. Waiting for processes to exit. May 27 17:54:41.364677 systemd[1]: sshd@13-157.180.123.17:22-139.178.89.65:33816.service: Deactivated successfully. May 27 17:54:41.367768 systemd[1]: session-14.scope: Deactivated successfully. May 27 17:54:41.369572 systemd-logind[1534]: Removed session 14. May 27 17:54:41.525414 systemd[1]: Started sshd@14-157.180.123.17:22-139.178.89.65:33822.service - OpenSSH per-connection server daemon (139.178.89.65:33822). May 27 17:54:42.498652 sshd[6006]: Accepted publickey for core from 139.178.89.65 port 33822 ssh2: RSA SHA256:qwIifqwsO4XluOtZIsqbrQnpEfm6LyQvxlgpLQEO90E May 27 17:54:42.501096 sshd-session[6006]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:54:42.511551 systemd-logind[1534]: New session 15 of user core. May 27 17:54:42.518340 systemd[1]: Started session-15.scope - Session 15 of User core. May 27 17:54:43.471934 sshd[6008]: Connection closed by 139.178.89.65 port 33822 May 27 17:54:43.473235 sshd-session[6006]: pam_unix(sshd:session): session closed for user core May 27 17:54:43.477511 systemd[1]: sshd@14-157.180.123.17:22-139.178.89.65:33822.service: Deactivated successfully. May 27 17:54:43.478686 containerd[1565]: time="2025-05-27T17:54:43.478639101Z" level=warning msg="container event discarded" container=3012ee1e8d29b4ebc4ecd2ab003d555d8d8e4e4fd9cdfd74b8dd5202b6c8e241 type=CONTAINER_CREATED_EVENT May 27 17:54:43.480057 systemd[1]: session-15.scope: Deactivated successfully. May 27 17:54:43.483188 systemd-logind[1534]: Session 15 logged out. Waiting for processes to exit. May 27 17:54:43.484477 systemd-logind[1534]: Removed session 15. May 27 17:54:43.528104 containerd[1565]: time="2025-05-27T17:54:43.528029770Z" level=warning msg="container event discarded" container=3012ee1e8d29b4ebc4ecd2ab003d555d8d8e4e4fd9cdfd74b8dd5202b6c8e241 type=CONTAINER_STARTED_EVENT May 27 17:54:43.641975 systemd[1]: Started sshd@15-157.180.123.17:22-139.178.89.65:57132.service - OpenSSH per-connection server daemon (139.178.89.65:57132). May 27 17:54:43.931711 containerd[1565]: time="2025-05-27T17:54:43.931648898Z" level=warning msg="container event discarded" container=3012ee1e8d29b4ebc4ecd2ab003d555d8d8e4e4fd9cdfd74b8dd5202b6c8e241 type=CONTAINER_STOPPED_EVENT May 27 17:54:44.631839 sshd[6018]: Accepted publickey for core from 139.178.89.65 port 57132 ssh2: RSA SHA256:qwIifqwsO4XluOtZIsqbrQnpEfm6LyQvxlgpLQEO90E May 27 17:54:44.633438 sshd-session[6018]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:54:44.638068 systemd-logind[1534]: New session 16 of user core. May 27 17:54:44.643391 systemd[1]: Started session-16.scope - Session 16 of User core. May 27 17:54:46.181836 sshd[6020]: Connection closed by 139.178.89.65 port 57132 May 27 17:54:46.186346 sshd-session[6018]: pam_unix(sshd:session): session closed for user core May 27 17:54:46.192773 systemd[1]: sshd@15-157.180.123.17:22-139.178.89.65:57132.service: Deactivated successfully. May 27 17:54:46.194342 systemd[1]: session-16.scope: Deactivated successfully. May 27 17:54:46.195649 systemd-logind[1534]: Session 16 logged out. Waiting for processes to exit. May 27 17:54:46.196925 systemd-logind[1534]: Removed session 16. May 27 17:54:46.354059 systemd[1]: Started sshd@16-157.180.123.17:22-139.178.89.65:57140.service - OpenSSH per-connection server daemon (139.178.89.65:57140). May 27 17:54:47.362825 sshd[6037]: Accepted publickey for core from 139.178.89.65 port 57140 ssh2: RSA SHA256:qwIifqwsO4XluOtZIsqbrQnpEfm6LyQvxlgpLQEO90E May 27 17:54:47.364781 sshd-session[6037]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:54:47.369780 systemd-logind[1534]: New session 17 of user core. May 27 17:54:47.373358 systemd[1]: Started session-17.scope - Session 17 of User core. May 27 17:54:48.271171 kubelet[2865]: E0527 17:54:48.271117 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-fq45k" podUID="8b985871-4559-42e8-9cf3-7b26e6ec2b9f" May 27 17:54:48.326235 sshd[6039]: Connection closed by 139.178.89.65 port 57140 May 27 17:54:48.327123 sshd-session[6037]: pam_unix(sshd:session): session closed for user core May 27 17:54:48.331507 systemd[1]: sshd@16-157.180.123.17:22-139.178.89.65:57140.service: Deactivated successfully. May 27 17:54:48.333599 systemd[1]: session-17.scope: Deactivated successfully. May 27 17:54:48.335446 systemd-logind[1534]: Session 17 logged out. Waiting for processes to exit. May 27 17:54:48.337143 systemd-logind[1534]: Removed session 17. May 27 17:54:48.499758 systemd[1]: Started sshd@17-157.180.123.17:22-139.178.89.65:57142.service - OpenSSH per-connection server daemon (139.178.89.65:57142). May 27 17:54:49.507107 sshd[6056]: Accepted publickey for core from 139.178.89.65 port 57142 ssh2: RSA SHA256:qwIifqwsO4XluOtZIsqbrQnpEfm6LyQvxlgpLQEO90E May 27 17:54:49.510021 sshd-session[6056]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:54:49.518763 systemd-logind[1534]: New session 18 of user core. May 27 17:54:49.527433 systemd[1]: Started session-18.scope - Session 18 of User core. May 27 17:54:50.265930 sshd[6058]: Connection closed by 139.178.89.65 port 57142 May 27 17:54:50.266916 sshd-session[6056]: pam_unix(sshd:session): session closed for user core May 27 17:54:50.272984 systemd-logind[1534]: Session 18 logged out. Waiting for processes to exit. May 27 17:54:50.273113 systemd[1]: sshd@17-157.180.123.17:22-139.178.89.65:57142.service: Deactivated successfully. May 27 17:54:50.276035 systemd[1]: session-18.scope: Deactivated successfully. May 27 17:54:50.279410 systemd-logind[1534]: Removed session 18. May 27 17:54:51.798816 containerd[1565]: time="2025-05-27T17:54:51.798737040Z" level=warning msg="container event discarded" container=70ddb3e5906d2ff3f84bb9249ff4cce437881aa2d1921a2cbd1edd83f0dcb9b0 type=CONTAINER_CREATED_EVENT May 27 17:54:51.951230 containerd[1565]: time="2025-05-27T17:54:51.951164386Z" level=warning msg="container event discarded" container=70ddb3e5906d2ff3f84bb9249ff4cce437881aa2d1921a2cbd1edd83f0dcb9b0 type=CONTAINER_STARTED_EVENT May 27 17:54:52.235740 kubelet[2865]: E0527 17:54:52.235680 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7c68764d8d-q7mcw" podUID="ba03e825-a044-4684-ba50-40a1a4351879" May 27 17:54:53.386574 containerd[1565]: time="2025-05-27T17:54:53.386513760Z" level=warning msg="container event discarded" container=d4f0583df76aebe81b6b68cd8b3d216c78763a313d21eac9272dff74ae095c34 type=CONTAINER_CREATED_EVENT May 27 17:54:53.386574 containerd[1565]: time="2025-05-27T17:54:53.386559767Z" level=warning msg="container event discarded" container=d4f0583df76aebe81b6b68cd8b3d216c78763a313d21eac9272dff74ae095c34 type=CONTAINER_STARTED_EVENT May 27 17:54:55.437514 systemd[1]: Started sshd@18-157.180.123.17:22-139.178.89.65:33894.service - OpenSSH per-connection server daemon (139.178.89.65:33894). May 27 17:54:55.513603 containerd[1565]: time="2025-05-27T17:54:55.490419506Z" level=warning msg="container event discarded" container=90414ce2b1120c9ec2383c53e2e3a6d28c1fa76098fd12c6891c3fd58617cb7f type=CONTAINER_CREATED_EVENT May 27 17:54:55.513603 containerd[1565]: time="2025-05-27T17:54:55.513555530Z" level=warning msg="container event discarded" container=90414ce2b1120c9ec2383c53e2e3a6d28c1fa76098fd12c6891c3fd58617cb7f type=CONTAINER_STARTED_EVENT May 27 17:54:56.425627 containerd[1565]: time="2025-05-27T17:54:56.425514672Z" level=warning msg="container event discarded" container=1ddf36c2976f3684bd155197bd8667c21478b2792b862b16ebb62dbd527da4ac type=CONTAINER_CREATED_EVENT May 27 17:54:56.425627 containerd[1565]: time="2025-05-27T17:54:56.425601636Z" level=warning msg="container event discarded" container=1ddf36c2976f3684bd155197bd8667c21478b2792b862b16ebb62dbd527da4ac type=CONTAINER_STARTED_EVENT May 27 17:54:56.435019 sshd[6075]: Accepted publickey for core from 139.178.89.65 port 33894 ssh2: RSA SHA256:qwIifqwsO4XluOtZIsqbrQnpEfm6LyQvxlgpLQEO90E May 27 17:54:56.436916 sshd-session[6075]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 17:54:56.445078 systemd-logind[1534]: New session 19 of user core. May 27 17:54:56.450514 systemd[1]: Started session-19.scope - Session 19 of User core. May 27 17:54:57.194895 sshd[6077]: Connection closed by 139.178.89.65 port 33894 May 27 17:54:57.195782 sshd-session[6075]: pam_unix(sshd:session): session closed for user core May 27 17:54:57.201936 systemd[1]: sshd@18-157.180.123.17:22-139.178.89.65:33894.service: Deactivated successfully. May 27 17:54:57.205559 systemd[1]: session-19.scope: Deactivated successfully. May 27 17:54:57.207405 systemd-logind[1534]: Session 19 logged out. Waiting for processes to exit. May 27 17:54:57.210098 systemd-logind[1534]: Removed session 19. May 27 17:54:57.610629 containerd[1565]: time="2025-05-27T17:54:57.610320418Z" level=info msg="TaskExit event in podsandbox handler container_id:\"70ddb3e5906d2ff3f84bb9249ff4cce437881aa2d1921a2cbd1edd83f0dcb9b0\" id:\"ab11eb462f631735b90062d9cad1c001ed1f32ca918094f9c30a5c8f3cf6010a\" pid:6100 exited_at:{seconds:1748368497 nanos:610018391}" May 27 17:54:57.629455 containerd[1565]: time="2025-05-27T17:54:57.629422084Z" level=warning msg="container event discarded" container=f0a259e941c5ce73a80470c831a01e67308ce3ae8025de5b2586d1e933506f38 type=CONTAINER_CREATED_EVENT May 27 17:54:57.629455 containerd[1565]: time="2025-05-27T17:54:57.629446871Z" level=warning msg="container event discarded" container=f0a259e941c5ce73a80470c831a01e67308ce3ae8025de5b2586d1e933506f38 type=CONTAINER_STARTED_EVENT May 27 17:54:57.654688 containerd[1565]: time="2025-05-27T17:54:57.654658251Z" level=warning msg="container event discarded" container=0d4c46dd32b324d996dd4dec47a460426bb9c6aeaae517f341399ef5e81adb28 type=CONTAINER_CREATED_EVENT May 27 17:54:57.654688 containerd[1565]: time="2025-05-27T17:54:57.654682096Z" level=warning msg="container event discarded" container=0d4c46dd32b324d996dd4dec47a460426bb9c6aeaae517f341399ef5e81adb28 type=CONTAINER_STARTED_EVENT May 27 17:54:57.751066 containerd[1565]: time="2025-05-27T17:54:57.751001480Z" level=warning msg="container event discarded" container=656ed912db74aff5447cd72acc340017d0b00e9a703d24af0c936d92bd774f34 type=CONTAINER_CREATED_EVENT May 27 17:54:57.751066 containerd[1565]: time="2025-05-27T17:54:57.751055391Z" level=warning msg="container event discarded" container=656ed912db74aff5447cd72acc340017d0b00e9a703d24af0c936d92bd774f34 type=CONTAINER_STARTED_EVENT May 27 17:54:57.780388 containerd[1565]: time="2025-05-27T17:54:57.780216148Z" level=warning msg="container event discarded" container=5629b5164be4cef55f62a9402b8f98d6606d66fe6976406e46d4ef8b04a98919 type=CONTAINER_CREATED_EVENT May 27 17:54:57.834608 containerd[1565]: time="2025-05-27T17:54:57.834549153Z" level=warning msg="container event discarded" container=5629b5164be4cef55f62a9402b8f98d6606d66fe6976406e46d4ef8b04a98919 type=CONTAINER_STARTED_EVENT May 27 17:54:58.864438 containerd[1565]: time="2025-05-27T17:54:58.864352528Z" level=warning msg="container event discarded" container=a195da45e312328d3aa4c0df7a7fdda094405bc53d16f05a41533c33157fb139 type=CONTAINER_CREATED_EVENT May 27 17:54:58.956694 containerd[1565]: time="2025-05-27T17:54:58.956638508Z" level=warning msg="container event discarded" container=a195da45e312328d3aa4c0df7a7fdda094405bc53d16f05a41533c33157fb139 type=CONTAINER_STARTED_EVENT May 27 17:54:59.546456 containerd[1565]: time="2025-05-27T17:54:59.546388321Z" level=warning msg="container event discarded" container=d35d4c60e0023eadcf8b1ee04531b572804ca51c597f2ebb934ce305ab300ddd type=CONTAINER_CREATED_EVENT May 27 17:54:59.546456 containerd[1565]: time="2025-05-27T17:54:59.546439388Z" level=warning msg="container event discarded" container=d35d4c60e0023eadcf8b1ee04531b572804ca51c597f2ebb934ce305ab300ddd type=CONTAINER_STARTED_EVENT May 27 17:54:59.586806 containerd[1565]: time="2025-05-27T17:54:59.586669988Z" level=warning msg="container event discarded" container=a65bdb2ed18bfc56dd7244b51fb78312a5b3f8c6cc54205817732ae66fb86cc7 type=CONTAINER_CREATED_EVENT May 27 17:54:59.670480 containerd[1565]: time="2025-05-27T17:54:59.670397077Z" level=warning msg="container event discarded" container=dbdb3e9fda2d561c19162076347b65b88f3a17357ca15950123485b4d9d9f21d type=CONTAINER_CREATED_EVENT May 27 17:54:59.670480 containerd[1565]: time="2025-05-27T17:54:59.670468981Z" level=warning msg="container event discarded" container=dbdb3e9fda2d561c19162076347b65b88f3a17357ca15950123485b4d9d9f21d type=CONTAINER_STARTED_EVENT May 27 17:54:59.689682 containerd[1565]: time="2025-05-27T17:54:59.689644456Z" level=warning msg="container event discarded" container=32826426ccfb961f246e0c007ccda95c8593d5343dcea181acb97a3718e44aba type=CONTAINER_CREATED_EVENT May 27 17:54:59.725963 containerd[1565]: time="2025-05-27T17:54:59.725900593Z" level=warning msg="container event discarded" container=a65bdb2ed18bfc56dd7244b51fb78312a5b3f8c6cc54205817732ae66fb86cc7 type=CONTAINER_STARTED_EVENT May 27 17:54:59.756288 containerd[1565]: time="2025-05-27T17:54:59.756184434Z" level=warning msg="container event discarded" container=32826426ccfb961f246e0c007ccda95c8593d5343dcea181acb97a3718e44aba type=CONTAINER_STARTED_EVENT May 27 17:55:00.238790 kubelet[2865]: E0527 17:55:00.238750 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-fq45k" podUID="8b985871-4559-42e8-9cf3-7b26e6ec2b9f" May 27 17:55:00.408998 containerd[1565]: time="2025-05-27T17:55:00.408960212Z" level=info msg="TaskExit event in podsandbox handler container_id:\"72c2e0ae0af1e51d52a1d2b4ee60cf6b8986602fcf3e79ed24901f24d1cc6948\" id:\"f1422269ad8a96baaba4f3a8334779d9e7800d6814d5f72ef40c093f8b1680a6\" pid:6124 exited_at:{seconds:1748368500 nanos:408780185}" May 27 17:55:03.489209 containerd[1565]: time="2025-05-27T17:55:03.489152763Z" level=warning msg="container event discarded" container=72c2e0ae0af1e51d52a1d2b4ee60cf6b8986602fcf3e79ed24901f24d1cc6948 type=CONTAINER_CREATED_EVENT May 27 17:55:03.630343 containerd[1565]: time="2025-05-27T17:55:03.630276251Z" level=warning msg="container event discarded" container=72c2e0ae0af1e51d52a1d2b4ee60cf6b8986602fcf3e79ed24901f24d1cc6948 type=CONTAINER_STARTED_EVENT May 27 17:55:04.531289 containerd[1565]: time="2025-05-27T17:55:04.531141464Z" level=info msg="TaskExit event in podsandbox handler container_id:\"72c2e0ae0af1e51d52a1d2b4ee60cf6b8986602fcf3e79ed24901f24d1cc6948\" id:\"139c512d4b9b1dd0b669659c8614a66dc2825a266f6f70348364459330666e30\" pid:6149 exited_at:{seconds:1748368504 nanos:530016396}" May 27 17:55:05.754605 containerd[1565]: time="2025-05-27T17:55:05.754533963Z" level=warning msg="container event discarded" container=f9c0731559011273999402a8e81e4a64a7380d9c202c90d5df2b6bae8c4f4040 type=CONTAINER_CREATED_EVENT May 27 17:55:05.806893 containerd[1565]: time="2025-05-27T17:55:05.806795467Z" level=warning msg="container event discarded" container=f9c0731559011273999402a8e81e4a64a7380d9c202c90d5df2b6bae8c4f4040 type=CONTAINER_STARTED_EVENT May 27 17:55:07.233845 kubelet[2865]: E0527 17:55:07.233471 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7c68764d8d-q7mcw" podUID="ba03e825-a044-4684-ba50-40a1a4351879" May 27 17:55:07.987909 containerd[1565]: time="2025-05-27T17:55:07.987843754Z" level=warning msg="container event discarded" container=46644a54b91cfdfb9268ef5c08209861710b261eba243646050a84c525024f3a type=CONTAINER_CREATED_EVENT May 27 17:55:08.077791 containerd[1565]: time="2025-05-27T17:55:08.070181308Z" level=warning msg="container event discarded" container=46644a54b91cfdfb9268ef5c08209861710b261eba243646050a84c525024f3a type=CONTAINER_STARTED_EVENT May 27 17:55:14.232995 kubelet[2865]: E0527 17:55:14.232857 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-fq45k" podUID="8b985871-4559-42e8-9cf3-7b26e6ec2b9f" May 27 17:55:22.251287 kubelet[2865]: E0527 17:55:22.251194 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-7c68764d8d-q7mcw" podUID="ba03e825-a044-4684-ba50-40a1a4351879" May 27 17:55:22.408336 kubelet[2865]: E0527 17:55:22.408270 2865 controller.go:195] "Failed to update lease" err="Put \"https://157.180.123.17:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4344-0-0-a-c8f0a3e630?timeout=10s\": context deadline exceeded" May 27 17:55:22.861714 kubelet[2865]: E0527 17:55:22.861645 2865 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:56380->10.0.0.2:2379: read: connection timed out" May 27 17:55:23.254043 systemd[1]: cri-containerd-21867166cd49e0b545cdfa7a8580ea0b46d5f02f94139d815d1336cbac9fa9ef.scope: Deactivated successfully. May 27 17:55:23.254462 systemd[1]: cri-containerd-21867166cd49e0b545cdfa7a8580ea0b46d5f02f94139d815d1336cbac9fa9ef.scope: Consumed 4.391s CPU time, 95.7M memory peak, 87.1M read from disk. May 27 17:55:23.352039 containerd[1565]: time="2025-05-27T17:55:23.351612279Z" level=info msg="TaskExit event in podsandbox handler container_id:\"21867166cd49e0b545cdfa7a8580ea0b46d5f02f94139d815d1336cbac9fa9ef\" id:\"21867166cd49e0b545cdfa7a8580ea0b46d5f02f94139d815d1336cbac9fa9ef\" pid:2696 exit_status:1 exited_at:{seconds:1748368523 nanos:331614054}" May 27 17:55:23.356598 containerd[1565]: time="2025-05-27T17:55:23.356575505Z" level=info msg="received exit event container_id:\"21867166cd49e0b545cdfa7a8580ea0b46d5f02f94139d815d1336cbac9fa9ef\" id:\"21867166cd49e0b545cdfa7a8580ea0b46d5f02f94139d815d1336cbac9fa9ef\" pid:2696 exit_status:1 exited_at:{seconds:1748368523 nanos:331614054}" May 27 17:55:23.433312 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-21867166cd49e0b545cdfa7a8580ea0b46d5f02f94139d815d1336cbac9fa9ef-rootfs.mount: Deactivated successfully. May 27 17:55:23.803729 systemd[1]: cri-containerd-e54cf70bd99c3778a7c09887ff2f1fa1e4c9b452ff837e3e06e4a872d42c175d.scope: Deactivated successfully. May 27 17:55:23.805270 systemd[1]: cri-containerd-e54cf70bd99c3778a7c09887ff2f1fa1e4c9b452ff837e3e06e4a872d42c175d.scope: Consumed 12.447s CPU time, 112.8M memory peak, 54.7M read from disk. May 27 17:55:23.810588 containerd[1565]: time="2025-05-27T17:55:23.810510986Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e54cf70bd99c3778a7c09887ff2f1fa1e4c9b452ff837e3e06e4a872d42c175d\" id:\"e54cf70bd99c3778a7c09887ff2f1fa1e4c9b452ff837e3e06e4a872d42c175d\" pid:3188 exit_status:1 exited_at:{seconds:1748368523 nanos:809778344}" May 27 17:55:23.810724 containerd[1565]: time="2025-05-27T17:55:23.810643715Z" level=info msg="received exit event container_id:\"e54cf70bd99c3778a7c09887ff2f1fa1e4c9b452ff837e3e06e4a872d42c175d\" id:\"e54cf70bd99c3778a7c09887ff2f1fa1e4c9b452ff837e3e06e4a872d42c175d\" pid:3188 exit_status:1 exited_at:{seconds:1748368523 nanos:809778344}" May 27 17:55:23.860778 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e54cf70bd99c3778a7c09887ff2f1fa1e4c9b452ff837e3e06e4a872d42c175d-rootfs.mount: Deactivated successfully. May 27 17:55:24.477246 kubelet[2865]: I0527 17:55:24.477182 2865 scope.go:117] "RemoveContainer" containerID="e54cf70bd99c3778a7c09887ff2f1fa1e4c9b452ff837e3e06e4a872d42c175d" May 27 17:55:24.489769 kubelet[2865]: I0527 17:55:24.489495 2865 scope.go:117] "RemoveContainer" containerID="21867166cd49e0b545cdfa7a8580ea0b46d5f02f94139d815d1336cbac9fa9ef" May 27 17:55:24.560456 containerd[1565]: time="2025-05-27T17:55:24.560409931Z" level=info msg="CreateContainer within sandbox \"cbd271c3934a4c61217db7ad844a25babc9fae6884f129b6a65854f8a4fb699c\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" May 27 17:55:24.560761 containerd[1565]: time="2025-05-27T17:55:24.560508777Z" level=info msg="CreateContainer within sandbox \"701000cf7d3f781c59dbf1e31f9fd84606a0c14c8b65b2dfe796507a746d5895\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" May 27 17:55:24.661782 containerd[1565]: time="2025-05-27T17:55:24.661748289Z" level=info msg="Container c3cfd4dcf343c381029349b81e44950541e69ce42c23236dfab76416e8d8fd0d: CDI devices from CRI Config.CDIDevices: []" May 27 17:55:24.662039 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2632793197.mount: Deactivated successfully. May 27 17:55:24.672789 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3728559165.mount: Deactivated successfully. May 27 17:55:24.676728 containerd[1565]: time="2025-05-27T17:55:24.676667492Z" level=info msg="CreateContainer within sandbox \"cbd271c3934a4c61217db7ad844a25babc9fae6884f129b6a65854f8a4fb699c\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"c3cfd4dcf343c381029349b81e44950541e69ce42c23236dfab76416e8d8fd0d\"" May 27 17:55:24.678306 containerd[1565]: time="2025-05-27T17:55:24.678261377Z" level=info msg="Container cd2a2ee626e01efb47d9579e765fdadc250e415a20f1b7dda24c8854277c86e4: CDI devices from CRI Config.CDIDevices: []" May 27 17:55:24.686332 containerd[1565]: time="2025-05-27T17:55:24.686296607Z" level=info msg="CreateContainer within sandbox \"701000cf7d3f781c59dbf1e31f9fd84606a0c14c8b65b2dfe796507a746d5895\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"cd2a2ee626e01efb47d9579e765fdadc250e415a20f1b7dda24c8854277c86e4\"" May 27 17:55:24.688488 containerd[1565]: time="2025-05-27T17:55:24.688471389Z" level=info msg="StartContainer for \"c3cfd4dcf343c381029349b81e44950541e69ce42c23236dfab76416e8d8fd0d\"" May 27 17:55:24.688748 containerd[1565]: time="2025-05-27T17:55:24.688471480Z" level=info msg="StartContainer for \"cd2a2ee626e01efb47d9579e765fdadc250e415a20f1b7dda24c8854277c86e4\"" May 27 17:55:24.689392 containerd[1565]: time="2025-05-27T17:55:24.689370623Z" level=info msg="connecting to shim cd2a2ee626e01efb47d9579e765fdadc250e415a20f1b7dda24c8854277c86e4" address="unix:///run/containerd/s/d50e86bf2cb21e0261ee68ed0cdef62428dc01dc1d7d1085c2c221529644fdcb" protocol=ttrpc version=3 May 27 17:55:24.689586 containerd[1565]: time="2025-05-27T17:55:24.689380241Z" level=info msg="connecting to shim c3cfd4dcf343c381029349b81e44950541e69ce42c23236dfab76416e8d8fd0d" address="unix:///run/containerd/s/5b6428d68c592eb8502f9c01ff20cdde4f0e4dddd3534bf0bb1d4edd87de63da" protocol=ttrpc version=3 May 27 17:55:24.724927 kubelet[2865]: E0527 17:55:24.693922 2865 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:56170->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{goldmane-78d55f7ddc-fq45k.1843739d38a58d66 calico-system 1925 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:calico-system,Name:goldmane-78d55f7ddc-fq45k,UID:8b985871-4559-42e8-9cf3-7b26e6ec2b9f,APIVersion:v1,ResourceVersion:827,FieldPath:spec.containers{goldmane},},Reason:BackOff,Message:Back-off pulling image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\",Source:EventSource{Component:kubelet,Host:ci-4344-0-0-a-c8f0a3e630,},FirstTimestamp:2025-05-27 17:49:59 +0000 UTC,LastTimestamp:2025-05-27 17:55:14.232724463 +0000 UTC m=+359.084603144,Count:21,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4344-0-0-a-c8f0a3e630,}" May 27 17:55:24.726362 systemd[1]: Started cri-containerd-cd2a2ee626e01efb47d9579e765fdadc250e415a20f1b7dda24c8854277c86e4.scope - libcontainer container cd2a2ee626e01efb47d9579e765fdadc250e415a20f1b7dda24c8854277c86e4. May 27 17:55:24.729140 systemd[1]: Started cri-containerd-c3cfd4dcf343c381029349b81e44950541e69ce42c23236dfab76416e8d8fd0d.scope - libcontainer container c3cfd4dcf343c381029349b81e44950541e69ce42c23236dfab76416e8d8fd0d. May 27 17:55:24.779630 containerd[1565]: time="2025-05-27T17:55:24.779598834Z" level=info msg="StartContainer for \"c3cfd4dcf343c381029349b81e44950541e69ce42c23236dfab76416e8d8fd0d\" returns successfully" May 27 17:55:24.803920 containerd[1565]: time="2025-05-27T17:55:24.803873838Z" level=info msg="StartContainer for \"cd2a2ee626e01efb47d9579e765fdadc250e415a20f1b7dda24c8854277c86e4\" returns successfully" May 27 17:55:25.657846 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1172240509.mount: Deactivated successfully. May 27 17:55:27.481116 containerd[1565]: time="2025-05-27T17:55:27.481079661Z" level=info msg="TaskExit event in podsandbox handler container_id:\"70ddb3e5906d2ff3f84bb9249ff4cce437881aa2d1921a2cbd1edd83f0dcb9b0\" id:\"e1757355673df70128aa094ec66e2b65bede2c5f2b42c84d2f585ec5a9894c9a\" pid:6273 exited_at:{seconds:1748368527 nanos:480456133}" May 27 17:55:28.965964 systemd[1]: cri-containerd-302b85b05373335fa61629b02ec4219e120ffef55b045c988ce6ebe347f4e0f6.scope: Deactivated successfully. May 27 17:55:28.966443 systemd[1]: cri-containerd-302b85b05373335fa61629b02ec4219e120ffef55b045c988ce6ebe347f4e0f6.scope: Consumed 3.464s CPU time, 42.5M memory peak, 50.5M read from disk. May 27 17:55:28.968988 containerd[1565]: time="2025-05-27T17:55:28.968847597Z" level=info msg="TaskExit event in podsandbox handler container_id:\"302b85b05373335fa61629b02ec4219e120ffef55b045c988ce6ebe347f4e0f6\" id:\"302b85b05373335fa61629b02ec4219e120ffef55b045c988ce6ebe347f4e0f6\" pid:2690 exit_status:1 exited_at:{seconds:1748368528 nanos:968477844}" May 27 17:55:28.977343 containerd[1565]: time="2025-05-27T17:55:28.977264271Z" level=info msg="received exit event container_id:\"302b85b05373335fa61629b02ec4219e120ffef55b045c988ce6ebe347f4e0f6\" id:\"302b85b05373335fa61629b02ec4219e120ffef55b045c988ce6ebe347f4e0f6\" pid:2690 exit_status:1 exited_at:{seconds:1748368528 nanos:968477844}" May 27 17:55:29.003019 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-302b85b05373335fa61629b02ec4219e120ffef55b045c988ce6ebe347f4e0f6-rootfs.mount: Deactivated successfully. May 27 17:55:29.249977 kubelet[2865]: E0527 17:55:29.247712 2865 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-fq45k" podUID="8b985871-4559-42e8-9cf3-7b26e6ec2b9f" May 27 17:55:29.492302 kubelet[2865]: I0527 17:55:29.492209 2865 scope.go:117] "RemoveContainer" containerID="302b85b05373335fa61629b02ec4219e120ffef55b045c988ce6ebe347f4e0f6" May 27 17:55:29.494536 containerd[1565]: time="2025-05-27T17:55:29.494493140Z" level=info msg="CreateContainer within sandbox \"9943345f5151c7643cdfff39141b42be5cd764524c9c46a8e518dd350f28fd0a\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" May 27 17:55:29.503308 containerd[1565]: time="2025-05-27T17:55:29.501817217Z" level=info msg="Container 2504c10a9acc5a471e681f1cc4eb97ac3838e6a0407adb8b76d0a9f9dcb214d9: CDI devices from CRI Config.CDIDevices: []"